WorldWideScience

Sample records for streamlined automated approaches

  1. Streamlined approach to waste management at CRL

    International Nuclear Information System (INIS)

    Adams, L.; Campbell, B.

    2011-01-01

    Radioactive, mixed, hazardous and non-hazardous wastes have been and continue to be generated at Chalk River Laboratories (CRL) as a result of research and development activities and operations since the 1940s. Over the years, the wastes produced as a byproduct of activities delivering the core missions of the CRL site have been of many types, and today, over thirty distinct waste streams have been identified, all requiring efficient management. With the commencement of decommissioning of the legacy created as part of the development of the Canadian nuclear industry, the volumes and range of wastes to be managed have been increasing in the near term, and this trend will continue into the future. The development of a streamlined approach to waste management is a key to successful waste management at CRL. Waste management guidelines that address all of the requirements have become complex, and so have the various waste management groups receiving waste, with their many different processes and capabilities. This has led to difficulties for waste generators in understanding all of the requirements to be satisfied for the various CRL waste receivers, whose primary concerns are to be safe and in compliance with their acceptance criteria and license conditions. As a result, waste movement on site can often be very slow, especially for non-routine waste types. Recognizing an opportunity for improvement, the Waste Management organization at CRL has implemented a more streamlined approach with emphasis on early identification of waste type and possible disposition path. This paper presents a streamlined approach to waste identification and waste management at CRL, the implementation methodology applied and the early results achieved from this process improvement. (author)

  2. Streamlining and automation of radioanalytical methods at a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, J.T.; Dillard, J.W. [IT Corp., Knoxville, TN (United States)

    1993-12-31

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed.

  3. Streamlining and automation of radioanalytical methods at a commercial laboratory

    International Nuclear Information System (INIS)

    Harvey, J.T.; Dillard, J.W.

    1993-01-01

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed

  4. ExoMars Trace Gas Orbiter Instrument Modelling Approach to Streamline Science Operations

    Science.gov (United States)

    Munoz Fernandez, Michela; Frew, David; Ashman, Michael; Cardesin Moinelo, Alejandro; Garcia Beteta, Juan Jose; Geiger, Bernhard; Metcalfe, Leo; Nespoli, Federico; Muniz Solaz, Carlos

    2018-05-01

    ExoMars Trace Gas Orbiter (TGO) science operations activities are centralised at ESAC's Science Operations Centre (SOC). The SOC receives the inputs from the principal investigators (PIs) in order to implement and deliver the spacecraft pointing requests and instrument timelines to the Mission Operations Centre (MOC). The high number of orbits per planning cycle has made it necessary to abstract the planning interactions between the SOC and the PI teams at the observation level. This paper describes the modelling approach we have conducted for TGOís instruments to streamline science operations. We have created dynamic observation types that scale to adapt to the conditions specified by the PI teams including observation timing, and pointing block parameters calculated from observation geometry. This approach is considered and improvement with respect to previous missions where the generation of the observation pointing and commanding requests was performed manually by the instrument teams. Automation software assists us to effectively handle the high density of planned orbits with increasing volume of scientific data and to successfully meet opportunistic scientific goals and objectives. Our planning tool combines the instrument observation definition files provided by the PIs together with the flight dynamics products to generate the Pointing Requests and the instrument timeline (ITL). The ITL contains all the validated commands at the TC sequence level and computes the resource envelopes (data rate, power, data volume) within the constraints. At the SOC, our main goal is to maximise the science output while minimising the number of iterations among the teams, ensuring that the timeline does not violate the state transitions allowed in the Mission Operations Rules and Constraints Document.

  5. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    International Nuclear Information System (INIS)

    Chess, Jordan J.; Montoya, Sergio A.; Harvey, Tyler R.; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E.; McMorran, Benjamin J.

    2017-01-01

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  6. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    Energy Technology Data Exchange (ETDEWEB)

    Chess, Jordan J., E-mail: jchess@uoregon.edu [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Montoya, Sergio A. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); Harvey, Tyler R. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Ophus, Colin [National Center for Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); McMorran, Benjamin J. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States)

    2017-06-15

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  7. Cognitive Approaches to Automated Instruction.

    Science.gov (United States)

    Regian, J. Wesley, Ed.; Shute, Valerie J., Ed.

    This book contains a snapshot of state-of-the-art research on the design of automated instructional systems. Selected cognitive psychologists were asked to describe their approach to instruction and cognitive diagnosis, the theoretical basis of the approach, its utility and applicability, and the knowledge engineering or task analysis methods…

  8. Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System

    Science.gov (United States)

    Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth

    2018-01-01

    The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544

  9. Approaches to automated protein crystal harvesting

    Energy Technology Data Exchange (ETDEWEB)

    Deller, Marc C., E-mail: mdeller@scripps.edu; Rupp, Bernhard, E-mail: mdeller@scripps.edu

    2014-01-28

    Approaches to automated and robot-assisted harvesting of protein crystals are critically reviewed. While no true turn-key solutions for automation of protein crystal harvesting are currently available, systems incorporating advanced robotics and micro-electromechanical systems represent exciting developments with the potential to revolutionize the way in which protein crystals are harvested.

  10. Adaptation : A Partially Automated Approach

    NARCIS (Netherlands)

    Manjing, Tham; Bukhsh, F.A.; Weigand, H.

    2014-01-01

    This paper showcases the possibility of creating an adaptive auditing system. Adaptation in an audit environment need human intervention at some point. Based on a case study this paper focuses on automation of adaptation process. It is divided into solution design and validation parts. The artifact

  11. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    Science.gov (United States)

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  12. STREAMLINED APPROACH FOR ENVIRONMENTAL RESTORATION PLAN FOR CORRECTIVE ACTION UNIT 116: AREA 25 TEST CELL C FACILITY NEVADA TEST SITE, NEVADA

    International Nuclear Information System (INIS)

    2006-01-01

    This Streamlined Approach for Environmental Restoration Plan identifies the activities required for the closure of Corrective Action Unit 116, Area 25 Test Cell C Facility. The Test Cell C Facility is located in Area 25 of the Nevada Test Site approximately 25 miles northwest of Mercury, Nevada

  13. InterviewStreamliner, a minimalist, free, open source, relational approach to computer-assisted qualitative data analysis software

    NARCIS (Netherlands)

    H.D. Pruijt (Hans)

    2010-01-01

    textabstractInterviewStreamliner is a free, open source, minimalist alternative to complex computer-assisted qualitative data analysis packages. It builds on the flexibility of relational database management technology.

  14. Photogrammetric approach to automated checking of DTMs

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2005-01-01

    Geometrically accurate digital terrain models (DTMs) are essential for orthoimage production and many other applications. Collecting reference data or visual inspection are reliable but time consuming and therefore expensive methods for finding errors in DTMs. In this paper, a photogrammetric...... approach to automated checking and improving of DTMs is evaluated. Corresponding points in two overlapping orthoimages are found by means of area based matching. Provided the image orientation is correct, discovered displacements correspond to DTM errors. Improvements of the method regarding its...

  15. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 574: Neptune, Nevada National Security Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Environmental Restoration

    2011-08-31

    This Streamlined Approach for Environmental Restoration (SAFER) Plan identifies the activities required for closure of Corrective Action Unit (CAU) 574, Neptune. CAU 574 is included in the Federal Facility Agreement and Consent Order (FFACO) (1996 [as amended March 2010]) and consists of the following two Corrective Action Sites (CASs) located in Area 12 of the Nevada National Security Site: (1) CAS 12-23-10, U12c.03 Crater (Neptune); (2) CAS 12-45-01, U12e.05 Crater (Blanca). This plan provides the methodology for the field activities that will be performed to gather the necessary information for closure of the two CASs. There is sufficient information and process knowledge regarding the expected nature and extent of potential contaminants to recommend closure of CAU 574 using the SAFER process. Based on historical documentation, personnel interviews, site process knowledge, site visits, photographs, field screening, analytical results, the results of the data quality objective (DQO) process (Section 3.0), and an evaluation of corrective action alternatives (Appendix B), closure in place with administrative controls is the expected closure strategy for CAU 574. Additional information will be obtained by conducting a field investigation to verify and support the expected closure strategy and provide a defensible recommendation that no further corrective action is necessary. This will be presented in a Closure Report that will be prepared and submitted to the Nevada Division of Environmental Protection (NDEP) for review and approval.

  16. TU-C-BRE-10: A Streamlined Approach to EPID Transit Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Morris, B; Fontenot, J [Louisiana State University, Baton Rouge, LA (United States); Mary Bird Perkins Cancer Center, Baton Rouge, LA (United States)

    2014-06-15

    Purpose: To investigate the feasibility of a simple and efficient transit dosimetry method using the electronic portal imaging device (EPID) for dose delivery error detection and prevention. Methods: In the proposed method, 2D reference transit images are generated for comparison with online images acquired during treatment. Reference transit images are generated by convolving through-air EPID measurements of each field with pixel-specific kernels selected from a library of pre-calculated Monte Carlo pencil kernels of varying radiological thickness. The kernel used for each pixel is selected based on the calculated radiological thickness of the patient along a line joining the pixel and the virtual source. The accuracy of the technique was evaluated in flat homogeneous and heterogeneous plastic water phantoms, a heterogeneous cylindrical phantom, and an anthropomorphic head phantom. Gamma criteria of 3%/3 mm was used to quantify the accuracy of the technique for the various cases. Results: An average of 99.9% and 99.7% of the points in the comparison between the measured and predicted images passed a 3%/3mm gamma for the homogeneous and heterogeneous plastic water phantoms, respectively. 97.1% of the points passed for the analysis of the heterogeneous cylindrical phantom. For the anthropomorphic head phantom, an average of 97.8% of points passed the 3%/3mm gamma criteria for all field sizes. Failures were observed primarily in areas of drastic thickness or material changes and at the edges of the fields. Conclusion: The data suggest that the proposed transit dosimetry method is a feasible approach to in vivo dose monitoring. Future research efforts could include implementation for more complex fields and sensitivity testing of the method to setup errors and changes in anatomy. Oncology Data Systems provided partial funding support but did not participate in the collection or analysis of data.

  17. A Fully Automated Approach to Spike Sorting.

    Science.gov (United States)

    Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F

    2017-09-13

    Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 408: Bomblet Target Area, Tonopah Test Range, Nevada

    International Nuclear Information System (INIS)

    NSTec Environmental Management

    2006-01-01

    This Streamlined Approach for Environmental Restoration Plan provides the details for the closure of Corrective Action Unit (CAU) 408, Bomblet Target Area. CAU 408 is located at the Tonopah Test Range and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order of 1996. One Corrective Action Site (CAS) is included in CAU 408: (lg b ullet) CAS TA-55-002-TAB2, Bomblet Target Areas Based on historical documentation, personnel interviews, process knowledge, site visits, aerial photography, multispectral data, preliminary geophysical surveys, and the results of data quality objectives process (Section 3.0), clean closure will be implemented for CAU 408. CAU 408 closure activities will consist of identification and clearance of bomblet target areas, identification and removal of depleted uranium (DU) fragments on South Antelope Lake, and collection of verification samples. Any soil containing contaminants at concentrations above the action levels will be excavated and transported to an appropriate disposal facility. Based on existing information, contaminants of potential concern at CAU 408 include explosives. In addition, at South Antelope Lake, bomblets containing DU were tested. None of these contaminants is expected to be present in the soil at concentrations above the action levels; however, this will be determined by radiological surveys and verification sample results. The corrective action investigation and closure activities have been planned to include data collection and hold points throughout the process. Hold points are designed to allow decision makers to review the existing data and decide which of the available options are most suitable. Hold points include the review of radiological, geophysical, and analytical data and field observations

  19. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 107: Low Impact Soil Sites, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    2008-01-01

    This Streamlined Approach for Environmental Restoration Plan covers activities associated with Corrective Action Unit (CAU) 107 of the Federal Facility Agreement and Consent Order (FFACO, 1996 (as amended February 2008)). CAU 107 consists of the following Corrective Action Sites (CASs) located in Areas 1, 2, 3, 4, 5, 9, 10, and 18 of the Nevada Test Site. (1) CAS 01-23-02, Atmospheric Test Site - High Alt; (2) CAS 02-23-02, Contaminated Areas (2); (3) CAS 02-23-03, Contaminated Berm; (4) CAS 02-23-10, Gourd-Amber Contamination Area; (5) CAS 02-23-11, Sappho Contamination Area; (6) CAS 02-23-12, Scuttle Contamination Area; (7) CAS 03-23-24, Seaweed B Contamination Area; (8) CAS 03-23-27, Adze Contamination Area; (9) CAS 03-23-28, Manzanas Contamination Area; (10) CAS 03-23-29, Truchas-Chamisal Contamination Area; (11) CAS 04-23-02, Atmospheric Test Site T4-a; (12) CAS 05-23-06, Atmospheric Test Site; (13) CAS 09-23-06, Mound of Contaminated Soil; (14) CAS 10-23-04, Atmospheric Test Site M-10; and (15) CAS 18-23-02, U-18d Crater (Sulky). Based on historical documentation, personnel interviews, site process knowledge, site visits, photographs, engineering drawings, field screening, analytical results, and the results of data quality objectives process (Section 3.0), closure in place with administrative controls or no further action will be implemented for CAU 107. CAU 107 closure activities will consist of verifying that the current postings required under Title 10 Code of Federal Regulations (CFR) Part 835 are in place and implementing use restrictions (URs) at two sites, CAS 03-23-29 and CAS 18-23-02. The current radiological postings combined with the URs are adequate administrative controls to limit site access and worker dose

  20. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 107: Low Impact Soil Sites, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    2009-01-01

    This Streamlined Approach for Environmental Restoration Plan covers activities associated with Corrective Action Unit (CAU) 107 of the Federal Facility Agreement and Consent Order (1996 (as amended February 2008)). CAU 107 consists of the following Corrective Action Sites (CASs) located in Areas 1, 2, 3, 4, 5, 9, 10, and 18 of the Nevada Test Site. (sm b ullet) CAS 01-23-02, Atmospheric Test Site - High Alt(sm b ullet) CAS 02-23-02, Contaminated Areas (2)(sm b ullet) CAS 02-23-03, Contaminated Berm(sm b ullet) CAS 02-23-10, Gourd-Amber Contamination Area(sm b ullet) CAS 02-23-11, Sappho Contamination Area(sm b ullet) CAS 02-23-12, Scuttle Contamination Area(sm b ullet) CAS 03-23-24, Seaweed B Contamination Area(sm b ullet) CAS 03-23-27, Adze Contamination Area(sm b ullet) CAS 03-23-28, Manzanas Contamination Area(sm b ullet) CAS 03-23-29, Truchas-Chamisal Contamination Area(sm b ullet) CAS 04-23-02, Atmospheric Test Site T4-a(sm b ullet) CAS 05-23-06, Atmospheric Test Site(sm b ullet) CAS 09-23-06, Mound of Contaminated Soil(sm b ullet) CAS 10-23-04, Atmospheric Test Site M-10(sm b ullet) CAS 18-23-02, U-18d Crater (Sulky) Based on historical documentation, personnel interviews, site process knowledge, site visits, photographs, engineering drawings, field screening, analytical results, and the results of data quality objectives process (Section 3.0), closure in place with administrative controls or no further action will be implemented for CAU 107.

  1. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 398: Area 25 Spill Sites, Nevada Test Site, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    K. B. Campbell

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) plan addresses the activities necessary to close Corrective Action Unit (CAU) 398: Area 25 Spill Sites. CAU 398, located in Area 25 of the Nevada Test Site, is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO) (FFACO, 1996), and consists of the following 13 Corrective Action Sites (CASs) (Figure 1): (1) CAS 25-44-01 , a fuel spill on soil that covers a concrete pad. The origins and use of the spill material are unknown, but the spill is suspected to be railroad bedding material. (2) CAS 25-44-02, a spill of liquid to the soil from leaking drums. (3) CAS 25-44-03, a spill of oil from two leaking drums onto a concrete pad and surrounding soil. (4) CAS 25-44-04, a spill from two tanks containing sulfuric acid and sodium hydroxide used for a water demineralization process. (5) CAS 25-25-02, a fuel or oil spill from leaking drums that were removed in 1992. (6) CAS 25-25-03, an oil spill adjacent to a tipped-over drum. The source of the drum is not listed, although it is noted that the drum was removed in 1991. (7) CAS 25-25-04, an area on the north side of the Engine-Maintenance, Assembly, and Disassembly (E-MAD) facility, where oils and cooling fluids from metal machining operations were poured directly onto the ground. (8) CAS 25-25-05, an area of oil and/or hydraulic fluid spills beneath the heavy equipment once stored there. (9) CAS 25-25-06, an area of diesel fuel staining beneath two generators that have since been removed. (10) CAS 25-25-07, an area of hydraulic oil spills associated with a tunnel-boring machine abandoned inside X-Tunnel. (11) CAS 25-25-08, an area of hydraulic fluid spills associated with a tunnel-boring machine abandoned inside Y-Tunnel. (12) CAS 25-25-16, a diesel fuel spill from an above-ground storage tank located near Building 3320 at Engine Test Stand-1 (ETS-1) that was removed in 1998. (13) CAS 25-25-17, a hydraulic oil spill

  2. An approach to automated chromosome analysis

    International Nuclear Information System (INIS)

    Le Go, Roland

    1972-01-01

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author) [fr

  3. Automated Announcements of Approaching Emergency Vehicles

    Science.gov (United States)

    Bachelder, Aaron; Foster, Conrad

    2006-01-01

    Street intersections that are equipped with traffic lights would also be equipped with means for generating audible announcements of approaching emergency vehicles, according to a proposal. The means to generate the announcements would be implemented in the intersection- based subsystems of emergency traffic-light-preemption systems like those described in the two immediately preceding articles and in "Systems Would Preempt Traffic Lights for Emergency Vehicles" (NPO-30573), NASA Tech Briefs, Vol. 28, No. 10 (October 2004), page 36. Preempting traffic lights is not, by itself, sufficient to warn pedestrians at affected intersections that emergency vehicles are approaching. Automated visual displays that warn of approaching emergency vehicles can be helpful as a supplement to preemption of traffic lights, but experience teaches that for a variety of reasons, pedestrians often do not see such displays. Moreover, in noisy and crowded urban settings, the lights and sirens on emergency vehicles are often not noticed until a few seconds before the vehicles arrive. According to the proposal, the traffic-light preemption subsystem at each intersection would generate an audible announcement for example, emergency vehicle approaching, please clear intersection whenever a preemption was triggered. The subsystem would estimate the time of arrival of an approaching emergency vehicle by use of vehicle identity, position, and time data from one or more sources that could include units connected to traffic loops and/or transponders connected to diagnostic and navigation systems in participating emergency vehicles. The intersection-based subsystem would then start the announcement far enough in advance to enable pedestrians to leave the roadway before any emergency vehicles arrive.

  4. Smart management of sample dilution using an artificial neural network to achieve streamlined processes and saving resources: the automated nephelometric testing of serum free light chain as case study.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Saving resources is a paramount issue for the modern laboratory, and new trainable as well as smart technologies can be used to allow the automated instrumentation to manage samples more efficiently in order to achieve streamlined processes. In this regard the serum free light chain (sFLC) testing represents an interesting challenge, as it usually causes using a number of assays before achieving an acceptable result within the analytical range. An artificial neural network based on the multi-layer perceptron (MLP-ANN) was used to infer the starting dilution status of sFLC samples based on the information available through the laboratory information system (LIS). After the learning phase, the MLP-ANN simulation was applied to the nephelometric testing routinely performed in our laboratory on a BN ProSpec® System analyzer (Siemens Helathcare) using the N Latex FLC kit. The MLP-ANN reduced the serum kappa free light chain (κ-FLC) and serum lambda free light chain (λ-FLC) wasted tests by 69.4% and 70.8% with respect to the naïve stepwise dilution scheme used by the automated analyzer, and by 64.9% and 66.9% compared to a "rational" dilution scheme based on a 4-step dilution. Although it was restricted to follow-up samples, the MLP-ANN showed good predictive performance, which alongside the possibility to implement it in any automated system, made it a suitable solution for achieving streamlined laboratory processes and saving resources.

  5. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 121: Storage Tanks and Miscellaneous Sites, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    NSTec Environmental Restoration

    2007-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan identifies the activities required for the closure of Corrective Action Unit (CAU) 121, Storage Tanks and Miscellaneous Sites. CAU 121 is currently listed in Appendix III of the ''Federal Facility Agreement and Consent Order'' (FFACO, 1996) and consists of three Corrective Action Sites (CASs) located in Area 12 of the Nevada Test Site (NTS): CAS 12-01-01, Aboveground Storage Tank; CAS 12-01-02, Aboveground Storage Tank; and CAS 12-22-26, Drums; 2 AST's. CASs 12-01-01 and 12-01-02 are located to the west of the Area 12 Camp, and CAS 12-22-26 is located near the U-12g Tunnel, also known as G-tunnel, in Area 12 (Figure 1). The aboveground storage tanks (ASTs) present at CASs 12-01-01 and 12-01-02 will be removed and disposed of at an appropriate facility. Soil below the ASTs will be sampled to identify whether it has been impacted with chemicals or radioactivity above action levels. If impacted soil above action levels is present, the soil will be excavated and disposed of at an appropriate facility. The CAS 12-22-26 site is composed of two overlapping areas, one where drums had formerly been stored, and the other where an AST was used to dispense diesel for locomotives used at G-tunnel. This area is located above an underground radioactive materials area (URMA), and within an area that may have elevated background radioactivity because of containment breaches during nuclear tests and associated tunnel reentry operations. CAS 12-22-26 does not include the URMA or the elevated background radioactivity. An AST that had previously been used to store liquid magnesium chloride (MgCl) was properly disposed of several years ago, and releases from this tank are not an environmental concern. The diesel AST will be removed and disposed of at an appropriate facility. Soil at the former drum area and the diesel AST area will be sampled to identify whether it has been impacted by releases, from the drums or the

  6. Scientific Evaluation and Review of Claims in Health Care (SEaRCH): A Streamlined, Systematic, Phased Approach for Determining “What Works” in Healthcare

    Science.gov (United States)

    Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela

    2017-01-01

    Abstract Background: Answering the question of “what works” in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. Methods: SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL©), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods—CAP, REAL, and EP—can be integrated into a strategic approach to help answer the question “what works in healthcare?” and what it means in a comprehensive way. Discussion: SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and

  7. Scientific Evaluation and Review of Claims in Health Care (SEaRCH): A Streamlined, Systematic, Phased Approach for Determining "What Works" in Healthcare.

    Science.gov (United States)

    Jonas, Wayne B; Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela

    2017-01-01

    Answering the question of "what works" in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL © ), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods-CAP, REAL, and EP-can be integrated into a strategic approach to help answer the question "what works in healthcare?" and what it means in a comprehensive way. SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and wellness.

  8. Approach to plant automation with evolving technology

    International Nuclear Information System (INIS)

    White, J.D.

    1989-01-01

    The US Department of Energy has provided support to Oak Ridge National Laboratory in order to pursue research leading to advanced, automated control of new innovative liquid-metal-cooled nuclear power plants. The purpose of this effort is to conduct research that will help to ensure improved operability, reliability, and safety for advanced LMRs. The plan adopted to achieve these program goals in an efficient and timely manner consists of utilizing, and advancing where required, state-of-the-art controls technology through close interaction with other national laboratories, universities, industry and utilities. A broad range of applications for the control systems strategies and the design environment developed in the course of this program is likely. A natural evolution of automated control in nuclear power plants is envisioned by ORNL to be a phased transition from today's situation of some analog control at the subsystem level with significant operator interaction to the future capability for completely automated digital control with operator supervision. The technical accomplishments provided by this program will assist the industry to accelerate this transition and provide greater economy and safety. The development of this transition to advanced, automated control system designs is expected to have extensive benefits in reduced operating costs, fewer outages, enhanced safety, improved licensability, and improved public acceptance for commercial nuclear power plants. 24 refs

  9. Approach to plant automation with evolving technology

    International Nuclear Information System (INIS)

    White, J.D.

    1989-01-01

    This paper reports that the U.S. Department of Energy has provided support to Oak Ridge National Laboratory in order to pursue research leading to advanced, automated control of new innovative liquid-metal-cooled nuclear power plants. The purpose of this effort is to conduct research that will help to ensure improved operability, reliability, and safety for advanced LMRs. The plan adopted to achieve these program goals in an efficient and timely manner consists of utilizing, and advancing where required, state-of-the art controls technology through close interaction with other national laboratories, universities, industry and utilities. A broad range of applications for the control systems strategies and the design environment developed in the course of this program is likely. A natural evolution of automated control in nuclear power plants is envisioned by ORNL to be a phased transition from today's situation of some analog control at the subsystem level with significant operator interaction to the future capability for completely automated digital control with operator supervision. The technical accomplishments provided by this program will assist the industry to accelerate this transition and provide greater economy and safety. The development of this transition to advanced, automated control system designs is expected to have extensive benefits in reduced operating costs, fewer outages, enhanced safety, improved licensability, and improved public acceptance for commercial nuclear power plants

  10. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 484: Surface Debris, Waste Sites, and Burn Area, Tonopah Test Range, Nevada

    International Nuclear Information System (INIS)

    Bechel Nevada

    2004-01-01

    This Streamlined Approach for Environmental Restoration plan details the activities necessary to close Corrective Action Unit (CAU) 484: Surface Debris, Waste Sites, and Burn Area (Tonopah Test Range). CAU 484 consists of sites located at the Tonopah Test Range, Nevada, and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order. CAU 484 consists of the following six Corrective Action Sites: (1) CAS RG-52-007-TAML, Davis Gun Penetrator Test; (2) CAS TA-52-001-TANL, NEDS Detonation Area; (3) CAS TA-52-004-TAAL, Metal Particle Dispersion Test; (4) CAS TA-52-005-TAAL, Joint Test Assembly DU Sites; (5) CAS TA-52-006-TAPL, Depleted Uranium Site; and (6) CAS TA-54-001-TANL, Containment Tank and Steel Structure

  11. Streamlined approach for environmental restoration plan for corrective action unit 430, buried depleted uranium artillery round No. 1, Tonopah test range

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    This plan addresses actions necessary for the restoration and closure of Corrective Action Unit (CAU) No. 430, Buried Depleted Uranium (DU) Artillery Round No. 1 (Corrective Action Site No. TA-55-003-0960), a buried and unexploded W-79 Joint Test Assembly (JTA) artillery test projectile with high explosives (HE), at the U.S. Department of Energy, Nevada Operations Office (DOE/NV) Tonopah Test Range (TTR) in south-central Nevada. It describes activities that will occur at the site as well as the steps that will be taken to gather adequate data to obtain a notice of completion from Nevada Division of Environmental Protection (NDEP). This plan was prepared under the Streamlined Approach for Environmental Restoration (SAFER) concept, and it will be implemented in accordance with the Federal Facility Agreement and Consent Order (FFACO) and the Resource Conservation and Recovery Act (RCRA) Industrial Sites Quality Assurance Project Plan.

  12. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 113: Reactor Maintenance, Assembly, and Disassembly Building Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    Smith, J. L.

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the action necessary for the closure in place of Corrective Action Unit (CAU) 113 Area 25 Reactor Maintenance, Assembly, and Disassembly Facility (R-MAD). CAU 113 is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO) (NDEP, 1996). The CAU is located in Area 25 of the Nevada Test Site (NTS) and consists of Corrective Action Site (CAS) 25-04-01, R-MAD Facility (Figures 1-2). This plan provides the methodology for closure in place of CAU 113. The site contains radiologically impacted and hazardous material. Based on preassessment field work, there is sufficient process knowledge to close in place CAU 113 using the SAFER process. At a future date when funding becomes available, the R-MAD Building (25-3110) will be demolished and inaccessible radiologic waste will be properly disposed in the Area 3 Radiological Waste Management Site (RWMS)

  13. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 113: Reactor Maintenance, Assembly, and Disassembly Building Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Smith

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the action necessary for the closure in place of Corrective Action Unit (CAU) 113 Area 25 Reactor Maintenance, Assembly, and Disassembly Facility (R-MAD). CAU 113 is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO) (NDEP, 1996). The CAU is located in Area 25 of the Nevada Test Site (NTS) and consists of Corrective Action Site (CAS) 25-04-01, R-MAD Facility (Figures 1-2). This plan provides the methodology for closure in place of CAU 113. The site contains radiologically impacted and hazardous material. Based on preassessment field work, there is sufficient process knowledge to close in place CAU 113 using the SAFER process. At a future date when funding becomes available, the R-MAD Building (25-3110) will be demolished and inaccessible radiologic waste will be properly disposed in the Area 3 Radiological Waste Management Site (RWMS).

  14. Streamlined approach for environmental restoration plan for corrective action unit 430, buried depleted uranium artillery round No. 1, Tonopah test range

    International Nuclear Information System (INIS)

    1996-09-01

    This plan addresses actions necessary for the restoration and closure of Corrective Action Unit (CAU) No. 430, Buried Depleted Uranium (DU) Artillery Round No. 1 (Corrective Action Site No. TA-55-003-0960), a buried and unexploded W-79 Joint Test Assembly (JTA) artillery test projectile with high explosives (HE), at the U.S. Department of Energy, Nevada Operations Office (DOE/NV) Tonopah Test Range (TTR) in south-central Nevada. It describes activities that will occur at the site as well as the steps that will be taken to gather adequate data to obtain a notice of completion from Nevada Division of Environmental Protection (NDEP). This plan was prepared under the Streamlined Approach for Environmental Restoration (SAFER) concept, and it will be implemented in accordance with the Federal Facility Agreement and Consent Order (FFACO) and the Resource Conservation and Recovery Act (RCRA) Industrial Sites Quality Assurance Project Plan

  15. Modern approaches to agent-based complex automated negotiation

    CERN Document Server

    Bai, Quan; Ito, Takayuki; Zhang, Minjie; Ren, Fenghui; Aydoğan, Reyhan; Hadfi, Rafik

    2017-01-01

    This book addresses several important aspects of complex automated negotiations and introduces a number of modern approaches for facilitating agents to conduct complex negotiations. It demonstrates that autonomous negotiation is one of the most important areas in the field of autonomous agents and multi-agent systems. Further, it presents complex automated negotiation scenarios that involve negotiation encounters that may have, for instance, a large number of agents, a large number of issues with strong interdependencies and/or real-time constraints.

  16. Self-optimizing approach for automated laser resonator alignment

    Science.gov (United States)

    Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.

    2012-02-01

    Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.

  17. System approach to automation and robotization of drivage

    Science.gov (United States)

    Zinov’ev, VV; Mayorov, AE; Starodubov, AN; Nikolaev, PI

    2018-03-01

    The authors consider the system approach to finding ways of no-man drilling and blasting in the face area by means of automation and robotization of operations with a view to reducing injuries in mines. The analysis is carried out in terms of the drilling and blasting technology applied in Makarevskoe Coal Field, Kuznetsk Coal Basin. Within the system-functional approach and using INDEFO procedure, the processes of drilling and blasthole charging are decomposed into related elementary operations. The automation and robotization methods to avoid the presence of miners in the face are found for each operation.

  18. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  19. Automate Your Physical Plant Using the Building Block Approach.

    Science.gov (United States)

    Michaelson, Matt

    1998-01-01

    Illustrates how Mount Saint Vincent University (Halifax), by upgrading the control and monitoring of one building or section of the school at a time, could produce savings in energy and operating costs and improve the environment. Explains a gradual, "building block" approach to facility automation that provides flexibility without a…

  20. A new systems engineering approach to streamlined science and mission operations for the Far Ultraviolet Spectroscopic Explorer (FUSE)

    Science.gov (United States)

    Butler, Madeline J.; Sonneborn, George; Perkins, Dorothy C.

    1994-01-01

    The Mission Operations and Data Systems Directorate (MO&DSD, Code 500), the Space Sciences Directorate (Code 600), and the Flight Projects Directorate (Code 400) have developed a new approach to combine the science and mission operations for the FUSE mission. FUSE, the last of the Delta-class Explorer missions, will obtain high resolution far ultraviolet spectra (910 - 1220 A) of stellar and extragalactic sources to study the evolution of galaxies and conditions in the early universe. FUSE will be launched in 2000 into a 24-hour highly eccentric orbit. Science operations will be conducted in real time for 16-18 hours per day, in a manner similar to the operations performed today for the International Ultraviolet Explorer. In a radical departure from previous missions, the operations concept combines spacecraft and science operations and data processing functions in a single facility to be housed in the Laboratory for Astronomy and Solar Physics (Code 680). A small missions operations team will provide the spacecraft control, telescope operations and data handling functions in a facility designated as the Science and Mission Operations Center (SMOC). This approach will utilize the Transportable Payload Operations Control Center (TPOCC) architecture for both spacecraft and instrument commanding. Other concepts of integrated operations being developed by the Code 500 Renaissance Project will also be employed for the FUSE SMOC. The primary objective of this approach is to reduce development and mission operations costs. The operations concept, integration of mission and science operations, and extensive use of existing hardware and software tools will decrease both development and operations costs extensively. This paper describes the FUSE operations concept, discusses the systems engineering approach used for its development, and the software, hardware and management tools that will make its implementation feasible.

  1. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 408: Bomblet Target Area Tonopah Test Range (TTR), Nevada, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Mark Krauss

    2010-03-01

    This Streamlined Approach for Environmental Restoration Plan addresses the actions needed to achieve closure of Corrective Action Unit (CAU) 408, Bomblet Target Area (TTR). Corrective Action Unit 408 is located at the Tonopah Test Range and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order. Corrective Action Unit 408 comprises Corrective Action Site TA-55-002-TAB2, Bomblet Target Areas. Clean closure of CAU 408 will be accomplished by removal of munitions and explosives of concern within seven target areas and potential disposal pits. The target areas were used to perform submunitions related tests for the U.S. Department of Energy (DOE). The scope of CAU 408 is limited to submunitions released from DOE activities. However, it is recognized that the presence of other types of unexploded ordnance and munitions may be present within the target areas due to the activities of other government organizations. The CAU 408 closure activities consist of: • Clearing bomblet target areas within the study area. • Identifying and remediating disposal pits. • Collecting verification samples. • Performing radiological screening of soil. • Removing soil containing contaminants at concentrations above the action levels. Based on existing information, contaminants of potential concern at CAU 408 include unexploded submunitions, explosives, Resource Conservation Recovery Act metals, and depleted uranium. Contaminants are not expected to be present in the soil at concentrations above the action levels; however, this will be determined by radiological surveys and verification sample results.

  2. Streamlined approach for environmental restoration (SAFER) plan for corrective action unit 412: clean slate I plutonium dispersion (TTR) tonopah test range, Nevada, revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick K.

    2015-04-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 412. CAU 412 is located on the Tonopah Test Range and consists of a single corrective action site (CAS), TA-23-01CS, Pu Contaminated Soil. There is sufficient information and historical documentation from previous investigations and the 1997 interim corrective action to recommend closure of CAU 412 using the SAFER process. Based on existing data, the presumed corrective action for CAU 412 is clean closure. However, additional data will be obtained during a field investigation to document and verify the adequacy of existing information and determine whether the CAU 412 closure objectives have been achieved. This SAFER Plan provides the methodology to gather the necessary information for closing the CAU.The following summarizes the SAFER activities that will support the closure of CAU 412:• Collect environmental samples from designated target populations to confirm or disprove the presence of contaminants of concern (COCs) as necessary to supplement existing information.• If no COCs are present, establish clean closure as the corrective action. • If COCs are present, the extent of contamination will be defined and further corrective actions will be evaluated with the stakeholders (NDEP, USAF).• Confirm the preferred closure option is sufficient to protect human health and the environment.

  3. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 408: Bomblet Target Area Tonopah Test Range (TTR), Nevada, Revision 1

    International Nuclear Information System (INIS)

    Krauss, Mark

    2010-01-01

    This Streamlined Approach for Environmental Restoration Plan addresses the actions needed to achieve closure of Corrective Action Unit (CAU) 408, Bomblet Target Area (TTR). Corrective Action Unit 408 is located at the Tonopah Test Range and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order. Corrective Action Unit 408 comprises Corrective Action Site TA-55-002-TAB2, Bomblet Target Areas. Clean closure of CAU 408 will be accomplished by removal of munitions and explosives of concern within seven target areas and potential disposal pits. The target areas were used to perform submunitions related tests for the U.S. Department of Energy (DOE). The scope of CAU 408 is limited to submunitions released from DOE activities. However, it is recognized that the presence of other types of unexploded ordnance and munitions may be present within the target areas due to the activities of other government organizations. The CAU 408 closure activities consist of: (1) Clearing bomblet target areas within the study area. (2) Identifying and remediating disposal pits. (3) Collecting verification samples. (4) Performing radiological screening of soil. (5) Removing soil containing contaminants at concentrations above the action levels. Based on existing information, contaminants of potential concern at CAU 408 include unexploded submunitions, explosives, Resource Conservation Recovery Act metals, and depleted uranium. Contaminants are not expected to be present in the soil at concentrations above the action levels; however, this will be determined by radiological surveys and verification sample results.

  4. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 415: Project 57 No. 1 Plutonium Dispersion (NTTR), Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick; Burmeister, Mark

    2014-04-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 415, Project 57 No. 1 Plutonium Dispersion (NTTR). CAU 415 is located on Range 4808A of the Nevada Test and Training Range (NTTR) and consists of one corrective action site: NAFR-23-02, Pu Contaminated Soil. The CAU 415 site consists of the atmospheric release of radiological contaminants to surface soil from the Project 57 safety experiment conducted in 1957. The safety experiment released plutonium (Pu), uranium (U), and americium (Am) to the surface soil over an area of approximately 1.9 square miles. This area is currently fenced and posted as a radiological contamination area. Vehicles and debris contaminated by the experiment were subsequently buried in a disposal trench within the surface-contaminated, fenced area and are assumed to have released radiological contamination to subsurface soils. Potential source materials in the form of pole-mounted electrical transformers were also identified at the site and will be removed as part of closure activities.

  5. Streamlined approach for environmental restoration work plan for Corrective Action Unit 126: Closure of aboveground storage tanks, Nevada Test Site, Nevada. Revision 1

    International Nuclear Information System (INIS)

    1998-07-01

    This plan addresses the closure of several aboveground storage tanks in Area 25 of the Nevada Test Site. The unit is currently identified as Corrective Action Unit 126 in the Federal Facility Agreement and Consent Order and is listed as having six Corrective Action Sites. This plan addresses the Streamlined Approach for Environmental Restoration closure for five of the six sites. Four of the CASs are located at the Engine Test Stand complex and one is located in the Central Support Area. The sites consist of aboveground tanks, two of which were used to store diesel fuel and one stored Nalcool (an antifreeze mixture). The remaining tanks were used as part of a water demineralization process and stored either sulfuric acid or sodium hydroxide, and one was used as a charcoal adsorption furnace. Closure will be completed by removal of the associated piping, tank supports and tanks using a front end loader, backhoe, and/or crane. When possible, the tanks will be salvaged as scrap metal. The piping that is not removed will be sealed using a cement grout

  6. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 538: Spill Sites, Nevada Test Site, Nevada, Rev. No.: 0

    Energy Technology Data Exchange (ETDEWEB)

    Alfred Wickline

    2006-04-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions necessary for the closure of Corrective Action Unit (CAU) 538: Spill Sites, Nevada Test Site, Nevada. It has been developed in accordance with the ''Federal Facility Agreement and Consent Order'' (FFACO) (1996) that was agreed to by the State of Nevada, the U.S. Department of Energy (DOE), and the U.S. Department of Defense. A SAFER may be performed when the following criteria are met: (1) Conceptual corrective actions are clearly identified (although some degree of investigation may be necessary to select a specific corrective action before completion of the Corrective Action Investigation [CAI]). (2) Uncertainty of the nature, extent, and corrective action must be limited to an acceptable level of risk. (3) The SAFER Plan includes decision points and criteria for making data quality objective (DQO) decisions. The purpose of the investigation will be to document and verify the adequacy of existing information; to affirm the decision for either clean closure, closure in place, or no further action; and to provide sufficient data to implement the corrective action. The actual corrective action selected will be based on characterization activities implemented under this SAFER Plan. This SAFER Plan identifies decision points developed in cooperation with the Nevada Division of Environmental Protection (NDEP) and where DOE will reach consensus with NDEP before beginning the next phase of work.

  7. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 575: Area 15 Miscellaneous Sites, Nevada National Security Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States)

    2014-12-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 575, Area 15 Miscellaneous Sites, identified in the Federal Facility Agreement and Consent Order (FFACO). CAU 575 comprises the following four corrective action sites (CASs) located in Area 15 of the Nevada National Security Site: 15-19-02, Waste Burial Pit, 15-30-01, Surface Features at Borehole Sites, 15-64-01, Decontamination Area, 15-99-03, Aggregate Plant This plan provides the methodology for field activities needed to gather the necessary information for closing each CAS. There is sufficient information and process knowledge from historical documentation and investigations of similar sites regarding the expected nature and extent of potential contaminants to recommend closure of CAU 575 using the SAFER process. Additional information will be obtained by conducting a field investigation to document and verify the adequacy of existing information, to affirm the predicted corrective action decisions, and to provide sufficient data to implement the corrective actions. This will be presented in a closure report that will be prepared and submitted to the Nevada Division of Environmental Protection (NDEP) for review and approval.

  8. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 425: Area 9 Main Lake Construction Debris Disposal Area, Tonopah Test Range, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    K. B. Campbell

    2002-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 425, Area 9 Main Lake Construction Debris Disposal Area. This CAU is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO, 1996). This site will be cleaned up under the SAFER process since the volume of waste exceeds the 23 cubic meters (m(sup 3)) (30 cubic yards[yd(sup 3)]) limit established for housekeeping sites. CAU 425 is located on the Tonopah Test Range (TTR) and consists of one Corrective Action Site (CAS) 09-08-001-TA09, Construction Debris Disposal Area (Figure 1). CAS 09-08-001-TA09 is an area that was used to collect debris from various projects in and around Area 9. The site is located approximately 81 meters (m) (265 feet[ft]) north of Edwards Freeway northeast of Main Lake on the TTR. The site is composed of concrete slabs with metal infrastructure, metal rebar, wooden telephone poles, and concrete rubble from the Hard Target and early Tornado Rocket sled tests. Other items such as wood scraps, plastic pipes, soil, and miscellaneous nonhazardous items have also been identified in the debris pile. It is estimated that this site contains approximately 2280 m(sup 3) (3000 yd(sup 3)) of construction-related debris

  9. Fleet Sizing of Automated Material Handling Using Simulation Approach

    Science.gov (United States)

    Wibisono, Radinal; Ai, The Jin; Ratna Yuniartha, Deny

    2018-03-01

    Automated material handling tends to be chosen rather than using human power in material handling activity for production floor in manufacturing company. One critical issue in implementing automated material handling is designing phase to ensure that material handling activity more efficient in term of cost spending. Fleet sizing become one of the topic in designing phase. In this research, simulation approach is being used to solve fleet sizing problem in flow shop production to ensure optimum situation. Optimum situation in this research means minimum flow time and maximum capacity in production floor. Simulation approach is being used because flow shop can be modelled into queuing network and inter-arrival time is not following exponential distribution. Therefore, contribution of this research is solving fleet sizing problem with multi objectives in flow shop production using simulation approach with ARENA Software

  10. Exploration on Automated Software Requirement Document Readability Approaches

    OpenAIRE

    Chen, Mingda; He, Yao

    2017-01-01

    Context. The requirements analysis phase, as the very beginning of software development process, has been identified as a quite important phase in the software development lifecycle. Software Requirement Specification (SRS) is the output of requirements analysis phase, whose quality factors play an important role in the evaluation work. Readability is a quite important SRS quality factor, but there are few available automated approaches for readability measurement, because of the tight depend...

  11. Automation of seismic network signal interpolation: an artificial intelligence approach

    International Nuclear Information System (INIS)

    Chiaruttini, C.; Roberto, V.

    1988-01-01

    After discussing the current status of the automation in signal interpretation from seismic networks, a new approach, based on artificial-intelligence tecniques, is proposed. The knowledge of the human expert analyst is examined, with emphasis on its objects, strategies and reasoning techniques. It is argued that knowledge-based systems (or expert systems) provide the most appropriate tools for designing an automatic system, modelled on the expert behaviour

  12. Photonomics: automation approaches yield economic aikido for photonics device manufacture

    Science.gov (United States)

    Jordan, Scott

    2002-09-01

    In the glory days of photonics, with exponentiating demand for photonics devices came exponentiating competition, with new ventures commencing deliveries seemingly weekly. Suddenly the industry was faced with a commodity marketplace well before a commodity cost structure was in place. Economic issues like cost, scalability, yield-call it all "Photonomics" -now drive the industry. Automation and throughput-optimization are obvious answers, but until now, suitable modular tools had not been introduced. Available solutions were barely compatible with typical transverse alignment tolerances and could not automate angular alignments of collimated devices and arrays. And settling physics served as the insoluble bottleneck to throughput and resolution advancement in packaging, characterization and fabrication processes. The industry has addressed these needs in several ways, ranging from special configurations of catalog motion devices to integrated microrobots based on a novel mini-hexapod configuration. This intriguing approach allows tip/tilt alignments to be automated about any point in space, such as a beam waist, a focal point, the cleaved face of a fiber, or the optical axis of a waveguide- ideal for MEMS packaging automation and array alignment. Meanwhile, patented new low-cost settling-enhancement technology has been applied in applications ranging from air-bearing long-travel stages to subnanometer-resolution piezo positioners to advance resolution and process cycle-times in sensitive applications such as optical coupling characterization and fiber Bragg grating generation. Background, examples and metrics are discussed, providing an up-to-date industry overview of available solutions.

  13. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 411. Double Tracks Plutonium Dispersion (Nellis), Nevada Test and Training Range, Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick K. [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States)

    2015-03-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 411, Double Tracks Plutonium Dispersion (Nellis). CAU 411 is located on the Nevada Test and Training Range and consists of a single corrective action site (CAS), NAFR-23-01, Pu Contaminated Soil. There is sufficient information and historical documentation from previous investigations and the 1996 interim corrective action to recommend closure of CAU 411 using the SAFER process. Based on existing data, the presumed corrective action for CAU 411 is clean closure. However, additional data will be obtained during a field investigation to document and verify the adequacy of existing information, and to determine whether the CAU 411 closure objectives have been achieved. This SAFER Plan provides the methodology to gather the necessary information for closing the CAU. The results of the field investigation will be presented in a closure report that will be prepared and submitted to the Nevada Division of Environmental Protection (NDEP) for review and approval. The site will be investigated based on the data quality objectives (DQOs) developed on November 20, 2014, by representatives of NDEP, the U.S. Air Force (USAF), and the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office. The DQO process was used to identify and define the type, amount, and quality of data needed to determine whether CAU 411 closure objectives have been achieved. The following text summarizes the SAFER activities that will support the closure of CAU 411; Collect environmental samples from designated target populations to confirm or disprove the presence of contaminants of concern (COCs) as necessary to supplement existing information; If COCs are no longer present, establish clean closure as the corrective action; If COCs are present, the extent of contamination will be defined and further corrective actions

  14. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 326: Areas 6 and 27 Release Sites, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    A. T. Urbon

    2001-09-01

    This Streamlined Approach for Environmental Restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 326, Areas 6 and 27 Release Sites. This CAU is currently listed in the January 2001, Appendix III of the Federal Facilities Agreement and Consent Order (FFACO) (FFACO, 1996). CAU 326 is located on the Nevada Test Site (NTS) and consists of the following four Corrective Action Sites (CASS) (Figure 1): CAS 06-25-01--Is a rupture in an underground pipe that carried heating oil (diesel) from the underground heating oil tank (Tank 6-CP-1) located to the west of Building CP-70 to the boiler in Building CP-1 in the Area 6 Control Point (CP) compound. CAS 06-25-02--A heating oil spill that is a result of overfilling an underground heating oil tank (Tank 6-DAF-5) located at the Area 6 Device Assembly Facility (DAF). CAS 06-25-04--A release of waste oil that occurred while removing used oil to from Tank 6-619-4. Tank 6-619-4 is located northwest of Building 6-619 at the Area 6 Gas Station. CAS 27-25-01--Consists of an excavation that was created in an attempt to remove impacted stained soil from the Site Maintenance Yard in Area 27. Approximately 53.5 cubic meters (m{sup 3}) (70 cubic yards [yd{sup 3}]) of soil impacted by total petroleum hydrocarbons (TPH) and polychlorinated biphenyls (PCBs) was excavated before the excavation activities were halted. The excavation activities were stopped because the volume of impacted soil exceeded estimated quantities and budget.

  15. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 330: Areas 6, 22, and 23 Tanks and Spill Sites, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    T. M. Fitzmaurice

    2001-08-01

    This Streamlined Approach for Environmental restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 330, Areas 6,22, and 23 Tanks and Spill Sites. The CAUs are currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO). This CAU is located at the Nevada Test Site (NTS) (Figure 1). CAU 330 consists of the following Corrective Action Sites (CASs): (1) CAS 06-02-04 - Consists of an underground tank and piping. This CAS is close to an area that was part of the Animal Investigation Program (AIP), conducted under the U.S. Public Health Service. Its purpose was to study and perform tests on the cattle and wild animals in and around the NTS that were exposed to radionuclides. It is unknown if this tank was part of these operations. (2) CAS 22-99-06 - Is a fuel spill that is believed to be a waste oil release which occurred when Camp Desert Rock was an active facility. This CAS was originally identified as being a small depression where liquids were poured onto the ground, located on the west side of Building T-1001. This building has been identified as housing a fire station, radio station, and radio net remote and telephone switchboard. (3) CAS 23-01-02 - Is a large aboveground storage tank (AST) farm that was constructed to provide gasoline and diesel storage in Area 23. The site consists of two ASTs, a concrete foundation, a surrounding earthen berm, associated piping, and unloading stations. (4) CAS 23-25-05 - Consists of an asphalt oil spill/tar release that contains a wash covered with asphalt oil/tar material, a half buried 208-liter (L) (55-gallon [gal]) drum, rebar, and concrete located in the vicinity.

  16. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 330: Areas 6, 22, and 23 Tanks and Spill Sites, Nevada Test Site, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    T. M. Fitzmaurice

    2001-01-01

    This Streamlined Approach for Environmental restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 330, Areas 6,22, and 23 Tanks and Spill Sites. The CAUs are currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO). This CAU is located at the Nevada Test Site (NTS) (Figure 1). CAU 330 consists of the following Corrective Action Sites (CASs): (1) CAS 06-02-04 - Consists of an underground tank and piping. This CAS is close to an area that was part of the Animal Investigation Program (AIP), conducted under the U.S. Public Health Service. Its purpose was to study and perform tests on the cattle and wild animals in and around the NTS that were exposed to radionuclides. It is unknown if this tank was part of these operations. (2) CAS 22-99-06 - Is a fuel spill that is believed to be a waste oil release which occurred when Camp Desert Rock was an active facility. This CAS was originally identified as being a small depression where liquids were poured onto the ground, located on the west side of Building T-1001. This building has been identified as housing a fire station, radio station, and radio net remote and telephone switchboard. (3) CAS 23-01-02 - Is a large aboveground storage tank (AST) farm that was constructed to provide gasoline and diesel storage in Area 23. The site consists of two ASTs, a concrete foundation, a surrounding earthen berm, associated piping, and unloading stations. (4) CAS 23-25-05 - Consists of an asphalt oil spill/tar release that contains a wash covered with asphalt oil/tar material, a half buried 208-liter (L) (55-gallon[gal]) drum, rebar, and concrete located in the vicinity

  17. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 326: Areas 6 and 27 Release Sites, Nevada Test Site, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    A. T. Urbon

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 326, Areas 6 and 27 Release Sites. This CAU is currently listed in the January 2001, Appendix III of the Federal Facilities Agreement and Consent Order (FFACO) (FFACO, 1996). CAU 326 is located on the Nevada Test Site (NTS) and consists of the following four Corrective Action Sites (CASS) (Figure 1): CAS 06-25-01-Is a rupture in an underground pipe that carried heating oil (diesel) from the underground heating oil tank (Tank 6-CP-1) located to the west of Building CP-70 to the boiler in Building CP-1 in the Area 6 Control Point (CP) compound. CAS 06-25-02-A heating oil spill that is a result of overfilling an underground heating oil tank (Tank 6-DAF-5) located at the Area 6 Device Assembly Facility (DAF). CAS 06-25-04-A release of waste oil that occurred while removing used oil to from Tank 6-619-4. Tank 6-619-4 is located northwest of Building 6-619 at the Area 6 Gas Station. CAS 27-25-01-Consists of an excavation that was created in an attempt to remove impacted stained soil from the Site Maintenance Yard in Area 27. Approximately 53.5 cubic meters (m(sup 3)) (70 cubic yards[yd(sup 3)]) of soil impacted by total petroleum hydrocarbons (TPH) and polychlorinated biphenyls (PCBs) was excavated before the excavation activities were halted. The excavation activities were stopped because the volume of impacted soil exceeded estimated quantities and budget

  18. Streamlining Compliance Validation Through Automation Processes

    Science.gov (United States)

    2014-03-01

    INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS ACAS Assured Compliance Assessment Suite AMP Apache- MySQL -PHP ANSI American...enemy. Of course , a common standard for DoD security personnel to write and share compliance validation content would prevent duplicate work and aid in...process and consume much of the SCAP content available. Finally, it is free and easy to install as part of the Apache/ MySQL /PHP (AMP) [37

  19. An automated approach for annual layer counting in ice cores

    Science.gov (United States)

    Winstrup, M.; Svensson, A.; Rasmussen, S. O.; Winther, O.; Steig, E.; Axelrod, A.

    2012-04-01

    The temporal resolution of some ice cores is sufficient to preserve seasonal information in the ice core record. In such cases, annual layer counting represents one of the most accurate methods to produce a chronology for the core. Yet, manual layer counting is a tedious and sometimes ambiguous job. As reliable layer recognition becomes more difficult, a manual approach increasingly relies on human interpretation of the available data. Thus, much may be gained by an automated and therefore objective approach for annual layer identification in ice cores. We have developed a novel method for automated annual layer counting in ice cores, which relies on Bayesian statistics. It uses algorithms from the statistical framework of Hidden Markov Models (HMM), originally developed for use in machine speech recognition. The strength of this layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on purely objective criteria for annual layer identification. With this methodology, it is possible to determine the most likely position of multiple layer boundaries in an entire section of ice core data at once. It provides a probabilistic uncertainty estimate of the resulting layer count, hence ensuring a proper treatment of ambiguous layer boundaries in the data. Furthermore multiple data series can be incorporated to be used at once, hence allowing for a full multi-parameter annual layer counting method similar to a manual approach. In this study, the automated layer counting algorithm has been applied to data from the NGRIP ice core, Greenland. The NGRIP ice core has very high temporal resolution with depth, and hence the potential to be dated by annual layer counting far back in time. In previous studies [Andersen et al., 2006; Svensson et al., 2008], manual layer counting has been carried out back to 60 kyr BP. A comparison between the counted annual layers based on the two approaches will be presented

  20. Analysis Streamlining in ATLAS

    CERN Document Server

    Heinrich, Lukas; The ATLAS collaboration

    2018-01-01

    We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...

  1. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 553: Areas 19, 20 Mud Pits and Cellars, Nevada Test Site, Nevada, Rev. No. 0

    International Nuclear Information System (INIS)

    Boehlecke, Robert F.

    2006-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions necessary for the closure of Corrective Action Unit (CAU) 553: Areas 19, 20 Mud Pits and Cellars, Nevada Test Site (NTS), Nevada. It has been developed in accordance with the ''Federal Facility Agreement and Consent Order'' (FFACO) (1996) that was agreed to by the State of Nevada, the U.S. Department of Energy (DOE), and the U.S. Department of Defense. A SAFER may be performed when the following criteria are met: (1) Conceptual corrective actions are clearly identified (although some degree of investigation may be necessary to select a specific corrective action before completion of the Corrective Action Investigation [CAI]); (2) Uncertainty of the nature, extent, and corrective action must be limited to an acceptable level of risk; (3) The SAFER Plan includes decision points and criteria for making data quality objective (DQO) decisions. The purpose of the investigation will be to document and verify the adequacy of existing information; to affirm the decision for clean closure, closure in place, or no further action; and to provide sufficient data to implement the corrective action. The actual corrective action selected will be based on characterization activities implemented under this SAFER Plan. This SAFER Plan identifies decision points developed in cooperation with the Nevada Department of Environmental Protection (NDEP), where the DOE, National Nuclear Security Administration Nevada Site Office (NNSA/NSO) will reach consensus with the NDEP before beginning the next phase of work. Corrective Action Unit 553 is located in Areas 19 and 20 of the NTS, approximately 65 miles (mi) northwest of Las Vegas, Nevada (Figure 1-1). Corrective Action Unit 553 is comprised of the four Corrective Action Sites (CASs) shown on Figure 1-1 and listed below: 19-99-01, Mud Spill; 19-99-11, Mud Spill; 20-09-09, Mud Spill; and 20-99-03, Mud Spill. There is sufficient information and process

  2. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  3. An automated approach to mapping corn from Landsat imagery

    Science.gov (United States)

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  4. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 356: Mud Pits and Disposal Sites, Nevada Test Site, Nevada (Revision No. 0, August 2001); FINAL

    International Nuclear Information System (INIS)

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions necessary for the characterization and closure of Corrective Action Unit (CAU) 356, Mud Pits and Disposal Sites, as identified in the Federal Facility Agreement and Consent Order (FFACO). The CAU, located on the Nevada Test Site in Nevada, consists of seven Corrective Action Sites (CASs): CAS 03-04-01, Area 3 Change House Septic System; CAS 03-09-01, Mud Pit Spill Over; CAS 03-09-03, Mud Pit; CAS 03-09-04, Mud Pit; CAS 03-09-05, Mud Pit; CAS 20-16-01, Landfill; CAS 20-22-21, Drums. Sufficient information and process knowledge from historical documentation and investigations are the basis for the development of the phased approach chosen to address the data collection activities prior to implementing the preferred closure alternative for each CAS. The Phase I investigation will determine through collection of environmental samples from targeted populations (i.e., mud/soil cuttings above textural discontinuity) if contaminants of potential concern (COPCs) are present in concentrations exceeding preliminary action levels (PALs) at each of the CASs. If COPCs are present above PALs, a Phase II investigation will be implemented to determine the extent of contamination to support the appropriate corrective action alternative to complete closure of the site. Groundwater impacts from potentially migrating contaminants are not expected due to the depths to groundwater and limiting hydrologic drivers of low precipitation and high evaporation rates. Future land-use scenarios limit future uses to industrial activities; therefore, future residential uses are not considered. Potential exposure routes to site workers from contaminants of concern in septage and soils include oral ingestion, inhalation, or dermal contact (absorption) through in-advertent disturbance of contaminated structures and/or soils. Diesel within drilling muds is expected to be the primary COPC based on process

  5. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 539: Area 25 and Area 26 Railroad Tracks, Nevada Test Site, Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Mark Krauss

    2010-06-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 539, Areas 25 and 26 Railroad Tracks, as identified in the Federal Facility Agreement and Consent Order (FFACO). A modification to the FFACOwas approved in May 2010 to transfer the two Railroad Tracks corrective action sites (CASs) from CAU 114 into CAU539. The two CASs are located in Areas 25 and 26 of the Nevada Test Site: • 25-99-21, Area 25 Railroad Tracks • 26-99-05, Area 26 Railroad Tracks This plan provides the methodology for field activities needed to gather the necessary information for closing the two CASs. There is sufficient information and process knowledge from historical documentation and investigations of similar sites regarding the expected nature and extent of potential contaminants to recommend closure of the CAU 539 Railroad Tracks CASs using the SAFER process. Additional information will be obtained by conducting a field investigation before selecting the appropriate corrective action for each CAS. The results of the field investigation should support a defensible recommendation that no further corrective action is necessary. If it is determined that complete clean closure cannot be accomplished during the SAFER, then a hold point will have been reached and the Nevada Division of Environmental Protection (NDEP) will be consulted to determine whether the remaining contamination will be closed under the alternative corrective action of closure in place with use restrictions. This will be presented in a closure report that will be prepared and submitted to the NDEP for review and approval. The sites will be investigated based on the data quality objectives (DQOs) developed on December 14, 2009, by representatives of U.S.Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office; Navarro Nevada Environmental Services, LLC (NNES); and National Security Technologies

  6. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 465: Hydronuclear Nevada National Security Site, Nevada, with ROTC 1, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Patrick Matthews

    2011-11-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 465, Hydronuclear, identified in the Federal Facility Agreement and Consent Order (FFACO). Corrective Action Unit 465 comprises the following four corrective action sites (CASs) located in Areas 6 and 27 of the Nevada National Security Site: (1) 00-23-01, Hydronuclear Experiment; (2) 00-23-02, Hydronuclear Experiment; (3) 00-23-03, Hydronuclear Experiment; (4) 06-99-01, Hydronuclear. The sites will be investigated based on the data quality objectives (DQOs) developed on July 6, 2011, by representatives of the Nevada Division of Environmental Protection (NDEP) and the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office. The DQO process was used to identify and define the type, amount, and quality of data needed to determine and implement appropriate corrective actions for each CAS in CAU 465. For CAU 465, two potential release components have been identified. The subsurface release component includes potential releases of radiological and nonradiological contaminants from the subsurface hydronuclear experiments and disposal boreholes. The surface release component consists of other potential releases of radiological and nonradiological contaminants to surface soils that may have occurred during the pre- and post-test activities. This plan provides the methodology for collection of the necessary information for closing each CAS component. There is sufficient information and process knowledge from historical documentation, contaminant characteristics, existing regional and site groundwater models, and investigations of similar sites regarding the expected nature and extent of potential contaminants to recommend closure of CAU 465 using the SAFER process. For potential subsurface releases, flow and transport models will be developed to integrate existing data into a conservative

  7. Streamlined Approach for Environmental Restoration Work Plan for Corrective Action Unit 461: Joint Test Assembly Sites and Corrective Action Unit 495: Unconfirmed Joint Test Assembly Sites Tonopah Test Range, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Jeff Smith

    1998-08-01

    This Streamlined Approach for Environmental Restoration plan addresses the action necessary for the clean closure of Corrective Action Unit 461 (Test Area Joint Test Assembly Sites) and Corrective Action Unit 495 (Unconfirmed Joint Test Assembly Sites). The Corrective Action Units are located at the Tonopah Test Range in south central Nevada. Closure for these sites will be completed by excavating and evaluating the condition of each artillery round (if found); detonating the rounds (if necessary); excavating the impacted soil and debris; collecting verification samples; backfilling the excavations; disposing of the impacted soil and debris at an approved low-level waste repository at the Nevada Test Site

  8. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 116: Area 25 Test Cell C Facility, Nevada Test Site, Nevada, Revision 1

    International Nuclear Information System (INIS)

    2008-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan identifies the activities required for the closure of Corrective Action Unit (CAU) 116, Area 25 Test Cell C Facility. The Test Cell C (TCC) Facility is located in Area 25 of the Nevada Test Site (NTS) approximately 25 miles northwest of Mercury, Nevada (Figure 1). CAU 116 is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO) of 1996 (as amended February 2008) and consists of two Corrective Action Sites (CASs): (1) CAS 25-23-20, Nuclear Furnace Piping; and (2) CAS 25-41-05, Test Cell C Facility. CAS 25-41-05 is described in the FFACO as the TCC Facility but actually includes Building 3210 and attached concrete shield wall only. CAU 116 will be closed by demolishing Building 3210, the attached concrete shield wall, and the nuclear furnace piping. In addition, as a best management practice (BMP), Building 3211 (moveable shed) will be demolished due to its close proximity to Building 3210. This will aid in demolition and disposal operations. Radiological surveys will be performed on the demolition debris to determine the proper disposal pathway. As much of the demolition debris as space allows will be placed into the Building 3210 basement structure. After filling to capacity with demolition debris, the basement structure will be mounded or capped and closed with administrative controls. Prior to beginning demolition activities and according to an approved Sampling and Analysis Plan (SAP), representative sampling of surface areas that are known, suspected, or have the potential to contain hazardous constituents such as lead or polychlorinated biphenyls (PCBs) will be performed throughout all buildings and structures. Sections 2.3.2, 4.2.2.2, 4.2.2.3, 4.3, and 6.2.6.1 address the methodologies employed that assure the solid debris placed in the basement structure will not contain contaminants of concern (COCs) above hazardous waste levels. The anticipated post

  9. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  10. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  11. Streamlined bioreactor-based production of human cartilage tissues.

    Science.gov (United States)

    Tonnarelli, B; Santoro, R; Adelaide Asnaghi, M; Wendt, D

    2016-05-27

    Engineered tissue grafts have been manufactured using methods based predominantly on traditional labour-intensive manual benchtop techniques. These methods impart significant regulatory and economic challenges, hindering the successful translation of engineered tissue products to the clinic. Alternatively, bioreactor-based production systems have the potential to overcome such limitations. In this work, we present an innovative manufacturing approach to engineer cartilage tissue within a single bioreactor system, starting from freshly isolated human primary chondrocytes, through the generation of cartilaginous tissue grafts. The limited number of primary chondrocytes that can be isolated from a small clinically-sized cartilage biopsy could be seeded and extensively expanded directly within a 3D scaffold in our perfusion bioreactor (5.4 ± 0.9 doublings in 2 weeks), bypassing conventional 2D expansion in flasks. Chondrocytes expanded in 3D scaffolds better maintained a chondrogenic phenotype than chondrocytes expanded on plastic flasks (collagen type II mRNA, 18-fold; Sox-9, 11-fold). After this "3D expansion" phase, bioreactor culture conditions were changed to subsequently support chondrogenic differentiation for two weeks. Engineered tissues based on 3D-expanded chondrocytes were more cartilaginous than tissues generated from chondrocytes previously expanded in flasks. We then demonstrated that this streamlined bioreactor-based process could be adapted to effectively generate up-scaled cartilage grafts in a size with clinical relevance (50 mm diameter). Streamlined and robust tissue engineering processes, as the one described here, may be key for the future manufacturing of grafts for clinical applications, as they facilitate the establishment of compact and closed bioreactor-based production systems, with minimal automation requirements, lower operating costs, and increased compliance to regulatory guidelines.

  12. An agent-oriented approach to automated mission operations

    Science.gov (United States)

    Truszkowski, Walt; Odubiyi, Jide

    1994-01-01

    As we plan for the next generation of Mission Operations Control Center (MOCC) systems, there are many opportunities for the increased utilization of innovative knowledge-based technologies. The innovative technology discussed is an advanced use of agent-oriented approaches to the automation of mission operations. The paper presents an overview of this technology and discusses applied operational scenarios currently being investigated and prototyped. A major focus of the current work is the development of a simple user mechanism that would empower operations staff members to create, in real time, software agents to assist them in common, labor intensive operations tasks. These operational tasks would include: handling routine data and information management functions; amplifying the capabilities of a spacecraft analyst/operator to rapidly identify, analyze, and correct spacecraft anomalies by correlating complex data/information sets and filtering error messages; improving routine monitoring and trend analysis by detecting common failure signatures; and serving as a sentinel for spacecraft changes during critical maneuvers enhancing the system's capabilities to support nonroutine operational conditions with minimum additional staff. An agent-based testbed is under development. This testbed will allow us to: (1) more clearly understand the intricacies of applying agent-based technology in support of the advanced automation of mission operations and (2) access the full set of benefits that can be realized by the proper application of agent-oriented technology in a mission operations environment. The testbed under development addresses some of the data management and report generation functions for the Explorer Platform (EP)/Extreme UltraViolet Explorer (EUVE) Flight Operations Team (FOT). We present an overview of agent-oriented technology and a detailed report on the operation's concept for the testbed.

  13. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 544: Cellars, Mud Pits, and Oil Spills, Nevada Test Site, Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Mark Krauss

    2010-07-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 544, Cellars, Mud Pits, and Oil Spills, identified in the Federal Facility Agreement and Consent Order (FFACO). Corrective Action Unit 544 comprises the following 20 corrective action sites (CASs) located in Areas 2, 7, 9, 10, 12, 19, and 20 of the Nevada Test Site (NTS): • 02-37-08, Cellar & Mud Pit • 02-37-09, Cellar & Mud Pit • 07-09-01, Mud Pit • 09-09-46, U-9itsx20 PS #1A Mud Pit • 10-09-01, Mud Pit • 12-09-03, Mud Pit • 19-09-01, Mud Pits (2) • 19-09-03, Mud Pit • 19-09-04, Mud Pit • 19-25-01, Oil Spill • 19-99-06, Waste Spill • 20-09-01, Mud Pits (2) • 20-09-02, Mud Pit • 20-09-03, Mud Pit • 20-09-04, Mud Pits (2) • 20-09-06, Mud Pit • 20-09-07, Mud Pit • 20-09-10, Mud Pit • 20-25-04, Oil Spills • 20-25-05, Oil Spills This plan provides the methodology for field activities needed to gather the necessary information for closing each CAS. There is sufficient information and process knowledge from historical documentation and investigations of similar sites regarding the expected nature and extent of potential contaminants to recommend closure of CAU 544 using the SAFER process. Using the approach approved for previous mud pit investigations (CAUs 530–535), 14 mud pits have been identified that • are either a single mud pit or a system of mud pits, • are not located in a radiologically posted area, and • have no evident biasing factors based on visual inspections. These 14 mud pits are recommended for no further action (NFA), and further field investigations will not be conducted. For the sites that do not meet the previously approved closure criteria, additional information will be obtained by conducting a field investigation before selecting the appropriate corrective action for each CAS. The results of the field investigation will support a defensible

  14. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  15. Automated mango fruit assessment using fuzzy logic approach

    Science.gov (United States)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  16. Automated approach to nuclear facility safeguards effectiveness evaluation

    International Nuclear Information System (INIS)

    1977-01-01

    Concern over the security of nuclear facilities has generated a need for a reliable, time efficient, and easily applied method of evaluating the effectiveness of safeguards systems. Such an evaluation technique could be used (1) by the Nuclear Regulatory Commission to evaluate a licensee's proposal, (2) to assess the security status of a system, or (3) to design and/or upgrade nuclear facilities. The technique should be capable of starting with basic information, such as the facility layout and performance parameters for physical protection components, and analyzing that information so that a reliable overall facility evaluation is obtained. Responding to this expressed need, an automated approach to facility safeguards effectiveness evaluation has been developed. This procedure consists of a collection of functional modules for facility characterization, critical path generation, and path evaluation combined into a continuous stream of operations. The technique has been implemented on an interactive computer-timesharing system and makes use of computer graphics for the handling and presentation of information. Using this technique a thorough facility evaluation can be made by systematically varying parameters that characterize the physical protection components of a facility according to changes in perceived adversary attributes and strategy, environmental conditions, and site status

  17. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  18. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However...... (and reportedly one or two critics) can engage one another on several agreed questions about such frameworks. The goal is to aid non-aligned practitioners in choosing between alternative frameworks for their human-automation interaction design challenges....

  19. Automated approach to detecting behavioral states using EEG-DABS

    Directory of Open Access Journals (Sweden)

    Zachary B. Loris

    2017-07-01

    Full Text Available Electrocorticographic (ECoG signals represent cortical electrical dipoles generated by synchronous local field potentials that result from simultaneous firing of neurons at distinct frequencies (brain waves. Since different brain waves correlate to different behavioral states, ECoG signals presents a novel strategy to detect complex behaviors. We developed a program, EEG Detection Analysis for Behavioral States (EEG-DABS that advances Fast Fourier Transforms through ECoG signals time series, separating it into (user defined frequency bands and normalizes them to reduce variability. EEG-DABS determines events if segments of an experimental ECoG record have significantly different power bands than a selected control pattern of EEG. Events are identified at every epoch and frequency band and then are displayed as output graphs by the program. Certain patterns of events correspond to specific behaviors. Once a predetermined pattern was selected for a behavioral state, EEG-DABS correctly identified the desired behavioral event. The selection of frequency band combinations for detection of the behavior affects accuracy of the method. All instances of certain behaviors, such as freezing, were correctly identified from the event patterns generated with EEG-DABS. Detecting behaviors is typically achieved by visually discerning unique animal phenotypes, a process that is time consuming, unreliable, and subjective. EEG-DABS removes variability by using defined parameters of EEG/ECoG for a desired behavior over chronic recordings. EEG-DABS presents a simple and automated approach to quantify different behavioral states from ECoG signals.

  20. An Automated Approach to Reasoning Under Multiple Perspectives

    Science.gov (United States)

    deBessonet, Cary

    2004-01-01

    This is the final report with emphasis on research during the last term. The context for the research has been the development of an automated reasoning technology for use in SMS (symbolic Manipulation System), a system used to build and query knowledge bases (KBs) using a special knowledge representation language SL (Symbolic Language). SMS interpreters assertive SL input and enters the results as components of its universe. The system operates in two basic models: 1) constructive mode (for building KBs); and 2) query/search mode (for querying KBs). Query satisfaction consists of matching query components with KB components. The system allows "penumbral matches," that is, matches that do not exactly meet the specifications of the query, but which are deemed relevant for the conversational context. If the user wants to know whether SMS has information that holds, say, for "any chow," the scope of relevancy might be set so that the system would respond based on a finding that it has information that holds for "most dogs," although this is not exactly what was called for by the query. The response would be qualified accordingly, as would normally be the case in ordinary human conversation. The general goal of the research was to develop an approach by which assertive content could be interpreted from multiple perspectives so that reasoning operations could be successfully conducted over the results. The interpretation of an SL statement such as, "{person believes [captain (asserted (perhaps)) (astronaut saw (comet (bright)))]}," which in English would amount to asserting something to the effect that, "Some person believes that a captain perhaps asserted that an astronaut saw a bright comet," would require the recognition of multiple perspectives, including some that are: a) epistemically-based (focusing on "believes"); b) assertion-based (focusing on "asserted"); c) perception-based (focusing on "saw"); d) adjectivally-based (focusing on "bight"); and e) modally

  1. The Automated Aircraft Rework System (AARS): A system integration approach

    Science.gov (United States)

    Benoit, Michael J.

    1994-01-01

    The Mercer Engineering Research Center (MERC), under contract to the United States Air Force (USAF) since 1989, has been actively involved in providing the Warner Robins Air Logistics Center (WR-ALC) with a robotic workcell designed to perform rework automated defastening and hole location/transfer operations on F-15 wings. This paper describes the activities required to develop and implement this workcell, known as the Automated Aircraft Rework System (AARS). AARS is scheduled to be completely installed and in operation at WR-ALC by September 1994.

  2. An automated approach for annual layer counting in ice cores

    DEFF Research Database (Denmark)

    Winstrup, Mai; Svensson, A. M.; Rasmussen, S. O.

    2012-01-01

    A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of Hidden Markov Models (HMMs), which originally was developed for use in machine speech-recognition. The strength of the layer...

  3. An automated approach for annual layer counting in ice cores

    DEFF Research Database (Denmark)

    Winstrup, Mai; Svensson, A. M.; Rasmussen, S. O.

    2012-01-01

    A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of hidden Markov models (HMMs), which originally was developed for use in machine speech recognition. The strength of the layer...

  4. An automated approach to the design of decision tree classifiers

    Science.gov (United States)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  5. Streamline-based microfluidic device

    Science.gov (United States)

    Tai, Yu-Chong (Inventor); Zheng, Siyang (Inventor); Kasdan, Harvey (Inventor)

    2013-01-01

    The present invention provides a streamline-based device and a method for using the device for continuous separation of particles including cells in biological fluids. The device includes a main microchannel and an array of side microchannels disposed on a substrate. The main microchannel has a plurality of stagnation points with a predetermined geometric design, for example, each of the stagnation points has a predetermined distance from the upstream edge of each of the side microchannels. The particles are separated and collected in the side microchannels.

  6. Automated Generation of OCL Constraints: NL based Approach vs Pattern Based Approach

    Directory of Open Access Journals (Sweden)

    IMRAN SARWAR BAJWA

    2017-04-01

    Full Text Available This paper presents an approach used for automated generations of software constraints. In this model, the SBVR (Semantics of Business Vocabulary and Rules based semi-formal representation is obtained from the syntactic and semantic analysis of a NL (Natural Language (such as English sentence. A SBVR representation is easy to translate to other formal languages as SBVR is based on higher-order logic like other formal languages such as OCL (Object Constraint Language. The proposed model endows with a systematic and powerful system of incorporating NL knowledge on the formal languages. A prototype is constructed in Java (an Eclipse plug-in as a proof of the concept. The performance was tested for a few sample texts taken from existing research thesis reports and books

  7. An automated approach for annual layer counting in ice cores

    Directory of Open Access Journals (Sweden)

    M. Winstrup

    2012-11-01

    Full Text Available A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of hidden Markov models (HMMs, which originally was developed for use in machine speech recognition. The strength of the layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on statistical criteria for annual layer identification. The most likely positions of multiple layer boundaries in a section of ice core data are determined simultaneously, and a probabilistic uncertainty estimate of the resulting layer count is provided, ensuring an objective treatment of ambiguous layers in the data. Furthermore, multiple data series can be incorporated and used simultaneously. In this study, the automated layer counting algorithm has been applied to two ice core records from Greenland: one displaying a distinct annual signal and one which is more challenging. The algorithm shows high skill in reproducing the results from manual layer counts, and the resulting timescale compares well to absolute-dated volcanic marker horizons where these exist.

  8. A high-resolution peak fractionation approach for streamlined screening of nuclear-factor-E2-related factor-2 activators in Salvia miltiorrhiza.

    Science.gov (United States)

    Zhang, Hui; Luo, Li-Ping; Song, Hui-Peng; Hao, Hai-Ping; Zhou, Ping; Qi, Lian-Wen; Li, Ping; Chen, Jun

    2014-01-24

    Generation of a high-purity fraction library for efficiently screening active compounds from natural products is challenging because of their chemical diversity and complex matrices. In this work, a strategy combining high-resolution peak fractionation (HRPF) with a cell-based assay was proposed for target screening of bioactive constituents from natural products. In this approach, peak fractionation was conducted under chromatographic conditions optimized for high-resolution separation of the natural product extract. The HRPF approach was automatically performed according to the predefinition of certain peaks based on their retention times from a reference chromatographic profile. The corresponding HRPF database was collected with a parallel mass spectrometer to ensure purity and characterize the structures of compounds in the various fractions. Using this approach, a set of 75 peak fractions on the microgram scale was generated from 4mg of the extract of Salvia miltiorrhiza. After screening by an ARE-luciferase reporter gene assay, 20 diterpene quinones were selected and identified, and 16 of these compounds were reported to possess novel Nrf2 activation activity. Compared with conventional fixed-time interval fractionation, the HRPF approach could significantly improve the efficiency of bioactive compound discovery and facilitate the uncovering of minor active components. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Flexible automated approach for quantitative liquid handling of complex biological samples.

    Science.gov (United States)

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  10. Smartnotebook: A semi-automated approach to protein sequential NMR resonance assignments

    International Nuclear Information System (INIS)

    Slupsky, Carolyn M.; Boyko, Robert F.; Booth, Valerie K.; Sykes, Brian D.

    2003-01-01

    Complete and accurate NMR spectral assignment is a prerequisite for high-throughput automated structure determination of biological macromolecules. However, completely automated assignment procedures generally encounter difficulties for all but the most ideal data sets. Sources of these problems include difficulty in resolving correlations in crowded spectral regions, as well as complications arising from dynamics, such as weak or missing peaks, or atoms exhibiting more than one peak due to exchange phenomena. Smartnotebook is a semi-automated assignment software package designed to combine the best features of the automated and manual approaches. The software finds and displays potential connections between residues, while the spectroscopist makes decisions on which connection is correct, allowing rapid and robust assignment. In addition, smartnotebook helps the user fit chains of connected residues to the primary sequence of the protein by comparing the experimentally determined chemical shifts with expected shifts derived from a chemical shift database, while providing bookkeeping throughout the assignment procedure

  11. An approach for automated analysis of particle holograms

    Science.gov (United States)

    Stanton, A. C.; Caulfield, H. J.; Stewart, G. W.

    1984-01-01

    A simple method for analyzing droplet holograms is proposed that is readily adaptable to automation using modern image digitizers and analyzers for determination of the number, location, and size distributions of spherical or nearly spherical droplets. The method determines these parameters by finding the spatial location of best focus of the droplet images. With this location known, the particle size may be determined by direct measurement of image area in the focal plane. Particle velocity and trajectory may be determined by comparison of image locations at different instants in time. The method is tested by analyzing digitized images from a reconstructed in-line hologram, and the results show that the method is more accurate than a time-consuming plane-by-plane search for sharpest focus.

  12. Streamlining Smart Meter Data Analytics

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts

    2015-01-01

    of the so-called big data possible. This can improve energy management, e.g., help utilities improve the management of energy and services, and help customers save money. As this regard, the paper focuses on building an innovative software solution to streamline smart meter data analytic, aiming at dealing......Today smart meters are increasingly used in worldwide. Smart meters are the advanced meters capable of measuring customer energy consumption at a fine-grained time interval, e.g., every 15 minutes. The data are very sizable, and might be from different sources, along with the other social......-economic metrics such as the geographic information of meters, the information about users and their property, geographic location and others, which make the data management very complex. On the other hand, data-mining and the emerging cloud computing technologies make the collection, management, and analysis...

  13. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    Science.gov (United States)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  14. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  15. An automated approach for finding variable-constant pairing bugs

    DEFF Research Database (Denmark)

    Lawall, Julia; Lo, David

    2010-01-01

    program-analysis and data-mining based approach to identify the uses of named constants and to identify anomalies in these uses.  We have applied our approach to a recent version of the Linux kernel and have found a number of bugs affecting both correctness and software maintenance.  Many of these bugs...... have been validated by the Linux developers....

  16. An automated approach to network features of protein structure ensembles

    Science.gov (United States)

    Bhattacharyya, Moitrayee; Bhat, Chanda R; Vishveshwara, Saraswathi

    2013-01-01

    Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of β2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html. PMID:23934896

  17. An Integrated Systems Approach: A Description of an Automated Circulation Management System.

    Science.gov (United States)

    Seifert, Jan E.; And Others

    These bidding specifications describe requirements for a turn-key automated circulation system for the University of Oklahoma Libraries. An integrated systems approach is planned, and requirements are presented for various subsystems: acquisitions, fund accounting, reserve room, and bibliographic and serials control. Also outlined are hardware…

  18. A data driven approach for automating vehicle activated signs

    OpenAIRE

    Jomaa, Diala

    2016-01-01

    Vehicle activated signs (VAS) display a warning message when drivers exceed a particular threshold. VAS are often installed on local roads to display a warning message depending on the speed of the approaching vehicles. VAS are usually powered by electricity; however, battery and solar powered VAS are also commonplace. This thesis investigated devel-opment of an automatic trigger speed of vehicle activated signs in order to influence driver behaviour, the effect of which has been measured in ...

  19. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  20. ACHP | News | ACHP Issues Program Comment to Streamline Communication

    Science.gov (United States)

    Program Comment to Streamline Communication Facilities Construction and Modification ACHP Issues Program Comment to Streamline Communication Facilities Construction and Modification The Advisory Council on

  1. An Automated Approach for Complementing Ad Blockers’ Blacklists

    Directory of Open Access Journals (Sweden)

    Gugelmann David

    2015-06-01

    Full Text Available Privacy in the Web has become a major concern resulting in the popular use of various tools for blocking tracking services. Most of these tools rely on manually maintained blacklists, which need to be kept up-to-date to protect Web users’ privacy efficiently. It is challenging to keep pace with today’s quickly evolving advertisement and analytics landscape. In order to support blacklist maintainers with this task, we identify a set of Web traffic features for identifying privacyintrusive services. Based on these features, we develop an automatic approach that learns the properties of advertisement and analytics services listed by existing blacklists and proposes new services for inclusion on blacklists. We evaluate our technique on real traffic traces of a campus network and find in the order of 200 new privacy-intrusive Web services that are not listed by the most popular Firefox plug-in Adblock Plus. The proposed Web traffic features are easy to derive, allowing a distributed implementation of our approach.

  2. An Approach to Automated Fusion System Design and Adaptation

    Directory of Open Access Journals (Sweden)

    Alexander Fritze

    2017-03-01

    Full Text Available Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  3. An Approach to Automated Fusion System Design and Adaptation.

    Science.gov (United States)

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-03-16

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  4. Accelerated Logistics: Streamlining the Army's Supply Chain

    National Research Council Canada - National Science Library

    Wang, Mark

    2000-01-01

    ...) initiative, the Army has dramatically streamlined its supply chain, cutting order and ship times for repair parts by nearly two-thirds nationwide and over 75 percent at several of the major Forces Command (FORSCOM) installations...

  5. Creating customer value by streamlining business processes.

    Science.gov (United States)

    Vantrappen, H

    1992-02-01

    Much of the strategic preoccupation of senior managers in the 1990s is focusing on the creation of customer value. Companies are seeking competitive advantage by streamlining the three processes through which they interact with their customers: product creation, order handling and service assurance. 'Micro-strategy' is a term which has been coined for the trade-offs and decisions on where and how to streamline these three processes. The article discusses micro-strategies applied by successful companies.

  6. Towards Automated Binding Affinity Prediction Using an Iterative Linear Interaction Energy Approach

    Directory of Open Access Journals (Sweden)

    C. Ruben Vosmeer

    2014-01-01

    Full Text Available Binding affinity prediction of potential drugs to target and off-target proteins is an essential asset in drug development. These predictions require the calculation of binding free energies. In such calculations, it is a major challenge to properly account for both the dynamic nature of the protein and the possible variety of ligand-binding orientations, while keeping computational costs tractable. Recently, an iterative Linear Interaction Energy (LIE approach was introduced, in which results from multiple simulations of a protein-ligand complex are combined into a single binding free energy using a Boltzmann weighting-based scheme. This method was shown to reach experimental accuracy for flexible proteins while retaining the computational efficiency of the general LIE approach. Here, we show that the iterative LIE approach can be used to predict binding affinities in an automated way. A workflow was designed using preselected protein conformations, automated ligand docking and clustering, and a (semi-automated molecular dynamics simulation setup. We show that using this workflow, binding affinities of aryloxypropanolamines to the malleable Cytochrome P450 2D6 enzyme can be predicted without a priori knowledge of dominant protein-ligand conformations. In addition, we provide an outlook for an approach to assess the quality of the LIE predictions, based on simulation outcomes only.

  7. Alternative approach to automated management of load flow in engineering networks considering functional reliability

    Directory of Open Access Journals (Sweden)

    Ирина Александровна Гавриленко

    2016-02-01

    Full Text Available The approach to automated management of load flow in engineering networks considering functional reliability was proposed in the article. The improvement of the concept of operational and strategic management of load flow in engineering networks was considered. The verbal statement of the problem for thesis research is defined, namely, the problem of development of information technology for exact calculation of the functional reliability of the network, or the risk of short delivery of purpose-oriented product for consumers

  8. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2013-07-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites, with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize the data range and variance of each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed, and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  9. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation

    Directory of Open Access Journals (Sweden)

    Rahul Deb Das

    2016-11-01

    Full Text Available Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS. Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  10. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.

    Science.gov (United States)

    Das, Rahul Deb; Winter, Stephan

    2016-11-23

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  11. An alarm filtering system for an automated process: a multiple-agent approach

    International Nuclear Information System (INIS)

    Khoualdi, Kamel

    1994-01-01

    Nowadays, the supervision process of industrial installations is more and more complex involving the automation of their control. A malfunction generates an avalanche of alarms. The operator, in charge of the supervision, must face the incident and execute right actions to recover a normal situation. Generally, he is drowned under the great number of alarms. Our aim, in the frame of our researches, is to perform an alarm filtering system for an automated metro line, to help the operator finding the main alarm responsible for the malfunction. Our works are divided into two parts, both dealing with study and development of an alarm filtering system but using two different approaches. The first part is developed in the frame of the SARA project (an operator assistance system for an automated metro line) which is an expert system prototype helping the operators of a command center. In this part, a centralized approach has been used representing the events with a single event graph and using a global procedure to perform diagnosis. This approach has itself shown its limits. In the second part of our works, we have considered the distributed artificial intelligence (DAI) techniques, and more especially the multi-agent approach. The multi-agent approach has been motivated by the natural distribution of the metro line equipment and by the fact that each equipment has its own local control and knowledge. Thus, each equipment has been considered as an autonomous agent. Through agents cooperation, the system is able to determine the main alarm and the faulty equipment responsible for the incident. A prototype, written in SPIRAL (a tool for knowledge-based system) is running on a workstation. This prototype has allowed the concretization and the validation of our multi-agent approach. (author) [fr

  12. Impact assessment: Eroding benefits through streamlining?

    Energy Technology Data Exchange (ETDEWEB)

    Bond, Alan, E-mail: alan.bond@uea.ac.uk [School of Environmental Sciences, University of East Anglia (United Kingdom); School of Geo and Spatial Sciences, North-West University (South Africa); Pope, Jenny, E-mail: jenny@integral-sustainability.net [Integral Sustainability (Australia); Curtin University Sustainability Policy Institute (Australia); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, North-West University (South Africa); Environmental Science, Murdoch University (Australia); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, North-West University (South Africa); Gunn, Jill A.E., E-mail: jill.gunn@usask.ca [Department of Geography and Planning and School of Environment and Sustainability, University of Saskatchewan (Canada)

    2014-02-15

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environment and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment.

  13. Impact assessment: Eroding benefits through streamlining?

    International Nuclear Information System (INIS)

    Bond, Alan; Pope, Jenny; Morrison-Saunders, Angus; Retief, Francois; Gunn, Jill A.E.

    2014-01-01

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environment and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment

  14. Main approaches to automation of management systems in the coal industry. [Czechoslovakia

    Energy Technology Data Exchange (ETDEWEB)

    Zafouk, P; Dlabaya, I; Frous, S

    1980-01-01

    The main approaches to automation of management systems in the coal industry of Czechoslovakia are enumerated. Organizational structure of the branch and concern form of organization. Complex improvement of management system and source of continued development of the branch. Automated control systems, an integral part of the complex management system. Primary problem - automation in the area of design of the information system. Centralization of methodological management of operations in the area of control system development. Unified approach to breakdown of control system into branches. Organizational support of the development of the control system, problems solved by the department of control system development of the Ministry, main department of control system development of the Research Institute, departmental committees in the branch. The use of principles of control system development in the Ostravsko-Karvinsk mining concern is demonstrated. Preparation for development of the control system in the concern: elaboration of concepts and programs of control system development. Design of control system of the concern. Control system of an enterprise in the concern as an integral control system. Support of control system development in organizations, participants in this process, their jurisdiction and obligations. Annual plans of control system development. Centralized subsystems and enterprises. Methods of coordination of the process of improvement of control and support of the harmony of decisions made. Technical support of control system development, construction of a unified network of computer centers in enterprises with combined resources of computer technology.

  15. PASA - A Program for Automated Protein NMR Backbone Signal Assignment by Pattern-Filtering Approach

    International Nuclear Information System (INIS)

    Xu Yizhuang; Wang Xiaoxia; Yang Jun; Vaynberg, Julia; Qin Jun

    2006-01-01

    We present a new program, PASA (Program for Automated Sequential Assignment), for assigning protein backbone resonances based on multidimensional heteronuclear NMR data. Distinct from existing programs, PASA emphasizes a per-residue-based pattern-filtering approach during the initial stage of the automated 13 C α and/or 13 C β chemical shift matching. The pattern filter employs one or multiple constraints such as 13 C α /C β chemical shift ranges for different amino acid types and side-chain spin systems, which helps to rule out, in a stepwise fashion, improbable assignments as resulted from resonance degeneracy or missing signals. Such stepwise filtering approach substantially minimizes early false linkage problems that often propagate, amplify, and ultimately cause complication or combinatorial explosion of the automation process. Our program (http://www.lerner.ccf.org/moleccard/qin/) was tested on four representative small-large sized proteins with various degrees of resonance degeneracy and missing signals, and we show that PASA achieved the assignments efficiently and rapidly that are fully consistent with those obtained by laborious manual protocols. The results demonstrate that PASA may be a valuable tool for NMR-based structural analyses, genomics, and proteomics

  16. A Semi-automated Approach to Improve the Efficiency of Medical Imaging Segmentation for Haptic Rendering.

    Science.gov (United States)

    Banerjee, Pat; Hu, Mengqi; Kannan, Rahul; Krishnaswamy, Srinivasan

    2017-08-01

    The Sensimmer platform represents our ongoing research on simultaneous haptics and graphics rendering of 3D models. For simulation of medical and surgical procedures using Sensimmer, 3D models must be obtained from medical imaging data, such as magnetic resonance imaging (MRI) or computed tomography (CT). Image segmentation techniques are used to determine the anatomies of interest from the images. 3D models are obtained from segmentation and their triangle reduction is required for graphics and haptics rendering. This paper focuses on creating 3D models by automating the segmentation of CT images based on the pixel contrast for integrating the interface between Sensimmer and medical imaging devices, using the volumetric approach, Hough transform method, and manual centering method. Hence, automating the process has reduced the segmentation time by 56.35% while maintaining the same accuracy of the output at ±2 voxels.

  17. An automated approach for mapping persistent ice and snow cover over high latitude regions

    Science.gov (United States)

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  18. Industrial Compositional Streamline Simulation for Efficient and Accurate Prediction of Gas Injection and WAG Processes

    Energy Technology Data Exchange (ETDEWEB)

    Margot Gerritsen

    2008-10-31

    number of streamlines to number of threads is sufficiently high, which is the case in real-field applications. This is an important result, as it eases the transition of serial to parallel streamline codes. The parallel speedup itself depends on the relative contribution of the tracing and mapping stages as compared to the solution of the transport equations along streamlines. As the physical complexity of the simulated 1D transport process increases, the contribution of the less efficient tracing and mapping stages is reduced and near-linear scalabilities can be obtained. Our work clearly shows that the owner approach, in which threads are assigned whole streamlines, is more attractive than a distributed model, in which streamline segments are assigned to threads, because it allows re-use of existing sequential code for the 1D streamline solves, also for implicit time-stepping algorithms.

  19. An approach to evaluate task allocation between operators and automation with respect to safety of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Macwan, A.; Wei, Z.G.; Wieringa, P.A.

    1994-01-01

    Even though the use of automation is increasing in complex systems, its effect on safety cannot be systematically analyzed using current techniques. Of particular interest is task allocation between operators and automation. In evaluating its effect on safety, a quantitative definition of degree of automation (doA) is used. The definition of doA accounts for the effect of task on safety, irrespective of whether the task is carried out by operator or automation. Also included is the indirect effect due to the change in workload perceived by the operator. This workload is translated into stress which affects operator performance, expressed as human error probability, and subsequently, safety. The approach can be useful for evaluation of existing task allocation schemes as well as in making decisions about task allocation between operators and automation. (author). 13 refs, 1 fig

  20. A holistic approach to ZigBee performance enhancement for home automation networks.

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-08-14

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  1. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-01-01

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network. PMID:25196004

  2. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Directory of Open Access Journals (Sweden)

    August Betzler

    2014-08-01

    Full Text Available Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  3. Automated delineation and characterization of drumlins using a localized contour tree approach

    Science.gov (United States)

    Wang, Shujie; Wu, Qiusheng; Ward, Dylan

    2017-10-01

    Drumlins are ubiquitous landforms in previously glaciated regions, formed through a series of complex subglacial processes operating underneath the paleo-ice sheets. Accurate delineation and characterization of drumlins are essential for understanding the formation mechanism of drumlins as well as the flow behaviors and basal conditions of paleo-ice sheets. Automated mapping of drumlins is particularly important for examining the distribution patterns of drumlins across large spatial scales. This paper presents an automated vector-based approach to mapping drumlins from high-resolution light detection and ranging (LiDAR) data. The rationale is to extract a set of concentric contours by building localized contour trees and establishing topological relationships. This automated method can overcome the shortcomings of previously manual and automated methods for mapping drumlins, for instance, the azimuthal biases during the generation of shaded relief images. A case study was carried out over a portion of the New York Drumlin Field. Overall 1181 drumlins were identified from the LiDAR-derived DEM across the study region, which had been underestimated in previous literature. The delineation results were visually and statistically compared to the manual digitization results. The morphology of drumlins was characterized by quantifying the length, width, elongation ratio, height, area, and volume. Statistical and spatial analyses were conducted to examine the distribution pattern and spatial variability of drumlin size and form. The drumlins and the morphologic characteristics exhibit significant spatial clustering rather than randomly distributed patterns. The form of drumlins varies from ovoid to spindle shapes towards the downstream direction of paleo ice flows, along with the decrease in width, area, and volume. This observation is in line with previous studies, which may be explained by the variations in sediment thickness and/or the velocity increases of ice flows

  4. Two Automated Techniques for Carotid Lumen Diameter Measurement: Regional versus Boundary Approaches.

    Science.gov (United States)

    Araki, Tadashi; Kumar, P Krishna; Suri, Harman S; Ikeda, Nobutaka; Gupta, Ajay; Saba, Luca; Rajan, Jeny; Lavra, Francesco; Sharma, Aditya M; Shafique, Shoaib; Nicolaides, Andrew; Laird, John R; Suri, Jasjit S

    2016-07-01

    The degree of stenosis in the carotid artery can be predicted using automated carotid lumen diameter (LD) measured from B-mode ultrasound images. Systolic velocity-based methods for measurement of LD are subjective. With the advancement of high resolution imaging, image-based methods have started to emerge. However, they require robust image analysis for accurate LD measurement. This paper presents two different algorithms for automated segmentation of the lumen borders in carotid ultrasound images. Both algorithms are modeled as a two stage process. Stage one consists of a global-based model using scale-space framework for the extraction of the region of interest. This stage is common to both algorithms. Stage two is modeled using a local-based strategy that extracts the lumen interfaces. At this stage, the algorithm-1 is modeled as a region-based strategy using a classification framework, whereas the algorithm-2 is modeled as a boundary-based approach that uses the level set framework. Two sets of databases (DB), Japan DB (JDB) (202 patients, 404 images) and Hong Kong DB (HKDB) (50 patients, 300 images) were used in this study. Two trained neuroradiologists performed manual LD tracings. The mean automated LD measured was 6.35 ± 0.95 mm for JDB and 6.20 ± 1.35 mm for HKDB. The precision-of-merit was: 97.4 % and 98.0 % w.r.t to two manual tracings for JDB and 99.7 % and 97.9 % w.r.t to two manual tracings for HKDB. Statistical tests such as ANOVA, Chi-Squared, T-test, and Mann-Whitney test were conducted to show the stability and reliability of the automated techniques.

  5. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  6. Automated element identification for EDS spectra evaluation using quantification and integrated spectra simulation approaches

    International Nuclear Information System (INIS)

    Eggert, F

    2010-01-01

    This work describes first real automated solution for qualitative evaluation of EDS spectra in X-ray microanalysis. It uses a combination of integrated standardless quantitative evaluation, computation of analytical errors to a final uncertainty, and parts of recently developed simulation approaches. Multiple spectra reconstruction assessments and peak searches of the residual spectrum are powerful enough to solve the qualitative analytical question automatically for totally unknown specimens. The integrated quantitative assessment is useful to improve the confidence of the qualitative analysis. Therefore, the qualitative element analysis becomes a part of integrated quantitative spectrum evaluation, where the quantitative results are used to iteratively refine element decisions, spectrum deconvolution, and simulation steps.

  7. Streamlining the Bankability Process using International Standards

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Repins, Ingrid L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kelly, George [Sunset Technology, Mount Airy, MD; Ramu, Govind [SunPower, San Jose, California; Heinz, Matthias [TUV Rheinland, Cologne, Germany; Chen, Yingnan [CGC (China General Certification Center), Beijing; Wohlgemuth, John [PowerMark, Union Hall, VA; Lokanath, Sumanth [First Solar, Tempe, Arizona; Daniels, Eric [Suncycle USA, Frederick MD; Hsi, Edward [Swiss RE, Zurich, Switzerland; Yamamichi, Masaaki [RTS, Trumbull, CT

    2017-09-27

    NREL has supported the international efforts to create a streamlined process for documenting bankability and/or completion of each step of a PV project plan. IECRE was created for this purpose in 2014. This poster describes the goals, current status of this effort, and how individuals and companies can become involved.

  8. Hydrodynamic Drag on Streamlined Projectiles and Cavities

    KAUST Repository

    Jetly, Aditya

    2016-04-19

    The air cavity formation resulting from the water-entry of solid objects has been the subject of extensive research due to its application in various fields such as biology, marine vehicles, sports and oil and gas industries. Recently we demonstrated that at certain conditions following the closing of the air cavity formed by the initial impact of a superhydrophobic sphere on a free water surface a stable streamlined shape air cavity can remain attached to the sphere. The formation of superhydrophobic sphere and attached air cavity reaches a steady state during the free fall. In this thesis we further explore this novel phenomenon to quantify the drag on streamlined shape cavities. The drag on the sphere-cavity formation is then compared with the drag on solid projectile which were designed to have self-similar shape to that of the cavity. The solid projectiles of adjustable weight were produced using 3D printing technique. In a set of experiments on the free fall of projectile we determined the variation of projectiles drag coefficient as a function of the projectiles length to diameter ratio and the projectiles specific weight, covering a range of intermediate Reynolds number, Re ~ 104 – 105 which are characteristic for our streamlined cavity experiments. Parallel free fall experiment with sphere attached streamlined air cavity and projectile of the same shape and effective weight clearly demonstrated the drag reduction effect due to the stress-free boundary condition at cavity liquid interface. The streamlined cavity experiments can be used as the upper bound estimate of the drag reduction by air layers naturally sustained on superhydrophobic surfaces in contact with water. In the final part of the thesis we design an experiment to test the drag reduction capacity of robust superhydrophobic coatings deposited on the surface of various model vessels.

  9. A streamlined ribosome profiling protocol for the characterization of microorganisms

    DEFF Research Database (Denmark)

    Latif, Haythem; Szubin, Richard; Tan, Justin

    2015-01-01

    Ribosome profiling is a powerful tool for characterizing in vivo protein translation at the genome scale, with multiple applications ranging from detailed molecular mechanisms to systems-level predictive modeling. Though highly effective, this intricate technique has yet to become widely used...... in the microbial research community. Here we present a streamlined ribosome profiling protocol with reduced barriers to entry for microbial characterization studies. Our approach provides simplified alternatives during harvest, lysis, and recovery of monosomes and also eliminates several time-consuming steps...

  10. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  11. Lessons learned in streamlining the preparation of SNM standard solutions

    International Nuclear Information System (INIS)

    Clark, J.P.; Johnson, S.R.

    1986-01-01

    Improved safeguard measurements have produced a demand for greater quantities of reliable SNM solution standards. At the Savannah River Plant (SRP), the demand for these standards has been met by several innovations to improve the productivity and reliability of standards preparations. With the use of computer controlled balance, large batches of SNM stock solutions are prepared on a gravimetric basis. Accurately dispensed quantities of the stock solution are weighed and stored in bottles. When needed, they are quantitatively transferred to tared containers, matrix adjusted to target concentrations, weighed, and measured for density at 25 0 C. Concentrations of SNM are calculated both gravimetrically and volumetrically. Calculated values are confirmed analytically before the standards are used in measurement control program (MCP) activities. The lessons learned include: MCP goals include error identification and management. Strategy modifications are required to improve error management. Administrative controls can minimize certain types of errors. Automation can eliminate redundancy and streamline preparations. Prudence and simplicity enhance automation success. The effort expended to increase productivity has increased the reliability of standards and provided better documentation for quality assurance

  12. Silhouette-based approach of 3D image reconstruction for automated image acquisition using robotic arm

    Science.gov (United States)

    Azhar, N.; Saad, W. H. M.; Manap, N. A.; Saad, N. M.; Syafeeza, A. R.

    2017-06-01

    This study presents the approach of 3D image reconstruction using an autonomous robotic arm for the image acquisition process. A low cost of the automated imaging platform is created using a pair of G15 servo motor connected in series to an Arduino UNO as a main microcontroller. Two sets of sequential images were obtained using different projection angle of the camera. The silhouette-based approach is used in this study for 3D reconstruction from the sequential images captured from several different angles of the object. Other than that, an analysis based on the effect of different number of sequential images on the accuracy of 3D model reconstruction was also carried out with a fixed projection angle of the camera. The effecting elements in the 3D reconstruction are discussed and the overall result of the analysis is concluded according to the prototype of imaging platform.

  13. An automated approach for segmentation of intravascular ultrasound images based on parametric active contour models

    International Nuclear Information System (INIS)

    Vard, Alireza; Jamshidi, Kamal; Movahhedinia, Naser

    2012-01-01

    This paper presents a fully automated approach to detect the intima and media-adventitia borders in intravascular ultrasound images based on parametric active contour models. To detect the intima border, we compute a new image feature applying a combination of short-term autocorrelations calculated for the contour pixels. These feature values are employed to define an energy function of the active contour called normalized cumulative short-term autocorrelation. Exploiting this energy function, the intima border is separated accurately from the blood region contaminated by high speckle noise. To extract media-adventitia boundary, we define a new form of energy function based on edge, texture and spring forces for the active contour. Utilizing this active contour, the media-adventitia border is identified correctly even in presence of branch openings and calcifications. Experimental results indicate accuracy of the proposed methods. In addition, statistical analysis demonstrates high conformity between manual tracing and the results obtained by the proposed approaches.

  14. Report: Follow-Up Report: EPA Proposes to Streamline the Review, Management and Disposal of Hazardous Waste Pharmaceuticals

    Science.gov (United States)

    Report #15-P-0260, August 19, 2015. EPA states that it intends to issue a proposed rule, Management Standards for Hazardous Waste, which will attempt to streamline the approach to managing and disposing of hazardous and nonhazardous pharmaceutical waste.

  15. A Crowd-Based Intelligence Approach for Measurable Security, Privacy, and Dependability in Internet of Automated Vehicles with Vehicular Fog

    Directory of Open Access Journals (Sweden)

    Ashish Rauniyar

    2018-01-01

    Full Text Available With the advent of Internet of things (IoT and cloud computing technologies, we are in the era of automation, device-to-device (D2D and machine-to-machine (M2M communications. Automated vehicles have recently gained a huge attention worldwide, and it has created a new wave of revolution in automobile industries. However, in order to fully establish automated vehicles and their connectivity to the surroundings, security, privacy, and dependability always remain a crucial issue. One cannot deny the fact that such automatic vehicles are highly vulnerable to different kinds of security attacks. Also, today’s such systems are built from generic components. Prior analysis of different attack trends and vulnerabilities enables us to deploy security solutions effectively. Moreover, scientific research has shown that a “group” can perform better than individuals in making decisions and predictions. Therefore, this paper deals with the measurable security, privacy, and dependability of automated vehicles through the crowd-based intelligence approach that is inspired from swarm intelligence. We have studied three use case scenarios of automated vehicles and systems with vehicular fog and have analyzed the security, privacy, and dependability metrics of such systems. Our systematic approaches to measuring efficient system configuration, security, privacy, and dependability of automated vehicles are essential for getting the overall picture of the system such as design patterns, best practices for configuration of system, metrics, and measurements.

  16. Streamlining Research by Using Existing Tools

    OpenAIRE

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...

  17. A Noble Approach of Process Automation in Galvanized Nut, Bolt Manufacturing Industry

    Directory of Open Access Journals (Sweden)

    Akash Samanta

    2012-05-01

    Full Text Available Corrosion costs money”, The Columbus battle institute estimates that corrosion costs Americans more than $ 220 billion annually, about 4.3% of the gross natural product [1].Now a days due to increase of pollution, the rate of corrosion is also increasing day-by-day mainly in India, so, to save the steel structures, galvanizing is the best and the simplest solution. Due to this reason galvanizing industries are increasing day-by-day since mid of 1700s.Galvanizing is a controlled metallurgical combination of zinc and steel that can provide a corrosion resistance in a wide variety of environment. In fact, the galvanized metal corrosion resistance factor can be some 70 to 80 times greater that the base metal material. Keeping in mind the importance of this industry, a noble approach of process automation in galvanized nut-bolt  manufacturing plant is presented here as nuts and bolts are the prime ingredient of any structure. In this paper the main objectives of any industry like survival, profit maximization, profit satisfying and sales growth are fulfilled. Furthermore the environmental aspects i.e. pollution control and energy saving are also considered in this paper. The whole automation process is done using programmable logic controller (PLC which has number of unique advantages like being faster, reliable, requires less maintenance and reprogrammable. The whole system has been designed and tested using GE, FANUC PLC.

  18. Expert system isssues in automated, autonomous space vehicle rendezvous

    Science.gov (United States)

    Goodwin, Mary Ann; Bochsler, Daniel C.

    1987-01-01

    The problems involved in automated autonomous rendezvous are briefly reviewed, and the Rendezvous Expert (RENEX) expert system is discussed with reference to its goals, approach used, and knowledge structure and contents. RENEX has been developed to support streamlining operations for the Space Shuttle and Space Station program and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample return unmanned missions. The experience with REMEX to date and recommendations for further development are presented.

  19. Rapid prototyping of an automated video surveillance system: a hardware-software co-design approach

    Science.gov (United States)

    Ngo, Hau T.; Rakvic, Ryan N.; Broussard, Randy P.; Ives, Robert W.

    2011-06-01

    FPGA devices with embedded DSP and memory blocks, and high-speed interfaces are ideal for real-time video processing applications. In this work, a hardware-software co-design approach is proposed to effectively utilize FPGA features for a prototype of an automated video surveillance system. Time-critical steps of the video surveillance algorithm are designed and implemented in the FPGAs logic elements to maximize parallel processing. Other non timecritical tasks are achieved by executing a high level language program on an embedded Nios-II processor. Pre-tested and verified video and interface functions from a standard video framework are utilized to significantly reduce development and verification time. Custom and parallel processing modules are integrated into the video processing chain by Altera's Avalon Streaming video protocol. Other data control interfaces are achieved by connecting hardware controllers to a Nios-II processor using Altera's Avalon Memory Mapped protocol.

  20. Development of an automated guided vehicle controller using a systems engineering approach

    Directory of Open Access Journals (Sweden)

    Ferreira, Tremaine

    2016-08-01

    Full Text Available Automated guided vehicles (AGVs are widely used for transporting materials in industry and commerce. In this research, an intelligent AGV-based material-handling system was developed using a model- based systems engineering (MBSE approach. The core of the AGV, the controller, was designed in the system modelling language environment using Visual Paradigm software, and then implemented in the hardware. As the result, the AGV’s complex tasks of material handling, navigation, and communication were successfully accomplished and tested in the real industrial environment. The developed AGV is capable of towing trolleys with a weight of up to 200kg at walking speed. The AGV can be incorporated into an intelligent material-handling system with multiple autonomous vehicles and work stations, thus providing flexibility and reconfigurability for the whole manufacturing system. Ergonomic and safety aspects were also considered in the design of the AGV. A comprehensive safety system that is compliant with industrial standards was implemented.

  1. A data-driven multiplicative fault diagnosis approach for automation processes.

    Science.gov (United States)

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  2. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    Science.gov (United States)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  3. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    Science.gov (United States)

    Niklasson, Markus; Ahlner, Alexandra; Andresen, Cecilia; Marsh, Joseph A; Lundström, Patrik

    2015-01-01

    The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  4. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    Directory of Open Access Journals (Sweden)

    Markus Niklasson

    2015-01-01

    Full Text Available The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  5. A Machine Learning Approach to Automated Gait Analysis for the Noldus Catwalk System.

    Science.gov (United States)

    Frohlich, Holger; Claes, Kasper; De Wolf, Catherine; Van Damme, Xavier; Michel, Anne

    2018-05-01

    Gait analysis of animal disease models can provide valuable insights into in vivo compound effects and thus help in preclinical drug development. The purpose of this paper is to establish a computational gait analysis approach for the Noldus Catwalk system, in which footprints are automatically captured and stored. We present a - to our knowledge - first machine learning based approach for the Catwalk system, which comprises a step decomposition, definition and extraction of meaningful features, multivariate step sequence alignment, feature selection, and training of different classifiers (gradient boosting machine, random forest, and elastic net). Using animal-wise leave-one-out cross validation we demonstrate that with our method we can reliable separate movement patterns of a putative Parkinson's disease animal model and several control groups. Furthermore, we show that we can predict the time point after and the type of different brain lesions and can even forecast the brain region, where the intervention was applied. We provide an in-depth analysis of the features involved into our classifiers via statistical techniques for model interpretation. A machine learning method for automated analysis of data from the Noldus Catwalk system was established. Our works shows the ability of machine learning to discriminate pharmacologically relevant animal groups based on their walking behavior in a multivariate manner. Further interesting aspects of the approach include the ability to learn from past experiments, improve with more data arriving and to make predictions for single animals in future studies.

  6. Streamlining of the Decontamination and Demolition Document Preparation Process

    International Nuclear Information System (INIS)

    Durand, Nick; Meincke, Carol; Peek, Georgianne

    1999-01-01

    During the past five years, the Sandia National Labo- ratories Decontamination, Decommissioning, Demolition, and Reuse (D3R) Program has evolved and become more focused and efficient. Historical approaches to project documentation, requirements, and drivers are discussed detailing key assumptions, oversight authority, and proj- ect approvals. Discussion of efforts to streamline the D3R project planning and preparation process include the in- corporation of the principles of graded approach, Total Quality Management, and the Observational Method (CH2MHILL April 1989).1 Process improvements were realized by clearly defining regulatory requirements for each phase of a project, establishing general guidance for the program and combining project-specific documents to eliminate redundant and unneeded information. Proc- ess improvements to cost, schedule, and quality are dis- cussed in detail for several projects

  7. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    Directory of Open Access Journals (Sweden)

    Mutlu Ozdogan

    Full Text Available In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i creating masks for water, non-forested areas, clouds, and cloud shadows; ii identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR difference image; iii filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission, issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for

  8. Automated EEG sleep staging in the term-age baby using a generative modelling approach

    Science.gov (United States)

    Pillay, Kirubin; Dereymaeker, Anneleen; Jansen, Katrien; Naulaers, Gunnar; Van Huffel, Sabine; De Vos, Maarten

    2018-06-01

    Objective. We develop a method for automated four-state sleep classification of preterm and term-born babies at term-age of 38-40 weeks postmenstrual age (the age since the last menstrual cycle of the mother) using multichannel electroencephalogram (EEG) recordings. At this critical age, EEG differentiates from broader quiet sleep (QS) and active sleep (AS) stages to four, more complex states, and the quality and timing of this differentiation is indicative of the level of brain development. However, existing methods for automated sleep classification remain focussed only on QS and AS sleep classification. Approach. EEG features were calculated from 16 EEG recordings, in 30 s epochs, and personalized feature scaling used to correct for some of the inter-recording variability, by standardizing each recording’s feature data using its mean and standard deviation. Hidden Markov models (HMMs) and Gaussian mixture models (GMMs) were trained, with the HMM incorporating knowledge of the sleep state transition probabilities. Performance of the GMM and HMM (with and without scaling) were compared, and Cohen’s kappa agreement calculated between the estimates and clinicians’ visual labels. Main results. For four-state classification, the HMM proved superior to the GMM. With the inclusion of personalized feature scaling, mean kappa (±standard deviation) was 0.62 (±0.16) compared to the GMM value of 0.55 (±0.15). Without feature scaling, kappas for the HMM and GMM dropped to 0.56 (±0.18) and 0.51 (±0.15), respectively. Significance. This is the first study to present a successful method for the automated staging of four states in term-age sleep using multichannel EEG. Results suggested a benefit in incorporating transition information using an HMM, and correcting for inter-recording variability through personalized feature scaling. Determining the timing and quality of these states are indicative of developmental delays in both preterm and term-born babies that may

  9. Lightroom 5 streamlining your digital photography process

    CERN Document Server

    Sylvan, Rob

    2014-01-01

    Manage your images with Lightroom and this beautifully illustrated guide Image management can soak up huge amounts of a photographer's time, but help is on hand. This complete guides teaches you how to use Adobe Lightroom 5 to import, manage, edit, and showcase large quantities of images with impressive results. The authors, both professional photographers and Lightroom experts, walk you through step by step, demonstrating real-world techniques as well as a variety of practical tips, tricks, and shortcuts that save you time. Streamline image management tasks like a pro, and get back to doing

  10. Automated, high accuracy classification of Parkinsonian disorders: a pattern recognition approach.

    Directory of Open Access Journals (Sweden)

    Andre F Marquand

    Full Text Available Progressive supranuclear palsy (PSP, multiple system atrophy (MSA and idiopathic Parkinson's disease (IPD can be clinically indistinguishable, especially in the early stages, despite distinct patterns of molecular pathology. Structural neuroimaging holds promise for providing objective biomarkers for discriminating these diseases at the single subject level but all studies to date have reported incomplete separation of disease groups. In this study, we employed multi-class pattern recognition to assess the value of anatomical patterns derived from a widely available structural neuroimaging sequence for automated classification of these disorders. To achieve this, 17 patients with PSP, 14 with IPD and 19 with MSA were scanned using structural MRI along with 19 healthy controls (HCs. An advanced probabilistic pattern recognition approach was employed to evaluate the diagnostic value of several pre-defined anatomical patterns for discriminating the disorders, including: (i a subcortical motor network; (ii each of its component regions and (iii the whole brain. All disease groups could be discriminated simultaneously with high accuracy using the subcortical motor network. The region providing the most accurate predictions overall was the midbrain/brainstem, which discriminated all disease groups from one another and from HCs. The subcortical network also produced more accurate predictions than the whole brain and all of its constituent regions. PSP was accurately predicted from the midbrain/brainstem, cerebellum and all basal ganglia compartments; MSA from the midbrain/brainstem and cerebellum and IPD from the midbrain/brainstem only. This study demonstrates that automated analysis of structural MRI can accurately predict diagnosis in individual patients with Parkinsonian disorders, and identifies distinct patterns of regional atrophy particularly useful for this process.

  11. Are the new automated methods for bone age estimation advantageous over the manual approaches?

    Science.gov (United States)

    De Sanctis, Vincenzo; Soliman, Ashraf T; Di Maio, Salvatore; Bedair, Said

    2014-12-01

    Bone Age Assessment (BAA) is performed worldwide for the evaluation of endocrine, genetic and chronic diseases, to monitor response to medical therapy and to determine the growth potential of children and adolescents. It is also used for consultation in planning orthopedic procedures, for determination of chronological age for adopted children, youth sports participation and in forensic settings. The main clinical methods for skeletal bone age estimation are the Greulich and Pyle (GP) and the Tanner and Whitehouse (TW) methods. Seventy six per cent (76%) of radiologists or pediatricians usually use the method of GP, 20% that of TW and 4% other methods. The advantages of using the TW method, as opposed to the GP method, are that it overcomes the subjectivity problem and results are more reproducible. However, it is complex and time consuming; for this reason its usage is just about 20% on a world-wide scale. Moreover, there are some evidences that bone age assignments by different physicians can differ significantly. Computerized and Quantitative Ultrasound Technologies (QUS) for assessing skeletal maturity have been developed with the aim of reducing many of the inconsistencies associated with radiographic investigations. In spite of the fact that the volume of automated methods for BAA has increased, the majotity of them are still in an early phase of development. QUS is comparable to the GP based method, but there is not enough established data yet for the healthy population. The Authors wish to stimulate the attention on the accuracy, reliability and consistency of BAA and to initiate a debate on manual versus automated approaches to enhance our assessment for skeletal matutation in children and adolescents.

  12. Assessment of tobacco smoke effects on neonatal cardiorespiratory control using a semi-automated processing approach.

    Science.gov (United States)

    Al-Omar, Sally; Le Rolle, Virginie; Beuchée, Alain; Samson, Nathalie; Praud, Jean-Paul; Carrault, Guy

    2018-05-10

    A semi-automated processing approach was developed to assess the effects of early postnatal environmental tobacco smoke (ETS) on the cardiorespiratory control of newborn lambs. The system consists of several steps beginning with artifact rejection, followed by the selection of stationary segments, and ending with feature extraction. This approach was used in six lambs exposed to 20 cigarettes/day for the first 15 days of life, while another six control lambs were exposed to room air. On postnatal day 16, electrocardiograph and respiratory signals were obtained from a 6-h polysomnographic recording. The effects of postnatal ETS exposure on heart rate variability, respiratory rate variability, and cardiorespiratory interrelations were explored. The unique results suggest that early postnatal ETS exposure increases respiratory rate variability and decreases the coupling between cardiac and respiratory systems. Potentially harmful consequences in early life include unstable breathing and decreased adaptability of cardiorespiratory function, particularly during early life challenges, such as prematurity or viral infection. Graphical abstract ᅟ.

  13. A Generic Deep-Learning-Based Approach for Automated Surface Inspection.

    Science.gov (United States)

    Ren, Ruoxu; Hung, Terence; Tan, Kay Chen

    2018-03-01

    Automated surface inspection (ASI) is a challenging task in industry, as collecting training dataset is usually costly and related methods are highly dataset-dependent. In this paper, a generic approach that requires small training data for ASI is proposed. First, this approach builds classifier on the features of image patches, where the features are transferred from a pretrained deep learning network. Next, pixel-wise prediction is obtained by convolving the trained classifier over input image. An experiment on three public and one industrial data set is carried out. The experiment involves two tasks: 1) image classification and 2) defect segmentation. The results of proposed algorithm are compared against several best benchmarks in literature. In the classification tasks, the proposed method improves accuracy by 0.66%-25.50%. In the segmentation tasks, the proposed method reduces error escape rates by 6.00%-19.00% in three defect types and improves accuracies by 2.29%-9.86% in all seven defect types. In addition, the proposed method achieves 0.0% error escape rate in the segmentation task of industrial data.

  14. Streamlining environmental product declarations: a stage model

    Science.gov (United States)

    Lefebvre, Elisabeth; Lefebvre, Louis A.; Talbot, Stephane; Le Hen, Gael

    2001-02-01

    General public environmental awareness and education is increasing, therefore stimulating the demand for reliable, objective and comparable information about products' environmental performances. The recently published standard series ISO 14040 and ISO 14025 are normalizing the preparation of Environmental Product Declarations (EPDs) containing comprehensive information relevant to a product's environmental impact during its life cycle. So far, only a few environmentally leading manufacturing organizations have experimented the preparation of EPDs (mostly from Europe), demonstrating its great potential as a marketing weapon. However the preparation of EPDs is a complex process, requiring collection and analysis of massive amounts of information coming from disparate sources (suppliers, sub-contractors, etc.). In a foreseeable future, the streamlining of the EPD preparation process will require product manufacturers to adapt their information systems (ERP, MES, SCADA) in order to make them capable of gathering, and transmitting the appropriate environmental information. It also requires strong functional integration all along the product supply chain in order to ensure that all the information is made available in a standardized and timely manner. The goal of the present paper is two fold: first to propose a transitional model towards green supply chain management and EPD preparation; second to identify key technologies and methodologies allowing to streamline the EPD process and subsequently the transition toward sustainable product development

  15. Streamlining cardiovascular clinical trials to improve efficiency and generalisability.

    Science.gov (United States)

    Zannad, Faiez; Pfeffer, Marc A; Bhatt, Deepak L; Bonds, Denise E; Borer, Jeffrey S; Calvo-Rojas, Gonzalo; Fiore, Louis; Lund, Lars H; Madigan, David; Maggioni, Aldo Pietro; Meyers, Catherine M; Rosenberg, Yves; Simon, Tabassome; Stough, Wendy Gattis; Zalewski, Andrew; Zariffa, Nevine; Temple, Robert

    2017-08-01

    Controlled trials provide the most valid determination of the efficacy and safety of an intervention, but large cardiovascular clinical trials have become extremely costly and complex, making it difficult to study many important clinical questions. A critical question, and the main objective of this review, is how trials might be simplified while maintaining randomisation to preserve scientific integrity and unbiased efficacy assessments. Experience with alternative approaches is accumulating, specifically with registry-based randomised controlled trials that make use of data already collected. This approach addresses bias concerns while still capitalising on the benefits and efficiencies of a registry. Several completed or ongoing trials illustrate the feasibility of using registry-based controlled trials to answer important questions relevant to daily clinical practice. Randomised trials within healthcare organisation databases may also represent streamlined solutions for some types of investigations, although data quality (endpoint assessment) is likely to be a greater concern in those settings. These approaches are not without challenges, and issues pertaining to informed consent, blinding, data quality and regulatory standards remain to be fully explored. Collaboration among stakeholders is necessary to achieve standards for data management and analysis, to validate large data sources for use in randomised trials, and to re-evaluate ethical standards to encourage research while also ensuring that patients are protected. The rapidly evolving efforts to streamline cardiovascular clinical trials have the potential to lead to major advances in promoting better care and outcomes for patients with cardiovascular disease. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. A geometrical approach for semi-automated crystal centering and in situ X-ray diffraction data collection

    International Nuclear Information System (INIS)

    Mohammad Yaser Heidari Khajepour; Ferrer, Jean-Luc; Lebrette, Hugo; Vernede, Xavier; Rogues, Pierrick

    2013-01-01

    High-throughput protein crystallography projects pushed forward the development of automated crystallization platforms that are now commonly used. This created an urgent need for adapted and automated equipment for crystal analysis. However, first these crystals have to be harvested, cryo-protected and flash-cooled, operations that can fail or negatively impact on the crystal. In situ X-ray diffraction analysis has become a valid alternative to these operations, and a growing number of users apply it for crystal screening and to solve structures. Nevertheless, even this shortcut may require a significant amount of beam time. In this in situ high-throughput approach, the centering of crystals relative to the beam represents the bottleneck in the analysis process. In this article, a new method to accelerate this process, by recording accurately the local geometry coordinates for each crystal in the crystallization plate, is presented. Subsequently, the crystallization plate can be presented to the X-ray beam by an automated plate-handling device, such as a six-axis robot arm, for an automated crystal centering in the beam, in situ screening or data collection. Here the preliminary results of such a semi-automated pipeline are reported for two distinct test proteins. (authors)

  17. Improving automated multiple sclerosis lesion segmentation with a cascaded 3D convolutional neural network approach.

    Science.gov (United States)

    Valverde, Sergi; Cabezas, Mariano; Roura, Eloy; González-Villà, Sandra; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Rovira, Àlex; Oliver, Arnau; Lladó, Xavier

    2017-07-15

    In this paper, we present a novel automated method for White Matter (WM) lesion segmentation of Multiple Sclerosis (MS) patient images. Our approach is based on a cascade of two 3D patch-wise convolutional neural networks (CNN). The first network is trained to be more sensitive revealing possible candidate lesion voxels while the second network is trained to reduce the number of misclassified voxels coming from the first network. This cascaded CNN architecture tends to learn well from a small (n≤35) set of labeled data of the same MRI contrast, which can be very interesting in practice, given the difficulty to obtain manual label annotations and the large amount of available unlabeled Magnetic Resonance Imaging (MRI) data. We evaluate the accuracy of the proposed method on the public MS lesion segmentation challenge MICCAI2008 dataset, comparing it with respect to other state-of-the-art MS lesion segmentation tools. Furthermore, the proposed method is also evaluated on two private MS clinical datasets, where the performance of our method is also compared with different recent public available state-of-the-art MS lesion segmentation methods. At the time of writing this paper, our method is the best ranked approach on the MICCAI2008 challenge, outperforming the rest of 60 participant methods when using all the available input modalities (T1-w, T2-w and FLAIR), while still in the top-rank (3rd position) when using only T1-w and FLAIR modalities. On clinical MS data, our approach exhibits a significant increase in the accuracy segmenting of WM lesions when compared with the rest of evaluated methods, highly correlating (r≥0.97) also with the expected lesion volume. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. The systems approach for applying artificial intelligence to space station automation (Invited Paper)

    Science.gov (United States)

    Grose, Vernon L.

    1985-12-01

    The progress of technology is marked by fragmentation -- dividing research and development into ever narrower fields of specialization. Ultimately, specialists know everything about nothing. And hope for integrating those slender slivers of specialty into a whole fades. Without an integrated, all-encompassing perspective, technology becomes applied in a lopsided and often inefficient manner. A decisionary model, developed and applied for NASA's Chief Engineer toward establishment of commercial space operations, can be adapted to the identification, evaluation, and selection of optimum application of artificial intelligence for space station automation -- restoring wholeness to a situation that is otherwise chaotic due to increasing subdivision of effort. Issues such as functional assignments for space station task, domain, and symptom modules can be resolved in a manner understood by all parties rather than just the person with assigned responsibility -- and ranked by overall significance to mission accomplishment. Ranking is based on the three basic parameters of cost, performance, and schedule. This approach has successfully integrated many diverse specialties in situations like worldwide terrorism control, coal mining safety, medical malpractice risk, grain elevator explosion prevention, offshore drilling hazards, and criminal justice resource allocation -- all of which would have otherwise been subject to "squeaky wheel" emphasis and support of decision-makers.

  19. Holistic approach for automated background EEG assessment in asphyxiated full-term infants

    Science.gov (United States)

    Matic, Vladimir; Cherian, Perumpillichira J.; Koolen, Ninah; Naulaers, Gunnar; Swarte, Renate M.; Govaert, Paul; Van Huffel, Sabine; De Vos, Maarten

    2014-12-01

    Objective. To develop an automated algorithm to quantify background EEG abnormalities in full-term neonates with hypoxic ischemic encephalopathy. Approach. The algorithm classifies 1 h of continuous neonatal EEG (cEEG) into a mild, moderate or severe background abnormality grade. These classes are well established in the literature and a clinical neurophysiologist labeled 272 1 h cEEG epochs selected from 34 neonates. The algorithm is based on adaptive EEG segmentation and mapping of the segments into the so-called segments’ feature space. Three features are suggested and further processing is obtained using a discretized three-dimensional distribution of the segments’ features represented as a 3-way data tensor. Further classification has been achieved using recently developed tensor decomposition/classification methods that reduce the size of the model and extract a significant and discriminative set of features. Main results. Effective parameterization of cEEG data has been achieved resulting in high classification accuracy (89%) to grade background EEG abnormalities. Significance. For the first time, the algorithm for the background EEG assessment has been validated on an extensive dataset which contained major artifacts and epileptic seizures. The demonstrated high robustness, while processing real-case EEGs, suggests that the algorithm can be used as an assistive tool to monitor the severity of hypoxic insults in newborns.

  20. DEF: an automated dead-end filling approach based on quasi-endosymbiosis.

    Science.gov (United States)

    Liu, Lili; Zhang, Zijun; Sheng, Taotao; Chen, Ming

    2017-02-01

    Gap filling for the reconstruction of metabolic networks is to restore the connectivity of metabolites via finding high-confidence reactions that could be missed in target organism. Current methods for gap filling either fall into the network topology or have limited capability in finding missing reactions that are indirectly related to dead-end metabolites but of biological importance to the target model. We present an automated dead-end filling (DEF) approach, which is derived from the wisdom of endosymbiosis theory, to fill gaps by finding the most efficient dead-end utilization paths in a constructed quasi-endosymbiosis model. The recalls of reactions and dead ends of DEF reach around 73% and 86%, respectively. This method is capable of finding indirectly dead-end-related reactions with biological importance for the target organism and is applicable to any given metabolic model. In the E. coli iJR904 model, for instance, about 42% of the dead-end metabolites were fixed by our proposed method. DEF is publicly available at http://bis.zju.edu.cn/DEF/. mchen@zju.edu.cn Supplementary data are available at Bioinformatics online.

  1. A machine learning approach for automated assessment of retinal vasculature in the oxygen induced retinopathy model.

    Science.gov (United States)

    Mazzaferri, Javier; Larrivée, Bruno; Cakir, Bertan; Sapieha, Przemyslaw; Costantino, Santiago

    2018-03-02

    Preclinical studies of vascular retinal diseases rely on the assessment of developmental dystrophies in the oxygen induced retinopathy rodent model. The quantification of vessel tufts and avascular regions is typically computed manually from flat mounted retinas imaged using fluorescent probes that highlight the vascular network. Such manual measurements are time-consuming and hampered by user variability and bias, thus a rapid and objective method is needed. Here, we introduce a machine learning approach to segment and characterize vascular tufts, delineate the whole vasculature network, and identify and analyze avascular regions. Our quantitative retinal vascular assessment (QuRVA) technique uses a simple machine learning method and morphological analysis to provide reliable computations of vascular density and pathological vascular tuft regions, devoid of user intervention within seconds. We demonstrate the high degree of error and variability of manual segmentations, and designed, coded, and implemented a set of algorithms to perform this task in a fully automated manner. We benchmark and validate the results of our analysis pipeline using the consensus of several manually curated segmentations using commonly used computer tools. The source code of our implementation is released under version 3 of the GNU General Public License ( https://www.mathworks.com/matlabcentral/fileexchange/65699-javimazzaf-qurva ).

  2. Original Approach for Automated Quantification of Antinuclear Autoantibodies by Indirect Immunofluorescence

    Directory of Open Access Journals (Sweden)

    Daniel Bertin

    2013-01-01

    Full Text Available Introduction. Indirect immunofluorescence (IIF is the gold standard method for the detection of antinuclear antibodies (ANA which are essential markers for the diagnosis of systemic autoimmune rheumatic diseases. For the discrimination of positive and negative samples, we propose here an original approach named Immunofluorescence for Computed Antinuclear antibody Rational Evaluation (ICARE based on the calculation of a fluorescence index (FI. Methods. We made comparison between FI and visual evaluations on 237 consecutive samples and on a cohort of 25 patients with SLE. Results. We obtained very good technical performance of FI (95% sensitivity, 98% specificity, and a kappa of 0.92, even in a subgroup of weakly positive samples. A significant correlation between quantification of FI and IIF ANA titers was found (Spearman's ρ=0.80, P<0.0001. Clinical performance of ICARE was validated on a cohort of patients with SLE corroborating the fact that FI could represent an attractive alternative for the evaluation of antibody titer. Conclusion. Our results represent a major step for automated quantification of IIF ANA, opening attractive perspectives such as rapid sample screening and laboratory standardization.

  3. Quantum mechanical streamlines. I - Square potential barrier

    Science.gov (United States)

    Hirschfelder, J. O.; Christoph, A. C.; Palke, W. E.

    1974-01-01

    Exact numerical calculations are made for scattering of quantum mechanical particles hitting a square two-dimensional potential barrier (an exact analog of the Goos-Haenchen optical experiments). Quantum mechanical streamlines are plotted and found to be smooth and continuous, to have continuous first derivatives even through the classical forbidden region, and to form quantized vortices around each of the nodal points. A comparison is made between the present numerical calculations and the stationary wave approximation, and good agreement is found between both the Goos-Haenchen shifts and the reflection coefficients. The time-independent Schroedinger equation for real wavefunctions is reduced to solving a nonlinear first-order partial differential equation, leading to a generalization of the Prager-Hirschfelder perturbation scheme. Implications of the hydrodynamical formulation of quantum mechanics are discussed, and cases are cited where quantum and classical mechanical motions are identical.

  4. Automated mitosis detection using texture, SIFT features and HMAX biologically inspired approach.

    Science.gov (United States)

    Irshad, Humayun; Jalali, Sepehr; Roux, Ludovic; Racoceanu, Daniel; Hwee, Lim Joo; Naour, Gilles Le; Capron, Frédérique

    2013-01-01

    According to Nottingham grading system, mitosis count in breast cancer histopathology is one of three components required for cancer grading and prognosis. Manual counting of mitosis is tedious and subject to considerable inter- and intra-reader variations. The aim is to investigate the various texture features and Hierarchical Model and X (HMAX) biologically inspired approach for mitosis detection using machine-learning techniques. We propose an approach that assists pathologists in automated mitosis detection and counting. The proposed method, which is based on the most favorable texture features combination, examines the separability between different channels of color space. Blue-ratio channel provides more discriminative information for mitosis detection in histopathological images. Co-occurrence features, run-length features, and Scale-invariant feature transform (SIFT) features were extracted and used in the classification of mitosis. Finally, a classification is performed to put the candidate patch either in the mitosis class or in the non-mitosis class. Three different classifiers have been evaluated: Decision tree, linear kernel Support Vector Machine (SVM), and non-linear kernel SVM. We also evaluate the performance of the proposed framework using the modified biologically inspired model of HMAX and compare the results with other feature extraction methods such as dense SIFT. The proposed method has been tested on Mitosis detection in breast cancer histological images (MITOS) dataset provided for an International Conference on Pattern Recognition (ICPR) 2012 contest. The proposed framework achieved 76% recall, 75% precision and 76% F-measure. Different frameworks for classification have been evaluated for mitosis detection. In future work, instead of regions, we intend to compute features on the results of mitosis contour segmentation and use them to improve detection and classification rate.

  5. Automated mitosis detection using texture, SIFT features and HMAX biologically inspired approach

    Directory of Open Access Journals (Sweden)

    Humayun Irshad

    2013-01-01

    Full Text Available Context: According to Nottingham grading system, mitosis count in breast cancer histopathology is one of three components required for cancer grading and prognosis. Manual counting of mitosis is tedious and subject to considerable inter- and intra-reader variations. Aims: The aim is to investigate the various texture features and Hierarchical Model and X (HMAX biologically inspired approach for mitosis detection using machine-learning techniques. Materials and Methods: We propose an approach that assists pathologists in automated mitosis detection and counting. The proposed method, which is based on the most favorable texture features combination, examines the separability between different channels of color space. Blue-ratio channel provides more discriminative information for mitosis detection in histopathological images. Co-occurrence features, run-length features, and Scale-invariant feature transform (SIFT features were extracted and used in the classification of mitosis. Finally, a classification is performed to put the candidate patch either in the mitosis class or in the non-mitosis class. Three different classifiers have been evaluated: Decision tree, linear kernel Support Vector Machine (SVM, and non-linear kernel SVM. We also evaluate the performance of the proposed framework using the modified biologically inspired model of HMAX and compare the results with other feature extraction methods such as dense SIFT. Results: The proposed method has been tested on Mitosis detection in breast cancer histological images (MITOS dataset provided for an International Conference on Pattern Recognition (ICPR 2012 contest. The proposed framework achieved 76% recall, 75% precision and 76% F-measure. Conclusions: Different frameworks for classification have been evaluated for mitosis detection. In future work, instead of regions, we intend to compute features on the results of mitosis contour segmentation and use them to improve detection and

  6. A streamlined failure mode and effects analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD 21287 (United States)

    2014-06-15

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  7. Streamlining the license renewal review process

    International Nuclear Information System (INIS)

    Dozier, J.; Lee, S.; Kuo, P.T.

    2001-01-01

    The staff of the NRC has been developing three regulatory guidance documents for license renewal: the Generic Aging Lessons Learned (GALL) report, Standard Review Plan for License Renewal (SRP-LR), and Regulatory Guide (RG) for Standard Format and Content for Applications to Renew Nuclear Power Plant Operating Licenses. These documents are designed to streamline the license renewal review process by providing clear guidance for license renewal applicants and the NRC staff in preparing and reviewing license renewal applications. The GALL report systematically catalogs aging effects on structures and components; identifies the relevant existing plant programs; and evaluates the existing programs against the attributes considered necessary for an aging management program to be acceptable for license renewal. The GALL report also provides guidance for the augmentation of existing plant programs for license renewal. The revised SRP-LR allows an applicant to reference the GALL report to preclude further NRC staff evaluation if the plant's existing programs meet the criteria described in the GALL report. During the review process, the NRC staff will focus primarily on existing programs that should be augmented or new programs developed specifically for license renewal. The Regulatory Guide is expected to endorse the Nuclear Energy Institute (NEI) guideline, NEI 95-10, Revision 2, entitled 'Industry Guideline for Implementing the Requirements of 10 CFR Part 54 - The License Renewal Rule', which provides guidance for preparing a license renewal application. This paper will provide an introduction to the GALL report, SRP-LR, Regulatory Guide, and NEI 95-10 to show how these documents are interrelated and how they will be used to streamline the license renewal review process. This topic will be of interest to domestic power utilities considering license renewal and international ICONE participants seeking state-of-the-art information about license renewal in the United States

  8. A streamlined failure mode and effects analysis

    International Nuclear Information System (INIS)

    Ford, Eric C.; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-01-01

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed

  9. A streamlined failure mode and effects analysis.

    Science.gov (United States)

    Ford, Eric C; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-06-01

    Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes had RPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  10. Validation of an automated surveillance approach for drain-related meningitis : A multicenter study

    NARCIS (Netherlands)

    Van Mourik, Maaike S M; Troelstra, Annet; Van Der Sprenkel, Jan Willem Berkelbach; Van Der Jagt-Zwetsloot, Marischka C E; Nelson, Jolande H.; Vos, Piet; Arts, Mark P.; Dennesen, Paul J W; Moons, K. (Carl) G.M.; Bonten, Marc J M

    2015-01-01

    Objective. Manual surveillance of healthcare-associated infections is cumbersome and vulnerable to subjective interpretation. Automated systems are under development to improve efficiency and reliability of surveillance, for example by selecting high-risk patients requiring manual chart review. In

  11. The standard laboratory module approach to automation of the chemical laboratory

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.H.

    1993-01-01

    Automation of the technology and practice of environmental laboratory automation has not been as rapid or complete as one might expect. Confined to autosamplers and limited robotic systems, our ability to apply production concepts to environmental analytical analysis is not great. With the impending remediation of our hazardous waste sites in the US, only the application of production chemistry techniques will even begin to provide those responsible with the necessary knowledge to accomplish the cleanup expeditiously and safely. Tightening regulatory requirements have already mandated staggering increases in sampling and characterization needs with the future only guaranteeing greater demands. The Contaminant Analysis Automation Program has been initiated by our government to address these current and future characterization by application of a new robotic paradigm for analytical chemistry. By using standardized modular instruments, named Standard Laboratory Modules, flexible automation systems can rapidly be configured to apply production techniques to our nations environmental problems at-site

  12. Analysis of Real Time Technical Data Obtained While Shotcreting: An Approach Towards Automation

    OpenAIRE

    Rodríguez, Ángel; Río, Olga

    2010-01-01

    Automation of shotcreting process is a key factor in both improving the working conditions and increasing productivity; as well as in increasing the quality of shotcrete. The confidence in the quality of the automation process itself and shotcrete linings can be improved by real time monitoring of pumping as well as other shotcreting machine related parameters. Prediction of how the difIerent technical parameters of application are governing the whole process is being a subject of increasing ...

  13. Automated design of analog and high-frequency circuits a computational intelligence approach

    CERN Document Server

    Liu, Bo; Fernández, Francisco V

    2014-01-01

    Computational intelligence techniques are becoming more and more important for automated problem solving nowadays. Due to the growing complexity of industrial applications and the increasingly tight time-to-market requirements, the time available for thorough problem analysis and development of tailored solution methods is decreasing. There is no doubt that this trend will continue in the foreseeable future. Hence, it is not surprising that robust and general automated problem solving methods with satisfactory performance are needed.

  14. A Systems Approach to Information Technology (IT) Infrastructure Design for Utility Management Automation Systems

    OpenAIRE

    A. Fereidunian; H. Lesani; C. Lucas; M. Lehtonen; M. M. Nordman

    2006-01-01

    Almost all of electric utility companies are planning to improve their management automation system, in order to meet the changing requirements of new liberalized energy market and to benefit from the innovations in information and communication technology (ICT or IT). Architectural design of the utility management automation (UMA) systems for their IT-enabling requires proper selection of IT choices for UMA system, which leads to multi-criteria decision-makings (MCDM). In resp...

  15. An Accelerated Testing Approach for Automated Vehicles with Background Traffic Described by Joint Distributions

    OpenAIRE

    Huang, Zhiyuan; Lam, Henry; Zhao, Ding

    2017-01-01

    This paper proposes a new framework based on joint statistical models for evaluating risks of automated vehicles in a naturalistic driving environment. The previous studies on the Accelerated Evaluation for automated vehicles are extended from multi-independent-variate models to joint statistics. The proposed toolkit includes exploration of the rare event (e.g. crash) sets and construction of accelerated distributions for Gaussian Mixture models using Importance Sampling techniques. Furthermo...

  16. A Federated Enterprise Architecture and MBSE Modeling Framework for Integrating Design Automation into a Global PLM Approach

    OpenAIRE

    Vosgien , Thomas; Rigger , Eugen; Schwarz , Martin; Shea , Kristina

    2017-01-01

    Part 1: PLM Maturity, Implementation and Adoption; International audience; PLM and Design Automation (DA) are two interdependent and necessary approaches to increase the performance and efficiency of product development processes. Often, DA systems’ usability suffers due to a lack of integration in industrial business environments stemming from the independent consideration of PLM and DA. This article proposes a methodological and modeling framework for developing and deploying DA solutions w...

  17. A Comparative Experimental Study on the Use of Machine Learning Approaches for Automated Valve Monitoring Based on Acoustic Emission Parameters

    Science.gov (United States)

    Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.

    2018-03-01

    Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.

  18. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  19. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    Science.gov (United States)

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  20. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  1. The design of the Comet streamliner: An electric land speed record motorcycle

    Science.gov (United States)

    McMillan, Ethan Alexander

    The development of the land speed record electric motorcycle streamliner, the Comet, is discussed herein. Its design process includes a detailed literary review of past and current motorcycle streamliners in an effort to highlight the main components of such a vehicle's design, while providing baseline data for performance comparisons. A new approach to balancing a streamliner at low speeds is also addressed, a system henceforth referred to as landing gear, which has proven an effective means for allowing the driver to control the low speed instabilities of the vehicle with relative ease compared to tradition designs. This is accompanied by a dynamic stability analysis conducted on a test chassis that was developed for the primary purpose of understanding the handling dynamics of streamliners, while also providing a test bed for the implementation of the landing gear system and a means to familiarize the driver to the operation and handling of such a vehicle. Data gathered through the use of GPS based velocity tracking, accelerometers, and a linear potentiometer provided a means to validate a dynamic stability analysis of the weave and wobble modes of the vehicle through linearization of a streamliner model developed in the BikeSIM software suite. Results indicate agreement between the experimental data and the simulation, indicating that the conventional recumbent design of a streamliner chassis is in fact highly stable throughout the performance envelope beyond extremely low speeds. A computational fluid dynamics study was also performed, utilized in the development of the body of the Comet to which a series of tests were conducted in order to develop a shape that was both practical to transport and highly efficient. By creating a hybrid airfoil from a NACA 0018 and NACA 66-018, a drag coefficient of 0.1 and frontal area of 0.44 m2 has been found for the final design. Utilizing a performance model based on the proposed vehicle's motor, its rolling resistance, and

  2. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach.

    Science.gov (United States)

    Murat, Miraemiliana; Chang, Siow-Wee; Abu, Arpah; Yap, Hwa Jen; Yong, Kien-Thai

    2017-01-01

    Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM), Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD), Histogram of Oriented Gradients (HOG), Hu invariant moments (Hu) and Zernike moments (ZM). Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN), random forest (RF), support vector machine (SVM), k-nearest neighbour (k-NN), linear discriminant analysis (LDA) and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM). In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS) and Pearson's coefficient correlation (PCC). The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia dataset and 99

  3. Framework to Implement Collaborative Robots in Manual Assembly: A Lean Automation Approach

    DEFF Research Database (Denmark)

    Malik, Ali Ahmad; Bilberg, Arne

    The recent proliferation of smart manufacturing technologies has emerged the concept of hybrid automation for assembly systems utilizing the best of humans and robots in a combination. Based on the ability to work alongside human-workers the next generation of industrial robots (or robotics 2...... of virtual simulations is discussed for validation and optimization of human-robot work environment....

  4. Computer-Automated Approach for Scoring Short Essays in an Introductory Statistics Course

    Science.gov (United States)

    Zimmerman, Whitney Alicia; Kang, Hyun Bin; Kim, Kyung; Gao, Mengzhao; Johnson, Glenn; Clariana, Roy; Zhang, Fan

    2018-01-01

    Over two semesters short essay prompts were developed for use with the Graphical Interface for Knowledge Structure (GIKS), an automated essay scoring system. Participants were students in an undergraduate-level online introductory statistics course. The GIKS compares students' writing samples with an expert's to produce keyword occurrence and…

  5. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  6. A systematic engineering tool chain approach for self-organizing building automation systems

    NARCIS (Netherlands)

    Mc Gibney, A.; Rea, S.; Lehmann, M.; Thior, S.; Lesecq, S.; Hendriks, M.; Guyon-Gardeux, C.; Mai, Linh Tuan; Pacull, F.; Ploennigs, J.; Basten, T.; Pesch, D.

    2013-01-01

    There is a strong push towards smart buildings that aim to achieve comfort, safety and energy efficiency, through building automation systems (BAS) that incorporate multiple subsystems such as heating and air-conditioning, lighting, access control etc. The design, commissioning and operation of BAS

  7. NMRNet: A deep learning approach to automated peak picking of protein NMR spectra.

    Science.gov (United States)

    Klukowski, Piotr; Augoff, Michal; Zieba, Maciej; Drwal, Maciej; Gonczarek, Adam; Walczak, Michal J

    2018-03-14

    Automated selection of signals in protein NMR spectra, known as peak picking, has been studied for over 20 years, nevertheless existing peak picking methods are still largely deficient. Accurate and precise automated peak picking would accelerate the structure calculation, and analysis of dynamics and interactions of macromolecules. Recent advancement in handling big data, together with an outburst of machine learning techniques, offer an opportunity to tackle the peak picking problem substantially faster than manual picking and on par with human accuracy. In particular, deep learning has proven to systematically achieve human-level performance in various recognition tasks, and thus emerges as an ideal tool to address automated identification of NMR signals. We have applied a convolutional neural network for visual analysis of multidimensional NMR spectra. A comprehensive test on 31 manually-annotated spectra has demonstrated top-tier average precision (AP) of 0.9596, 0.9058 and 0.8271 for backbone, side-chain and NOESY spectra, respectively. Furthermore, a combination of extracted peak lists with automated assignment routine, FLYA, outperformed other methods, including the manual one, and led to correct resonance assignment at the levels of 90.40%, 89.90% and 90.20% for three benchmark proteins. The proposed model is a part of a Dumpling software (platform for protein NMR data analysis), and is available at https://dumpling.bio/. michaljerzywalczak@gmail.compiotr.klukowski@pwr.edu.pl. Supplementary data are available at Bioinformatics online.

  8. An Automated Design Approach for High-Lift Systems incorporating Eccentric Beam Actuators

    NARCIS (Netherlands)

    Steenhuizen, D.; Van Tooren, M.J.L.

    2010-01-01

    In order to asess the merit of novel high-lift structural concepts to the design of contemporary and future transport aircraft, a highly automated design routine is elaborated. The structure, purpose and evolution of this design routine is set-out with the use of Knowledge-Based Engineering

  9. Results of a multivariate approach to automated oestrus and mastitis detection

    NARCIS (Netherlands)

    Mol, de R.M.; Kroeze, G.H.; Achten, J.M.F.H.; Maatje, K.; Rossing, W.

    1997-01-01

    In modern dairy farming sensors can be used to measure on-line milk yield, milk temperature, electrical conductivity of quarter milk, concentrate intake and the cow's activity. Together with information from the management information system (MIS), the sensor data can be used for the automated

  10. Different approaches to synovial membrane volume determination by magnetic resonance imaging: manual versus automated segmentation

    DEFF Research Database (Denmark)

    Østergaard, Mikkel

    1997-01-01

    Automated fast (5-20 min) synovial membrane volume determination by MRI, based on pre-set post-gadolinium-DTPA enhancement thresholds, was evaluated as a substitute for a time-consuming (45-120 min), previously validated, manual segmentation method. Twenty-nine knees [rheumatoid arthritis (RA) 13...

  11. Fast and accurate approaches for large-scale, automated mapping of food diaries on food composition tables

    DEFF Research Database (Denmark)

    Lamarine, Marc; Hager, Jörg; Saris, Wim H M

    2018-01-01

    the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching). The second used a machine learning approach (C5.0 classifier) combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English...... not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names). Our approaches have been implemented as R packages...... and are freely available from GitHub. Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We demonstrate that both high precision and recall can be achieved. Our solutions can be used with any FCT and do not require any programming background...

  12. An approach to automated chromosome analysis; Etudes pour une methode d'automatisation des analyses chromosomiques

    Energy Technology Data Exchange (ETDEWEB)

    Le Go, Roland

    1972-05-03

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author) [French] Ce travail expose les methodes d'approche etudiees en vue

  13. Air Force Information Workflow Automation through Synchronized Air Power Management (SAPM)

    National Research Council Canada - National Science Library

    Benkley, Carl; Chang, Irene; Crowley, John; Oristian, Thomas

    2004-01-01

    .... Implementing Extensible Markup Language (XML) messages, web services, and workflow automation, SAPM expands existing web-based capabilities, enables machine-to-machine interfaces, and streamlines the war fighter kill chain process...

  14. Lean coding machine. Facilities target productivity and job satisfaction with coding automation.

    Science.gov (United States)

    Rollins, Genna

    2010-07-01

    Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.

  15. Case studies in geographic information systems for environmental streamlining

    Science.gov (United States)

    2012-05-31

    This 2012 summary report addresses the current use of geographic information systems (GIS) and related technologies by State Departments of Transportation (DOTs) for environmental streamlining and stewardship, particularly in relation to the National...

  16. Novel diffusion tensor imaging technique reveals developmental streamline volume changes in the corticospinal tract associated with leg motor control.

    Science.gov (United States)

    Kamson, David O; Juhász, Csaba; Chugani, Harry T; Jeong, Jeong-Won

    2015-04-01

    Diffusion tensor imaging (DTI) has expanded our knowledge of corticospinal tract (CST) anatomy and development. However, previous developmental DTI studies assessed the CST as a whole, overlooking potential differences in development of its components related to control of the upper and lower extremities. The present cross-sectional study investigated age-related changes, side and gender differences in streamline volume of the leg- and hand-related segments of the CST in children. DTI data of 31 children (1-14 years; mean age: 6±4 years; 17 girls) with normal conventional MRI were analyzed. Leg- and hand-related CST streamline volumes were quantified separately, using a recently validated novel tractography approach. CST streamline volumes on both sides were compared between genders and correlated with age. Higher absolute streamline volumes were found in the left leg-related CST compared to the right (p=0.001) without a gender effect (p=0.4), whereas no differences were found in the absolute hand-related CST volumes (p>0.4). CST leg-related streamline volumes, normalized to hemispheric white matter volumes, declined with age in the right hemisphere only (R=-.51; p=0.004). Absolute leg-related CST streamline volumes showed similar, but slightly weaker correlations. Hand-related absolute or normalized CST streamline volumes showed no age-related variations on either side. These results suggest differential development of CST segments controlling hand vs. leg movements. Asymmetric volume changes in the lower limb motor pathway may be secondary to gradually strengthening left hemispheric dominance and is consistent with previous data suggesting that footedness is a better predictor of hemispheric lateralization than handedness. Copyright © 2014 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  17. An automated and fast approach to detect single-trial visual evoked potentials with application to brain-computer interface.

    Science.gov (United States)

    Tu, Yiheng; Hung, Yeung Sam; Hu, Li; Huang, Gan; Hu, Yong; Zhang, Zhiguo

    2014-12-01

    This study aims (1) to develop an automated and fast approach for detecting visual evoked potentials (VEPs) in single trials and (2) to apply the single-trial VEP detection approach in designing a real-time and high-performance brain-computer interface (BCI) system. The single-trial VEP detection approach uses common spatial pattern (CSP) as a spatial filter and wavelet filtering (WF) a temporal-spectral filter to jointly enhance the signal-to-noise ratio (SNR) of single-trial VEPs. The performance of the joint spatial-temporal-spectral filtering approach was assessed in a four-command VEP-based BCI system. The offline classification accuracy of the BCI system was significantly improved from 67.6±12.5% (raw data) to 97.3±2.1% (data filtered by CSP and WF). The proposed approach was successfully implemented in an online BCI system, where subjects could make 20 decisions in one minute with classification accuracy of 90%. The proposed single-trial detection approach is able to obtain robust and reliable VEP waveform in an automatic and fast way and it is applicable in VEP based online BCI systems. This approach provides a real-time and automated solution for single-trial detection of evoked potentials or event-related potentials (EPs/ERPs) in various paradigms, which could benefit many applications such as BCI and intraoperative monitoring. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  18. Fast and Accurate Approaches for Large-Scale, Automated Mapping of Food Diaries on Food Composition Tables

    Directory of Open Access Journals (Sweden)

    Marc Lamarine

    2018-05-01

    Full Text Available Aim of Study: The use of weighed food diaries in nutritional studies provides a powerful method to quantify food and nutrient intakes. Yet, mapping these records onto food composition tables (FCTs is a challenging, time-consuming and error-prone process. Experts make this effort manually and no automation has been previously proposed. Our study aimed to assess automated approaches to map food items onto FCTs.Methods: We used food diaries (~170,000 records pertaining to 4,200 unique food items from the DiOGenes randomized clinical trial. We attempted to map these items onto six FCTs available from the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching. The second used a machine learning approach (C5.0 classifier combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English-translation. Top matching pairs were reviewed manually to derive performance metrics: precision (the percentage of correctly mapped items and recall (percentage of mapped items.Results: The simpler approach: fuzzy matching, provided very good performance. Under a relaxed threshold (score > 50%, this approach enabled to remap 99.49% of the items with a precision of 88.75%. With a slightly more stringent threshold (score > 63%, the precision could be significantly improved to 96.81% while keeping a recall rate > 95% (i.e., only 5% of the queried items would not be mapped. The machine learning approach did not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names. Our approaches have been implemented as R packages and are freely available from GitHub.Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We

  19. Streamline segment statistics of premixed flames with nonunity Lewis numbers

    Science.gov (United States)

    Chakraborty, Nilanjan; Wang, Lipo; Klein, Markus

    2014-03-01

    The interaction of flame and surrounding fluid motion is of central importance in the fundamental understanding of turbulent combustion. It is demonstrated here that this interaction can be represented using streamline segment analysis, which was previously applied in nonreactive turbulence. The present work focuses on the effects of the global Lewis number (Le) on streamline segment statistics in premixed flames in the thin-reaction-zones regime. A direct numerical simulation database of freely propagating thin-reaction-zones regime flames with Le ranging from 0.34 to 1.2 is used to demonstrate that Le has significant influences on the characteristic features of the streamline segment, such as the curve length, the difference in the velocity magnitude at two extremal points, and their correlations with the local flame curvature. The strengthenings of the dilatation rate, flame normal acceleration, and flame-generated turbulence with decreasing Le are principally responsible for these observed effects. An expression for the probability density function (pdf) of the streamline segment length, originally developed for nonreacting turbulent flows, captures the qualitative behavior for turbulent premixed flames in the thin-reaction-zones regime for a wide range of Le values. The joint pdfs between the streamline length and the difference in the velocity magnitude at two extremal points for both unweighted and density-weighted velocity vectors are analyzed and compared. Detailed explanations are provided for the observed differences in the topological behaviors of the streamline segment in response to the global Le.

  20. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    Directory of Open Access Journals (Sweden)

    Chinmay A. Shukla

    2017-05-01

    Full Text Available The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature.

  1. Automated Guided Vehicle For Phsically Handicapped People - A Cost Effective Approach

    Science.gov (United States)

    Kumar, G. Arun, Dr.; Sivasubramaniam, Mr. A.

    2017-12-01

    Automated Guided vehicle (AGV) is like a robot that can deliver the materials from the supply area to the technician automatically. This is faster and more efficient. The robot can be accessed wirelessly. A technician can directly control the robot to deliver the components rather than control it via a human operator (over phone, computer etc. who has to program the robot or ask a delivery person to make the delivery). The vehicle is automatically guided through its ways. To avoid collisions a proximity sensor is attached to the system. The sensor senses the signals of the obstacles and can stop the vehicle in the presence of obstacles. Thus vehicle can avoid accidents that can be very useful to the present industrial trend and material handling and equipment handling will be automated and easy time saving methodology.

  2. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  3. Approach to automation of a process of yeast inoculum production on industrial scale for ethanol production

    Directory of Open Access Journals (Sweden)

    Ibeth Viviana Ordóñez-Ortega

    2013-07-01

    Full Text Available The results of an applied research for automation the stage of reproduction of Saccharomyces cerevisiae yeas to produce ethanol, are presented in this paper. The identification of the variables to be instrumented, the instrumentation requirements and the proposed control scheme are based on the analysis of the functioning and operation of the process.

  4. Automated data mining: an innovative and efficient web-based approach to maintaining resident case logs.

    Science.gov (United States)

    Bhattacharya, Pratik; Van Stavern, Renee; Madhavan, Ramesh

    2010-12-01

    Use of resident case logs has been considered by the Residency Review Committee for Neurology of the Accreditation Council for Graduate Medical Education (ACGME). This study explores the effectiveness of a data-mining program for creating resident logs and compares the results to a manual data-entry system. Other potential applications of data mining to enhancing resident education are also explored. Patient notes dictated by residents were extracted from the Hospital Information System and analyzed using an unstructured mining program. History, examination and ICD codes were obtained and compared to the existing manual log. The automated data History, examination, and ICD codes were gathered for a 30-day period and compared to manual case logs. The automated method extracted all resident dictations with the dates of encounter and transcription. The automated data-miner processed information from all 19 residents, while only 4 residents logged manually. The manual method identified only broad categories of diseases; the major categories were stroke or vascular disorder 53 (27.6%), epilepsy 28 (14.7%), and pain syndromes 26 (13.5%). In the automated method, epilepsy 114 (21.1%), cerebral atherosclerosis 114 (21.1%), and headache 105 (19.4%) were the most frequent primary diagnoses, and headache 89 (16.5%), seizures 94 (17.4%), and low back pain 47 (9%) were the most common chief complaints. More detailed patient information such as tobacco use 227 (42%), alcohol use 205 (38%), and drug use 38 (7%) were extracted by the data-mining method. Manual case logs are time-consuming, provide limited information, and may be unpopular with residents. Data mining is a time-effective tool that may aid in the assessment of resident experience or the ACGME core competencies or in resident clinical research. More study of this method in larger numbers of residency programs is needed.

  5. Estimating Regional Mass Balance of Himalayan Glaciers Using Hexagon Imagery: An Automated Approach

    Science.gov (United States)

    Maurer, J. M.; Rupper, S.

    2013-12-01

    Currently there is much uncertainty regarding the present and future state of Himalayan glaciers, which supply meltwater for river systems vital to more than 1.4 billion people living throughout Asia. Previous assessments of regional glacier mass balance in the Himalayas using various remote sensing and field-based methods give inconsistent results, and most assessments are over relatively short (e.g., single decade) timescales. This study aims to quantify multi-decadal changes in volume and extent of Himalayan glaciers through efficient use of the large database of declassified 1970-80s era Hexagon stereo imagery. Automation of the DEM extraction process provides an effective workflow for many images to be processed and glacier elevation changes quantified with minimal user input. The tedious procedure of manual ground control point selection necessary for block-bundle adjustment (as ephemeral data is not available for the declassified images) is automated using the Maximally Stable Extremal Regions algorithm, which matches image elements between raw Hexagon images and georeferenced Landsat 15 meter panchromatic images. Additional automated Hexagon DEM processing, co-registration, and bias correction allow for direct comparison with modern ASTER and SRTM elevation data, thus quantifying glacier elevation and area changes over several decades across largely inaccessible mountainous regions. As consistent methodology is used for all glaciers, results will likely reveal significant spatial and temporal patterns in regional ice mass balance. Ultimately, these findings could have important implications for future water resource management in light of environmental change.

  6. Vision-Based Geo-Monitoring - A New Approach for an Automated System

    Science.gov (United States)

    Wagner, A.; Reiterer, A.; Wasmeier, P.; Rieke-Zapp, D.; Wunderlich, T.

    2012-04-01

    : (1) combining two measurement systems and measuring object points by spatial intersection, or (2) using one measurement system and combining image-based techniques with the integrated distance measurement unit. Beside the system configuration, the detection of features inside the captured images can be done on the basis of different approaches, e.g. template-, edge-, and/or point-based methods. Our system is able to select a suitable algorithm based on different object characteristics, such as object geometry, texture, behaviour, etc. The long-term objective is the research, development and installation of a fully-automated measurement system, including a data analysis and interpretation component. Acknowledgments: The presented research has been supported by the Alexander von Humboldt Foundation, and by the European Sciences Foundation (ESF).

  7. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach

    Directory of Open Access Journals (Sweden)

    Miraemiliana Murat

    2017-09-01

    Full Text Available Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM, Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD, Histogram of Oriented Gradients (HOG, Hu invariant moments (Hu and Zernike moments (ZM. Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN, random forest (RF, support vector machine (SVM, k-nearest neighbour (k-NN, linear discriminant analysis (LDA and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM. In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS and Pearson’s coefficient correlation (PCC. The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia

  8. Streamlined Modeling for Characterizing Spacecraft Anomalous Behavior

    Science.gov (United States)

    Klem, B.; Swann, D.

    2011-09-01

    Anomalous behavior of on-orbit spacecraft can often be detected using passive, remote sensors which measure electro-optical signatures that vary in time and spectral content. Analysts responsible for assessing spacecraft operational status and detecting detrimental anomalies using non-resolved imaging sensors are often presented with various sensing and identification issues. Modeling and measuring spacecraft self emission and reflected radiant intensity when the radiation patterns exhibit a time varying reflective glint superimposed on an underlying diffuse signal contribute to assessment of spacecraft behavior in two ways: (1) providing information on body component orientation and attitude; and, (2) detecting changes in surface material properties due to the space environment. Simple convex and cube-shaped spacecraft, designed to operate without protruding solar panel appendages, may require an enhanced level of preflight characterization to support interpretation of the various physical effects observed during on-orbit monitoring. This paper describes selected portions of the signature database generated using streamlined signature modeling and simulations of basic geometry shapes apparent to non-imaging sensors. With this database, summarization of key observable features for such shapes as spheres, cylinders, flat plates, cones, and cubes in specific spectral bands that include the visible, mid wave, and long wave infrared provide the analyst with input to the decision process algorithms contained in the overall sensing and identification architectures. The models typically utilize baseline materials such as Kapton, paints, aluminum surface end plates, and radiators, along with solar cell representations covering the cylindrical and side portions of the spacecraft. Multiple space and ground-based sensors are assumed to be located at key locations to describe the comprehensive multi-viewing aspect scenarios that can result in significant specular reflection

  9. A novel approach to sequence validating protein expression clones with automated decision making

    Directory of Open Access Journals (Sweden)

    Mohr Stephanie E

    2007-06-01

    Full Text Available Abstract Background Whereas the molecular assembly of protein expression clones is readily automated and routinely accomplished in high throughput, sequence verification of these clones is still largely performed manually, an arduous and time consuming process. The ultimate goal of validation is to determine if a given plasmid clone matches its reference sequence sufficiently to be "acceptable" for use in protein expression experiments. Given the accelerating increase in availability of tens of thousands of unverified clones, there is a strong demand for rapid, efficient and accurate software that automates clone validation. Results We have developed an Automated Clone Evaluation (ACE system – the first comprehensive, multi-platform, web-based plasmid sequence verification software package. ACE automates the clone verification process by defining each clone sequence as a list of multidimensional discrepancy objects, each describing a difference between the clone and its expected sequence including the resulting polypeptide consequences. To evaluate clones automatically, this list can be compared against user acceptance criteria that specify the allowable number of discrepancies of each type. This strategy allows users to re-evaluate the same set of clones against different acceptance criteria as needed for use in other experiments. ACE manages the entire sequence validation process including contig management, identifying and annotating discrepancies, determining if discrepancies correspond to polymorphisms and clone finishing. Designed to manage thousands of clones simultaneously, ACE maintains a relational database to store information about clones at various completion stages, project processing parameters and acceptance criteria. In a direct comparison, the automated analysis by ACE took less time and was more accurate than a manual analysis of a 93 gene clone set. Conclusion ACE was designed to facilitate high throughput clone sequence

  10. Streamlining Collaboration for the Gravitational-wave Astronomy Community

    Science.gov (United States)

    Koranda, S.

    2016-12-01

    In the morning hours of September 14, 2015 the LaserInterferometer Gravitational-wave Observatory (LIGO) directlydetected gravitational waves from inspiraling and coalescingblack holes, confirming a major prediction of AlbertEinstein's general theory of relativity and beginning the eraof gravitational-wave astronomy. With the LIGO detectors in the United States, the Virgo andGEO detectors in Europe, and the KAGRA detector in Japan thegravitational-wave astrononmy community is opening a newwindow on our Universe. Realizing the full science potentialof LIGO and the other interferometers requires globalcollaboration not only within the gravitational-wave astronomycommunity but also with the astronomers and astrophysicists acrossmultipe disciplines working to realize and leverage the powerof multi-messenger astronomy. Enabling thousands of researchers from around the world andacross multiple projects to efficiently collaborate, share,and analyze data and provide streamlined access to services,computing, and tools requires new and scalable approaches toidentity and access management (IAM). We will discuss LIGO'sIAM journey that began in 2007 and how today LIGO leveragesinternal identity federations like InCommon and eduGAIN toprovide scalable and managed access for the gravitational-waveastronomy community. We will discuss the steps both largeand small research organizations and projects take as theirIAM infrastructure matures from ad-hoc silos of independent services to fully integrated and federated services thatstreamline collaboration so that scientists can focus onresearch and not managing passwords.

  11. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality

    Science.gov (United States)

    Albert, Océane; Reintsch, Wolfgang E.; Chan, Peter; Robaire, Bernard

    2016-01-01

    STUDY QUESTION Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? SUMMARY ANSWER We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. WHAT IS KNOWN ALREADY The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. STUDY DESIGN, SIZE, DURATION The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses (n = 3–5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. PARTICIPANTS/MATERIALS, SETTING, METHODS Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. MAIN RESULTS AND THE ROLE OF CHANCE We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi

  12. A Streamlined Approach by a Combination of Bioindication and Geostatistical Methods for Assessing Air Contaminants and Their Effects on Human Health in Industrialized Areas: A Case Study in Southern Brazil

    Directory of Open Access Journals (Sweden)

    Angélica B. Ferreira

    2017-09-01

    Full Text Available Industrialization in developing countries associated with urban growth results in a number of economic benefits, especially in small or medium-sized cities, but leads to a number of environmental and public health consequences. This problem is further aggravated when adequate infrastructure is lacking to monitor the environmental impacts left by industries and refineries. In this study, a new protocol was designed combining biomonitoring and geostatistics to evaluate the possible effects of shale industry emissions on human health and wellbeing. Futhermore, the traditional and expensive air quality method based on PM2.5 measuring was also used to validate the low-cost geostatistical approach. Chemical analysis was performed using Energy Dispersive X-ray Fluorescence Spectrometer (EDXRF to measure inorganic elements in tree bark and shale retorted samples in São Mateus do Sul city, Southern Brazil. Fe, S, and Si were considered potential pollutants in the study area. Distribution maps of element concentrations were generated from the dataset and used to estimate the spatial behavior of Fe, S, and Si and the range from their hot spot(s, highlighting the regions sorrounding the shale refinery. This evidence was also demonstrated in the measurements of PM2.5 concentrations, which are in agreement with the information obtained from the biomonitoring and geostatistical model. Factor and descriptive analyses performed on the concentrations of tree bark contaminants suggest that Fe, S, and Si might be used as indicators of industrial emissions. The number of cases of respiratory diseases obtained from local basic health unit were used to assess a possible correlation between shale refinery emissions and cases of repiratory disease. These data are public and may be accessed on the website of the the Brazilian Ministry of Health. Significant associations were found between the health data and refinery activities. The combination of the spatial

  13. Automated genotyping of dinucleotide repeat markers

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Hoffman, E.P. [Carnegie Mellon Univ., Pittsburgh, PA (United States)]|[Univ. of Pittsburgh, PA (United States)

    1994-09-01

    The dinucleotide repeats (i.e., microsatellites) such as CA-repeats are a highly polymorphic, highly abundant class of PCR-amplifiable markers that have greatly streamlined genetic mapping experimentation. It is expected that over 30,000 such markers (including tri- and tetranucleotide repeats) will be characterized for routine use in the next few years. Since only size determination, and not sequencing, is required to determine alleles, in principle, dinucleotide repeat genotyping is easily performed on electrophoretic gels, and can be automated using DNA sequencers. Unfortunately, PCR stuttering with these markers generates not one band for each allele, but a pattern of bands. Since closely spaced alleles must be disambiguated by human scoring, this poses a key obstacle to full automation. We have developed methods that overcome this obstacle. Our model is that the observed data is generated by arithmetic superposition (i.e., convolution) of multiple allele patterns. By quantitatively measuring the size of each component band, and exploiting the unique stutter pattern associated with each marker, closely spaced alleles can be deconvolved; this unambiguously reconstructs the {open_quotes}true{close_quotes} allele bands, with stutter artifact removed. We used this approach in a system for automated diagnosis of (X-linked) Duchenne muscular dystrophy; four multiplexed CA-repeats within the dystrophin gene were assayed on a DNA sequencer. Our method accurately detected small variations in gel migration that shifted the allele size estimate. In 167 nonmutated alleles, 89% (149/167) showed no size variation, 9% (15/167) showed 1 bp variation, and 2% (3/167) showed 2 bp variation. We are currently developing a library of dinucleotide repeat patterns; together with our deconvolution methods, this library will enable fully automated genotyping of dinucleotide repeats from sizing data.

  14. An Automated Quiet Sleep Detection Approach in Preterm Infants as a Gateway to Assess Brain Maturation.

    Science.gov (United States)

    Dereymaeker, Anneleen; Pillay, Kirubin; Vervisch, Jan; Van Huffel, Sabine; Naulaers, Gunnar; Jansen, Katrien; De Vos, Maarten

    2017-09-01

    Sleep state development in preterm neonates can provide crucial information regarding functional brain maturation and give insight into neurological well being. However, visual labeling of sleep stages from EEG requires expertise and is very time consuming, prompting the need for an automated procedure. We present a robust method for automated detection of preterm sleep from EEG, over a wide postmenstrual age ([Formula: see text] age) range, focusing first on Quiet Sleep (QS) as an initial marker for sleep assessment. Our algorithm, CLuster-based Adaptive Sleep Staging (CLASS), detects QS if it remains relatively more discontinuous than non-QS over PMA. CLASS was optimized on a training set of 34 recordings aged 27-42 weeks PMA, and performance then assessed on a distinct test set of 55 recordings of the same age range. Results were compared to visual QS labeling from two independent raters (with inter-rater agreement [Formula: see text]), using Sensitivity, Specificity, Detection Factor ([Formula: see text] of visual QS periods correctly detected by CLASS) and Misclassification Factor ([Formula: see text] of CLASS-detected QS periods that are misclassified). CLASS performance proved optimal across recordings at 31-38 weeks (median [Formula: see text], median MF 0-0.25, median Sensitivity 0.93-1.0, and median Specificity 0.80-0.91 across this age range), with minimal misclassifications at 35-36 weeks (median [Formula: see text]). To illustrate the potential of CLASS in facilitating clinical research, normal maturational trends over PMA were derived from CLASS-estimated QS periods, visual QS estimates, and nonstate specific periods (containing QS and non-QS) in the EEG recording. CLASS QS trends agreed with those from visual QS, with both showing stronger correlations than nonstate specific trends. This highlights the benefit of automated QS detection for exploring brain maturation.

  15. Automated-biasing approach to Monte Carlo shipping-cask calculations

    International Nuclear Information System (INIS)

    Hoffman, T.J.; Tang, J.S.; Parks, C.V.; Childs, R.L.

    1982-01-01

    Computer Sciences at Oak Ridge National Laboratory, under a contract with the Nuclear Regulatory Commission, has developed the SCALE system for performing standardized criticality, shielding, and heat transfer analyses of nuclear systems. During the early phase of shielding development in SCALE, it was established that Monte Carlo calculations of radiation levels exterior to a spent fuel shipping cask would be extremely expensive. This cost can be substantially reduced by proper biasing of the Monte Carlo histories. The purpose of this study is to develop and test an automated biasing procedure for the MORSE-SGC/S module of the SCALE system

  16. An Automated Approach to Syntax-based Analysis of Classical Latin

    Directory of Open Access Journals (Sweden)

    Anjalie Field

    2016-12-01

    Full Text Available The goal of this study is to present an automated method for analyzing the style of Latin authors. Many of the common automated methods in stylistic analysis are based on lexical measures, which do not work well with Latin because of the language’s high degree of inflection and free word order. In contrast, this study focuses on analysis at a syntax level by examining two constructions, the ablative absolute and the cum clause. These constructions are often interchangeable, which suggests an author’s choice of construction is typically more stylistic than functional. We first identified these constructions in hand-annotated texts. Next we developed a method for identifying the constructions in unannotated texts, using probabilistic morphological tagging. Our methods identified constructions with enough accuracy to distinguish among different genres and different authors. In particular, we were able to determine which book of Caesar’s Commentarii de Bello Gallico was not written by Caesar. Furthermore, the usage of ablative absolutes and cum clauses observed in this study is consistent with the usage scholars have observed when analyzing these texts by hand. The proposed methods for an automatic syntax-based analysis are shown to be valuable for the study of classical literature.

  17. MIDAS: Automated Approach to Design Microwave Integrated Inductors and Transformers on Silicon

    Directory of Open Access Journals (Sweden)

    L. Aluigi

    2013-09-01

    Full Text Available The design of modern radiofrequency integrated circuits on silicon operating at microwave and millimeter-waves requires the integration of several spiral inductors and transformers that are not commonly available in the process design-kits of the technologies. In this work we present an auxiliary CAD tool for Microwave Inductor (and transformer Design Automation on Silicon (MIDAS that exploits commercial simulators and allows the implementation of an automatic design flow, including three-dimensional layout editing and electromagnetic simulations. In detail, MIDAS allows the designer to derive a preliminary sizing of the inductor (transformer on the bases of the design entries (specifications. It draws the inductor (transformer layers for the specific process design kit, including vias and underpasses, with or without patterned ground shield, and launches the electromagnetic simulations, achieving effective design automation with respect to the traditional design flow for RFICs. With the present software suite the complete design time is reduced significantly (typically 1 hour on a PC based on Intel® Pentium® Dual 1.80GHz CPU with 2-GB RAM. Afterwards both the device equivalent circuit and the layout are ready to be imported in the Cadence environment.

  18. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    Science.gov (United States)

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. MRF-ANN: a machine learning approach for automated ER scoring of breast cancer immunohistochemical images.

    Science.gov (United States)

    Mungle, T; Tewary, S; DAS, D K; Arun, I; Basak, B; Agarwal, S; Ahmed, R; Chatterjee, S; Chakraborty, C

    2017-08-01

    Molecular pathology, especially immunohistochemistry, plays an important role in evaluating hormone receptor status along with diagnosis of breast cancer. Time-consumption and inter-/intraobserver variability are major hindrances for evaluating the receptor score. In view of this, the paper proposes an automated Allred Scoring methodology for estrogen receptor (ER). White balancing is used to normalize the colour image taking into consideration colour variation during staining in different labs. Markov random field model with expectation-maximization optimization is employed to segment the ER cells. The proposed segmentation methodology is found to have F-measure 0.95. Artificial neural network is subsequently used to obtain intensity-based score for ER cells, from pixel colour intensity features. Simultaneously, proportion score - percentage of ER positive cells is computed via cell counting. The final ER score is computed by adding intensity and proportion scores - a standard Allred scoring system followed by pathologists. The classification accuracy for classification of cells by classifier in terms of F-measure is 0.9626. The problem of subjective interobserver ability is addressed by quantifying ER score from two expert pathologist and proposed methodology. The intraclass correlation achieved is greater than 0.90. The study has potential advantage of assisting pathologist in decision making over manual procedure and could evolve as a part of automated decision support system with other receptor scoring/analysis procedure. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  20. Approach to analysis of single nucleotide polymorphisms by automated constant denaturant capillary electrophoresis

    International Nuclear Information System (INIS)

    Bjoerheim, Jens; Abrahamsen, Torveig Weum; Kristensen, Annette Torgunrud; Gaudernack, Gustav; Ekstroem, Per O.

    2003-01-01

    Melting gel techniques have proven to be amenable and powerful tools in point mutation and single nucleotide polymorphism (SNP) analysis. With the introduction of commercially available capillary electrophoresis instruments, a partly automated platform for denaturant capillary electrophoresis with potential for routine screening of selected target sequences has been established. The aim of this article is to demonstrate the use of automated constant denaturant capillary electrophoresis (ACDCE) in single nucleotide polymorphism analysis of various target sequences. Optimal analysis conditions for different single nucleotide polymorphisms on ACDCE are evaluated with the Poland algorithm. Laboratory procedures include only PCR and electrophoresis. For direct genotyping of individual SNPs, the samples are analyzed with an internal standard and the alleles are identified by co-migration of sample and standard peaks. In conclusion, SNPs suitable for melting gel analysis based on theoretical thermodynamics were separated by ACDCE under appropriate conditions. With this instrumentation (ABI 310 Genetic Analyzer), 48 samples could be analyzed without any intervention. Several institutions have capillary instrumentation in-house, thus making this SNP analysis method accessible to large groups of researchers without any need for instrument modification

  1. Planning for Office Automation.

    Science.gov (United States)

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  2. An Intelligent Systems Approach to Automated Object Recognition: A Preliminary Study

    Science.gov (United States)

    Maddox, Brian G.; Swadley, Casey L.

    2002-01-01

    Attempts at fully automated object recognition systems have met with varying levels of success over the years. However, none of the systems have achieved high enough accuracy rates to be run unattended. One of the reasons for this may be that they are designed from the computer's point of view and rely mainly on image-processing methods. A better solution to this problem may be to make use of modern advances in computational intelligence and distributed processing to try to mimic how the human brain is thought to recognize objects. As humans combine cognitive processes with detection techniques, such a system would combine traditional image-processing techniques with computer-based intelligence to determine the identity of various objects in a scene.

  3. A Novel Approach for Enhancement of Automobile Clutch Engagement Quality Using Mechatronics Based Automated Clutch System

    Science.gov (United States)

    Tripathi, K.

    2013-01-01

    In automated manual clutch (AMC) a mechatronic system controls clutch force trajectory through an actuator governed by a control system. The present study identifies relevant characteristics of this trajectory and their effects on driveline dynamics and engagement quality. A new type of force trajectory is identified which gives the good engagement quality. However this trajectory is not achievable through conventional clutch control mechanism. But in AMC a mechatronic system based on electro-hydraulic or electro-mechanical elements can make it feasible. A mechatronic system is presented in which a mechatronic add-on system can be used to implement the novel force trajectory, without the requirement of replacing the traditional diaphragm spring based clutch in a vehicle with manual transmission.

  4. Evolutionary approaches for scheduling a flexible manufacturing system with automated guided vehicles and robots

    Directory of Open Access Journals (Sweden)

    Ramaraj Natarajan

    2012-08-01

    Full Text Available This paper addresses the scheduling of machines, an Automated Guided Vehicle (AGV and two robots in a Flexible Manufacturing System (FMS formed in three loop layouts, with objectives to minimize the makespan, mean flow time and mean tardiness. The scheduling optimization is carried out using Sheep Flock Heredity Algorithm (SFHA and Artificial Immune System (AIS algorithm. AGV is used for carrying jobs between the Load/Unload station and the machines. The robots are used for loading and unloading the jobs in the machines, and also used for transferring jobs between the machines. The algorithms are applied for test problems taken from the literature and the results obtained using the two algorithms are compared. The results indicate that SFHA performs better than AIS for this problem.

  5. Qualification of academic facilities for small-scale automated manufacture of autologous cell-based products.

    Science.gov (United States)

    Hourd, Paul; Chandra, Amit; Alvey, David; Ginty, Patrick; McCall, Mark; Ratcliffe, Elizabeth; Rayment, Erin; Williams, David J

    2014-01-01

    Academic centers, hospitals and small companies, as typical development settings for UK regenerative medicine assets, are significant contributors to the development of autologous cell-based therapies. Often lacking the appropriate funding, quality assurance heritage or specialist regulatory expertise, qualifying aseptic cell processing facilities for GMP compliance is a significant challenge. The qualification of a new Cell Therapy Manufacturing Facility with automated processing capability, the first of its kind in a UK academic setting, provides a unique demonstrator for the qualification of small-scale, automated facilities for GMP-compliant manufacture of autologous cell-based products in these settings. This paper shares our experiences in qualifying the Cell Therapy Manufacturing Facility, focusing on our approach to streamlining the qualification effort, the challenges, project delays and inefficiencies we encountered, and the subsequent lessons learned.

  6. AEDs at your fingertips: automated external defibrillators on college campuses and a novel approach for increasing accessibility.

    Science.gov (United States)

    Berger, Ryan J; O'Shea, Jesse G

    2014-01-01

    The use of automated external defibrillators (AEDs) increases survival in cardiac arrest events. Due to the success of previous efforts and free, readily available mobile mapping software, the discussion is to emphasize the importance of the use of AEDs to prevent sudden cardiac arrest-related deaths on college campuses and abroad, while suggesting a novel approach to aiding in access and awareness issues. A user-friendly mobile application (a low-cost iOS map) was developed at Florida State University to decrease AED retrieval distance and time. The development of mobile AED maps is feasible for a variety of universities and other entities, with the potential to save lives. Just having AEDs installed is not enough--they need to be easily locatable. Society increasingly relies on phones to provide information, and there are opportunities to use mobile technology to locate and share information about relevant emergency devices; these should be incorporated into the chain of survival.

  7. An automated Pearson's correlation change classification (APC3) approach for GC/MS metabonomic data using total ion chromatograms (TICs).

    Science.gov (United States)

    Prakash, Bhaskaran David; Esuvaranathan, Kesavan; Ho, Paul C; Pasikanti, Kishore Kumar; Chan, Eric Chun Yong; Yap, Chun Wei

    2013-05-21

    A fully automated and computationally efficient Pearson's correlation change classification (APC3) approach is proposed and shown to have overall comparable performance with both an average accuracy and an average AUC of 0.89 ± 0.08 but is 3.9 to 7 times faster, easier to use and have low outlier susceptibility in contrast to other dimensional reduction and classification combinations using only the total ion chromatogram (TIC) intensities of GC/MS data. The use of only the TIC permits the possible application of APC3 to other metabonomic data such as LC/MS TICs or NMR spectra. A RapidMiner implementation is available for download at http://padel.nus.edu.sg/software/padelapc3.

  8. Intelligent Systems Approach for Automated Identification of Individual Control Behavior of a Human Operator

    Science.gov (United States)

    Zaychik, Kirill B.; Cardullo, Frank M.

    2012-01-01

    Results have been obtained using conventional techniques to model the generic human operator?s control behavior, however little research has been done to identify an individual based on control behavior. The hypothesis investigated is that different operators exhibit different control behavior when performing a given control task. Two enhancements to existing human operator models, which allow personalization of the modeled control behavior, are presented. One enhancement accounts for the testing control signals, which are introduced by an operator for more accurate control of the system and/or to adjust the control strategy. This uses the Artificial Neural Network which can be fine-tuned to model the testing control. Another enhancement takes the form of an equiripple filter which conditions the control system power spectrum. A novel automated parameter identification technique was developed to facilitate the identification process of the parameters of the selected models. This utilizes a Genetic Algorithm based optimization engine called the Bit-Climbing Algorithm. Enhancements were validated using experimental data obtained from three different sources: the Manual Control Laboratory software experiments, Unmanned Aerial Vehicle simulation, and NASA Langley Research Center Visual Motion Simulator studies. This manuscript also addresses applying human operator models to evaluate the effectiveness of motion feedback when simulating actual pilot control behavior in a flight simulator.

  9. A new approach to automated assessment of fractionation of endocardial electrograms during atrial fibrillation

    International Nuclear Information System (INIS)

    Křemen, V; Lhotská, L; Macaš, M; Čihák, R; Vančura, V; Kautzner, J; Wichterle, D

    2008-01-01

    Complex fractionated atrial electrograms (CFAEs) may represent the electrophysiological substrate for atrial fibrillation (AF). Progress in signal processing algorithms to identify sites of CFAEs is crucial for the development of AF ablation strategies. A novel algorithm for automated description of fractionation of atrial electrograms (A-EGMs) based on the wavelet transform has been proposed. The algorithm was developed and validated using a representative set of 1.5 s A-EGM (n = 113) ranked by three experts into four categories: 1—organized atrial activity; 2—mild; 3—intermediate; 4—high degree of fractionation. A tight relationship between a fractionation index and expert classification of A-EGMs (Spearman correlation ρ = 0.87) was documented with a sensitivity of 82% and specificity of 90% for the identification of highly fractionated A-EGMs. This operator-independent description of A-EGM complexity may be easily incorporated into mapping systems to facilitate CFAE identification and to guide AF substrate ablation

  10. Novel approach to the behavioural characterization of inbred mice: automated home cage observations.

    Science.gov (United States)

    de Visser, L; van den Bos, R; Kuurman, W W; Kas, M J H; Spruijt, B M

    2006-08-01

    Here we present a newly developed tool for continuous recordings and analysis of novelty-induced and baseline behaviour of mice in a home cage-like environment. Aim of this study was to demonstrate the strength of this method by characterizing four inbred strains of mice, C57BL/6, DBA/2, C3H and 129S2/Sv, on locomotor activity. Strains differed in circadian rhythmicity, novelty-induced activity and the time-course of specific behavioural elements. For instance, C57BL/6 and DBA/2 mice showed a much faster decrease in activity over time than C3H and 129S2/Sv mice. Principal component analysis revealed two major factors within locomotor activity, which were defined as 'level of activity' and 'velocity/stops'. These factors were able to distinguish strains. Interestingly, mice that displayed high levels of activity in the initial phase of the home cage test were also highly active during an open-field test. Velocity and the number of stops during movement correlated positively with anxiety-related behaviour in the elevated plus maze. The use of an automated home cage observation system yields temporal changes in elements of locomotor activity with an advanced level of spatial resolution. Moreover, it avoids the confounding influence of human intervention and saves time-consuming human observations.

  11. Automated detection of pain from facial expressions: a rule-based approach using AAM

    Science.gov (United States)

    Chen, Zhanli; Ansari, Rashid; Wilkie, Diana J.

    2012-02-01

    In this paper, we examine the problem of using video analysis to assess pain, an important problem especially for critically ill, non-communicative patients, and people with dementia. We propose and evaluate an automated method to detect the presence of pain manifested in patient videos using a unique and large collection of cancer patient videos captured in patient homes. The method is based on detecting pain-related facial action units defined in the Facial Action Coding System (FACS) that is widely used for objective assessment in pain analysis. In our research, a person-specific Active Appearance Model (AAM) based on Project-Out Inverse Compositional Method is trained for each patient individually for the modeling purpose. A flexible representation of the shape model is used in a rule-based method that is better suited than the more commonly used classifier-based methods for application to the cancer patient videos in which pain-related facial actions occur infrequently and more subtly. The rule-based method relies on the feature points that provide facial action cues and is extracted from the shape vertices of AAM, which have a natural correspondence to face muscular movement. In this paper, we investigate the detection of a commonly used set of pain-related action units in both the upper and lower face. Our detection results show good agreement with the results obtained by three trained FACS coders who independently reviewed and scored the action units in the cancer patient videos.

  12. Streamline Your Project: A Lifecycle Model.

    Science.gov (United States)

    Viren, John

    2000-01-01

    Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…

  13. How streamlining telecommunications can cut IT expense.

    Science.gov (United States)

    McIntyre, Greg

    2016-02-01

    Hospitals and health systems can save IT expenses by implementing more efficient processes in accordance with the principles of effective telecommunications expense management. This approach involves three primary steps: Inventory of existing infrastructure. Charge verification. Optimization of rates and design for continual improvement.

  14. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S [University of Nebraska Medical Center, Omaha, NE (United States)

    2016-06-15

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  15. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    International Nuclear Information System (INIS)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S

    2016-01-01

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm"2. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  16. 48 CFR 12.602 - Streamlined evaluation of offers.

    Science.gov (United States)

    2010-10-01

    ... offers. 12.602 Section 12.602 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... for Commercial Items 12.602 Streamlined evaluation of offers. (a) When evaluation factors are used... evaluation factors. (b) Offers shall be evaluated in accordance with the criteria contained in the...

  17. Application-Tailored I/O with Streamline

    NARCIS (Netherlands)

    de Bruijn, W.J.; Bos, H.J.; Bal, H.E.

    2011-01-01

    Streamline is a stream-based OS communication subsystem that spans from peripheral hardware to userspace processes. It improves performance of I/O-bound applications (such as webservers and streaming media applications) by constructing tailor-made I/O paths through the operating system for each

  18. Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services

    Science.gov (United States)

    Collins, Patrick; Bahr, Thomas

    2016-04-01

    The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of

  19. TU-H-206-01: An Automated Approach for Identifying Geometric Distortions in Gamma Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Mann, S; Nelson, J [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Samei, E [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-06-15

    Purpose: To develop a clinically-deployable, automated process for detecting artifacts in routine nuclear medicine (NM) quality assurance (QA) bar phantom images. Methods: An artifact detection algorithm was created to analyze bar phantom images as part of an ongoing QA program. A low noise, high resolution reference image was acquired from an x-ray of the bar phantom with a Philips Digital Diagnost system utilizing image stitching. NM bar images, acquired for 5 million counts over a 512×512 matrix, were registered to the template image by maximizing mutual information (MI). The MI index was used as an initial test for artifacts; low values indicate an overall presence of distortions regardless of their spatial location. Images with low MI scores were further analyzed for bar linearity, periodicity, alignment, and compression to locate differences with respect to the template. Findings from each test were spatially correlated and locations failing multiple tests were flagged as potential artifacts requiring additional visual analysis. The algorithm was initially deployed for GE Discovery 670 and Infinia Hawkeye gamma cameras. Results: The algorithm successfully identified clinically relevant artifacts from both systems previously unnoticed by technologists performing the QA. Average MI indices for artifact-free images are 0.55. Images with MI indices < 0.50 have shown 100% sensitivity and specificity for artifact detection when compared with a thorough visual analysis. Correlation of geometric tests confirms the ability to spatially locate the most likely image regions containing an artifact regardless of initial phantom orientation. Conclusion: The algorithm shows the potential to detect gamma camera artifacts that may be missed by routine technologist inspections. Detection and subsequent correction of artifacts ensures maximum image quality and may help to identify failing hardware before it impacts clinical workflow. Going forward, the algorithm is being

  20. Automated nodule location and size estimation using a multi-scale Laplacian of Gaussian filtering approach.

    Science.gov (United States)

    Jirapatnakul, Artit C; Fotin, Sergei V; Reeves, Anthony P; Biancardi, Alberto M; Yankelevitz, David F; Henschke, Claudia I

    2009-01-01

    Estimation of nodule location and size is an important pre-processing step in some nodule segmentation algorithms to determine the size and location of the region of interest. Ideally, such estimation methods will consistently find the same nodule location regardless of where the the seed point (provided either manually or by a nodule detection algorithm) is placed relative to the "true" center of the nodule, and the size should be a reasonable estimate of the true nodule size. We developed a method that estimates nodule location and size using multi-scale Laplacian of Gaussian (LoG) filtering. Nodule candidates near a given seed point are found by searching for blob-like regions with high filter response. The candidates are then pruned according to filter response and location, and the remaining candidates are sorted by size and the largest candidate selected. This method was compared to a previously published template-based method. The methods were evaluated on the basis of stability of the estimated nodule location to changes in the initial seed point and how well the size estimates agreed with volumes determined by a semi-automated nodule segmentation method. The LoG method exhibited better stability to changes in the seed point, with 93% of nodules having the same estimated location even when the seed point was altered, compared to only 52% of nodules for the template-based method. Both methods also showed good agreement with sizes determined by a nodule segmentation method, with an average relative size difference of 5% and -5% for the LoG and template-based methods respectively.

  1. Temporal Dynamics of Health and Well-Being: A Crowdsourcing Approach to Momentary Assessments and Automated Generation of Personalized Feedback.

    Science.gov (United States)

    van der Krieke, Lian; Blaauw, Frank J; Emerencia, Ando C; Schenk, Hendrika M; Slaets, Joris P J; Bos, Elisabeth H; de Jonge, Peter; Jeronimus, Bertus F

    Recent developments in research and mobile health enable a quantitative idiographic approach in health research. The present study investigates the potential of an electronic diary crowdsourcing study in the Netherlands for (1) large-scale automated self-assessment for individual-based health promotion and (2) enabling research at both the between-persons and within-persons level. To illustrate the latter, we examined between-persons and within-persons associations between somatic symptoms and quality of life. A website provided the general Dutch population access to a 30-day (3 times a day) diary study assessing 43 items related to health and well-being, which gave participants personalized feedback. Associations between somatic symptoms and quality of life were examined with a linear mixed model. A total of 629 participants completed 28,430 assessments, with a mean (SD) of 45 (32) assessments per participant. Most participants (n = 517 [82%]) were women and 531 (84%) had high education. Almost 40% of the participants (n = 247) completed enough assessments (t = 68) to generate personalized feedback including temporal dynamics between well-being, health behavior, and emotions. Substantial between-person variability was found in the within-person association between somatic symptoms and quality of life. We successfully built an application for automated diary assessments and personalized feedback. The application was used by a sample of mainly highly educated women, which suggests that the potential of our intensive diary assessment method for large-scale health promotion is limited. However, a rich data set was collected that allows for group-level and idiographic analyses that can shed light on etiological processes and may contribute to the development of empirical-based health promotion solutions.

  2. Reliability centered maintenance streamlining through lessons learned

    International Nuclear Information System (INIS)

    Strong, D.K.

    1991-01-01

    In late 1986, PSE and G concluded that the Nuclear Department would develop a consistent approach to maintenance at Artificial Island (Salem and Hope Creak nuclear units). Preventive maintenance (PM) would be the heart of this approach. In the last six months of 1987 departments affected by the maintenance program participated on working groups that developed the Artificial Island maintenance philosophy. The central theme of the maintenance philosophy is the RCM (reliability centered maintenance) process. A pilot project tested the process in 1988. In 1989 the Central PM Group formed and in 1990 was given responsibility and authority to analyze, approve, implement, and control PM program changes. RCM is the central theme of the PM improvement effort but not the whole effort. Other important pieces included in this paper are: development of a common PM program, improvement of work instructions, development of predictive maintenance techniques into programs, development of a PM basis database, development of PM feedback from failure trends, root cause analysis, maintenance performance indicators, technicians, and engineers

  3. Intelligent systems approach for automated identification of individual control behavior of a human operator

    Science.gov (United States)

    Zaychik, Kirill B.

    Acceptable results have been obtained using conventional techniques to model the generic human operator's control behavior. However, little research has been done in an attempt to identify an individual based on his/her control behavior. The main hypothesis investigated in this dissertation is that different operators exhibit different control behavior when performing a given control task. Furthermore, inter-person differences are manifested in the amplitude and frequency content of the non-linear component of the control behavior. Two enhancements to the existing models of the human operator, which allow personalization of the modeled control behavior, are presented in this dissertation. One of the proposed enhancements accounts for the "testing" control signals, which are introduced by an operator for more accurate control of the system and/or to adjust his/her control strategy. Such enhancement uses the Artificial Neural Network (ANN), which can be fine-tuned to model the "testing" control behavior of a given individual. The other model enhancement took the form of an equiripple filter (EF), which conditions the power spectrum of the control signal before it is passed through the plant dynamics block. The filter design technique uses Parks-McClellan algorithm, which allows parameterization of the desired levels of power at certain frequencies. A novel automated parameter identification technique (APID) was developed to facilitate the identification process of the parameters of the selected models of the human operator. APID utilizes a Genetic Algorithm (GA) based optimization engine called the Bit-climbing Algorithm (BCA). Proposed model enhancements were validated using the experimental data obtained at three different sources: the Manual Control Laboratory software experiments, Unmanned Aerial Vehicle simulation, and NASA Langley Research Center Visual Motion Simulator studies. Validation analysis involves comparison of the actual and simulated control

  4. Water table fluctuations and soil biogeochemistry: An experimental approach using an automated soil column system

    Science.gov (United States)

    Rezanezhad, F.; Couture, R.-M.; Kovac, R.; O'Connell, D.; Van Cappellen, P.

    2014-02-01

    Water table fluctuations significantly affect the biological and geochemical functioning of soils. Here, we introduce an automated soil column system in which the water table regime is imposed using a computer-controlled, multi-channel pump connected to a hydrostatic equilibrium reservoir and a water storage reservoir. The potential of this new system is illustrated by comparing results from two columns filled with 45 cm of the same homogenized riparian soil. In one soil column the water table remained constant at -20 cm below the soil surface, while in the other the water table oscillated between the soil surface and the bottom of the column, at a rate of 4.8 cm d-1. The experiment ran for 75 days at room temperature (25 ± 2 °C). Micro-sensors installed at -10 and -30 cm below the soil surface in the stable water table column recorded constant redox potentials on the order of 600 and -200 mV, respectively. In the fluctuating water table column, redox potentials at the same depths oscillated between oxidizing (∼700 mV) and reducing (∼-100 mV) conditions. Pore waters collected periodically and solid-phase analyses on core material obtained at the end of the experiment highlighted striking geochemical differences between the two columns, especially in the time series and depth distributions of Fe, Mn, K, P and S. Soil CO2 emissions derived from headspace gas analysis exhibited periodic variations in the fluctuating water table column, with peak values during water table drawdown. Transient redox conditions caused by the water table fluctuations enhanced microbial oxidation of soil organic matter, resulting in a pronounced depletion of particulate organic carbon in the midsection of the fluctuating water table column. Denaturing Gradient Gel Electrophoresis (DGGE) revealed the onset of differentiation of the bacterial communities in the upper (oxidizing) and lower (reducing) soil sections, although no systematic differences in microbial community structure

  5. Set Based PLM Implementation, a Modular Approach to PLM Process Knowledge, Management and Automation

    NARCIS (Netherlands)

    Koomen, Sebastiaan Pieter; Ríos, J.; Bernard, A.; Bouras, A.; Foufou, S.

    2017-01-01

    In many cases PLM implementations are halted in the first phases of larger projects. On average, implementation projects take longer, cost more than planned and not all goals are achieved despite modern software implementation methods like Agile or Scrum. This paper proposes another approach, in

  6. Guards: An approach safety-related systems using cots example of MMI and reactor automation in nuclear submarine application

    International Nuclear Information System (INIS)

    Brun, M.

    1998-01-01

    For at least 10 years, the nuclear industry designs and licences specific digital safety-critical systems (IEC 1226 class A). One key issue for future programs is to design and licence safety-related systems providing more complex functions and using Commercial-Off-The-Shelf components. This issue is especially raised for Reactor automation and Man-Machine-Interface. The usual I and C (Instrumentation and Control) organisation for these functions is based on redundancy between a commercial, up-to-date, unclassified > system and a simplified classified > system using traditional technologies. It clearly appears that such organisation is not satisfying from the point of view of people who have actually to operate these systems: The operator is supposed not to trust the normal system and rely on the back-up system which is less helpful and that he use very few. This paper presents a new approach to that problem using COTS components in low-level layers, safety architecture and mechanisms at medium level layer (GUARDS architecture developed in the current ESPRIT project number 20716), and a pre-validated functional layer. The aim of this solution is to comply with the > IEC 1226 class B requirements, at lower overall cost (design, implementation, licensing, long term confidence). This approach is illustrated by its application in Man-Machine-Interface (MMI) for our future program of Nuclear submarine. (author)

  7. What is a Dune: Developing AN Automated Approach to Extracting Dunes from Digital Elevation Models

    Science.gov (United States)

    Taylor, H.; DeCuir, C.; Wernette, P. A.; Taube, C.; Eyler, R.; Thopson, S.

    2016-12-01

    Coastal dunes can absorb storm surge and mitigate inland erosion caused by elevated water levels during a storm. In order to understand how a dune responds to and recovers from a storm, it is important that we can first identify and differentiate the beach and dune from the rest of the landscape. Current literature does not provide a consistent definition of what the dune features (e.g. dune toe, dune crest) are or how they can be extracted. The purpose of this research is to develop enhanced approaches to extracting dunes from a digital elevation model (DEM). Manual delineation, convergence index, least-cost path, relative relief, and vegetation abundance were compared and contrasted on a small area of Padre Island National Seashore (PAIS), Preliminary results indicate that the method used to extract the dune greatly affects our interpretation of how the dune changes. The manual delineation method was time intensive and subjective, while the convergence index approach was useful to easily identify the dune crest through maximum and minimum values. The least-cost path method proved to be time intensive due to data clipping; however, this approach resulted in continuous geomorphic landscape features (e.g. dune toe, dune crest). While the relative relief approach shows the most features in multi resolution, it is difficult to assess the accuracy of the extracted features because extracted features appear as points that can vary widely in their location from one meter to the next. The vegetation approach was greatly impacted by the seasonal and annual fluctuations of growth but is advantageous in historical change studies because it can be used to extract consistent dune formation from historical aerial imagery. Improving our ability to more accurately assess dune response and recovery to a storm will enable coastal managers to more accurately predict how dunes may respond to future climate change scenarios.

  8. Joint statistics and conditional mean strain rates of streamline segments

    International Nuclear Information System (INIS)

    Schaefer, P; Gampert, M; Peters, N

    2013-01-01

    Based on four different direct numerical simulations of turbulent flows with Taylor-based Reynolds numbers ranging from Re λ = 50 to 300 among which are two homogeneous isotropic decaying, one forced and one homogeneous shear flow, streamlines are identified and the obtained space curves are parameterized with the pseudo-time as well as the arclength. Based on local extrema of the absolute value of the velocity along the streamlines, the latter are partitioned into segments following Wang (2010 J. Fluid Mech. 648 183–203). Streamline segments are then statistically analyzed based on both parameterizations using the joint probability density function of the pseudo-time lag τ (arclength l, respectively) between and the velocity difference Δu at the extrema: P(τ,Δu), (P(l,Δu)). We distinguish positive and negative streamline segments depending on the sign of the velocity difference Δu. Differences as well as similarities in the statistical description for both parameterizations are discussed. In particular, it turns out that the normalized probability distribution functions (pdfs) (of both parameterizations) of the length of positive, negative and all segments assume a universal shape for all Reynolds numbers and flow types and are well described by a model derived in Schaefer P et al (2012 Phys. Fluids 24 045104). Particular attention is given to the conditional mean velocity difference at the ending points of the segments, which can be understood as a first-order structure function in the context of streamline segment analysis. It determines to a large extent the stretching (compression) of positive (negative) streamline segments and corresponds to the convective velocity in phase space in the transport model equation for the pdf. While based on the random sweeping hypothesis a scaling ∝ (u rms ετ) 1/3 is found for the parameterization based on the pseudo-time, the parameterization with the arclength l yields a much larger than expected l 1/3 scaling. A

  9. Environmental protection: Streamlining petroleum exploration and production

    International Nuclear Information System (INIS)

    Hunt, A.M.

    1991-01-01

    The petroleum industry is inherently subject to a tremendous degree of volatility through fluctuation in world market prices and vagaries of world politics. A more recent stressful demand on the existing domestic petroleum exploration and production system has been the burgeoning number of environmental regulations imposed on this segment of the industry. Prudent and acceptable oil-field practices must now include agency-regulated environmental protection measures. Many independent producers are unfamiliar not only with the regulatory agencies, but also with the jargon and ambiguities, of regulations that very widely from state to state. Whereas some companies perceive only the restrictions and added cost of regulatory compliance, other companies have sought to optimize benefits while minimizing financial burdens by approaching this modern necessity more creatively, thereby discovering numerous means to become even more competitive. The domestic oil field of the 1990s will be increasingly affected by environmental regulation and public opinion. A number of companies have taken a proactive position on environmental issues. Industry examples include Louisiana Land and Exploration Company's history of wetlands conservation and Chevron's SMART (Save Money and Reduce Toxics). The future of the quality of life of this nation, and indeed the planet as a whole, lies in our capability to deal concurrently with the issues of a petroleum-based economy while protecting the natural environment that sustains life

  10. Solving for the Surface: An Automated Approach to THEMIS Atmospheric Correction

    Science.gov (United States)

    Ryan, A. J.; Salvatore, M. R.; Smith, R.; Edwards, C. S.; Christensen, P. R.

    2013-12-01

    Here we present the initial results of an automated atmospheric correction algorithm for the Thermal Emission Imaging System (THEMIS) instrument, whereby high spectral resolution Thermal Emission Spectrometer (TES) data are queried to generate numerous atmospheric opacity values for each THEMIS infrared image. While the pioneering methods of Bandfield et al. [2004] also used TES spectra to atmospherically correct THEMIS data, the algorithm presented here is a significant improvement because of the reduced dependency on user-defined inputs for individual images. Additionally, this technique is particularly useful for correcting THEMIS images that have captured a range of atmospheric conditions and/or surface elevations, issues that have been difficult to correct for using previous techniques. Thermal infrared observations of the Martian surface can be used to determine the spatial distribution and relative abundance of many common rock-forming minerals. This information is essential to understanding the planet's geologic and climatic history. However, the Martian atmosphere also has absorptions in the thermal infrared which complicate the interpretation of infrared measurements obtained from orbit. TES has sufficient spectral resolution (143 bands at 10 cm-1 sampling) to linearly unmix and remove atmospheric spectral end-members from the acquired spectra. THEMIS has the benefit of higher spatial resolution (~100 m/pixel vs. 3x5 km/TES-pixel) but has lower spectral resolution (8 surface sensitive spectral bands). As such, it is not possible to isolate the surface component by unmixing the atmospheric contribution from the THEMIS spectra, as is done with TES. Bandfield et al. [2004] developed a technique using atmospherically corrected TES spectra as tie-points for constant radiance offset correction and surface emissivity retrieval. This technique is the primary method used to correct THEMIS but is highly susceptible to inconsistent results if great care in the

  11. A Novel Approach for Fully Automated, Personalized Health Coaching for Adults with Prediabetes: Pilot Clinical Trial.

    Science.gov (United States)

    Everett, Estelle; Kane, Brian; Yoo, Ashley; Dobs, Adrian; Mathioudakis, Nestoras

    2018-02-27

    Prediabetes is a high-risk state for the future development of type 2 diabetes, which may be prevented through physical activity (PA), adherence to a healthy diet, and weight loss. Mobile health (mHealth) technology is a practical and cost-effective method of delivering diabetes prevention programs in a real-world setting. Sweetch (Sweetch Health, Ltd) is a fully automated, personalized mHealth platform designed to promote adherence to PA and weight reduction in people with prediabetes. The objective of this pilot study was to calibrate the Sweetch app and determine the feasibility, acceptability, safety, and effectiveness of the Sweetch app in combination with a digital body weight scale (DBWS) in adults with prediabetes. This was a 3-month prospective, single-arm, observational study of adults with a diagnosis of prediabetes and body mass index (BMI) between 24 kg/m 2 and 40 kg/m 2 . Feasibility was assessed by study retention. Acceptability of the mobile platform and DBWS were evaluated using validated questionnaires. Effectiveness measures included change in PA, weight, BMI, glycated hemoglobin (HbA 1c ), and fasting blood glucose from baseline to 3-month visit. The significance of changes in outcome measures was evaluated using paired t test or Wilcoxon matched pairs test. The study retention rate was 47 out of 55 (86%) participants. There was a high degree of acceptability of the Sweetch app, with a median (interquartile range [IQR]) score of 78% (73%-80%) out of 100% on the validated System Usability Scale. Satisfaction regarding the DBWS was also high, with median (IQR) score of 93% (83%-100%). PA increased by 2.8 metabolic equivalent of task (MET)-hours per week (SD 6.8; P=.02), with mean weight loss of 1.6 kg (SD 2.5; P<.001) from baseline. The median change in A 1c was -0.1% (IQR -0.2% to 0.1%; P=.04), with no significant change in fasting blood glucose (-1 mg/dL; P=.59). There were no adverse events reported. The Sweetch mobile

  12. An Evaluation of the Acquisition Streamlining Methods at the Fleet and Industrial Supply Center Pearl Harbor Hawaii

    National Research Council Canada - National Science Library

    Henry, Mark

    1999-01-01

    ...) Pearl Harbor's implementation of acquisition streamlining initiatives and recommends viable methods of streamlining the acquisition process at FISC Pearl Harbor and other Naval Supply Systems Command...

  13. Streamlining: Reducing costs and increasing STS operations effectiveness

    Science.gov (United States)

    Petersburg, R. K.

    1985-01-01

    The development of streamlining as a concept, its inclusion in the space transportation system engineering and operations support (STSEOS) contract, and how it serves as an incentive to management and technical support personnel is discussed. The mechanics of encouraging and processing streamlining suggestions, reviews, feedback to submitters, recognition, and how individual employee performance evaluations are used to motivation are discussed. Several items that were implemented are mentioned. Information reported and the methodology of determining estimated dollar savings are outlined. The overall effect of this activity on the ability of the McDonnell Douglas flight preparation and mission operations team to support a rapidly increasing flight rate without a proportional increase in cost is illustrated.

  14. Streamline topology: Patterns in fluid flows and their bifurcations

    DEFF Research Database (Denmark)

    Brøns, Morten

    2007-01-01

    Using dynamical systems theory, we consider structures such as vortices and separation in the streamline patterns of fluid flows. Bifurcation of patterns under variation of external parameters is studied using simplifying normal form transformations. Flows away from boundaries, flows close to fix...... walls, and axisymmetric flows are analyzed in detail. We show how to apply the ideas from the theory to analyze numerical simulations of the vortex breakdown in a closed cylindrical container....

  15. Damage Detection with Streamlined Structural Health Monitoring Data

    OpenAIRE

    Li, Jian; Deng, Jun; Xie, Weizhi

    2015-01-01

    The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM) systems often overwhelms the systems’ capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compressio...

  16. Zephyr: A secure Internet process to streamline engineering

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, C.W.; Niven, W.A.; Cavitt, R.E. [and others

    1998-05-12

    Lawrence Livermore National Laboratory (LLNL) is implementing an Internet-based process pilot called `Zephyr` to streamline engineering and commerce using the Internet. Major benefits have accrued by using Zephyr in facilitating industrial collaboration, speeding the engineering development cycle, reducing procurement time, and lowering overall costs. Programs at LLNL are potentializing the efficiencies introduced since implementing Zephyr. Zephyr`s pilot functionality is undergoing full integration with Business Systems, Finance, and Vendors to support major programs at the Laboratory.

  17. A Streamlined Artificial Variable Free Version of Simplex Method

    OpenAIRE

    Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad

    2015-01-01

    This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new ...

  18. Cubic Bezier Curve Approach for Automated Offline Signature Verification with Intrusion Identification

    Directory of Open Access Journals (Sweden)

    Arun Vijayaragavan

    2014-01-01

    Full Text Available Authentication is a process of identifying person’s rights over a system. Many authentication types are used in various systems, wherein biometrics authentication systems are of a special concern. Signature verification is a basic biometric authentication technique used widely. The signature matching algorithm uses image correlation and graph matching technique which provides false rejection or acceptance. We proposed a model to compare knowledge from signature. Intrusion in the signature repository system results in copy of the signature that leads to false acceptance. Our approach uses a Bezier curve algorithm to identify the curve points and uses the behaviors of the signature for verification. An analyzing mobile agent is used to identify the input signature parameters and compare them with reference signature repository. It identifies duplication of signature over intrusion and rejects it. Experiments are conducted on a database with thousands of signature images from various sources and the results are favorable.

  19. Are Flow Injection-based Approaches Suitable for Automated Handling of Solid Samples?

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald; Cerdà, Victor

    Flow-based approaches were originally conceived for liquid-phase analysis, implying that constituents in solid samples generally had to be transferred into the liquid state, via appropriate batch pretreatment procedures, prior to analysis. Yet, in recent years, much effort has been focused...... electrolytic or aqueous leaching, on-line dialysis/microdialysis, in-line filtration, and pervaporation-based procedures have been successfully implemented in continuous flow/flow injection systems. In this communication, the new generation of flow analysis, including sequential injection, multicommutated flow.......g., soils, sediments, sludges), and thus, ascertaining the potential mobility, bioavailability and eventual impact of anthropogenic elements on biota [2]. In this context, the principles of sequential injection-microcolumn extraction (SI-MCE) for dynamic fractionation are explained in detail along...

  20. An automated approach for early detection of diabetic retinopathy using SD-OCT images.

    Science.gov (United States)

    ElTanboly, Ahmed H; Palacio, Agustina; Shalaby, Ahmed M; Switala, Andrew E; Helmy, Omar; Schaal, Shlomit; El-Baz, Ayman

    2018-01-01

      This study was to demonstrate the feasibility of an automatic approach for early detection of diabetic retinopathy (DR) from SD-OCT images. These scans were prospectively collected from 200 subjects through the fovea then were automatically segmented, into 12 layers. Each layer was characterized by its thickness, tortuosity, and normalized reflectivity. 26 diabetic patients, without DR changes visible by funduscopic examination, were matched with 26 controls, according to age and sex, for purposes of statistical analysis using mixed effects ANOVA. The INL was narrower in diabetes (p = 0.14), while the NFL (p = 0.04) and IZ (p = 0.34) were thicker. Tortuosity of layers NFL through the OPL was greater in diabetes (all p diabetes. In turn, carries the promise to a reliable non-invasive diagnostic tool for early detection of DR.

  1. Dividing Streamline Formation Channel Confluences by Physical Modeling

    Directory of Open Access Journals (Sweden)

    Minarni Nur Trilita

    2010-02-01

    Full Text Available Confluence channels are often found in open channel network system and is the most important element. The incoming flow from the branch channel to the main cause various forms and cause vortex flow. Phenomenon can cause erosion of the side wall of the channel, the bed channel scour and sedimentation in the downstream confluence channel. To control these problems needed research into the current width of the branch channel. The incoming flow from the branch channel to the main channel flow bounded by a line distributors (dividing streamline. In this paper, the wide dividing streamline observed in the laboratory using a physical model of two open channels, a square that formed an angle of 30º. Observations were made with a variety of flow coming from each channel. The results obtained in the laboratory observation that the width of dividing streamline flow is influenced by the discharge ratio between the channel branch with the main channel. While the results of a comparison with previous studies showing that the observation in the laboratory is smaller than the results of previous research.

  2. Automated Classification of Radiology Reports for Acute Lung Injury: Comparison of Keyword and Machine Learning Based Natural Language Processing Approaches.

    Science.gov (United States)

    Solti, Imre; Cooke, Colin R; Xia, Fei; Wurfel, Mark M

    2009-11-01

    This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators.

  3. A new method for calculating volumetric sweeps efficiency using streamline simulation concepts

    International Nuclear Information System (INIS)

    Hidrobo, E A

    2000-01-01

    One of the purposes of reservoir engineering is to quantify the volumetric sweep efficiency for optimizing reservoir management decisions. The estimation of this parameter has always been a difficult task. Until now, sweep efficiency correlations and calculations have been limited to mostly homogeneous 2-D cases. Calculating volumetric sweep efficiency in a 3-D heterogeneous reservoir becomes difficult due to inherent complexity of multiple layers and arbitrary well configurations. In this paper, a new method for computing volumetric sweep efficiency for any arbitrary heterogeneity and well configuration is presented. The proposed method is based on Datta-Gupta and King's formulation of streamline time-of-flight (1995). Given the fact that the time-of-flight reflects the fluid front propagation at various times, then the connectivity in the time-of-flight represents a direct measure of the volumetric sweep efficiency. The proposed approach has been applied to synthetic as well as field examples. Synthetic examples are used to validate the volumetric sweep efficiency calculations using the streamline time-of-flight connectivity criterion by comparison with analytic solutions and published correlations. The field example, which illustrates the feasibility of the approach for large-scale field applications, is from the north Robertson unit, a low permeability carbonate reservoir in west Texas

  4. A machine learning approach to automated structural network analysis: application to neonatal encephalopathy.

    Directory of Open Access Journals (Sweden)

    Etay Ziv

    Full Text Available Neonatal encephalopathy represents a heterogeneous group of conditions associated with life-long developmental disabilities and neurological deficits. Clinical measures and current anatomic brain imaging remain inadequate predictors of outcome in children with neonatal encephalopathy. Some studies have suggested that brain development and, therefore, brain connectivity may be altered in the subgroup of patients who subsequently go on to develop clinically significant neurological abnormalities. Large-scale structural brain connectivity networks constructed using diffusion tractography have been posited to reflect organizational differences in white matter architecture at the mesoscale, and thus offer a unique tool for characterizing brain development in patients with neonatal encephalopathy. In this manuscript we use diffusion tractography to construct structural networks for a cohort of patients with neonatal encephalopathy. We systematically map these networks to a high-dimensional space and then apply standard machine learning algorithms to predict neurological outcome in the cohort. Using nested cross-validation we demonstrate high prediction accuracy that is both statistically significant and robust over a broad range of thresholds. Our algorithm offers a novel tool to evaluate neonates at risk for developing neurological deficit. The described approach can be applied to any brain pathology that affects structural connectivity.

  5. A hybrid computational-experimental approach for automated crystal structure solution

    Science.gov (United States)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  6. Improving the iterative Linear Interaction Energy approach using automated recognition of configurational transitions.

    Science.gov (United States)

    Vosmeer, C Ruben; Kooi, Derk P; Capoferri, Luigi; Terpstra, Margreet M; Vermeulen, Nico P E; Geerke, Daan P

    2016-01-01

    Recently an iterative method was proposed to enhance the accuracy and efficiency of ligand-protein binding affinity prediction through linear interaction energy (LIE) theory. For ligand binding to flexible Cytochrome P450s (CYPs), this method was shown to decrease the root-mean-square error and standard deviation of error prediction by combining interaction energies of simulations starting from different conformations. Thereby, different parts of protein-ligand conformational space are sampled in parallel simulations. The iterative LIE framework relies on the assumption that separate simulations explore different local parts of phase space, and do not show transitions to other parts of configurational space that are already covered in parallel simulations. In this work, a method is proposed to (automatically) detect such transitions during the simulations that are performed to construct LIE models and to predict binding affinities. Using noise-canceling techniques and splines to fit time series of the raw data for the interaction energies, transitions during simulation between different parts of phase space are identified. Boolean selection criteria are then applied to determine which parts of the interaction energy trajectories are to be used as input for the LIE calculations. Here we show that this filtering approach benefits the predictive quality of our previous CYP 2D6-aryloxypropanolamine LIE model. In addition, an analysis is performed of the gain in computational efficiency that can be obtained from monitoring simulations using the proposed filtering method and by prematurely terminating simulations accordingly.

  7. A rapid approach for automated comparison of independently derived stream networks

    Science.gov (United States)

    Stanislawski, Larry V.; Buttenfield, Barbara P.; Doumbouya, Ariel T.

    2015-01-01

    This paper presents an improved coefficient of line correspondence (CLC) metric for automatically assessing the similarity of two different sets of linear features. Elevation-derived channels at 1:24,000 scale (24K) are generated from a weighted flow-accumulation model and compared to 24K National Hydrography Dataset (NHD) flowlines. The CLC process conflates two vector datasets through a raster line-density differencing approach that is faster and more reliable than earlier methods. Methods are tested on 30 subbasins distributed across different terrain and climate conditions of the conterminous United States. CLC values for the 30 subbasins indicate 44–83% of the features match between the two datasets, with the majority of the mismatching features comprised of first-order features. Relatively lower CLC values result from subbasins with less than about 1.5 degrees of slope. The primary difference between the two datasets may be explained by different data capture criteria. First-order, headwater tributaries derived from the flow-accumulation model are captured more comprehensively through drainage area and terrain conditions, whereas capture of headwater features in the NHD is cartographically constrained by tributary length. The addition of missing headwaters to the NHD, as guided by the elevation-derived channels, can substantially improve the scientific value of the NHD.

  8. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  9. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  10. Automated Inspection of Defects in Optical Fiber Connector End Face Using Novel Morphology Approaches.

    Science.gov (United States)

    Mei, Shuang; Wang, Yudan; Wen, Guojun; Hu, Yang

    2018-05-03

    Increasing deployment of optical fiber networks and the need for reliable high bandwidth make the task of inspecting optical fiber connector end faces a crucial process that must not be neglected. Traditional end face inspections are usually performed by manual visual methods, which are low in efficiency and poor in precision for long-term industrial applications. More seriously, the inspection results cannot be quantified for subsequent analysis. Aiming at the characteristics of typical defects in the inspection process for optical fiber end faces, we propose a novel method, “difference of min-max ranking filtering” (DO2MR), for detection of region-based defects, e.g., dirt, oil, contamination, pits, and chips, and a special model, a “linear enhancement inspector” (LEI), for the detection of scratches. The DO2MR is a morphology method that intends to determine whether a pixel belongs to a defective region by comparing the difference of gray values of pixels in the neighborhood around the pixel. The LEI is also a morphology method that is designed to search for scratches at different orientations with a special linear detector. These two approaches can be easily integrated into optical inspection equipment for automatic quality verification. As far as we know, this is the first time that complete defect detection methods for optical fiber end faces are available in the literature. Experimental results demonstrate that the proposed DO2MR and LEI models yield good comprehensive performance with high precision and accepted recall rates, and the image-level detection accuracies reach 96.0 and 89.3%, respectively.

  11. Automated Inspection of Defects in Optical Fiber Connector End Face Using Novel Morphology Approaches

    Directory of Open Access Journals (Sweden)

    Shuang Mei

    2018-05-01

    Full Text Available Increasing deployment of optical fiber networks and the need for reliable high bandwidth make the task of inspecting optical fiber connector end faces a crucial process that must not be neglected. Traditional end face inspections are usually performed by manual visual methods, which are low in efficiency and poor in precision for long-term industrial applications. More seriously, the inspection results cannot be quantified for subsequent analysis. Aiming at the characteristics of typical defects in the inspection process for optical fiber end faces, we propose a novel method, “difference of min-max ranking filtering” (DO2MR, for detection of region-based defects, e.g., dirt, oil, contamination, pits, and chips, and a special model, a “linear enhancement inspector” (LEI, for the detection of scratches. The DO2MR is a morphology method that intends to determine whether a pixel belongs to a defective region by comparing the difference of gray values of pixels in the neighborhood around the pixel. The LEI is also a morphology method that is designed to search for scratches at different orientations with a special linear detector. These two approaches can be easily integrated into optical inspection equipment for automatic quality verification. As far as we know, this is the first time that complete defect detection methods for optical fiber end faces are available in the literature. Experimental results demonstrate that the proposed DO2MR and LEI models yield good comprehensive performance with high precision and accepted recall rates, and the image-level detection accuracies reach 96.0 and 89.3%, respectively.

  12. Application aware approach to compression and transmission of H.264 encoded video for automated and centralized transportation surveillance.

    Science.gov (United States)

    2012-10-01

    In this report we present a transportation video coding and wireless transmission system specically tailored to automated : vehicle tracking applications. By taking into account the video characteristics and the lossy nature of the wireless channe...

  13. SEMI-AUTOMATED APPROACH FOR MAPPING URBAN TREES FROM INTEGRATED AERIAL LiDAR POINT CLOUD AND DIGITAL IMAGERY DATASETS

    Directory of Open Access Journals (Sweden)

    M. A. Dogon-Yaro

    2016-09-01

    Full Text Available Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  14. Approaches towards the automated interpretation and prediction of electrospray tandem mass spectra of non-peptidic combinatorial compounds.

    Science.gov (United States)

    Klagkou, Katerina; Pullen, Frank; Harrison, Mark; Organ, Andy; Firth, Alistair; Langley, G John

    2003-01-01

    Combinatorial chemistry is widely used within the pharmaceutical industry as a means of rapid identification of potential drugs. With the growth of combinatorial libraries, mass spectrometry (MS) became the key analytical technique because of its speed of analysis, sensitivity, accuracy and ability to be coupled with other analytical techniques. In the majority of cases, electrospray mass spectrometry (ES-MS) has become the default ionisation technique. However, due to the absence of fragment ions in the resulting spectra, tandem mass spectrometry (MS/MS) is required to provide structural information for the identification of an unknown analyte. This work discusses the first steps of an investigation into the fragmentation pathways taking place in electrospray tandem mass spectrometry. The ultimate goal for this project is to set general fragmentation rules for non-peptidic, pharmaceutical, combinatorial compounds. As an aid, an artificial intelligence (AI) software package is used to facilitate interpretation of the spectra. This initial study has focused on determining the fragmentation rules for some classes of compound types that fit the remit as outlined above. Based on studies carried out on several combinatorial libraries of these compounds, it was established that different classes of drug molecules follow unique fragmentation pathways. In addition to these general observations, the specific ionisation processes and the fragmentation pathways involved in the electrospray mass spectra of these systems were explored. The ultimate goal will be to incorporate our findings into the computer program and allow identification of an unknown, non-peptidic compound following insertion of its ES-MS/MS spectrum into the AI package. The work herein demonstrates the potential benefit of such an approach in addressing the issue of high-throughput, automated MS/MS data interpretation. Copyright 2003 John Wiley & Sons, Ltd.

  15. Migraine Subclassification via a Data-Driven Automated Approach Using Multimodality Factor Mixture Modeling of Brain Structure Measurements.

    Science.gov (United States)

    Schwedt, Todd J; Si, Bing; Li, Jing; Wu, Teresa; Chong, Catherine D

    2017-07-01

    The current subclassification of migraine is according to headache frequency and aura status. The variability in migraine symptoms, disease course, and response to treatment suggest the presence of additional heterogeneity or subclasses within migraine. The study objective was to subclassify migraine via a data-driven approach, identifying latent factors by jointly exploiting multiple sets of brain structural features obtained via magnetic resonance imaging (MRI). Migraineurs (n = 66) and healthy controls (n = 54) had brain MRI measurements of cortical thickness, cortical surface area, and volumes for 68 regions. A multimodality factor mixture model was used to subclassify MRIs and to determine the brain structural factors that most contributed to the subclassification. Clinical characteristics of subjects in each subgroup were compared. Automated MRI classification divided the subjects into two subgroups. Migraineurs in subgroup #1 had more severe allodynia symptoms during migraines (6.1 ± 5.3 vs. 3.6 ± 3.2, P = .03), more years with migraine (19.2 ± 11.3 years vs 13 ± 8.3 years, P = .01), and higher Migraine Disability Assessment (MIDAS) scores (25 ± 22.9 vs 15.7 ± 12.2, P = .04). There were not differences in headache frequency or migraine aura status between the two subgroups. Data-driven subclassification of brain MRIs based upon structural measurements identified two subgroups. Amongst migraineurs, the subgroups differed in allodynia symptom severity, years with migraine, and migraine-related disability. Since allodynia is associated with this imaging-based subclassification of migraine and prior publications suggest that allodynia impacts migraine treatment response and disease prognosis, future migraine diagnostic criteria could consider allodynia when defining migraine subgroups. © 2017 American Headache Society.

  16. Automated Grading of Gliomas using Deep Learning in Digital Pathology Images: A modular approach with ensemble of convolutional neural networks.

    Science.gov (United States)

    Ertosun, Mehmet Günhan; Rubin, Daniel L

    2015-01-01

    Brain glioma is the most common primary malignant brain tumors in adults with different pathologic subtypes: Lower Grade Glioma (LGG) Grade II, Lower Grade Glioma (LGG) Grade III, and Glioblastoma Multiforme (GBM) Grade IV. The survival and treatment options are highly dependent of this glioma grade. We propose a deep learning-based, modular classification pipeline for automated grading of gliomas using digital pathology images. Whole tissue digitized images of pathology slides obtained from The Cancer Genome Atlas (TCGA) were used to train our deep learning modules. Our modular pipeline provides diagnostic quality statistics, such as precision, sensitivity and specificity, of the individual deep learning modules, and (1) facilitates training given the limited data in this domain, (2) enables exploration of different deep learning structures for each module, (3) leads to developing less complex modules that are simpler to analyze, and (4) provides flexibility, permitting use of single modules within the framework or use of other modeling or machine learning applications, such as probabilistic graphical models or support vector machines. Our modular approach helps us meet the requirements of minimum accuracy levels that are demanded by the context of different decision points within a multi-class classification scheme. Convolutional Neural Networks are trained for each module for each sub-task with more than 90% classification accuracies on validation data set, and achieved classification accuracy of 96% for the task of GBM vs LGG classification, 71% for further identifying the grade of LGG into Grade II or Grade III on independent data set coming from new patients from the multi-institutional repository.

  17. Semi-Automated Approach for Mapping Urban Trees from Integrated Aerial LiDAR Point Cloud and Digital Imagery Datasets

    Science.gov (United States)

    Dogon-Yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-09-01

    Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  18. Streamlining digital signal processing a tricks of the trade guidebook

    CERN Document Server

    2012-01-01

    Streamlining Digital Signal Processing, Second Edition, presents recent advances in DSP that simplify or increase the computational speed of common signal processing operations and provides practical, real-world tips and tricks not covered in conventional DSP textbooks. It offers new implementations of digital filter design, spectrum analysis, signal generation, high-speed function approximation, and various other DSP functions. It provides:Great tips, tricks of the trade, secrets, practical shortcuts, and clever engineering solutions from seasoned signal processing professionalsAn assortment.

  19. Streamlined library programming how to improve services and cut costs

    CERN Document Server

    Porter-Reynolds, Daisy

    2014-01-01

    In their roles as community centers, public libraries offer many innovative and appealing programs; but under current budget cuts, library resources are stretched thin. With slashed budgets and limited staff hours, what can libraries do to best serve their publics? This how-to guide provides strategies for streamlining library programming in public libraries while simultaneously maintaining-or even improving-quality delivery. The wide variety of principles and techniques described can be applied on a selective basis to libraries of all sizes. Based upon the author's own extensive experience as

  20. Topology of streamlines and vorticity contours for two - dimensional flows

    DEFF Research Database (Denmark)

    Andersen, Morten

    on the vortex filament by the localised induction approximation the stream function is slightly modified and an extra parameter is introduced. In this setting two new flow topologies arise, but not more than two critical points occur for any combination of the parameters. The analysis of the closed form show...... by a point vortex above a wall in inviscid fluid. There is no reason to a priori expect equivalent results of the three vortex definitions. However, the study is mainly motivated by the findings of Kudela & Malecha (Fluid Dyn. Res. 41, 2009) who find good agreement between the vorticity and streamlines...

  1. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 2: Approaches based on impregnated membranes and porous supports.

    Science.gov (United States)

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-11

    A critical overview on automation of modern liquid phase microextraction (LPME) approaches based on the liquid impregnation of porous sorbents and membranes is presented. It is the continuation of part 1, in which non-dispersive LPME techniques based on the use of the extraction phase (EP) in the form of drop, plug, film, or microflow have been surveyed. Compared to the approaches described in part 1, porous materials provide an improved support for the EP. Simultaneously they allow to enlarge its contact surface and to reduce the risk of loss by incident flow or by components of surrounding matrix. Solvent-impregnated membranes or hollow fibres are further ideally suited for analyte extraction with simultaneous or subsequent back-extraction. Their use can therefore improve the procedure robustness and reproducibility as well as it "opens the door" to the new operation modes and fields of application. However, additional work and time are required for membrane replacement and renewed impregnation. Automation of porous support-based and membrane-based approaches plays an important role in the achievement of better reliability, rapidness, and reproducibility compared to manual assays. Automated renewal of the extraction solvent and coupling of sample pretreatment with the detection instrumentation can be named as examples. The different LPME methodologies using impregnated membranes and porous supports for the extraction phase and the different strategies of their automation, and their analytical applications are comprehensively described and discussed in this part. Finally, an outlook on future demands and perspectives of LPME techniques from both parts as a promising area in the field of sample pretreatment is given. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972)

  3. Evaluation of two streamlined life cycle assessment methods

    International Nuclear Information System (INIS)

    Hochschomer, Elisabeth; Finnveden, Goeran; Johansson, Jessica

    2002-02-01

    Two different methods for streamlined life cycle assessment (LCA) are described: the MECO-method and SLCA. Both methods are tested on an already made case-study on cars fuelled with petrol or ethanol, and electric cars with electricity produced from hydro power or coal. The report also contains some background information on LCA and streamlined LCA, and a deschption of the case study used. The evaluation of the MECO and SLCA-methods are based on a comparison of the results from the case study as well as practical aspects. One conclusion is that the SLCA-method has some limitations. Among the limitations are that the whole life-cycle is not covered, it requires quite a lot of information and there is room for arbitrariness. It is not very flexible instead it difficult to develop further. We are therefore not recommending the SLCA-method. The MECO-method does in comparison show several attractive features. It is also interesting to note that the MECO-method produces information that is complementary compared to a more traditional quantitative LCA. We suggest that the MECO method needs some further development and adjustment to Swedish conditions

  4. Study of streamline flow in the portal system

    International Nuclear Information System (INIS)

    Atkins, H.L.; Deitch, J.S.; Oster, Z.H.; Perkes, E.A.

    1985-01-01

    The study was undertaken to determine if streamline flow occurs in the portal vein, thus separating inflow from the superior mesenteric artery (SMA) and the inferior mesenteric artery. Previously published data on this subject is inconsistent. Patients undergoing abdominal angiography received two administrations of Tc-99m sulfur colloid, first via the SMA during angiography and, after completion of the angiographic procedure, via a peripheral vein (IV). Anterior images of the liver were recorded over a three minute acquisition before and after the IV injection without moving the patient. The image from the SMA injection was subtracted from the SMA and IV image to provide a pure IV image. Analysis of R to L ratios for selected regions of interest as well as whole lobes was carried out and the shift of R to L (SMA to IV) determined. Six patients had liver metastases from the colon, four had cirrhosis and four had no known liver disease. The shift in the ratio was highly variable without a consistent pattern. Large changes in some patients could be attributed to hepatic artery flow directed to metastases. No consistent evidence for streamlining of portal flow was discerned

  5. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    Science.gov (United States)

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  6. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    Science.gov (United States)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  7. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  8. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  9. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  10. Identifying and Quantifying Cultural Factors That Matter to the IT Workforce: An Approach Based on Automated Content Analysis

    DEFF Research Database (Denmark)

    Schmiedel, Theresa; Müller, Oliver; Debortoli, Stefan

    2016-01-01

    builds on 112,610 online reviews of Fortune 500 IT companies collected from Glassdoor, an online platform on which current and former employees can anonymously review companies and their management. We perform an automated content analysis to identify cultural factors that employees emphasize...

  11. The streamline upwind Petrov-Galerkin stabilising method for the numerical solution of highly advective problems

    Directory of Open Access Journals (Sweden)

    Carlos Humberto Galeano Urueña

    2009-05-01

    Full Text Available This article describes the streamline upwind Petrov-Galerkin (SUPG method as being a stabilisation technique for resolving the diffusion-advection-reaction equation by finite elements. The first part of this article has a short analysis of the importance of this type of differential equation in modelling physical phenomena in multiple fields. A one-dimensional description of the SUPG me- thod is then given to extend this basis to two and three dimensions. The outcome of a strongly advective and a high numerical complexity experiment is presented. The results show how the version of the implemented SUPG technique allowed stabilised approaches in space, even for high Peclet numbers. Additional graphs of the numerical experiments presented here can be downloaded from www.gnum.unal.edu.co.

  12. Investigating the effects of streamline-based fiber tractography on matrix scaling in brain connective network.

    Science.gov (United States)

    Jan, Hengtai; Chao, Yi-Ping; Cho, Kuan-Hung; Kuo, Li-Wei

    2013-01-01

    Investigating the brain connective network using the modern graph theory has been widely applied in cognitive and clinical neuroscience research. In this study, we aimed to investigate the effects of streamline-based fiber tractography on the change of network properties and established a systematic framework to understand how an adequate network matrix scaling can be determined. The network properties, including degree, efficiency and betweenness centrality, show similar tendency in both left and right hemispheres. By employing the curve-fitting process with exponential law and measuring the residuals, the association between changes of network properties and threshold of track numbers is found and an adequate range of investigating the lateralization of brain network is suggested. The proposed approach can be further applied in clinical applications to improve the diagnostic sensitivity using network analysis with graph theory.

  13. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  14. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  15. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  16. The Zig-zag Instability of Streamlined Bodies

    Science.gov (United States)

    Guillet, Thibault; Coux, Martin; Quere, David; Clanet, Christophe

    2017-11-01

    When a floating bluff body, like a sphere, impacts water with a vertical velocity, its trajectory is straight and the depth of its dive increases with its initial velocity. Even though we observe the same phenomenon at low impact speed for axisymmetric streamlined bodies, the trajectory is found to deviate from the vertical when the velocity overcomes a critical value. This instability results from a competition between the destabilizing torque of the lift and the stabilizing torque of the Archimede's force. Balancing these torques yields a prediction on the critical velocity above which the instability appears. This theoretical value is found to depend on the position of the gravity center of the projectile and predicts with a full agreement the behaviour observed in our different experiments. Project funded by DGA.

  17. Streamlining air import operations by trade facilitation measures

    Directory of Open Access Journals (Sweden)

    Yuri da Cunha Ferreira

    2017-12-01

    Full Text Available Global operations are subject to considerable uncertainties. Due to the Trade Facilitation Agreement that became effective in February 2017, the study of measures to streamline customs controls is urgent. This study aims to assess the impact of trade facilitation measures on import flows. An experimental study was performed in the largest cargo airport in South America through discrete-event simulation and design of experiments. Operation impacts of three trade facilitation measures are assessed on import flow by air. We shed light in the following trade facilitation measures: the use of X-ray equipment for physical inspection; increase of the number of qualified companies in the trade facilitation program; performance targets for customs officials. All trade facilitation measures used indicated potential to provide more predictability, cost savings, time reduction, and increase in security in international supply chain.

  18. State Models to Incentivize and Streamline Small Hydropower Development

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, Taylor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Levine, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Johnson, Kurt [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-10-31

    In 2016, the hydropower fleet in the United States produced more than 6 percent (approximately 265,829 gigawatt-hours [GWh]) of the total net electricity generation. The median-size hydroelectric facility in the United States is 1.6 MW and 75 percent of total facilities have a nameplate capacity of 10 MW or less. Moreover, the U.S. Department of Energy's Hydropower Vision study identified approximately 79 GW hydroelectric potential beyond what is already developed. Much of the potential identified is at low-impact new stream-reaches, existing conduits, and non-powered dams with a median project size of 10 MW or less. To optimize the potential and value of small hydropower development, state governments are crafting policies that provide financial assistance and expedite state and federal review processes for small hydroelectric projects. This report analyzes state-led initiatives and programs that incentivize and streamline small hydroelectric development.

  19. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  20. Recent developments in automated determinations of trace level concentrations of elements and on-line fractionations schemes exploiting the micro-sequential injection - lab-on-valve approach

    DEFF Research Database (Denmark)

    Hansen, Elo Harald; Miró, Manuel; Long, Xiangbao

    2006-01-01

    The determination of trace level concentrations of elements, such as metal species, in complex matrices by atomic absorption or emission spectrometric methods often require appropriate pretreatments comprising separation of the analyte from interfering constituents and analyte preconcentration...... are presented as based on the exploitation of micro-sequential injection (μSI-LOV) using hydrophobic as well as hydrophilic bead materials. The examples given comprise the presentation of a universal approach for SPE-assays, front-end speciation of Cr(III) and Cr(VI) in a fully automated and enclosed set...

  1. The juvenile face as a suitable age indicator in child pornography cases: a pilot study on the reliability of automated and visual estimation approaches.

    Science.gov (United States)

    Ratnayake, M; Obertová, Z; Dose, M; Gabriel, P; Bröker, H M; Brauckmann, M; Barkus, A; Rizgeliene, R; Tutkuviene, J; Ritz-Timme, S; Marasciuolo, L; Gibelli, D; Cattaneo, C

    2014-09-01

    In cases of suspected child pornography, the age of the victim represents a crucial factor for legal prosecution. The conventional methods for age estimation provide unreliable age estimates, particularly if teenage victims are concerned. In this pilot study, the potential of age estimation for screening purposes is explored for juvenile faces. In addition to a visual approach, an automated procedure is introduced, which has the ability to rapidly scan through large numbers of suspicious image data in order to trace juvenile faces. Age estimations were performed by experts, non-experts and the Demonstrator of a developed software on frontal facial images of 50 females aged 10-19 years from Germany, Italy, and Lithuania. To test the accuracy, the mean absolute error (MAE) between the estimates and the real ages was calculated for each examiner and the Demonstrator. The Demonstrator achieved the lowest MAE (1.47 years) for the 50 test images. Decreased image quality had no significant impact on the performance and classification results. The experts delivered slightly less accurate MAE (1.63 years). Throughout the tested age range, both the manual and the automated approach led to reliable age estimates within the limits of natural biological variability. The visual analysis of the face produces reasonably accurate age estimates up to the age of 18 years, which is the legally relevant age threshold for victims in cases of pedo-pornography. This approach can be applied in conjunction with the conventional methods for a preliminary age estimation of juveniles depicted on images.

  2. A systematic approach to the application of Automation, Robotics, and Machine Intelligence Systems /ARAMIS/ to future space projects

    Science.gov (United States)

    Smith, D. B. S.

    1982-01-01

    The potential applications of Automation, Robotics, and Machine Intelligence Systems (ARAMIS) to space projects are investigated, through a systematic method. In this method selected space projects are broken down into space project tasks, and 69 of these tasks are selected for study. Candidate ARAMIS options are defined for each task. The relative merits of these options are evaluated according to seven indices of performance. Logical sequences of ARAMIS development are also defined. Based on this data, promising applications of ARAMIS are

  3. Development of an Integrated Approach to Routine Automation of Neutron Activation Analysis. Results of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2018-04-01

    Neutron activation analysis (NAA) is a powerful technique for determining bulk composition of major and trace elements. Automation may contribute significantly to keep NAA competitive for end-users. It provides opportunities for a larger analytical capacity and a shorter overall turnaround time if large series of samples have to be analysed. This publication documents and disseminates the expertise generated on automation in NAA during a coordinated research project (CRP). The CRP participants presented different cost-effective designs of sample changers for gamma-ray spectrometry as well as irradiation devices, and were able to construct and successfully test these systems. They also implemented, expanded and improved quality control and quality assurance as cross-cutting topical area of their automated NAA procedures. The publication serves as a reference of interest to NAA practitioners, experts, and research reactor personnel, but also to various stakeholders and users interested in basic research and/or services provided by NAA. The individual country reports are available on the CD-ROM attached to this publication.

  4. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  5. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...

  6. Automated Assessment in Massive Open Online Courses

    Science.gov (United States)

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  7. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    Energy Technology Data Exchange (ETDEWEB)

    Acciarri, R.; Bagby, L.; Baller, B.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Greenlee, H.; James, C.; Jostlein, H.; Ketchum, W.; Kirby, M.; Kobilarcik, T.; Lockwitz, S.; Lundberg, B.; Marchionni, A.; Moore, C.D.; Palamara, O.; Pavlovic, Z.; Raaf, J.L.; Schukraft, A.; Snider, E.L.; Spentzouris, P.; Strauss, T.; Toups, M.; Wolbers, S.; Yang, T.; Zeller, G.P. [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Adams, C. [Harvard University, Cambridge, MA (United States); Yale University, New Haven, CT (United States); An, R.; Littlejohn, B.R.; Martinez Caicedo, D.A. [Illinois Institute of Technology (IIT), Chicago, IL (United States); Anthony, J.; Escudero Sanchez, L.; De Vries, J.J.; Marshall, J.; Smith, A.; Thomson, M. [University of Cambridge, Cambridge (United Kingdom); Asaadi, J. [University of Texas, Arlington, TX (United States); Auger, M.; Ereditato, A.; Goeldi, D.; Kreslo, I.; Lorca, D.; Luethi, M.; Rudolf von Rohr, C.; Sinclair, J.; Weber, M. [Universitaet Bern, Bern (Switzerland); Balasubramanian, S.; Fleming, B.T.; Gramellini, E.; Hackenburg, A.; Luo, X.; Russell, B.; Tufanli, S. [Yale University, New Haven, CT (United States); Barnes, C.; Mousseau, J.; Spitz, J. [University of Michigan, Ann Arbor, MI (United States); Barr, G.; Bass, M.; Del Tutto, M.; Laube, A.; Soleti, S.R.; De Pontseele, W.V. [University of Oxford, Oxford (United Kingdom); Bay, F. [TUBITAK Space Technologies Research Institute, Ankara (Turkey); Bishai, M.; Chen, H.; Joshi, J.; Kirby, B.; Li, Y.; Mooney, M.; Qian, X.; Viren, B.; Zhang, C. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Blake, A.; Devitt, D.; Lister, A.; Nowak, J. [Lancaster University, Lancaster (United Kingdom); Bolton, T.; Horton-Smith, G.; Meddage, V.; Rafique, A. [Kansas State University (KSU), Manhattan, KS (United States); Camilleri, L.; Caratelli, D.; Crespo-Anadon, J.I.; Fadeeva, A.A.; Genty, V.; Kaleko, D.; Seligman, W.; Shaevitz, M.H. [Columbia University, New York, NY (United States); Church, E. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Cianci, D.; Karagiorgi, G. [Columbia University, New York, NY (United States); The University of Manchester (United Kingdom); Cohen, E.; Piasetzky, E. [Tel Aviv University, Tel Aviv (Israel); Collin, G.H.; Conrad, J.M.; Hen, O.; Hourlier, A.; Moon, J.; Wongjirad, T.; Yates, L. [Massachusetts Institute of Technology (MIT), Cambridge, MA (United States); Convery, M.; Eberly, B.; Rochester, L.; Tsai, Y.T.; Usher, T. [SLAC National Accelerator Laboratory, Menlo Park, CA (United States); Dytman, S.; Graf, N.; Jiang, L.; Naples, D.; Paolone, V.; Wickremasinghe, D.A. [University of Pittsburgh, Pittsburgh, PA (United States); Esquivel, J.; Hamilton, P.; Pulliam, G.; Soderberg, M. [Syracuse University, Syracuse, NY (United States); Foreman, W.; Ho, J.; Schmitz, D.W.; Zennamo, J. [University of Chicago, IL (United States); Furmanski, A.P.; Garcia-Gamez, D.; Hewes, J.; Hill, C.; Murrells, R.; Porzio, D.; Soeldner-Rembold, S.; Szelc, A.M. [The University of Manchester (United Kingdom); Garvey, G.T.; Huang, E.C.; Louis, W.C.; Mills, G.B.; De Water, R.G.V. [Los Alamos National Laboratory (LANL), Los Alamos, NM (United States); Gollapinni, S. [Kansas State University (KSU), Manhattan, KS (United States); University of Tennessee, Knoxville, TN (United States); and others

    2018-01-15

    The development and operation of liquid-argon time-projection chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies. (orig.)

  8. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    CERN Document Server

    Acciarri, R.; An, R.; Anthony, J.; Asaadi, J.; Auger, M.; Bagby, L.; Balasubramanian, S.; Baller, B.; Barnes, C.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Cohen, E.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fadeeva, A. A.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garcia-Gamez, D.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; Hourlier, A.; Huang, E.-C.; James, C.; Jan de Vries, J.; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Piasetzky, E.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; Rudolf von Rohr, C.; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Smith, A.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van De Pontseele, W.; Van de Water, R. G.; Viren, B.; Weber, M.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Yates, L.; Zeller, G. P.; Zennamo, J.; Zhang, C.

    2017-01-01

    The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the...

  9. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    Science.gov (United States)

    Acciarri, R.; Adams, C.; An, R.; Anthony, J.; Asaadi, J.; Auger, M.; Bagby, L.; Balasubramanian, S.; Baller, B.; Barnes, C.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Cohen, E.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fadeeva, A. A.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garcia-Gamez, D.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; Hourlier, A.; Huang, E.-C.; James, C.; Jan de Vries, J.; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Piasetzky, E.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; Rudolf von Rohr, C.; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Smith, A.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van De Pontseele, W.; Van de Water, R. G.; Viren, B.; Weber, M.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Yates, L.; Zeller, G. P.; Zennamo, J.; Zhang, C.

    2018-01-01

    The development and operation of liquid-argon time-projection chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.

  10. Systematic approach to the application of automation, robotics, and machine intelligence systems (aramis) to future space projects

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D B.S.

    1983-01-01

    The potential applications of automation, robotics and machine intelligence systems (ARAMIS) to space projects are investigated, through a systematic method. In this method selected space projects are broken down into space project tasks, and 69 of these tasks are selected for study. Candidate ARAMIS options are defined for each task. The relative merits of these options are evaluated according to seven indices of performance. Logical sequences of ARAMIS development are also defined. Based on this data, promising applications of ARAMIS are identified for space project tasks. General conclusions and recommendations for further study are also presented. 6 references.

  11. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Directory of Open Access Journals (Sweden)

    Dai Fei Elmer Ker

    Full Text Available Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and

  12. Will the Measurement Robots Take Our Jobs? An Update on the State of Automated M&V for Energy Efficiency Programs

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Taylor, Cody [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-28

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifying savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.

  13. Streamlined islands and the English Channel megaflood hypothesis

    Science.gov (United States)

    Collier, J. S.; Oggioni, F.; Gupta, S.; García-Moreno, D.; Trentesaux, A.; De Batist, M.

    2015-12-01

    Recognising ice-age catastrophic megafloods is important because they had significant impact on large-scale drainage evolution and patterns of water and sediment movement to the oceans, and likely induced very rapid, short-term effects on climate. It has been previously proposed that a drainage system on the floor of the English Channel was initiated by catastrophic flooding in the Pleistocene but this suggestion has remained controversial. Here we examine this hypothesis through an analysis of key landform features. We use a new compilation of multi- and single-beam bathymetry together with sub-bottom profiler data to establish the internal structure, planform geometry and hence origin of a set of 36 mid-channel islands. Whilst there is evidence of modern-day surficial sediment processes, the majority of the islands can be clearly demonstrated to be formed of bedrock, and are hence erosional remnants rather than depositional features. The islands display classic lemniscate or tear-drop outlines, with elongated tips pointing downstream, typical of streamlined islands formed during high-magnitude water flow. The length-to-width ratio for the entire island population is 3.4 ± 1.3 and the degree-of-elongation or k-value is 3.7 ± 1.4. These values are comparable to streamlined islands in other proven Pleistocene catastrophic flood terrains and are distinctly different to values found in modern-day rivers. The island geometries show a correlation with bedrock type: with those carved from Upper Cretaceous chalk having larger length-to-width ratios (3.2 ± 1.3) than those carved into more mixed Paleogene terrigenous sandstones, siltstones and mudstones (3.0 ± 1.5). We attribute these differences to the former rock unit having a lower skin friction which allowed longer island growth to achieve minimum drag. The Paleogene islands, although less numerous than the Chalk islands, also assume more perfect lemniscate shapes. These lithologies therefore reached island

  14. Evaluating an Automated Approach for Monitoring Forest Disturbances in the Pacific Northwest from Logging, Fire and Insect Outbreaks with Landsat Time Series Data

    Science.gov (United States)

    R.Neigh, Christopher S.; Bolton, Douglas K.; Williams, Jennifer J.; Diabate, Mouhamad

    2014-01-01

    Forests are the largest aboveground sink for atmospheric carbon (C), and understanding how they change through time is critical to reduce our C-cycle uncertainties. We investigated a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 1991 in Pacific Northwest forests, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometers (AVHRRs). To understand the causal factors of this decline, we evaluated an automated classification method developed for Landsat time series stacks (LTSS) to map forest change. This method included: (1) multiple disturbance index thresholds; and (2) a spectral trajectory-based image analysis with multiple confidence thresholds. We produced 48 maps and verified their accuracy with air photos, monitoring trends in burn severity data and insect aerial detection survey data. Area-based accuracy estimates for change in forest cover resulted in producer's and user's accuracies of 0.21 +/- 0.06 to 0.38 +/- 0.05 for insect disturbance, 0.23 +/- 0.07 to 1 +/- 0 for burned area and 0.74 +/- 0.03 to 0.76 +/- 0.03 for logging. We believe that accuracy was low for insect disturbance because air photo reference data were temporally sparse, hence missing some outbreaks, and the annual anniversary time step is not dense enough to track defoliation and progressive stand mortality. Producer's and user's accuracy for burned area was low due to the temporally abrupt nature of fire and harvest with a similar response of spectral indices between the disturbance index and normalized burn ratio. We conclude that the spectral trajectory approach also captures multi-year stress that could be caused by climate, acid deposition, pathogens, partial harvest, thinning, etc. Our study focused on understanding the transferability of previously successful methods to new ecosystems and found that this automated method does not perform with the same accuracy in Pacific Northwest forests

  15. Toward better public health reporting using existing off the shelf approaches: The value of medical dictionaries in automated cancer detection using plaintext medical data.

    Science.gov (United States)

    Kasthurirathne, Suranga N; Dixon, Brian E; Gichoya, Judy; Xu, Huiping; Xia, Yuni; Mamlin, Burke; Grannis, Shaun J

    2017-05-01

    Existing approaches to derive decision models from plaintext clinical data frequently depend on medical dictionaries as the sources of potential features. Prior research suggests that decision models developed using non-dictionary based feature sourcing approaches and "off the shelf" tools could predict cancer with performance metrics between 80% and 90%. We sought to compare non-dictionary based models to models built using features derived from medical dictionaries. We evaluated the detection of cancer cases from free text pathology reports using decision models built with combinations of dictionary or non-dictionary based feature sourcing approaches, 4 feature subset sizes, and 5 classification algorithms. Each decision model was evaluated using the following performance metrics: sensitivity, specificity, accuracy, positive predictive value, and area under the receiver operating characteristics (ROC) curve. Decision models parameterized using dictionary and non-dictionary feature sourcing approaches produced performance metrics between 70 and 90%. The source of features and feature subset size had no impact on the performance of a decision model. Our study suggests there is little value in leveraging medical dictionaries for extracting features for decision model building. Decision models built using features extracted from the plaintext reports themselves achieve comparable results to those built using medical dictionaries. Overall, this suggests that existing "off the shelf" approaches can be leveraged to perform accurate cancer detection using less complex Named Entity Recognition (NER) based feature extraction, automated feature selection and modeling approaches. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Intelligent production of rotor blades using IT-aided automation approaches; Intelligente Fertigung von Rotorblaettern durch IT-gestuetzte Automationsansaetze

    Energy Technology Data Exchange (ETDEWEB)

    Ohlendorf, J.H.; Thoben, K.D. [Bremen Univ. (Germany). BIK Institut fuer integrierte Produktentwicklung; Hans, C.; Ghrairi, Z. [Bremer Institut fuer Produktion und Logistik GmbH, Bremen (Germany)

    2010-08-15

    In this paper, the necessity of a control system for resin infusion and curing was elaborated with the example of rotor blade manufacturing. Also, an innovative multi-component learning control system that is based on the results of the iReMo-project was introduced. It reveals so-far hidden processes within the moulding tools via a sophisticated sensor-network, meets all imposed requirements and will therewith considerably change and improve the moulding process of reinforced polymer composite materials. In an environment that has little contact with the automation technology today, these innovations already bear a great potential to increase efficiency and quality and reduce costs and lead times. Further benefits will result from the interconnection with other automation solutions for fibre-reinforced composites products, i.e., the automatic placement of fabrics in the mould and will be subject to future research activities. An emphasis will be put on the continuous data management to assure quality and traceability. A consequent gathering and filing of all product data in terms of a Product Lifecyle Management (PLM) would be also possible for fibre-reinforced composites. (orig.)

  17. Damage Detection with Streamlined Structural Health Monitoring Data

    Directory of Open Access Journals (Sweden)

    Jian Li

    2015-04-01

    Full Text Available The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM systems often overwhelms the systems’ capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compression, interactive sensor data retrieval, and structural knowledge discovery, which aim to enhance the reliability, efficiency, and robustness of on-line SHM systems. Adoption of this new concept will enable the design of an on-line SHM system with more uniform data generation and data handling capacity for its subsystems. To examine this concept in the context of vibration-based SHM systems, real sensor data from an on-line SHM system comprising a scaled steel bridge structure and an on-line data acquisition system with remote data access was used in this study. Vibration test results clearly demonstrated the prominent performance characteristics of the proposed integrated SHM system including rapid data access, interactive data retrieval and knowledge discovery of structural conditions on a global level.

  18. Damage detection with streamlined structural health monitoring data.

    Science.gov (United States)

    Li, Jian; Deng, Jun; Xie, Weizhi

    2015-04-15

    The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM) systems often overwhelms the systems' capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compression, interactive sensor data retrieval, and structural knowledge discovery, which aim to enhance the reliability, efficiency, and robustness of on-line SHM systems. Adoption of this new concept will enable the design of an on-line SHM system with more uniform data generation and data handling capacity for its subsystems. To examine this concept in the context of vibration-based SHM systems, real sensor data from an on-line SHM system comprising a scaled steel bridge structure and an on-line data acquisition system with remote data access was used in this study. Vibration test results clearly demonstrated the prominent performance characteristics of the proposed integrated SHM system including rapid data access, interactive data retrieval and knowledge discovery of structural conditions on a global level.

  19. An integrated billing application to streamline clinician workflow.

    Science.gov (United States)

    Vawdrey, David K; Walsh, Colin; Stetson, Peter D

    2014-01-01

    Between 2008 and 2010, our academic medical center transitioned to electronic provider documentation using a commercial electronic health record system. For attending physicians, one of the most frustrating aspects of this experience was the system's failure to support their existing electronic billing workflow. Because of poor system integration, it was difficult to verify the supporting documentation for each bill and impractical to track whether billable notes had corresponding charges. We developed and deployed in 2011 an integrated billing application called "iCharge" that streamlines clinicians' documentation and billing workflow, and simultaneously populates the inpatient problem list using billing diagnosis codes. Each month, over 550 physicians use iCharge to submit approximately 23,000 professional service charges for over 4,200 patients. On average, about 2.5 new problems are added to each patient's problem list. This paper describes the challenges and benefits of workflow integration across disparate applications and presents an example of innovative software development within a commercial EHR framework.

  20. VISMASHUP: streamlining the creation of custom visualization applications

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [Los Alamos National Laboratory; Santos, Emanuele [UNIV OF UTAH; Lins, Lauro [UNIV OF UTAH; Freire, Juliana [UNIV OF UTAH; Silva, Cl' audio T [UNIV OF UTAH

    2010-01-01

    Visualization is essential for understanding the increasing volumes of digital data. However, the process required to create insightful visualizations is involved and time consuming. Although several visualization tools are available, including tools with sophisticated visual interfaces, they are out of reach for users who have little or no knowledge of visualization techniques and/or who do not have programming expertise. In this paper, we propose VISMASHUP, a new framework for streamlining the creation of customized visualization applications. Because these applications can be customized for very specific tasks, they can hide much of the complexity in a visualization specification and make it easier for users to explore visualizations by manipulating a small set of parameters. We describe the framework and how it supports the various tasks a designer needs to carry out to develop an application, from mining and exploring a set of visualization specifications (pipelines), to the creation of simplified views of the pipelines, and the automatic generation of the application and its interface. We also describe the implementation of the system and demonstrate its use in two real application scenarios.

  1. Microdiversification in genome-streamlined ubiquitous freshwater Actinobacteria.

    Science.gov (United States)

    Neuenschwander, Stefan M; Ghai, Rohit; Pernthaler, Jakob; Salcher, Michaela M

    2018-01-01

    Actinobacteria of the acI lineage are the most abundant microbes in freshwater systems, but there are so far no pure living cultures of these organisms, possibly because of metabolic dependencies on other microbes. This, in turn, has hampered an in-depth assessment of the genomic basis for their success in the environment. Here we present genomes from 16 axenic cultures of acI Actinobacteria. The isolates were not only of minute cell size, but also among the most streamlined free-living microbes, with extremely small genome sizes (1.2-1.4 Mbp) and low genomic GC content. Genome reduction in these bacteria might have led to auxotrophy for various vitamins, amino acids and reduced sulphur sources, thus creating dependencies to co-occurring organisms (the 'Black Queen' hypothesis). Genome analyses, moreover, revealed a surprising degree of inter- and intraspecific diversity in metabolic pathways, especially of carbohydrate transport and metabolism, and mainly encoded in genomic islands. The striking genotype microdiversification of acI Actinobacteria might explain their global success in highly dynamic freshwater environments with complex seasonal patterns of allochthonous and autochthonous carbon sources. We propose a new order within Actinobacteria ('Candidatus Nanopelagicales') with two new genera ('Candidatus Nanopelagicus' and 'Candidatus Planktophila') and nine new species.

  2. A streamlined artificial variable free version of simplex method.

    Directory of Open Access Journals (Sweden)

    Syed Inayatullah

    Full Text Available This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement.

  3. A streamlined artificial variable free version of simplex method.

    Science.gov (United States)

    Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad

    2015-01-01

    This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement.

  4. Automating payroll, billing, and medical records. Using technology to do more with less.

    Science.gov (United States)

    Vetter, E

    1995-08-01

    As home care agencies grow, so does the need to streamline the paperwork involved in running an agency. One agency found a way to reduce its payroll, billing, and medical records paperwork by implementing an automated, image-based data collection system that saves time, money, and paper.

  5. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    Science.gov (United States)

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  6. A semi-automated approach to derive elevation time-series and calculate glacier mass balance from historical aerial imagery

    Science.gov (United States)

    Whorton, E.; Headman, A.; Shean, D. E.; McCann, E.

    2017-12-01

    Understanding the implications of glacier recession on water resources in the western U.S. requires quantifying glacier mass change across large regions over several decades. Very few glaciers in North America have long-term continuous field measurements of glacier mass balance. However, systematic aerial photography campaigns began in 1957 on many glaciers in the western U.S. and Alaska. These historical, vertical aerial stereo-photographs documenting glacier evolution have recently become publically available. Digital elevation models (DEM) of the transient glacier surface preserved in each imagery timestamp can be derived, then differenced to calculate glacier volume and mass change to improve regional geodetic solutions of glacier mass balance. In order to batch process these data, we use Python-based algorithms and Agisoft Photoscan structure from motion (SfM) photogrammetry software to semi-automate DEM creation, and orthorectify and co-register historical aerial imagery in a high-performance computing environment. Scanned photographs are rotated to reduce scaling issues, cropped to the same size to remove fiducials, and batch histogram equalization is applied to improve image quality and aid pixel-matching algorithms using the Python library OpenCV. Processed photographs are then passed to Photoscan through the Photoscan Python library to create DEMs and orthoimagery. To extend the period of record, the elevation products are co-registered to each other, airborne LiDAR data, and DEMs derived from sub-meter commercial satellite imagery. With the exception of the placement of ground control points, the process is entirely automated with Python. Current research is focused on: one, applying these algorithms to create geodetic mass balance time series for the 90 photographed glaciers in Washington State and two, evaluating the minimal amount of positional information required in Photoscan to prevent distortion effects that cannot be addressed during co

  7. Stream-lined Gating Systems with Improved Yield - Dimensioning and Experimental Validation

    DEFF Research Database (Denmark)

    Tiedje, Niels Skat; Skov-Hansen, Søren Peter

    the two types of lay-outs are cast in production. It is shown that flow in the stream-lined lay-out is well controlled and that the quality of the castings is as at least equal to that of castings produced with a traditional lay-out. Further, the yield is improved by 4 % relative to a traditional lay-out.......The paper describes how a stream-lined gating system where the melt is confined and controlled during filling can be designed. Commercial numerical modelling software has been used to compare the stream-lined design with a traditional gating system. These results are confirmed by experiments where...

  8. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  9. Exploring nomological link between automated service quality, customer satisfaction and behavioural intentions with CRM performance indexing approach: Empirical evidence from Indian banking industry

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2013-01-01

    Full Text Available Automation in service delivery has increased the consumers’ expectation with regard to service quality and subsequently the perception of the same. Technology-driven services redefined quality dimensions and their subsequent impact on the behavioural outcomes of the consumers with specific reference to attitudinal loyalty and propensity to switch. Customer Relationship Management (CRM has further reinforced the operational aspects of a service provider by integrating the behavioural perspectives with technology. This paper attempts to explore the nomological link between automated service quality and its behavioural consequences with specific reference to consumers’ attitudinal loyalty and their intention to switch or defect from their present service provider. The study further takes into consideration the moderating effects of the performance of the dimensions and attributes of customer relationship management by introducing a novel approach to CRM performance indexing. The cross-sectional study was carried out with the customers of State Bank of India at Asansol, Durgapur, Bolpur and Santiniketan in West Bengal, India. The study used structural equation modeling (SEM to assess and validate the nomological relationship between the variables.

  10. An automated approach for single-cell tracking in epifluorescence microscopy applied to E. coli growth analysis on microfluidics biochips

    Science.gov (United States)

    Fetita, Catalin; Kirov, Boris; Jaramillo, Alfonso; Lefevre, Christophe

    2012-03-01

    With the accumulation of knowledge for the intimate molecular mechanisms governing the processes inside the living cells in the later years, the ability to characterize the performance of elementary genetic circuits and parts at the single-cell level is becoming of crucial importance. Biological science is arriving to the point where it can develop hypothesis for the action of each molecule participating in the biochemical reactions and need proper techniques to test those hypothesis. Microfluidics is emerging as the technology that combined with high-magnification microscopy will allow for the long-term single-cell level observation of bacterial physiology. In this study we design, build and characterize the gene dynamics of genetic circuits as one of the basic parts governing programmed cell behavior. We use E. coli as model organism and grow it in microfluidics chips, which we observe with epifluorescence microscopy. One of the most invaluable segments of this technology is the consequent image processing, since it allows for the automated analysis of vast amount of single-cell observation and the fast and easy derivation of conclusions based on that data. Specifically, we are interested in promoter activity as function of time. We expect it to be oscillatory and for that we use GFP (green fluorescent protein) as a reporter in our genetic circuits. In this paper, an automated framework for single-cell tracking in phase-contrast microscopy is developed, combining 2D segmentation of cell time frames and graph-based reconstruction of their spatiotemporal evolution with fast tracking of the associated fluorescence signal. The results obtained on the investigated biological database are presented and discussed.

  11. An automated multi-modal object analysis approach to coronary calcium scoring of adaptive heart isolated MSCT images

    Science.gov (United States)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-02-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. This can be challenging for a human observer as it is difficult to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. The inclusion or exclusion of false positive or true positive calcified plaques respectively will alter the patient calcium score incorrectly, thus leading to the possibility of incorrect treatment prescription. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the Volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the requirement and

  12. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    Science.gov (United States)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  13. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  14. The benefits of life cycle inventory parametric models in streamlining data collection. A case study in the wooden pallet sector

    DEFF Research Database (Denmark)

    Niero, Monia; Di Felice, F.; Ren, J.

    2014-01-01

    LCA methodology is time and resource consuming particularly when it comes to data collection and handling, therefore companies, particularly Small and Medium Enterprises (SMEs), are inclined to use streamlined approaches to shorten the resource-consuming life cycle inventory (LCI) phase. An effec......LCA methodology is time and resource consuming particularly when it comes to data collection and handling, therefore companies, particularly Small and Medium Enterprises (SMEs), are inclined to use streamlined approaches to shorten the resource-consuming life cycle inventory (LCI) phase...... study of a SME in the wooden pallet sector, investigating to what extent the use of parametric LCI models can be beneficial both in evaluating the environmental impacts of similar products and in providing a preliminary assessment of the potential environmental impacts of new products. We developed...... an LCI parametric model describing the LCI of a range of wooden pallets and tested its effectiveness with a reference product, namely a non-reversible pallet with four-way blocks. The identified parameters refer to the technical characteristics of the product system, e.g. the number and dimension...

  15. Large-scale renewable energy project barriers: Environmental impact assessment streamlining efforts in Japan and the EU

    International Nuclear Information System (INIS)

    Schumacher, Kim

    2017-01-01

    Environmental Impact Assessment (EIA) procedures have been identified as a major barrier to renewable energy (RE) development with regards to large-scale projects (LS-RE). However EIA laws have also been neglected by many decision-makers who have been underestimating its impact on RE development and the stifling potential they possess. As a consequence, apart from acknowledging the shortcomings of the systems currently in place, few governments momentarily have concrete plans to reform their EIA laws. By looking at recent EIA streamlining efforts in two industrialized regions that underwent major transformations in their energy sectors, this paper attempts to assess how such reform efforts can act as a means to support the balancing of environmental protection and climate change mitigation with socio-economic challenges. Thereby this paper fills this intellectual void by identifying the strengths and weaknesses of the Japanese EIA law by contrasting it with the recently revised EIA Directive of the European Union (EU). This enables the identification of the regulatory provisions that impact RE development the most and the determination of how structured EIA law reforms would affect domestic RE project development. The main focus lies on the evaluation of regulatory streamlining efforts in the Japanese and EU contexts through the application of a mixed-methods approach, consisting of in-depth literary and legal reviews, followed by a comparative analysis and a series of semi-structured interviews. Highlighting several legal inconsistencies in combination with the views of EIA professionals, academics and law- and policymakers, allowed for a more comprehensive assessment of what streamlining elements of the reformed EU EIA Directive and the proposed Japanese EIA framework modifications could either promote or stifle further RE deployment. - Highlights: •Performs an in-depth review of EIA reforms in OECD territories •First paper to compare Japan and the European

  16. Surrogate Based Optimization of Aerodynamic Noise for Streamlined Shape of High Speed Trains

    Directory of Open Access Journals (Sweden)

    Zhenxu Sun

    2017-02-01

    Full Text Available Aerodynamic noise increases with the sixth power of the running speed. As the speed increases, aerodynamic noise becomes predominant and begins to be the main noise source at a certain high speed. As a result, aerodynamic noise has to be focused on when designing new high-speed trains. In order to perform the aerodynamic noise optimization, the equivalent continuous sound pressure level (SPL has been used in the present paper, which could take all of the far field observation probes into consideration. The Non-Linear Acoustics Solver (NLAS approach has been utilized for acoustic calculation. With the use of Kriging surrogate model, a multi-objective optimization of the streamlined shape of high-speed trains has been performed, which takes the noise level in the far field and the drag of the whole train as the objectives. To efficiently construct the Kriging model, the cross validation approach has been adopted. Optimization results reveal that both the equivalent continuous sound pressure level and the drag of the whole train are reduced in a certain extent.

  17. A streamlined DNA tool for global identification of heavily exploited coastal shark species (genus Rhizoprionodon.

    Directory of Open Access Journals (Sweden)

    Danillo Pinhal

    Full Text Available Obtaining accurate species-specific landings data is an essential step toward achieving sustainable shark fisheries. Globally distributed sharpnose sharks (genus Rhizoprionodon exhibit life-history characteristics (rapid growth, early maturity, annual reproduction that suggests that they could be fished in a sustainable manner assuming an investment in monitoring, assessment and careful management. However, obtaining species-specific landings data for sharpnose sharks is problematic because they are morphologically very similar to one another. Moreover, sharpnose sharks may also be confused with other small sharks (either small species or juveniles of large species once they are processed (i.e., the head and fins are removed. Here we present a highly streamlined molecular genetics approach based on seven species-specific PCR primers in a multiplex format that can simultaneously discriminate body parts from the seven described sharpnose shark species commonly occurring in coastal fisheries worldwide. The species-specific primers are based on nucleotide sequence differences among species in the nuclear ribosomal internal transcribed spacer 2 locus (ITS2. This approach also distinguishes sharpnose sharks from a wide range of other sharks (52 species and can therefore assist in the regulation of coastal shark fisheries around the world.

  18. Streamlining of the RELAP5-3D Code

    International Nuclear Information System (INIS)

    Mesina, George L; Hykes, Joshua; Guillen, Donna Post

    2007-01-01

    RELAP5-3D is widely used by the nuclear community to simulate general thermal hydraulic systems and has proven to be so versatile that the spectrum of transient two-phase problems that can be analyzed has increased substantially over time. To accommodate the many new types of problems that are analyzed by RELAP5-3D, both the physics and numerical methods of the code have been continuously improved. In the area of computational methods and mathematical techniques, many upgrades and improvements have been made decrease code run time and increase solution accuracy. These include vectorization, parallelization, use of improved equation solvers for thermal hydraulics and neutron kinetics, and incorporation of improved library utilities. In the area of applied nuclear engineering, expanded capabilities include boron and level tracking models, radiation/conduction enclosure model, feedwater heater and compressor components, fluids and corresponding correlations for modeling Generation IV reactor designs, and coupling to computational fluid dynamics solvers. Ongoing and proposed future developments include improvements to the two-phase pump model, conversion to FORTRAN 90, and coupling to more computer programs. This paper summarizes the general improvements made to RELAP5-3D, with an emphasis on streamlining the code infrastructure for improved maintenance and development. With all these past, present and planned developments, it is necessary to modify the code infrastructure to incorporate modifications in a consistent and maintainable manner. Modifying a complex code such as RELAP5-3D to incorporate new models, upgrade numerics, and optimize existing code becomes more difficult as the code grows larger. The difficulty of this as well as the chance of introducing errors is significantly reduced when the code is structured. To streamline the code into a structured program, a commercial restructuring tool, FOR( ) STRUCT, was applied to the RELAP5-3D source files. The

  19. Streamlined, Inexpensive 3D Printing of the Brain and Skull.

    Science.gov (United States)

    Naftulin, Jason S; Kimchi, Eyal Y; Cash, Sydney S

    2015-01-01

    Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.

  20. Streamlined, Inexpensive 3D Printing of the Brain and Skull.

    Directory of Open Access Journals (Sweden)

    Jason S Naftulin

    Full Text Available Neuroimaging technologies such as Magnetic Resonance Imaging (MRI and Computed Tomography (CT collect three-dimensional data (3D that is typically viewed on two-dimensional (2D screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM images to stereolithography (STL files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = <30 min. Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.

  1. Streamlined, Inexpensive 3D Printing of the Brain and Skull

    Science.gov (United States)

    Cash, Sydney S.

    2015-01-01

    Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3–4 in consumable plastic filament as described, and the total process takes 14–17 hours, almost all of which is unsupervised (preprocessing = 4–6 hr; printing = 9–11 hr, post-processing = Printing a matching portion of a skull costs $1–5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes. PMID:26295459

  2. Managing Written Directives: A Software Solution to Streamline Workflow.

    Science.gov (United States)

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases

  3. Streamlining genomes: toward the generation of simplified and stabilized microbial systems

    NARCIS (Netherlands)

    Leprince, A.; Passel, van M.W.J.; Martins Dos Santos, V.A.P.

    2012-01-01

    At the junction between systems and synthetic biology, genome streamlining provides a solid foundation both for increased understanding of cellular circuitry, and for the tailoring of microbial chassis towards innovative biotechnological applications. Iterative genomic deletions (targeted and

  4. West Virginia Peer Exchange : Streamlining Highway Safety Improvement Program Project Delivery - An RSPCB Peer Exchange

    Science.gov (United States)

    2014-09-01

    The West Virginia Division of Highways (WV DOH) hosted a Peer Exchange to share information and experiences for streamlining Highway Safety Improvement Program (HSIP) project delivery. The event was held September 23 to 24, 2014 in Charleston, West V...

  5. 77 FR 50691 - Request for Information (RFI): Guidance on Data Streamlining and Reducing Undue Reporting Burden...

    Science.gov (United States)

    2012-08-22

    .... Attention: HIV Data Streamlining. FOR FURTHER INFORMATION CONTACT: Andrew D. Forsyth Ph.D. or Vera... of HIV/AIDS programs that vary in their specifications (e.g., numerators, denominators, time frames...

  6. West Virginia peer exchange : streamlining highway safety improvement program project delivery.

    Science.gov (United States)

    2015-01-01

    The West Virginia Division of Highways (WV DOH) hosted a Peer Exchange to share information and experiences : for streamlining Highway Safety Improvement Program (HSIP) project delivery. The event was held September : 22 to 23, 2014 in Charleston, We...

  7. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  8. New approach for simplified and automated measurement of left ventricular ejection fraction by ECG gated blood pool scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Inagaki, Suetsugu; Adachi, Haruhiko; Sugihara, Hiroki; Katsume, Hiroshi; Ijichi, Hamao; Okamoto, Kunio; Hosoba, Minoru

    1984-12-01

    Background (BKG) correction is important but debatable in the measurement of Left ventricular ejection fraction (LVEF) with ECG gated blood pool scintigraphy. We devised a new simplified BKG processing (fixed BKG method) without BKG region-of-interest (ROI) assignment, and the accuracy and reproducibility were assessed in 25 patients with various heart diseases and 5 normal subjects by comparison with LVEF obtained by contrast levolgraphy (LVG-EF). Four additional protocols for LVEF measurement with BKG-ROI assignment were also assessed for reference. LVEF calculated using the fixed BKG ratio of 0.64 (BKG count rates were 64%) of end-diastolic count rates of LV) with ''Fixed'' LV-ROI was best correlated with LVG-EF (r = 0.936, p < 0.001) and most approximated (Fixed BKG ratio method EF: 61.1 +- 20.1, LVG-EF: 61.2 +- 20.4% (mean +- SD)) among other protocols. The wide availability of the fixed value of 0.64 was tested in various diseases, body size and end-diastolic volume by LVG, and the results were to be little influenced by them. Furthermore, fixed BKG method produced lower inter-and intra- observer variability than other protocols requiring BKG-ROI assignment, probably due to its simplified processing. In conclusion, fixed BKG ratio method simplifies the measurement of LVEF, and is feasible for automated processing and single probe system.

  9. A framework for evaluating the performance of automated teller machine in banking industries: A queuing model-cum-TOPSIS approach

    Directory of Open Access Journals (Sweden)

    Christopher Osita Anyaeche

    2018-04-01

    Full Text Available The improvement in the provision of banking services to customers enhances bank’s performance (profitability and productivity and the amounts of dividend declared to shareholders as well as bank’s competitiveness. One means of fast tracking the service time for bank customers is through the use of self-servicing machines, such as automated teller machine (ATM. Total service cost, expected waiting time in queue, ATM utilization and percentage of customer loss are some of the performance indices that are used to evaluate the service rendered by a bank’s ATM. This study proposes a framework for evaluating the performance of ATM by integrating queuing model and Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS methodology. Applicability of the framework was tested using practical data obtained from four banks in Nigeria. It was observed that the average ATM usage in the study area was less than 50%. The TOPSIS results identified Bank A as the best ranked bank. In addition, the results obtained revealed that banks with two ATM were ranked higher than banks with more than two ATM

  10. Improving and streamlining the workflow in the graphic arts and printing industry

    Science.gov (United States)

    Tuijn, Chris

    2003-01-01

    In order to survive in the economy of today, an ever-increasing productivity is required from all the partners participating in a specific business process. This is not different for the printing industry. One of the ways to remain profitable is, on one hand, to reduce costs by automation and aiming for large-scale projects and, on the other hand, to specialize and become an expert in the area in which one is active. One of the ways to realize these goals is by streamlining the communication of the different partners and focus on the core business. If we look at the graphic arts and printing industry, we can identify different important players that eventually help in the realization of printed material. For the printing company (as is the case for any other company), the most important player is the customer. This role can be adopted by many different players including publishers, companies, non-commercial institutions, private persons etc. Sometimes, the customer will be the content provider as well but this is not always the case. Often, the content is provided by other organizations such as design and prepress agencies, advertising companies etc. In most printing organizations, the customer has one contact person often referred to as the CSR (Customers Service Representative). Other people involved at the printing organization include the sales representatives, prepress operators, printing operators, postpress operators, planners, the logistics department, the financial department etc. In the first part of this article, we propose a solution that will improve the communication between all the different actors in the graphic arts and printing industry considerably and will optimize and streamline the overall workflow as well. This solution consists of an environment in which the customer can communicate with the CSR to ask for a quote based on a specific product intent; the CSR will then (after the approval from the customer's side) organize the work and brief

  11. A semi-automated approach for mapping geomorphology of El Bardawil Lake, Northern Sinai, Egypt, using integrated remote sensing and GIS techniques

    Directory of Open Access Journals (Sweden)

    Nabil Sayed Embabi

    2014-06-01

    Full Text Available Among the other coastal lakes of the Mediterranean northern coast of Egypt, Bardawil Lake is a unique lagoon, as it is fed only by seawater. The lagoon is composed of two main basins, and several other internal small basins interconnected to one another. Although the general geomorphologic characteristics are treated in some regional studies, we used a semi-automated approach based on a wide variety of digital image processing for mapping the major geomorphological landforms of the lake on a medium scale of 1:250,000. The approach is based primarily on data fusion of Landsat ETM+ image, and validated by other ancillary spatial data (e.g. topographic maps, Google images and GPS in situ data. Interpretations of high resolution space images by Google Earth and the large-scale topographic maps (1:25,000, in specific, revealed new microforms and some detailed geomorphologic aspects with the aid of GPS measurements. Small sand barriers, submerged sand dunes, tidal channels, fans and flats, and micro-lagoons are the recurrent forms in the lake. The approach used in this study could be widely applied to study the low-lying coastal lands along the Nile Delta. However, it is concluded from geological data and geomorphologic aspects that Bardawil Lake is of a tectonic origin; it was much deeper than it is currently, and has been filled with sediments mostly since the Flandrian transgression (∼8–6 ka bp.

  12. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  13. Two-dimensional parallel array technology as a new approach to automated combinatorial solid-phase organic synthesis

    Science.gov (United States)

    Brennan; Biddison; Frauendorf; Schwarcz; Keen; Ecker; Davis; Tinder; Swayze

    1998-01-01

    An automated, 96-well parallel array synthesizer for solid-phase organic synthesis has been designed and constructed. The instrument employs a unique reagent array delivery format, in which each reagent utilized has a dedicated plumbing system. An inert atmosphere is maintained during all phases of a synthesis, and temperature can be controlled via a thermal transfer plate which holds the injection molded reaction block. The reaction plate assembly slides in the X-axis direction, while eight nozzle blocks holding the reagent lines slide in the Y-axis direction, allowing for the extremely rapid delivery of any of 64 reagents to 96 wells. In addition, there are six banks of fixed nozzle blocks, which deliver the same reagent or solvent to eight wells at once, for a total of 72 possible reagents. The instrument is controlled by software which allows the straightforward programming of the synthesis of a larger number of compounds. This is accomplished by supplying a general synthetic procedure in the form of a command file, which calls upon certain reagents to be added to specific wells via lookup in a sequence file. The bottle position, flow rate, and concentration of each reagent is stored in a separate reagent table file. To demonstrate the utility of the parallel array synthesizer, a small combinatorial library of hydroxamic acids was prepared in high throughput mode for biological screening. Approximately 1300 compounds were prepared on a 10 μmole scale (3-5 mg) in a few weeks. The resulting crude compounds were generally >80% pure, and were utilized directly for high throughput screening in antibacterial assays. Several active wells were found, and the activity was verified by solution-phase synthesis of analytically pure material, indicating that the system described herein is an efficient means for the parallel synthesis of compounds for lead discovery. Copyright 1998 John Wiley & Sons, Inc.

  14. Filaments in curved streamlines: rapid formation of Staphylococcus aureus biofilm streamers

    International Nuclear Information System (INIS)

    Kevin Kim, Minyoung; Drescher, Knut; Shun Pak, On; Stone, Howard A; Bassler, Bonnie L

    2014-01-01

    Biofilms are surface-associated conglomerates of bacteria that are highly resistant to antibiotics. These bacterial communities can cause chronic infections in humans by colonizing, for example, medical implants, heart valves, or lungs. Staphylococcus aureus, a notorious human pathogen, causes some of the most common biofilm-related infections. Despite the clinical importance of S. aureus biofilms, it remains mostly unknown how physical effects, in particular flow, and surface structure influence biofilm dynamics. Here we use model microfluidic systems to investigate how environmental factors, such as surface geometry, surface chemistry, and fluid flow affect biofilm development of S. aureus. We discovered that S. aureus rapidly forms flow-induced, filamentous biofilm streamers, and furthermore if surfaces are coated with human blood plasma, streamers appear within minutes and clog the channels more rapidly than if the channels are uncoated. To understand how biofilm streamer filaments reorient in flows with curved streamlines to bridge the distances between corners, we developed a mathematical model based on resistive force theory of slender filaments. Understanding physical aspects of biofilm formation of S. aureus may lead to new approaches for interrupting biofilm formation of this pathogen. (paper)

  15. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  16. Geologic storage of carbon dioxide and enhanced oil recovery. I. Uncertainty quantification employing a streamline based proxy for reservoir flow simulation

    International Nuclear Information System (INIS)

    Kovscek, A.R.; Wang, Y.

    2005-01-01

    Carbon dioxide (CO 2 ) is already injected into a limited class of reservoirs for oil recovery purposes; however, the engineering design question for simultaneous oil recovery and storage of anthropogenic CO 2 is significantly different from that of oil recovery alone. Currently, the volumes of CO 2 injected solely for oil recovery are minimized due to the purchase cost of CO 2 . If and when CO 2 emissions to the atmosphere are managed, it will be necessary to maximize simultaneously both economic oil recovery and the volumes of CO 2 emplaced in oil reservoirs. This process is coined 'cooptimization'. This paper proposes a work flow for cooptimization of oil recovery and geologic CO 2 storage. An important component of the work flow is the assessment of uncertainty in predictions of performance. Typical methods for quantifying uncertainty employ exhaustive flow simulation of multiple stochastic realizations of the geologic architecture of a reservoir. Such approaches are computationally intensive and thereby time consuming. An analytic streamline based proxy for full reservoir simulation is proposed and tested. Streamline trajectories represent the three-dimensional velocity field during multiphase flow in porous media and so are useful for quantifying the similarity and differences among various reservoir models. The proxy allows rational selection of a representative subset of equi-probable reservoir models that encompass uncertainty with respect to true reservoir geology. The streamline approach is demonstrated to be thorough and rapid

  17. A study of low cost approaches to scientific experiment implementation for shuttle launched and serviced automated spacecraft

    Science.gov (United States)

    1975-01-01

    Cost reductions that can be obtained in experiment instrumentation by the use of standardized electronics and by the relaxation of instrument reliability requirements are studied. The feasibility of using standardized equipment for experiment instrumentation is assessed and a system design approach that most effectively incorporates standardized equipment is developed. The level and form of modularization that is appropriate for the standardized equipment is determined. Mission assurance aspects of instrument development are examined to determine the cost reductions that might be derived from the relaxation of reliability requirements and to formulate a systematic approach to the optimization of mission assurance cost reductions. The results of the analyses are applied to a representative model HEAO payload in order to provide a concrete example of the cost reductions that can be achieved by a standardized approach to the instrument electronics.

  18. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...... on the interpretation and transformation of the resulting Point Cloud data into information, which can be used in architectural and engineering design workflows. Our approach to tackle this problem, is in contrast to existing ones which work on the levels of points, based on the detection of building elements...

  19. Effectiveness of and obstacles to antibiotic streamlining to amoxicillin monotherapy in bacteremic pneumococcal pneumonia.

    Science.gov (United States)

    Blot, Mathieu; Pivot, Diane; Bourredjem, Abderrahmane; Salmon-Rousseau, Arnaud; de Curraize, Claire; Croisier, Delphine; Chavanet, Pascal; Binquet, Christine; Piroth, Lionel

    2017-09-01

    Antibiotic streamlining is pivotal to reduce the emergence of resistant bacteria. However, whether streamlining is frequently performed and safe in difficult situations, such as bacteremic pneumococcal pneumonia (BPP), has still to be assessed. All adult patients admitted to Dijon Hospital (France) from 2005 to 2013 who had BPP without complications, and were alive on the third day were enrolled. Clinical, biological, radiological, microbiological and therapeutic data were recorded. A first analysis was conducted to assess factors associated with being on amoxicillin on the third day. A second analysis, adjusting for a propensity score, was performed to determine whether 30-day mortality was associated with streamlining to amoxicillin monotherapy. Of the 196 patients hospitalized for BPP, 161 were still alive on the third day and were included in the study. Treatment was streamlined to amoxicillin in 60 patients (37%). Factors associated with not streamlining were severe pneumonia (OR 3.11, 95%CI [1.23-7.87]) and a first-line antibiotic combination (OR 3.08, 95%CI [1.34-7.09]). By contrast, starting with amoxicillin monotherapy correlated inversely with the risk of subsequent treatment with antibiotics other than amoxicillin (OR 0.06, 95%CI [0.01-0.30]). The Cox model adjusted for the propensity-score analysis showed that streamlining to amoxicillin during BPP was not significantly associated with a higher risk of 30-day mortality (HR 0.38, 95%CI [0.08-1.87]). Streamlining to amoxicillin is insufficiently implemented during BPP. This strategy is safe and potentially associated with ecological and economic benefits; therefore, it should be further encouraged, particularly when antibiotic combinations are started for severe pneumonia. Copyright © 2017. Published by Elsevier B.V.

  20. A framework for streamlining research workflow in neuroscience and psychology

    Directory of Open Access Journals (Sweden)

    Jonas eKubilius

    2014-01-01

    Full Text Available Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for faster, more robust code development and collaboration for researchers.

  1. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  2. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend on th...

  3. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...

  4. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  5. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  6. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  7. Food intake monitoring: an acoustical approach to automated food intake activity detection and classification of consumed food

    International Nuclear Information System (INIS)

    Päßler, Sebastian; Fischer, Wolf-Joachim; Wolff, Matthias

    2012-01-01

    Obesity and nutrition-related diseases are currently growing challenges for medicine. A precise and timesaving method for food intake monitoring is needed. For this purpose, an approach based on the classification of sounds produced during food intake is presented. Sounds are recorded non-invasively by miniature microphones in the outer ear canal. A database of 51 participants eating seven types of food and consuming one drink has been developed for algorithm development and model training. The database is labeled manually using a protocol with introductions for annotation. The annotation procedure is evaluated using Cohen's kappa coefficient. The food intake activity is detected by the comparison of the signal energy of in-ear sounds to environmental sounds recorded by a reference microphone. Hidden Markov models are used for the recognition of single chew or swallowing events. Intake cycles are modeled as event sequences in finite-state grammars. Classification of consumed food is realized by a finite-state grammar decoder based on the Viterbi algorithm. We achieved a detection accuracy of 83% and a food classification accuracy of 79% on a test set of 10% of all records. Our approach faces the need of monitoring the time and occurrence of eating. With differentiation of consumed food, a first step toward the goal of meal weight estimation is taken. (paper)

  8. Streamlining CASTOR to manage the LHC data torrent

    International Nuclear Information System (INIS)

    Presti, G Lo; Curull, X Espinal; Cano, E; Fiorini, B; Ieri, A; Murray, S; Ponce, S; Sindrilaru, E

    2014-01-01

    This contribution describes the evolution of the main CERN storage system, CASTOR, as it manages the bulk data stream of the LHC and other CERN experiments, achieving over 90 PB of stored data by the end of LHC Run 1. This evolution was marked by the introduction of policies to optimize the tape sub-system throughput, going towards a cold storage system where data placement is managed by the experiments' production managers. More efficient tape migrations and recalls have been implemented and deployed where bulk meta-data operations greatly reduce the overhead due to small files. A repack facility is now integrated in the system and it has been enhanced in order to automate the repacking of several tens of petabytes, required in 2014 in order to prepare for the next LHC run. Finally the scheduling system has been evolved to integrate the internal monitoring. To efficiently manage the service a solid monitoring infrastructure is required, able to analyze the logs produced by the different components (about 1 kHz of log messages). A new system has been developed and deployed, which uses a transport messaging layer provided by the CERN-IT Agile Infrastructure and exploits technologies including Hadoop and HBase. This enables efficient data mining by making use of MapReduce techniques, and real-time data aggregation and visualization. The outlook for the future is also presented. Directions and possible evolution will be discussed in view of the restart of data taking activities.

  9. Streamlining resummed QCD calculations using Monte Carlo integration

    Energy Technology Data Exchange (ETDEWEB)

    Farhi, David; Feige, Ilya; Freytsis, Marat; Schwartz, Matthew D. [Center for the Fundamental Laws of Nature, Harvard University,17 Oxford St., Cambridge, MA 02138 (United States)

    2016-08-18

    Some of the most arduous and error-prone aspects of precision resummed calculations are related to the partonic hard process, having nothing to do with the resummation. In particular, interfacing to parton-distribution functions, combining various channels, and performing the phase space integration can be limiting factors in completing calculations. Conveniently, however, most of these tasks are already automated in many Monte Carlo programs, such as MADGRAPH http://dx.doi.org/10.1007/JHEP07(2014)079, ALPGEN http://dx.doi.org/10.1088/1126-6708/2003/07/001 or SHERPA http://dx.doi.org/10.1088/1126-6708/2009/02/007. In this paper, we show how such programs can be used to produce distributions of partonic kinematics with associated color structures representing the hard factor in a resummed distribution. These distributions can then be used to weight convolutions of jet, soft and beam functions producing a complete resummed calculation. In fact, only around 1000 unweighted events are necessary to produce precise distributions. A number of examples and checks are provided, including e{sup +}e{sup −} two- and four-jet event shapes, n-jettiness and jet-mass related observables at hadron colliders at next-to-leading-log (NLL) matched to leading order (LO). Attached code can be used to modify MADGRAPH to export the relevant LO hard functions and color structures for arbitrary processes.

  10. Automated attribution of remotely-sensed ecological disturbances using spatial and temporal characteristics of common disturbance classes.

    Science.gov (United States)

    Cooper, L. A.; Ballantyne, A.

    2017-12-01

    Forest disturbances are critical components of ecosystems. Knowledge of their prevalence and impacts is necessary to accurately describe forest health and ecosystem services through time. While there are currently several methods available to identify and describe forest disturbances, especially those which occur in North America, the process remains inefficient and inaccessible in many parts of the world. Here, we introduce a preliminary approach to streamline and automate both the detection and attribution of forest disturbances. We use a combination of the Breaks for Additive Season and Trend (BFAST) detection algorithm to detect disturbances in combination with supervised and unsupervised classification algorithms to attribute the detections to disturbance classes. Both spatial and temporal disturbance characteristics are derived and utilized for the goal of automating the disturbance attribution process. The resulting preliminary algorithm is applied to up-scaled (100m) Landsat data for several different ecosystems in North America, with varying success. Our results indicate that supervised classification is more reliable than unsupervised classification, but that limited training data are required for a region. Future work will improve the algorithm through refining and validating at sites within North America before applying this approach globally.

  11. Quantifying the performance of automated GIS-based geomorphological approaches for riparian zone delineation using digital elevation models

    Directory of Open Access Journals (Sweden)

    D. Fernández

    2012-10-01

    Full Text Available Riparian zone delineation is a central issue for managing rivers and adjacent areas; however, criteria used to delineate them are still under debate. The area inundated by a 50-yr flood has been indicated as an optimal hydrological descriptor for riparian areas. This detailed hydrological information is usually only available for populated areas at risk of flooding. In this work we created several floodplain surfaces by means of two different GIS-based geomorphological approaches using digital elevation models (DEMs, in an attempt to find hydrologically meaningful potential riparian zones for river networks at the river basin scale. Objective quantification of the performance of the two geomorphologic models is provided by analysing coinciding and exceeding areas with respect to the 50-yr flood surface in different river geomorphological types.

  12. Streamline-concentration balance model for in-situ uranium leaching and site restoration

    International Nuclear Information System (INIS)

    Bommer, P.M.; Schechter, R.S.; Humenick, M.J.

    1981-03-01

    This work presents two computer models. One describes in-situ uranium leaching and the other describes post leaching site restoration. Both models use a streamline generator to set up the flow field over the reservoir. The leaching model then uses the flow data in a concentration balance along each streamline coupled with the appropriate reaction kinetics to calculate uranium production. The restoration model uses the same procedure except that binary cation exchange is used as the restoring mechanism along each streamline and leaching cation clean up is simulated. The mathematical basis for each model is shown in detail along with the computational schemes used. Finally, the two models have been used with several data sets to point out their capabilities and to illustrate important leaching and restoration parameters and schemes

  13. Self streamlining wind tunnel: Further low speed testing and final design studies for the transonic facility

    Science.gov (United States)

    Wolf, S. W. D.

    1978-01-01

    Work was continued with the low speed self streamlining wind tunnel (SSWT) using the NACA 0012-64 airfoil in an effort to explain the discrepancies between the NASA Langley low turbulence pressure tunnel (LTPT) and SSWT results obtained with the airfoil stalled. Conventional wind tunnel corrections were applied to straight wall SSWT airfoil data, to illustrate the inadequacy of standard correction techniques in circumstances of high blockage. Also one SSWT test was re-run at different air speeds to investigate the effects of such changes (perhaps through changes in Reynold's number and freestream turbulence levels) on airfoil data and wall contours. Mechanical design analyses for the transonic self-streamlining wind tunnel (TSWT) were completed by the application of theoretical airfoil flow field data to the elastic beam and streamline analysis. The control system for the transonic facility, which will eventually allow on-line computer operation of the wind tunnel, was outlined.

  14. Streamline-concentration balance model for in situ uranium leaching and site restoration

    International Nuclear Information System (INIS)

    Bommer, P.M.

    1979-01-01

    This work presents two computer models. One describes in situ uranium leaching and the other describes post leaching site restoration. Both models use a streamline generator to set up the flow field over the reservoir. The leaching model then uses the flow data in a concentration balance along each streamline coupled with the appropriate reaction kinetics to calculate uranium production. The restoration model uses the same procedure ecept that binary cation exchange is used as the restoring mechanism along each streamline and leaching cation clean up is stimulated. The mathematical basis for each model is shown in detail along with the computational schemes used. Finally, the two models have been used with several data sets to point out their capabilities and to illustrate important leaching and restoration parameters and schemes

  15. Application of a new genetic classification and semi-automated geomorphic mapping approach in the Perth submarine canyon, Australia

    Science.gov (United States)

    Picard, K.; Nanson, R.; Huang, Z.; Nichol, S.; McCulloch, M.

    2017-12-01

    The acquisition of high resolution marine geophysical data has intensified in recent years (e.g. multibeam echo-sounding, sub-bottom profiling). This progress provides the opportunity to classify and map the seafloor in greater detail, using new methods that preserve the links between processes and morphology. Geoscience Australia has developed a new genetic classification approach, nested within the Harris et al (2014) global seafloor mapping framework. The approach divides parent units into sub-features based on established classification schemes and feature descriptors defined by Bradwell et al. (2016: http://nora.nerc.ac.uk/), the International Hydrographic Organization (https://www.iho.int) and the Coastal Marine and Ecological Classification Standard (https://www.cmecscatalog.org). Owing to the ecological significance of submarine canyon systems in particular, much recent attention has focused on defining their variation in form and process, whereby they can be classified using a range of topographic metrics, fluvial dis/connection and shelf-incising status. The Perth Canyon is incised into the continental slope and shelf of southwest Australia, covering an area of >1500 km2 and extending from 4700 m water depth to the shelf break in 170 m. The canyon sits within a Marine Protected Area, incorporating a Marine National Park and Habitat Protection Zone in recognition of its benthic and pelagic biodiversity values. However, detailed information of the spatial patterns of the seabed habitats that influence this biodiversity is lacking. Here we use 20 m resolution bathymetry and acoustic backscatter data acquired in 2015 by the Schmidt Ocean Institute plus sub-bottom datasets and sediment samples collected Geoscience Australia in 2005 to apply the new geomorphic classification system to the Perth Canyon. This presentation will show the results of the geomorphic feature mapping of the canyon and its application to better defining potential benthic habitats.

  16. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    . The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)

  17. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    . The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)

  18. Unique encoding for streamline topologies of incompressible and inviscid flows in multiply connected domains

    Energy Technology Data Exchange (ETDEWEB)

    Sakajo, T [Department of Mathematics, Kyoto University, Kitashirakawa Oiwake-cho, Sakyo-ku, Kyoto 606-8502 (Japan); Sawamura, Y; Yokoyama, T, E-mail: sakajo@math.kyoto-u.ac.jp [JST CREST, Kawaguchi, Saitama 332-0012 (Japan)

    2014-06-01

    This study considers the flow of incompressible and inviscid fluid in two-dimensional multiply connected domains. For such flows, encoding algorithms to assign a unique sequence of words to any structurally stable streamline topology based on the theory presented by Yokoyama and Sakajo (2013 Proc. R. Soc. A 469 20120558) are proposed. As an application, we utilize the algorithms to characterize the evolution of an incompressible and viscid flow around a flat plate inclined to the uniform flow in terms of the change of the word representations for their instantaneous streamline topologies. (papers)

  19. Streamline Patterns and their Bifurcations near a wall with Navier slip Boundary Conditions

    DEFF Research Database (Denmark)

    Tophøj, Laust; Møller, Søren; Brøns, Morten

    2006-01-01

    We consider the two-dimensional topology of streamlines near a surface where the Navier slip boundary condition applies. Using transformations to bring the streamfunction in a simple normal form, we obtain bifurcation diagrams of streamline patterns under variation of one or two external parameters....... Topologically, these are identical with the ones previously found for no-slip surfaces. We use the theory to analyze the Stokes flow inside a circle, and show how it can be used to predict new bifurcation phenomena. ©2006 American Institute of Physics...

  20. Analysis of Streamline Separation at Infinity Using Time-Discrete Markov Chains.

    Science.gov (United States)

    Reich, W; Scheuermann, G

    2012-12-01

    Existing methods for analyzing separation of streamlines are often restricted to a finite time or a local area. In our paper we introduce a new method that complements them by allowing an infinite-time-evaluation of steady planar vector fields. Our algorithm unifies combinatorial and probabilistic methods and introduces the concept of separation in time-discrete Markov-Chains. We compute particle distributions instead of the streamlines of single particles. We encode the flow into a map and then into a transition matrix for each time direction. Finally, we compare the results of our grid-independent algorithm to the popular Finite-Time-Lyapunov-Exponents and discuss the discrepancies.

  1. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States.

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-02-01

    Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. A set of command line-based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion.

  2. Providing Data Management Support to NASA Airborne Field Studies through Streamlined Usability Design

    Science.gov (United States)

    Beach, A. L., III; Northup, E. A.; Early, A. B.; Chen, G.

    2016-12-01

    Airborne field studies are an effective way to gain a detailed understanding of atmospheric processes for scientific research on climate change and air quality relevant issues. One major function of airborne project data management is to maintain seamless data access within the science team. This allows individual instrument principal investigators (PIs) to process and validate their own data, which requires analysis of data sets from other PIs (or instruments). The project's web platform streamlines data ingest, distribution processes, and data format validation. In May 2016, the NASA Langley Research Center (LaRC) Atmospheric Science Data Center (ASDC) developed a new data management capability to help support the Korea U.S.-Air Quality (KORUS-AQ) science team. This effort is aimed at providing direct NASA Distributed Active Archive Center (DAAC) support to an airborne field study. Working closely with the science team, the ASDC developed a scalable architecture that allows investigators to easily upload and distribute their data and documentation within a secure collaborative environment. The user interface leverages modern design elements to intuitively guide the PI through each step of the data management process. In addition, the new framework creates an abstraction layer between how the data files are stored and how the data itself is organized(i.e. grouping files by PI). This approach makes it easy for PIs to simply transfer their data to one directory, while the system itself can automatically group/sort data as needed. Moreover, the platform is "server agnostic" to a certain degree, making deployment and customization more straightforward as hardware needs change. This flexible design will improve development efficiency and can be leveraged for future field campaigns. This presentation will examine the KORUS-AQ data portal as a scalable solution that applies consistent and intuitive usability design practices to support ingest and management of airborne

  3. Inter-laboratory validation of an inexpensive streamlined method to measure inorganic arsenic in rice grain.

    Science.gov (United States)

    Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J

    2018-05-04

    With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.

  4. Streamlining the Design Tradespace for Earth Imaging Constellations

    Science.gov (United States)

    Nag, Sreeja; Hughes, Steven P.; Le Moigne, Jacqueline J.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth

  5. Southern Ocean overturning across streamlines in an eddying simulation of the Antarctic Circumpolar Current

    Directory of Open Access Journals (Sweden)

    A. M. Treguier

    2007-12-01

    Full Text Available An eddying global model is used to study the characteristics of the Antarctic Circumpolar Current (ACC in a streamline-following framework. Previous model-based estimates of the meridional circulation were calculated using zonal averages: this method leads to a counter-intuitive poleward circulation of the less dense waters, and underestimates the eddy effects. We show that on the contrary, the upper ocean circulation across streamlines agrees with the theoretical view: an equatorward mean flow partially cancelled by a poleward eddy mass flux. Two model simulations, in which the buoyancy forcing above the ACC changes from positive to negative, suggest that the relationship between the residual meridional circulation and the surface buoyancy flux is not as straightforward as assumed by the simplest theoretical models: the sign of the residual circulation cannot be inferred from the surface buoyancy forcing only. Among the other processes that likely play a part in setting the meridional circulation, our model results emphasize the complex three-dimensional structure of the ACC (probably not well accounted for in streamline-averaged, two-dimensional models and the distinct role of temperature and salinity in the definition of the density field. Heat and salt transports by the time-mean flow are important even across time-mean streamlines. Heat and salt are balanced in the ACC, the model drift being small, but the nonlinearity of the equation of state cannot be ignored in the density balance.

  6. Streamlining the Online Course Development Process by Using Project Management Tools

    Science.gov (United States)

    Abdous, M'hammed; He, Wu

    2008-01-01

    Managing the design and production of online courses is challenging. Insufficient instructional design and inefficient management often lead to issues such as poor course quality and course delivery delays. In an effort to facilitate, streamline, and improve the overall design and production of online courses, this article discusses how we…

  7. Less is More : Better Compliance and Increased Revenues by Streamlining Business Registration in Uganda

    OpenAIRE

    Sander, Cerstin

    2003-01-01

    A pilot of a streamlined business registration system in Entebbe, Uganda, reduced compliance costs for enterprises by 75 percent, raised registration numbers and fee revenue by 40 percent and reduced the cost of administering the system. It also reduced opportunities for corruption, improved relations between businesses and the local authorities and resulted in better compliance.

  8. Zephyr: A secure Internet-based process to streamline engineering procurements using the World Wide Web

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, C.W.; Cavitt, R.E.; Niven, W.A.; Warren, F.E.; Taylor, S.S.; Sharick, T.M.; Vickers, D.L.; Mitschkowetz, N.; Weaver, R.L.

    1996-08-13

    Lawrence Livermore National Laboratory (LLNL) is piloting an Internet- based paperless process called `Zephyr` to streamline engineering procurements. Major benefits have accrued by using Zephyr in reducing procurement time, speeding the engineering development cycle, facilitating industrial collaboration, and reducing overall costs. Programs at LLNL are benefiting by the efficiencies introduced since implementing Zephyr`s engineering and commerce on the Internet.

  9. Streamline topologies near simple degenerate critical points in two-dimensional flow away from boundaries

    DEFF Research Database (Denmark)

    Brøns, Morten; Hartnack, Johan Nicolai

    1998-01-01

    Streamline patterns and their bifurcations in two-dimensional incompressible flow are investigated from a topological point of view. The velocity field is expanded at a point in the fluid, and the expansion coefficients are considered as bifurcation parameters. A series of non-linear coordinate c...

  10. Streamline topologies near simple degenerate critical points in two-dimensional flow away from boundaries

    DEFF Research Database (Denmark)

    Brøns, Morten; Hartnack, Johan Nicolai

    1999-01-01

    Streamline patterns and their bifurcations in two-dimensional incompressible flow are investigated from a topological point of view. The velocity field is expanded at a point in the fluid, and the expansion coefficients are considered as bifurcation parameters. A series of nonlinear coordinate ch...

  11. Streamlined Total Synthesis of Trioxacarcins and Its Application to the Design, Synthesis, and Biological Evaluation of Analogues Thereof. Discovery of Simpler Designed and Potent Trioxacarcin Analogues.

    Science.gov (United States)

    Nicolaou, K C; Chen, Pengxi; Zhu, Shugao; Cai, Quan; Erande, Rohan D; Li, Ruofan; Sun, Hongbao; Pulukuri, Kiran Kumar; Rigol, Stephan; Aujay, Monette; Sandoval, Joseph; Gavrilyuk, Julia

    2017-11-01

    A streamlined total synthesis of the naturally occurring antitumor agents trioxacarcins is described, along with its application to the construction of a series of designed analogues of these complex natural products. Biological evaluation of the synthesized compounds revealed a number of highly potent, and yet structurally simpler, compounds that are effective against certain cancer cell lines, including a drug-resistant line. A novel one-step synthesis of anthraquinones and chloro anthraquinones from simple ketone precursors and phenylselenyl chloride is also described. The reported work, featuring novel chemistry and cascade reactions, has potential applications in cancer therapy, including targeted approaches as in antibody-drug conjugates.

  12. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  13. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  14. Automated Parallel Capillary Electrophoretic System

    Science.gov (United States)

    Li, Qingbo; Kane, Thomas E.; Liu, Changsheng; Sonnenschein, Bernard; Sharer, Michael V.; Kernan, John R.

    2000-02-22

    An automated electrophoretic system is disclosed. The system employs a capillary cartridge having a plurality of capillary tubes. The cartridge has a first array of capillary ends projecting from one side of a plate. The first array of capillary ends are spaced apart in substantially the same manner as the wells of a microtitre tray of standard size. This allows one to simultaneously perform capillary electrophoresis on samples present in each of the wells of the tray. The system includes a stacked, dual carousel arrangement to eliminate cross-contamination resulting from reuse of the same buffer tray on consecutive executions from electrophoresis. The system also has a gel delivery module containing a gel syringe/a stepper motor or a high pressure chamber with a pump to quickly and uniformly deliver gel through the capillary tubes. The system further includes a multi-wavelength beam generator to generate a laser beam which produces a beam with a wide range of wavelengths. An off-line capillary reconditioner thoroughly cleans a capillary cartridge to enable simultaneous execution of electrophoresis with another capillary cartridge. The streamlined nature of the off-line capillary reconditioner offers the advantage of increased system throughput with a minimal increase in system cost.

  15. Robotic automation of medication-use management.

    Science.gov (United States)

    Enright, S M

    1993-11-01

    In the October 1993 issue of Physician Assistant, we published "Robots for Health Care," the first of two articles on the medical applications of robotics. That article discussed ways in which robots could help patients with manipulative disabilities to perform activities of daily living and hold paid employment; transfer patients from bed to chair and back again; add precision to the most exacting surgical procedures; and someday carry out diagnostic and therapeutic techniques from within the human body. This month, we are pleased to offer an article by Sharon Enright, an authority on pharmacy operations, who considers how an automated medication-management system that makes use of bar-code technology is capable of streamlining drug dispensing, controlling safety, increasing cost-effectiveness, and ensuring accurate and complete record-keeping.

  16. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  17. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  18. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  19. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  20. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  1. Investing in the Future: Automation Marketplace 2009

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    In a year where the general economy presented enormous challenges, libraries continued to make investments in automation, especially in products that help improve what and how they deliver to their end users. Access to electronic content remains a key driver. In response to anticipated needs for new approaches to library automation, many companies…

  2. Quantification of bacteria on abiotic surfaces by laser scanning cytometry: An automated approach to screen the antifouling properties of new surface coatings

    DEFF Research Database (Denmark)

    Regina, Viduthalai R.; Poulsen, Morten; Søhoel, Helmer

    2012-01-01

    Bacterial biofilms are a persistent source of contamination, and much effort invested in developing antifouling surfaces or coatings. A bottle-neck in developing such coatings is often the time-consuming task of screening and evaluating a large number of surface materials. An automated high...

  3. Mean streamline analysis for performance prediction of cross-flow fans

    International Nuclear Information System (INIS)

    Kim, Jae Won; Oh, Hyoung Woo

    2004-01-01

    This paper presents the mean streamline analysis using the empirical loss correlations for performance prediction of cross-flow fans. Comparison of overall performance predictions with test data of a cross-flow fan system with a simplified vortex wall scroll casing and with the published experimental characteristics for a cross-flow fan has been carried out to demonstrate the accuracy of the proposed method. Predicted performance curves by the present mean streamline analysis agree well with experimental data for two different cross-flow fans over the normal operating conditions. The prediction method presented herein can be used efficiently as a tool for the preliminary design and performance analysis of general-purpose cross-flow fans

  4. A streamlined Western blot exercise: An efficient and greener approach in the laboratory classroom.

    Science.gov (United States)

    Ness, Traci L; Robinson, Rebekah L; Mojadedi, Wais; Peavy, Lydia; Weiland, Mitch H

    2015-01-01

    SDS-PAGE and western blotting are two commonly taught protein detection techniques in biochemistry and molecular biology laboratory classrooms. A pitfall associated with incorporating these techniques into the laboratory is the significant wait times that do not allow students to obtain timely results. The waiting associated with SDS-PAGE comes from staining and destaining, whereas with western blotting it is the times required for antibody incubations and the numerous wash steps. This laboratory exercise incorporates 2,2,2-trichloroethanol (TCE) into the SDS-PAGE gel allowing for visualization of migrated proteins in a matter of minutes, saving both the time and chemical waste associated with traditional Coomassie staining. Additionally, TCE staining does not affect protein transfer eliminating the requirement for duplicated gels for total protein and western analyses. Protein transfer can be confirmed immediately without the use of Ponceau S staining. Lastly, this western blot procedure has been further shortened by using an HRP-conjugated primary antibody, which eliminates the secondary antibody incubation and washes, and uses a colorimetric detection to allow for visualization by students without the need for specialized equipment. © 2015 The International Union of Biochemistry and Molecular Biology.

  5. Chemically Modified Bacteriophage as a Streamlined Approach to Noninvasive Breast Cancer Imaging

    Science.gov (United States)

    2013-12-01

    REFERENCES: 1. Arap, W.; Haedicke, W.; Bernasconi, M.; Kain, R.; Rajotte, D.; Krajewski, S.; Ellerby, H. M.; Bredesen, D. E.; Pasqualini , R...Genetically Engi- neered Nanofiber-like Viruses for Tissue Regenerating Materials. Nano Lett. 2009, 9, 846–852. 7. Arap, W.; Pasqualini , R.; Ruoslahti, E

  6. Project management for small business: a streamlined approach from planning to completion

    National Research Council Canada - National Science Library

    Phillips, Joseph

    2012-01-01

    ... a Project Management Plan 81 72 Developing the Work Breakdown Structure 85 76 Selecting Your Project Management Software 83 65 CHAPTER 4: MANAGING PROJECT COSTS Building a Cost Management Frame...

  7. Automating quantum experiment control

    Science.gov (United States)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  8. A New Automated Instrument Calibration Facility at the Savannah River Site

    International Nuclear Information System (INIS)

    Polz, E.; Rushton, R.O.; Wilkie, W.H.; Hancock, R.C.

    1998-01-01

    The Health Physics Instrument Calibration Facility at the Savannah River Site in Aiken, SC was expressly designed and built to calibrate portable radiation survey instruments. The facility incorporates recent advances in automation technology, building layout and construction, and computer software to improve the calibration process. Nine new calibration systems automate instrument calibration and data collection. The building is laid out so that instruments are moved from one area to another in a logical, efficient manner. New software and hardware integrate all functions such as shipping/receiving, work flow, calibration, testing, and report generation. Benefits include a streamlined and integrated program, improved efficiency, reduced errors, and better accuracy

  9. Calculation of heat transfer in transversely stream-lined tube bundles with chess arrangement

    International Nuclear Information System (INIS)

    Migaj, V.K.

    1978-01-01

    A semiempirical theory of heat transfer in transversely stream-lined chess-board tube bundles has been developed. The theory is based on a single cylinder model and involves external flow parameter evaluation on the basis of the solidification principle of a vortex zone. The effect of turbulence is estimated according to experimental results. The method is extended to both average and local heat transfer coefficients. Comparison with experiment shows satisfactory agreement

  10. Streamline processing of discrete nuclear spectra by means of authoregularized iteration process (the KOLOBOK code)

    International Nuclear Information System (INIS)

    Gadzhokov, V.; Penev, I.; Aleksandrov, L.

    1979-01-01

    A brief description of the KOLOBOK computer code designed for streamline processing of discrete nuclear spectra with symmetric Gaussian shape of the single line on computers of the ES series, models 1020 and above, is given. The program solves the stream of discrete-spectrometry generated nonlinear problems by means of authoregularized iteration process. The Fortran-4 text of the code is reported in an Appendix

  11. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  12. The Effects of Propulsive Jetting on Drag of a Streamlined body

    Science.gov (United States)

    Krieg, Michael; Mohseni, Kamran

    2017-11-01

    Recently an abundance of bioinspired underwater vehicles have emerged to leverage eons of evolution. Our group has developed a propulsion technique inspired by jellyfish and squid. Propulsive jets are generated by ingesting and expelling water from a flexible internal cavity. We have demonstrated thruster capabilities for maneuvering on AUV platforms, where the internal thruster geometry minimized forward drag; however, such a setup cannot characterize propulsive efficiency. Therefore, we created a new streamlined vehicle platform that produces unsteady jets for forward propulsion rather than maneuvering. The streamlined jetting body is placed in a water tunnel and held stationary while jetting frequency and background flow velocity are varied. For each frequency/velocity pair the flow field is measured around the surface and in the wake using PIV. Using the zero jetting frequency as a baseline for each background velocity, the passive body drag is related to the velocity distribution. For cases with active jetting the drag and jetting forces are estimated from the velocity field and compared to the passive case. For this streamlined body, the entrainment of surrounding flow into the propulsive jet can reduce drag forces in addition to the momentum transfer of the jet itself. Office of Naval Research.

  13. Development of a web-based tool for automated processing and cataloging of a unique combinatorial drug screen.

    Science.gov (United States)

    Dalecki, Alex G; Wolschendorf, Frank

    2016-07-01

    Facing totally resistant bacteria, traditional drug discovery efforts have proven to be of limited use in replenishing our depleted arsenal of therapeutic antibiotics. Recently, the natural anti-bacterial properties of metal ions in synergy with metal-coordinating ligands have shown potential for generating new molecule candidates with potential therapeutic downstream applications. We recently developed a novel combinatorial screening approach to identify compounds with copper-dependent anti-bacterial properties. Through a parallel screening technique, the assay distinguishes between copper-dependent and independent activities against Mycobacterium tuberculosis with hits being defined as compounds with copper-dependent activities. These activities must then be linked to a compound master list to process and analyze the data and to identify the hit molecules, a labor intensive and mistake-prone analysis. Here, we describe a software program built to automate this analysis in order to streamline our workflow significantly. We conducted a small, 1440 compound screen against M. tuberculosis and used it as an example framework to build and optimize the software. Though specifically adapted to our own needs, it can be readily expanded for any small- to medium-throughput screening effort, parallel or conventional. Further, by virtue of the underlying Linux server, it can be easily adapted for chemoinformatic analysis of screens through packages such as OpenBabel. Overall, this setup represents an easy-to-use solution for streamlining processing and analysis of biological screening data, as well as offering a scaffold for ready functionality expansion. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. The role of streamline curvature in sand dune dynamics: evidence from field and wind tunnel measurements

    Science.gov (United States)

    Wiggs, Giles F. S.; Livingstone, Ian; Warren, Andrew

    1996-09-01

    Field measurements on an unvegetated, 10 m high barchan dune in Oman are compared with measurements over a 1:200 scale fixed model in a wind tunnel. Both the field and wind tunnel data demonstrate similar patterns of wind and shear velocity over the dune, confirming significant flow deceleration upwind of and at the toe of the dune, acceleration of flow up the windward slope, and deceleration between the crest and brink. This pattern, including the widely reported upwind reduction in shear velocity, reflects observations of previous studies. Such a reduction in shear velocity upwind of the dune should result in a reduction in sand transport and subsequent sand deposition. This is not observed in the field. Wind tunnel modelling using a near-surface pulse-wire probe suggests that the field method of shear velocity derivation is inadequate. The wind tunnel results exhibit no reduction in shear velocity upwind of or at the toe of the dune. Evidence provided by Reynolds stress profiles and turbulence intensities measured in the wind tunnel suggest that this maintenance of upwind shear stress may be a result of concave (unstable) streamline curvature. These additional surface stresses are not recorded by the techniques used in the field measurements. Using the occurrence of streamline curvature as a starting point, a new 2-D model of dune dynamics is deduced. This model relies on the establishment of an equilibrium between windward slope morphology, surface stresses induced by streamline curvature, and streamwise acceleration. Adopting the criteria that concave streamline curvature and streamwise acceleration both increase surface shear stress, whereas convex streamline curvature and deceleration have the opposite effect, the relationships between form and process are investigated in each of three morphologically distinct zones: the upwind interdune and concave toe region of the dune, the convex portion of the windward slope, and the crest-brink region. The

  15. Brownfields Assessing Contractor Capabilities for Streamlined Site Investigation: Additional Information Regarding All Appropriate Inquiries and Hiring an Environmental Professional

    Science.gov (United States)

    This document assists Brownfields grantees and other decision makers as they assess the capabilities of contractors and consultants to determine their qualifications to provide streamlined and innovative strategies for the assessment and and cleanup.

  16. Evaluation of Automated Flagger Assistance Devices

    Science.gov (United States)

    2018-02-01

    Automated flagger assistance devices (AFADs) are designed to improve worker safety by replacing flaggers who are typically located near traffic approaching a work zone. In this study, a new AFAD developed by the Missouri Department of Transportation ...

  17. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Keyser, John

    2013-01-01

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation

  18. System analysis of automated speed enforcement implementation.

    Science.gov (United States)

    2016-04-01

    Speeding is a major factor in a large proportion of traffic crashes, injuries, and fatalities in the United States. Automated Speed Enforcement (ASE) is one of many approaches shown to be effective in reducing speeding violations and crashes. However...

  19. Guessing right for the next war: streamlining, pooling, and right-timing force design decisions for an environment of uncertainty

    Science.gov (United States)

    2017-05-25

    key ingredients for not only how the Army fought World War II, but also how it continues to organize today. In essence , streamlining pares down every...Germans.1 The Battle of Mortain reflected the US Army in World War II at its best.2 It defined US Army success in the European theater of operations...continues to organize today.5 In essence , streamlining pared down every unit to its essentials based around a critical capability it provided to

  20. Development of an Integrated Approach to Routine Automation of Neutron Activation Analysis. Results of a Coordinated Research Project. Companion CD-ROM. Annex II: Country Reports

    International Nuclear Information System (INIS)

    2018-04-01

    Neutron activation analysis (NAA) is a powerful technique for determining bulk composition of major and trace elements. Automation may contribute significantly to keep NAA competitive for end-users. It provides opportunities for a larger analytical capacity and a shorter overall turnaround time if large series of samples have to be analysed. This publication documents and disseminates the expertise generated on automation in NAA during a coordinated research project (CRP). The CRP participants presented different cost-effective designs of sample changers for gamma-ray spectrometry as well as irradiation devices, and were able to construct and successfully test these systems. They also implemented, expanded and improved quality control and quality assurance as cross-cutting topical area of their automated NAA procedures. The publication serves as a reference of interest to NAA practitioners, experts, and research reactor personnel, but also to various stakeholders and users interested in basic research and/or services provided by NAA. This CD-ROM contains the individual country reports

  1. Unmet needs in automated cytogenetics

    International Nuclear Information System (INIS)

    Bender, M.A.

    1976-01-01

    Though some, at least, of the goals of automation systems for analysis of clinical cytogenetic material seem either at hand, like automatic metaphase finding, or at least likely to be met in the near future, like operator-assisted semi-automatic analysis of banded metaphase spreads, important areas of cytogenetic analsis, most importantly the determination of chromosomal aberration frequencies in populations of cells or in samples of cells from people exposed to environmental mutagens, await practical methods of automation. Important as are the clinical diagnostic applications, it is apparent that increasing concern over the clastogenic effects of the multitude of potentially clastogenic chemical and physical agents to which human populations are being increasingly exposed, and the resulting emergence of extensive cytogenetic testing protocols, makes the development of automation not only economically feasible but almost mandatory. The nature of the problems involved, and acutal of possible approaches to their solution, are discussed

  2. Computer automation and artificial intelligence

    International Nuclear Information System (INIS)

    Hasnain, S.B.

    1992-01-01

    Rapid advances in computing, resulting from micro chip revolution has increased its application manifold particularly for computer automation. Yet the level of automation available, has limited its application to more complex and dynamic systems which require an intelligent computer control. In this paper a review of Artificial intelligence techniques used to augment automation is presented. The current sequential processing approach usually adopted in artificial intelligence has succeeded in emulating the symbolic processing part of intelligence, but the processing power required to get more elusive aspects of intelligence leads towards parallel processing. An overview of parallel processing with emphasis on transputer is also provided. A Fuzzy knowledge based controller for amination drug delivery in muscle relaxant anesthesia on transputer is described. 4 figs. (author)

  3. Automated analysis of autoradiographic imagery

    International Nuclear Information System (INIS)

    Bisignani, W.T.; Greenhouse, S.C.

    1975-01-01

    A research programme is described which has as its objective the automated characterization of neurological tissue regions from autoradiographs by utilizing hybrid-resolution image processing techniques. An experimental system is discussed which includes raw imagery, scanning an digitizing equipments, feature-extraction algorithms, and regional characterization techniques. The parameters extracted by these algorithms are presented as well as the regional characteristics which are obtained by operating on the parameters with statistical sampling techniques. An approach is presented for validating the techniques and initial experimental results are obtained from an anlysis of an autoradiograph of a region of the hypothalamus. An extension of these automated techniques to other biomedical research areas is discussed as well as the implications of applying automated techniques to biomedical research problems. (author)

  4. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  5. Development of an unbiased, semi-automated approach for classifying plasma cell immunophenotype following multicolor flow cytometry of bone marrow aspirates.

    Science.gov (United States)

    Post, Steven R; Post, Ginell R; Nikolic, Dejan; Owens, Rebecca; Insuasti-Beltran, Giovanni

    2018-03-24

    Despite increased usage of multiparameter flow cytometry (MFC) to assess diagnosis, prognosis, and therapeutic efficacy (minimal residual disease, MRD) in plasma cell neoplasms (PCNs), standardization of methodology and data analysis is suboptimal. We investigated the utility of using the mean and median fluorescence intensities (FI) obtained from MFC to objectively describe parameters that distinguish plasma cell (PC) phenotypes. In this retrospective study, flow cytometry results from bone marrow aspirate specimens from 570 patients referred to the Myeloma Institute at UAMS were evaluated. Mean and median FI data were obtained from 8-color MFC of non-neoplastic, malignant, and mixed PC populations using antibodies to CD38, CD138, CD19, CD20, CD27, CD45, CD56, and CD81. Of 570 cases, 252 cases showed only non-neoplastic PCs, 168 showed only malignant PCs, and 150 showed mixed PC populations. Statistical analysis of median FI data for each CD marker showed no difference in expression intensity on non-neoplastic and malignant PCs, between pure and mixed PC populations. ROC analysis of the median FI of CD expression in non-neoplastic and malignant PCs was used to develop an algorithm to convert quantitative FI values to qualitative assessments including "negative," "positive," "dim," and "heterogeneous" expression. FI data derived from 8-color MFC can be used to define marker expression on PCs. Translation of FI data from Infinicyt software to an Excel worksheet streamlines workflow and eliminates transcriptional errors when generating flow reports. © 2018 International Clinical Cytometry Society. © 2018 International Clinical Cytometry Society.

  6. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  7. Stable–streamlined and helical cavities following the impact of Leidenfrost spheres

    KAUST Repository

    Mansoor, Mohammad M.

    2017-06-23

    We report results from an experimental study on the formation of stable–streamlined and helical cavity wakes following the free-surface impact of Leidenfrost spheres. Similar to the observations of Mansoor et al. (J. Fluid Mech., vol. 743, 2014, pp. 295–326), we show that acoustic ripples form along the interface of elongated cavities entrained in the presence of wall effects as soon as the primary cavity pinch-off takes place. The crests of these ripples can act as favourable points for closure, producing multiple acoustic pinch-offs, which are found to occur in an acoustic pinch-off cascade. We show that these ripples pacify with time in the absence of physical contact between the sphere and the liquid, leading to extremely smooth cavity wake profiles. More importantly, the downward-facing jet at the apex of the cavity is continually suppressed due to a skin-friction drag effect at the colliding cavity-wall junction, which ultimately produces a stable–streamlined cavity wake. This streamlined configuration is found to experience drag coefficients an order of a magnitude lower than those acting on room-temperature spheres. A striking observation is the formation of helical cavities which occur for impact Reynolds numbers and are characterized by multiple interfacial ridges, stemming from and rotating synchronously about an evident contact line around the sphere equator. The contact line is shown to result from the degeneration of Kelvin–Helmholtz billows into turbulence which are observed forming along the liquid–vapour interface around the bottom hemisphere of the sphere. Using sphere trajectory measurements, we show that this helical cavity wake configuration has 40 %–55 % smaller force coefficients than those obtained in the formation of stable cavity wakes.

  8. Stable–streamlined and helical cavities following the impact of Leidenfrost spheres

    KAUST Repository

    Mansoor, Mohammad M.; Vakarelski, Ivan Uriev; Marston, J. O.; Truscott, T. T.; Thoroddsen, Sigurdur T

    2017-01-01

    We report results from an experimental study on the formation of stable–streamlined and helical cavity wakes following the free-surface impact of Leidenfrost spheres. Similar to the observations of Mansoor et al. (J. Fluid Mech., vol. 743, 2014, pp. 295–326), we show that acoustic ripples form along the interface of elongated cavities entrained in the presence of wall effects as soon as the primary cavity pinch-off takes place. The crests of these ripples can act as favourable points for closure, producing multiple acoustic pinch-offs, which are found to occur in an acoustic pinch-off cascade. We show that these ripples pacify with time in the absence of physical contact between the sphere and the liquid, leading to extremely smooth cavity wake profiles. More importantly, the downward-facing jet at the apex of the cavity is continually suppressed due to a skin-friction drag effect at the colliding cavity-wall junction, which ultimately produces a stable–streamlined cavity wake. This streamlined configuration is found to experience drag coefficients an order of a magnitude lower than those acting on room-temperature spheres. A striking observation is the formation of helical cavities which occur for impact Reynolds numbers and are characterized by multiple interfacial ridges, stemming from and rotating synchronously about an evident contact line around the sphere equator. The contact line is shown to result from the degeneration of Kelvin–Helmholtz billows into turbulence which are observed forming along the liquid–vapour interface around the bottom hemisphere of the sphere. Using sphere trajectory measurements, we show that this helical cavity wake configuration has 40 %–55 % smaller force coefficients than those obtained in the formation of stable cavity wakes.

  9. Theoretical Calculations on Sediment Transport on Titan, and the Possible Production of Streamlined Forms

    Science.gov (United States)

    Burr, D. M.; Emery, J. P.; Lorenz, R. D.

    2005-01-01

    The Cassini Imaging Science System (ISS) has been returning images of Titan, along with other Saturnian satellites. Images taken through the 938 nm methane window see down to Titan's surface. One of the purposes of the Cassini mission is to investigate possible fluid cycling on Titan. Lemniscate features shown recently and radar evidence of surface flow prompted us to consider theoretically the creation by methane fluid flow of streamlined forms on Titan. This follows work by other groups in theoretical consideration of fluid motion on Titan's surface.

  10. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  11. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  12. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  13. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  14. Automating linear accelerator quality assurance.

    Science.gov (United States)

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC

  15. A multi-stage approach to maximizing geocoding success in a large population-based cohort study through automated and interactive processes

    Directory of Open Access Journals (Sweden)

    Jennifer S. Sonderman

    2012-05-01

    Full Text Available To enable spatial analyses within a large, prospective cohort study of nearly 86,000 adults enrolled in a 12-state area in the southeastern United States of America from 2002-2009, a multi-stage geocoding protocol was developed to efficiently maximize the proportion of participants assigned an address level geographic coordinate. Addresses were parsed, cleaned and standardized before applying a combination of automated and interactive geocoding tools. Our full protocol increased the non-Post Office (PO Box match rate from 74.5% to 97.6%. Overall, we geocoded 99.96% of participant addresses, with only 5.2% at the ZIP code centroid level (2.8% PO Box and 2.3% non-PO Box addresses. One key to reducing the need for interactive geocoding was the use of multiple base maps. Still, addresses in areas with population density 920 persons/km2 (odds ratio (OR = 5.24; 95% confidence interval (CI = 4.23, 6.49, as were addresses collected from participants during in-person interviews compared with mailed questionnaires (OR = 1.83; 95% CI = 1.59, 2.11. This study demonstrates that population density and address ascertainment method can influence automated geocoding results and that high success in address level geocoding is achievable for large-scale studies covering wide geographical areas.

  16. Uma abordagem metodológica para o desenvolvimento de sistemas automatizados e integrados de manufatura A methodological approach to automated and integrated manufacturing systems development

    Directory of Open Access Journals (Sweden)

    Marco Antonio Busetti de Paula

    2008-01-01

    Full Text Available Este trabalho apresenta uma metodologia de projeto aplicada a sistemas automatizados e integrados de manufatura. A metodologia consiste em um desenvolvimento cíclico de três etapas - modelagem, síntese e implementação - até o atendimento da aplicação demandada para o sistema real, resultando no projeto do sistema automatizado integrado. Esta forma de desenvolvimento permite uma revisão contínua dos resultados obtidos em cada etapa. Para testar e validar a metodologia, é apresentado um exemplo de re-projeto de um protótipo de sistema de manufatura em função da necessidade de inserção de um novo produto.This paper presents a methodology of project applied to automated and integrated manufacturing systems. The methodology consists of a cyclic three stages development - modeling, synthesis and implementation - till the accomplishment of the application required by the real system, resulting in the project of the automated and integrated system. This kind of development allows a continuous revision of the results of each stage. To submit to a test and to validate the methodology, it is given an example of a re-project of a prototype of a manufacturing system at the time of the introduction of a new product on the market.

  17. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    in operating situations involving a procedural type of activity was precise. The measure of operators' trust in the automation yielded no significant results in relation to the procedural type of activity scenarios. Based on the experience and outcome of this experiment, a preliminary approach for defining automation types is proposed. In addition, a set of hypotheses about how automation influences operator performance is derived from the outcome of the experiment, and a set of recommendations for future studies is offered (author) (ml)

  18. Streamlining the process: A strategy for making NEPA work better and cost less

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, R.P.; Hansen, J.D. [Hansen Environmental Consultants, Englewood, CO (United States); Wolff, T.A. [Sandia National Labs., Albuquerque, NM (United States)

    1998-05-01

    When the National Environmental Policy Act (NEPA) was enacted in 1969, neither Congress nor the Federal Agencies affected anticipated that implementation of the NEPA process would result in the intolerable delays, inefficiencies, duplication of effort, commitments of excessive financial and personnel resources, and bureaucratic gridlock that have become institutionalized. The 1975 Council on Environmental Quality (CEQ) regulations, which were intended to make the NEPA process more efficient and more useful to decision makers and the public, have either been largely ignored or unintentionally subverted. Agency policy mandates, like those of former Secretary of Energy Hazel R. O`Leary, to ``make NEPA work better and cost less`` have, so far, been disappointingly ineffectual. Federal Agencies have reached the point where almost every constituent of the NEPA process must be subjected to crisis management. This paper focuses on a ten-point strategy for streamlining the NEPA process in order to achieve the Act`s objectives while easing the considerable burden on agencies, the public, and the judicial system. How the ten points are timed and implemented is critical to any successful streamlining.

  19. The impact of groundwater velocity fields on streamlines in an aquifer system with a discontinuous aquitard (Inner Mongolia, China)

    Science.gov (United States)

    Wu, Qiang; Zhao, Yingwang; Xu, Hua

    2018-04-01

    Many numerical methods that simulate groundwater flow, particularly the continuous Galerkin finite element method, do not produce velocity information directly. Many algorithms have been proposed to improve the accuracy of velocity fields computed from hydraulic potentials. The differences in the streamlines generated from velocity fields obtained using different algorithms are presented in this report. The superconvergence method employed by FEFLOW, a popular commercial code, and some dual-mesh methods proposed in recent years are selected for comparison. The applications to depict hydrogeologic conditions using streamlines are used, and errors in streamlines are shown to lead to notable errors in boundary conditions, the locations of material interfaces, fluxes and conductivities. Furthermore, the effects of the procedures used in these two types of methods, including velocity integration and local conservation, are analyzed. The method of interpolating velocities across edges using fluxes is shown to be able to eliminate errors associated with refraction points that are not located along material interfaces and streamline ends at no-flow boundaries. Local conservation is shown to be a crucial property of velocity fields and can result in more accurate streamline densities. A case study involving both three-dimensional and two-dimensional cross-sectional models of a coal mine in Inner Mongolia, China, are used to support the conclusions presented.

  20. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  1. Using process-oriented interfaces for solving the automation paradox in highly automated navy vessels

    NARCIS (Netherlands)

    Diggelen, J. van; Post, W.; Rakhorst, M.; Plasmeijer, R.; Staal, W. van

    2014-01-01

    This paper describes a coherent engineering method for developing high level human machine interaction within a highly automated environment consisting of sensors, actuators, automatic situation assessors and planning devices. Our approach combines ideas from cognitive work analysis, cognitive

  2. Evaluation of right ventricular function by coronary computed tomography angiography using a novel automated 3D right ventricle volume segmentation approach: a validation study.

    Science.gov (United States)

    Burghard, Philipp; Plank, Fabian; Beyer, Christoph; Müller, Silvana; Dörler, Jakob; Zaruba, Marc-Michael; Pölzl, Leo; Pölzl, Gerhard; Klauser, Andrea; Rauch, Stefan; Barbieri, Fabian; Langer, Christian-Ekkehardt; Schgoer, Wilfried; Williamson, Eric E; Feuchtner, Gudrun

    2018-06-04

    To evaluate right ventricle (RV) function by coronary computed tomography angiography (CTA) using a novel automated three-dimensional (3D) RV volume segmentation tool in comparison with clinical reference modalities. Twenty-six patients with severe end-stage heart failure [left ventricle (LV) ejection fraction (EF) right heart invasive catheterisation (IC). Automated 3D RV volume segmentation was successful in 26 (100%) patients. Read-out time was 3 min 33 s (range, 1 min 50s-4 min 33s). RV EF by CTA was stronger correlated with right atrial pressure (RAP) by IC (r = -0.595; p = 0.006) but weaker with TAPSE (r = 0.366, p = 0.94). When comparing TAPSE with RAP by IC (r = -0.317, p = 0.231), a weak-to-moderate non-significant inverse correlation was found. Interobserver correlation was high with r = 0.96 (p right atrium (RA) and right ventricle (RV) was 196.9 ± 75.3 and 217.5 ± 76.1 HU, respectively. Measurement of RV function by CTA using a novel 3D volumetric segmentation tool is fast and reliable by applying a dedicated biphasic injection protocol. The RV EF from CTA is a closer surrogate of RAP than TAPSE by TTE. • Evaluation of RV function by cardiac CTA by using a novel 3D volume segmentation tool is fast and reliable. • A biphasic contrast agent injection protocol ensures homogenous RV contrast attenuation. • Cardiac CT is a valuable alternative modality to CMR for the evaluation of RV function.

  3. Automated Scheduling Via Artificial Intelligence

    Science.gov (United States)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  4. LC-HR-MS/MS standard urine screening approach: Pros and cons of automated on-line extraction by turbulent flow chromatography versus dilute-and-shoot and comparison with established urine precipitation.

    Science.gov (United States)

    Helfer, Andreas G; Michely, Julian A; Weber, Armin A; Meyer, Markus R; Maurer, Hans H

    2017-02-01

    Comprehensive urine screening for drugs and metabolites by LC-HR-MS/MS using Orbitrap technology has been described with precipitation as simple workup. In order to fasten, automate, and/or simplify the workup, on-line extraction by turbulent flow chromatography and a dilute-and-shoot approach were developed and compared. After chromatographic separation within 10min, the Q-Exactive mass spectrometer was run in full scan mode with positive/negative switching and subsequent data dependent acquisition mode. The workup approaches were validated concerning selectivity, recovery, matrix effects, process efficiency, and limits of identification and detection for typical drug representatives and metabolites. The total workup time for on-line extraction was 6min, for the dilution approach 3min. For comparison, the established urine precipitation and evaporation lasted 10min. The validation results were acceptable. The limits for on-line extraction were comparable with those described for precipitation, but lower than for dilution. Thanks to the high sensitivity of the LC-HR-MS/MS system, all three workup approaches were sufficient for comprehensive urine screening and allowed fast, reliable, and reproducible detection of cardiovascular drugs, drugs of abuse, and other CNS acting drugs after common doses. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Shuttle Repair Tools Automate Vehicle Maintenance

    Science.gov (United States)

    2013-01-01

    Successfully building, flying, and maintaining the space shuttles was an immensely complex job that required a high level of detailed, precise engineering. After each shuttle landed, it entered a maintenance, repair, and overhaul (MRO) phase. Each system was thoroughly checked and tested, and worn or damaged parts replaced, before the shuttle was rolled out for its next mission. During the MRO period, workers needed to record exactly what needed replacing and why, as well as follow precise guidelines and procedures in making their repairs. That meant traceability, and with it lots of paperwork. In 2007, the number of reports generated during electrical system repairs was getting out of hand-placing among the top three systems in terms of paperwork volume. Repair specialists at Kennedy Space Center were unhappy spending so much time at a desk and so little time actually working on the shuttle. "Engineers weren't spending their time doing technical work," says Joseph Schuh, an electrical engineer at Kennedy. "Instead, they were busy with repetitive, time-consuming processes that, while important in their own right, provided a low return on time invested." The strain of such inefficiency was bad enough that slow electrical repairs jeopardized rollout on several occasions. Knowing there had to be a way to streamline operations, Kennedy asked Martin Belson, a project manager with 30 years experience as an aerospace contractor, to co-lead a team in developing software that would reduce the effort required to document shuttle repairs. The result was System Maintenance Automated Repair Tasks (SMART) software. SMART is a tool for aggregating and applying information on every aspect of repairs, from procedures and instructions to a vehicle s troubleshooting history. Drawing on that data, SMART largely automates the processes of generating repair instructions and post-repair paperwork. In the case of the space shuttle, this meant that SMART had 30 years worth of operations

  6. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States1

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-01-01

    Premise of the study: Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. Methods and Results: A set of command line–based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. Conclusions: WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion. PMID:26949580

  7. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  8. Analytical Work in Support of the Design and Operation of Two Dimensional Self Streamlining Test Sections

    Science.gov (United States)

    Judd, M.; Wolf, S. W. D.; Goodyer, M. J.

    1976-01-01

    A method has been developed for accurately computing the imaginary flow fields outside a flexible walled test section, applicable to lifting and non-lifting models. The tolerances in the setting of the flexible walls introduce only small levels of aerodynamic interference at the model. While it is not possible to apply corrections for the interference effects, they may be reduced by improving the setting accuracy of the portions of wall immediately above and below the model. Interference effects of the truncation of the length of the streamlined portion of a test section are brought to an acceptably small level by the use of a suitably long test section with the model placed centrally.

  9. Streamlining interventional radiology admissions: The role of the interventional radiology clinic and physician's assistant

    International Nuclear Information System (INIS)

    White, R.I. Jr.; Rizer, D.M.; Shuman, K.; White, E.J.; Adams, P.; Doyle, K.; Kinnison, M.

    1987-01-01

    During a 5-year period (1982-1987), 376 patients were admitted to an interventional radiology service where they were managed by the senior physician and interventional radiology fellows. Sixty-eight percent of patients were admitted for angioplasty and 32% for elective embolotherapy/diagnostic angiography. A one-half-day, twice weekly interventional radiology clinic and employment of a physician's assistant who performed preadmission history and physicals and wrote orders accounted, in part, for a decrease in hospital stay length from 3.74 days (1982-1983) to 2.41 days (1986-1987). The authors conclude that use of the clinic and the physician's assistant streamlines patient flow and the admitting process and is partially responsible for a decreased length of stay for patients admitted to an interventional radiology service

  10. Streamlining and Standardizing Due Diligence to Ensure Quality of PV Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-28

    Those investing in PV power plants would like to have confidence that the plants will provide the anticipated return on investment. While due diligence is capably performed by independent engineers today, as PV systems mature, there will be benefit in standardization and streamlining of this process. The IECRE has defined technical information that is needed as a basis for each transaction step such as approving a design to begin construction, documenting readiness to operate, quantifying performance after a year of operation, and assessing the health of the plant in preparation for sale of the plant. The technical requirements have been defined by IEC Technical Committee 82 and have been designed to be both effective and efficient in completing the assessments. This workshop will describe these new tools that are now available to the community and will include a panel/audience discussion about how and when they can be most effectively used.

  11. Roof Box Shape Streamline Adaptation and the Impact towards Fuel Consumption

    Directory of Open Access Journals (Sweden)

    Abdul Latif M.F.

    2017-01-01

    Full Text Available The fuel price hike is currently a sensational national issue in Malaysia. Since the rationalization of fuel subsidies many were affected especially the middle income family. Vehicle aerodynamic were directly related to the fuel consumption, were extra frontal area result a higher drag force hence higher fuel consumption. Roof box were among the largest contributor to the extra drag, thus the roof box shape rationalization were prominent to reduce the extra drag. The idea of adopting water drop shape to the roof box design shows prominent result. The roof box has been simulated using MIRA virtual wind tunnel modelling via commercial computational fluid dynamic (CFD package. This streamline shape drastically reduce the drag force by 34% resulting to a 1.7% fuel saving compare to the conventional boxy roof box. This is an effort to reduce the carbon foot print for a sustainable green world.

  12. Easy XMM-Newton Data Analysis with the Streamlined ABC Guide!

    Science.gov (United States)

    Valencic, Lynne A.; Snowden, Steven L.; Pence, William D.

    2016-01-01

    The US XMM-Newton GOF has streamlined the time-honored XMM-Newton ABC Guide, making it easier to find and use what users may need to analyze their data. It takes into account what type of data a user might have, if they want to reduce the data on their own machine or over the internet with Web Hera, and if they prefer to use the command window or a GUI. The GOF has also included an introduction to analyzing EPIC and RGS spectra, and PN Timing mode data. The guide is provided for free to students, educators, and researchers for educational and research purposes. Try it out at: http://heasarc.gsfc.nasa.gov/docs/xmm/sl/intro.html

  13. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  14. Fully-automated approach to hippocampus segmentation using a graph-cuts algorithm combined with atlas-based segmentation and morphological opening.

    Science.gov (United States)

    Kwak, Kichang; Yoon, Uicheul; Lee, Dong-Kyun; Kim, Geon Ha; Seo, Sang Won; Na, Duk L; Shim, Hack-Joon; Lee, Jong-Min

    2013-09-01

    The hippocampus has been known to be an important structure as a biomarker for Alzheimer's disease (AD) and other neurological and psychiatric diseases. However, it requires accurate, robust and reproducible delineation of hippocampal structures. In this study, an automated hippocampal segmentation method based on a graph-cuts algorithm combined with atlas-based segmentation and morphological opening was proposed. First of all, the atlas-based segmentation was applied to define initial hippocampal region for a priori information on graph-cuts. The definition of initial seeds was further elaborated by incorporating estimation of partial volume probabilities at each voxel. Finally, morphological opening was applied to reduce false positive of the result processed by graph-cuts. In the experiments with twenty-seven healthy normal subjects, the proposed method showed more reliable results (similarity index=0.81±0.03) than the conventional atlas-based segmentation method (0.72±0.04). Also as for segmentation accuracy which is measured in terms of the ratios of false positive and false negative, the proposed method (precision=0.76±0.04, recall=0.86±0.05) produced lower ratios than the conventional methods (0.73±0.05, 0.72±0.06) demonstrating its plausibility for accurate, robust and reliable segmentation of hippocampus. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Intelligent automation of high-performance liquid chromatography method development by means of a real-time knowledge-based approach.

    Science.gov (United States)

    I, Ting-Po; Smith, Randy; Guhan, Sam; Taksen, Ken; Vavra, Mark; Myers, Douglas; Hearn, Milton T W

    2002-09-27

    We describe the development, attributes and capabilities of a novel type of artificial intelligence system, called LabExpert, for automation of HPLC method development. Unlike other computerised method development systems, LabExpert operates in real-time, using an artificial intelligence system and design engine to provide experimental decision outcomes relevant to the optimisation of complex separations as well as the control of the instrumentation, column selection, mobile phase choice and other experimental parameters. LabExpert manages every input parameter to a HPLC data station and evaluates each output parameter of the HPLC data station in real-time as part of its decision process. Based on a combination of inherent and user-defined evaluation criteria, the artificial intelligence system programs use a reasoning process, applying chromatographic principles and acquired experimental observations to iteratively provide a regime for a priori development of an acceptable HPLC separation method. Because remote monitoring and control are also functions of LabExpert, the system allows full-time utilisation of analytical instrumentation and associated laboratory resources. Based on our experience with LabExpert with a wide range of analyte mixtures, this artificial intelligence system consistently identified in a similar or faster time-frame preferred sets of analytical conditions that are equal in resolution, efficiency and throughput to those empirically determined by highly experienced chromatographic scientists. An illustrative example, demonstrating the potential of LabExpert in the process of method development of drug substances, is provided.

  16. Psychological distress and streamlined BreastScreen follow-up assessment versus standard assessment.

    Science.gov (United States)

    Sherman, Kerry A; Winch, Caleb J; Borecky, Natacha; Boyages, John

    2013-11-04

    To establish whether altered protocol characteristics of streamlined StepDown breast assessment clinics heightened or reduced the psychological distress of women in attendance compared with standard assessment. Willingness to attend future screening was also compared between the assessment groups. Observational, prospective study of women attending either a mammogram-only StepDown or a standard breast assessment clinic. Women completed questionnaires on the day of assessment and 1 month later. Women attending StepDown (136 women) or standard assessment clinics (148 women) at a BreastScreen centre between 10 November 2009 and 7 August 2010. Breast cancer worries; positive and negative psychological consequences of assessment (Psychological Consequences Questionnaire); breast cancer-related intrusion and avoidance (Impact of Event Scale); and willingness to attend, and uneasiness about, future screening. At 1-month follow-up, no group differences were evident between those attending standard and StepDown clinics on breast cancer worries (P= 0.44), positive (P= 0.88) and negative (P = 0.65) consequences, intrusion (P = 0.64), and avoidance (P = 0.87). Willingness to return for future mammograms was high, and did not differ between groups (P = 0.16), although higher levels of unease were associated with lessened willingness to rescreen (P = 0.04). There was no evidence that attending streamlined StepDown assessments had different outcomes in terms of distress than attending standard assessment clinics for women with a BreastScreen-detected abnormality. However, unease about attending future screening was generally associated with less willingness to do so in both groups; thus, there is a role for psycho-educational intervention to address these concerns.

  17. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  18. Automated NMR fragment based screening identified a novel interface blocker to the LARG/RhoA complex.

    Directory of Open Access Journals (Sweden)

    Jia Gao

    Full Text Available The small GTPase cycles between the inactive GDP form and the activated GTP form, catalyzed by the upstream guanine exchange factors. The modulation of such process by small molecules has been proven to be a fruitful route for therapeutic intervention to prevent the over-activation of the small GTPase. The fragment based approach emerging in the past decade has demonstrated its paramount potential in the discovery of inhibitors targeting such novel and challenging protein-protein interactions. The details regarding the procedure of NMR fragment screening from scratch have been rarely disclosed comprehensively, thus restricts its wider applications. To achieve a consistent screening applicable to a number of targets, we developed a highly automated protocol to cover every aspect of NMR fragment screening as possible, including the construction of small but diverse libray, determination of the aqueous solubility by NMR, grouping compounds with mutual dispersity to a cocktail, and the automated processing and visualization of the ligand based screening spectra. We exemplified our streamlined screening in RhoA alone and the complex of the small GTPase RhoA and its upstream guanine exchange factor LARG. Two hits were confirmed from the primary screening in cocktail and secondary screening over individual hits for LARG/RhoA complex, while one of them was also identified from the screening for RhoA alone. HSQC titration of the two hits over RhoA and LARG alone, respectively, identified one compound binding to RhoA.GDP at a 0.11 mM affinity, and perturbed the residues at the switch II region of RhoA. This hit blocked the formation of the LARG/RhoA complex, validated by the native gel electrophoresis, and the titration of RhoA to ¹⁵N labeled LARG in the absence and presence the compound, respectively. It therefore provides us a starting point toward a more potent inhibitor to RhoA activation catalyzed by LARG.

  19. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  1. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  2. Automated Subsystem Control for Life Support System (ASCLSS)

    Science.gov (United States)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  3. CCD characterization and measurements automation

    International Nuclear Information System (INIS)

    Kotov, I.V.; Frank, J.; Kotov, A.I.; Kubanek, P.; O'Connor, P.; Prouza, M.; Radeka, V.; Takacs, P.

    2012-01-01

    Modern mosaic cameras have grown both in size and in number of sensors. The required volume of sensor testing and characterization has grown accordingly. For camera projects as large as the LSST, test automation becomes a necessity. A CCD testing and characterization laboratory was built and is in operation for the LSST project. Characterization of LSST study contract sensors has been performed. The characterization process and its automation are discussed, and results are presented. Our system automatically acquires images, populates a database with metadata information, and runs express analysis. This approach is illustrated on 55 Fe data analysis. 55 Fe data are used to measure gain, charge transfer efficiency and charge diffusion. Examples of express analysis results are presented and discussed.

  4. Safeguards Automated Facility Evaluation (SAFE) methodology

    International Nuclear Information System (INIS)

    Chapman, L.D.; Grady, L.M.; Bennett, H.A.; Sasser, D.W.; Engi, D.

    1978-08-01

    An automated approach to facility safeguards effectiveness evaluation has been developed. This automated process, called Safeguards Automated Facility Evaluation (SAFE), consists of a collection of a continuous stream of operational modules for facility characterization, the selection of critical paths, and the evaluation of safeguards effectiveness along these paths. The technique has been implemented on an interactive computer time-sharing system and makes use of computer graphics for the processing and presentation of information. Using this technique, a comprehensive evaluation of a safeguards system can be provided by systematically varying the parameters that characterize the physical protection components of a facility to reflect the perceived adversary attributes and strategy, environmental conditions, and site operational conditions. The SAFE procedure has broad applications in the nuclear facility safeguards field as well as in the security field in general. Any fixed facility containing valuable materials or components to be protected from theft or sabotage could be analyzed using this same automated evaluation technique

  5. Performance evaluation of automated urine microscopy as a rapid, non-invasive approach for the diagnosis of non-gonococcal urethritis

    Science.gov (United States)

    Pond, Marcus J; Nori, Achyuta V; Patel, Sheel; Laing, Ken; Ajayi, Margarita; Copas, Andrew J; Butcher, Philip D; Hay, Phillip; Sadiq, Syed Tariq

    2015-01-01

    Objectives Gram-stained urethral smear (GSUS), the standard point-of-care test for non-gonococcal urethritis (NGU) is operator dependent and poorly specific. The performance of rapid automated urine flow cytometry (AUFC) of first void urine (FVU) white cell counts (UWCC) for predicting Mycoplasma genitalium and Chlamydia trachomatis urethral infections was assessed and its application to asymptomatic infection was evaluated. Methods Receiver operating characteristic curve analysis, determining FVU-UWCC threshold for predicting M. genitalium or C. trachomatis infection was performed on 208 ‘training’ samples from symptomatic patients and subsequently validated using 228 additional FVUs obtained from prospective unselected patients. Results An optimal diagnostic threshold of >29 UWC/µL gave sensitivities and specificities for either infection of 81.5% (95% CI 65.1% to 91.6%) and 85.8% (79.5% to 90.4%), respectively, compared with 86.8% (71.1% to 95%) and 64.7% (56.9% to 71.7%), respectively, for GSUS, using the training set samples. FVU-UWCC demonstrated sensitivities and specificities of 69.2% (95% CI 48.1% to 84.9%) and 92% (87.2% to 95.2%), respectively, when using validation samples. In asymptomatic patients where GSUS was not used, AUFC would have enabled more infections to be detected compared with clinical considerations only (71.4% vs 28.6%; p=0.03). The correlation between UWCC and bacterial load was stronger for M. genitalium compared with C. trachomatis (τ=0.426, p≤0.001 vs τ=0.295, p=0.022, respectively). Conclusions AUFC offers improved specificity over microscopy for predicting C. trachomatis or M. genitalium infection. Universal AUFC may enable non-invasive diagnosis of asymptomatic NGU at the PoC. The degree of urethral inflammation exhibits a stronger association with pathogen load for M. genitalium compared with C. trachomatis. PMID:25614466

  6. Use of Vortex Generators to Reduce Distortion for Mach 1.6 Streamline-Traced Supersonic Inlets

    Science.gov (United States)

    Baydar, Ezgihan; Lu, Frank; Slater, John W.; Trefny, Chuck

    2016-01-01

    Reduce the total pressure distortion at the engine-fan face due to low-momentum flow caused by the interaction of an external terminal shock at the turbulent boundary layer along a streamline-traced external-compression (STEX) inlet for Mach 1.6.

  7. Proposed Model for a Streamlined, Cohesive, and Optimized K-12 STEM Curriculum with a Focus on Engineering

    Science.gov (United States)

    Locke, Edward

    2009-01-01

    This article presents a proposed model for a clear description of K-12 age-possible engineering knowledge content, in terms of the selection of analytic principles and predictive skills for various grades, based on the mastery of mathematics and science pre-requisites, as mandated by national or state performance standards; and a streamlined,…

  8. Rapid Evidence Assessment of the Literature (REAL(©)): streamlining the systematic review process and creating utility for evidence-based health care.

    Science.gov (United States)

    Crawford, Cindy; Boyd, Courtney; Jain, Shamini; Khorsan, Raheleh; Jonas, Wayne

    2015-11-02

    Systematic reviews (SRs) are widely recognized as the best means of synthesizing clinical research. However, traditional approaches can be costly and time-consuming and can be subject to selection and judgment bias. It can also be difficult to interpret the results of a SR in a meaningful way in order to make research recommendations, clinical or policy decisions, or practice guidelines. Samueli Institute has developed the Rapid Evidence Assessment of the Literature (REAL) SR process to address these issues. REAL provides up-to-date, rigorous, high quality SR information on health care practices, products, or programs in a streamlined, efficient and reliable manner. This process is a component of the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™) program developed by Samueli Institute, which aims at answering the question of "What works?" in health care. The REAL process (1) tailors a standardized search strategy to a specific and relevant research question developed with various stakeholders to survey the available literature; (2) evaluates the quantity and quality of the literature using structured tools and rulebooks to ensure objectivity, reliability and reproducibility of reviewer ratings in an independent fashion and; (3) obtains formalized, balanced input from trained subject matter experts on the implications of the evidence for future research and current practice. Online tools and quality assurance processes are utilized for each step of the review to ensure a rapid, rigorous, reliable, transparent and reproducible SR process. The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into "what works" in health care. Using the REAL system allows for the facilitation of recommendations on appropriate next steps in policy, funding

  9. THE QUESTION OF DEVELOPMENT OF AUTOMATED SYSTEMS FOR TRAFFIC MANAGEMENT

    Directory of Open Access Journals (Sweden)

    V. Shirin

    2015-12-01

    Full Text Available The current systems and methods for automated traffic management in cities are analyzed. The management in cities is analyzed. The management levels are specified. There were fermulated the general requirements, objectives and funnctions of the automated sistems for traffic management with regard to the modern transport problems as well as proposed their aditional managemrnt and infor-maton functions. A phased approach to the implementation of projects on creation of automated sys-tems of traffic management is offered.

  10. Automating Hyperspectral Data for Rapid Response in Volcanic Emergencies

    Science.gov (United States)

    Davies, Ashley G.; Doubleday, Joshua R.; Chien, Steve A.

    2013-01-01

    In a volcanic emergency, time is of the essence. It is vital to quantify eruption parameters (thermal emission, effusion rate, location of activity) and distribute this information as quickly as possible to decision-makers in order to enable effective evaluation of eruption-related risk and hazard. The goal of this work was to automate and streamline processing of spacecraft hyperspectral data, automate product generation, and automate distribution of products. Visible and Short-Wave Infrared Images of volcanic eruption in Iceland in May 2010." class="caption" align="right">The software rapidly processes hyperspectral data, correcting for incident sunlight where necessary, and atmospheric transmission; detects thermally anomalous pixels; fits data with model black-body thermal emission spectra to determine radiant flux; calculates atmospheric convection thermal removal; and then calculates total heat loss. From these results, an estimation of effusion rate is made. Maps are generated of thermal emission and location (see figure). Products are posted online, and relevant parties notified. Effusion rate data are added to historical record and plotted to identify spikes in activity for persistently active eruptions. The entire process from start to end is autonomous. Future spacecraft, especially those in deep space, can react to detection of transient processes without the need to communicate with Earth, thus increasing science return. Terrestrially, this removes the need for human intervention.

  11. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  12. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  13. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  14. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  15. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  16. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  17. Performance evaluation of automated urine microscopy as a rapid, non-invasive approach for the diagnosis of non-gonococcal urethritis.

    Science.gov (United States)

    Pond, Marcus J; Nori, Achyuta V; Patel, Sheel; Laing, Ken; Ajayi, Margarita; Copas, Andrew J; Butcher, Philip D; Hay, Phillip; Sadiq, Syed Tariq

    2015-05-01

    Gram-stained urethral smear (GSUS), the standard point-of-care test for non-gonococcal urethritis (NGU) is operator dependent and poorly specific. The performance of rapid automated urine flow cytometry (AUFC) of first void urine (FVU) white cell counts (UWCC) for predicting Mycoplasma genitalium and Chlamydia trachomatis urethral infections was assessed and its application to asymptomatic infection was evaluated. Receiver operating characteristic curve analysis, determining FVU-UWCC threshold for predicting M. genitalium or C. trachomatis infection was performed on 208 'training' samples from symptomatic patients and subsequently validated using 228 additional FVUs obtained from prospective unselected patients. An optimal diagnostic threshold of >29 UWC/µL gave sensitivities and specificities for either infection of 81.5% (95% CI 65.1% to 91.6%) and 85.8% (79.5% to 90.4%), respectively, compared with 86.8% (71.1% to 95%) and 64.7% (56.9% to 71.7%), respectively, for GSUS, using the training set samples. FVU-UWCC demonstrated sensitivities and specificities of 69.2% (95% CI 48.1% to 84.9%) and 92% (87.2% to 95.2%), respectively, when using validation samples. In asymptomatic patients where GSUS was not used, AUFC would have enabled more infections to be detected compared with clinical considerations only (71.4% vs 28.6%; p=0.03). The correlation between UWCC and bacterial load was stronger for M. genitalium compared with C. trachomatis (τ=0.426, p≤0.001 vs τ=0.295, p=0.022, respectively). AUFC offers improved specificity over microscopy for predicting C. trachomatis or M. genitalium infection. Universal AUFC may enable non-invasive diagnosis of asymptomatic NGU at the PoC. The degree of urethral inflammation exhibits a stronger association with pathogen load for M. genitalium compared with C. trachomatis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  18. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  19. 76 FR 22854 - Streamlined Patent Reexamination Proceedings; Notice of Public Meeting

    Science.gov (United States)

    2011-04-25

    ... identified a number of automation and information technology upgrades that will be instituted as part of the... what the SNQ is believed to be). Current practice does not set forth a consistent format in which the... is new and non-cumulative of what had been considered in any previous or pending USPTO examination of...

  20. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  1. Social aspects of automation: Some critical insights

    Science.gov (United States)

    Nouzil, Ibrahim; Raza, Ali; Pervaiz, Salman

    2017-09-01

    Sustainable development has been recognized globally as one of the major driving forces towards the current technological innovations. To achieve sustainable development and attain its associated goals, it is very important to properly address its concerns in different aspects of technological innovations. Several industrial sectors have enjoyed productivity and economic gains due to advent of automation technology. It is important to characterize sustainability for the automation technology. Sustainability is key factor that will determine the future of our neighbours in time and it must be tightly wrapped around the double-edged sword of technology. In this study, different impacts of automation have been addressed using the ‘Circles of Sustainability’ approach as a framework, covering economic, political, cultural and ecological aspects and their implications. A systematic literature review of automation technology from its inception is outlined and plotted against its many outcomes covering a broad spectrum. The study is more focused towards the social aspects of the automation technology. The study also reviews literature to analyse the employment deficiency as one end of the social impact spectrum. On the other end of the spectrum, benefits to society through technological advancements, such as the Internet of Things (IoT) coupled with automation are presented.

  2. Monte Carlo shielding analyses using an automated biasing procedure

    International Nuclear Information System (INIS)

    Tang, J.S.; Hoffman, T.J.

    1988-01-01

    A systematic and automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete ordinates calculation are used to generate biasing parameters for a Monte Carlo calculation. The entire procedure of adjoint calculation, biasing parameters generation, and Monte Carlo calculation has been automated. The automated biasing procedure has been applied to several realistic deep-penetration shipping cask problems. The results obtained for neutron and gamma-ray transport indicate that with the automated biasing procedure Monte Carlo shielding calculations of spent-fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost

  3. Automating linear accelerator quality assurance

    International Nuclear Information System (INIS)

    Eckhause, Tobias; Thorwarth, Ryan; Moran, Jean M.; Al-Hallaq, Hania; Farrey, Karl; Ritter, Timothy; DeMarco, John; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Park, SungYong; Perez, Mario; Booth, Jeremy T.

    2015-01-01

    Purpose: The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. Methods: The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. Results: For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The

  4. Automating linear accelerator quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Eckhause, Tobias; Thorwarth, Ryan; Moran, Jean M., E-mail: jmmoran@med.umich.edu [Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan 48109-5010 (United States); Al-Hallaq, Hania; Farrey, Karl [Department of Radiation Oncology and Cellular Oncology, The University of Chicago, Chicago, Illinois 60637 (United States); Ritter, Timothy [Ann Arbor VA Medical Center, Ann Arbor, Michigan 48109 (United States); DeMarco, John [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California, 90048 (United States); Pawlicki, Todd; Kim, Gwe-Ya [UCSD Medical Center, La Jolla, California 92093 (United States); Popple, Richard [Department of Radiation Oncology, University of Alabama Birmingham, Birmingham, Alabama 35249 (United States); Sharma, Vijeshwar; Park, SungYong [Karmanos Cancer Institute, McLaren-Flint, Flint, Michigan 48532 (United States); Perez, Mario; Booth, Jeremy T. [Royal North Shore Hospital, Sydney, NSW 2065 (Australia)

    2015-10-15

    Purpose: The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. Methods: The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. Results: For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The

  5. Reactor pressure vessel stud management automation strategies

    International Nuclear Information System (INIS)

    Biach, W.L.; Hill, R.; Hung, K.

    1992-01-01

    The adoption of hydraulic tensioner technology as the standard for bolting and unbolting the reactor pressure vessel (RPV) head 35 yr ago represented an incredible commitment to new technology, but the existing technology was so primitive as to be clearly unacceptable. Today, a variety of approaches for improvement make the decision more difficult. Automation in existing installations must meet complex physical, logistic, and financial parameters while addressing the demands of reduced exposure, reduced critical path, and extended plant life. There are two generic approaches to providing automated RPV stud engagement and disengagement: the multiple stud tensioner and automated individual tools. A variation of the latter would include the handling system. Each has its benefits and liabilities

  6. OCT-based profiler for automating ocular surface prosthetic fitting (Conference Presentation)

    Science.gov (United States)

    Mujat, Mircea; Patel, Ankit H.; Maguluri, Gopi N.; Iftimia, Nicusor V.; Patel, Chirag; Agranat, Josh; Tomashevskaya, Olga; Bonte, Eugene; Ferguson, R. Daniel

    2016-03-01

    The use of a Prosthetic Replacement of the Ocular Surface Environment (PROSE) device is a revolutionary treatment for military patients that have lost their eyelids due to 3rd degree facial burns and for civilians who suffer from a host of corneal diseases. However, custom manual fitting is often a protracted painful, inexact process that requires multiple fitting sessions. Training for new practitioners is a long process. Automated methods to measure the complete corneal and scleral topology would provide a valuable tool for both clinicians and PROSE device manufacturers and would help streamline the fitting process. PSI has developed an ocular anterior-segment profiler based on Optical Coherence Tomography (OCT), which provides a 3D measure of the surface of the sclera and cornea. This device will provide topography data that will be used to expedite and improve the fabrication process for PROSE devices. OCT has been used to image portions of the cornea and sclera and to measure surface topology for smaller contact lenses [1-3]. However, current state-of-the-art anterior eye OCT systems can only scan about 16 mm of the eye's anterior surface, which is not sufficient for covering the sclera around the cornea. In addition, there is no systematic method for scanning and aligning/stitching the full scleral/corneal surface and commercial segmentation software is not optimized for the PROSE application. Although preliminary, our results demonstrate the capability of PSI's approach to generate accurate surface plots over relatively large areas of the eye, which is not currently possible with any other existing platform. Testing the technology on human volunteers is currently underway at Boston Foundation for Sight.

  7. M-Track: A New Software for Automated Detection of Grooming Trajectories in Mice.

    Directory of Open Access Journals (Sweden)

    Sheldon L Reeves

    2016-09-01

    Full Text Available Grooming is a complex and robust innate behavior, commonly performed by most vertebrate species. In mice, grooming consists of a series of stereotyped patterned strokes, performed along the rostro-caudal axis of the body. The frequency and duration of each grooming episode is sensitive to changes in stress levels, social interactions and pharmacological manipulations, and is therefore used in behavioral studies to gain insights into the function of brain regions that control movement execution and anxiety. Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode, and are often performed on grooming episodes triggered by stress exposure, which may not be entirely representative of spontaneous grooming in freely-behaving mice. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand movement stereotypy and bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual forepaws. Here we describe a simple experimental set-up and a novel open-source code, named M-Track, for simultaneously tracking the movement of individual forepaws during spontaneous grooming in multiple freely-behaving mice. This toolbox provides a simple platform to perform trajectory analysis of forepaw movement during distinct grooming episodes. By using M-track we show that, in C57BL/6 wild type mice, the speed and bilateral coordination of the left and right forepaws remain unaltered during the execution of distinct grooming episodes. Stress exposure induces a profound increase in the length of the forepaw grooming trajectories. M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

  8. M-Track: A New Software for Automated Detection of Grooming Trajectories in Mice.

    Science.gov (United States)

    Reeves, Sheldon L; Fleming, Kelsey E; Zhang, Lin; Scimemi, Annalisa

    2016-09-01

    Grooming is a complex and robust innate behavior, commonly performed by most vertebrate species. In mice, grooming consists of a series of stereotyped patterned strokes, performed along the rostro-caudal axis of the body. The frequency and duration of each grooming episode is sensitive to changes in stress levels, social interactions and pharmacological manipulations, and is therefore used in behavioral studies to gain insights into the function of brain regions that control movement execution and anxiety. Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode, and are often performed on grooming episodes triggered by stress exposure, which may not be entirely representative of spontaneous grooming in freely-behaving mice. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand movement stereotypy and bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual forepaws. Here we describe a simple experimental set-up and a novel open-source code, named M-Track, for simultaneously tracking the movement of individual forepaws during spontaneous grooming in multiple freely-behaving mice. This toolbox provides a simple platform to perform trajectory analysis of forepaw movement during distinct grooming episodes. By using M-track we show that, in C57BL/6 wild type mice, the speed and bilateral coordination of the left and right forepaws remain unaltered during the execution of distinct grooming episodes. Stress exposure induces a profound increase in the length of the forepaw grooming trajectories. M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

  9. Challenges and Obstacles of e-Government Streamlining: A Case Study

    Directory of Open Access Journals (Sweden)

    Anupam K. Nath

    2014-05-01

    Full Text Available e-Government streamlining has been a challenge since its inception in the domain of e-business. Business organizations face challenges while trying to collaborate with partners through the use of information technology in order to ensure efficient delivery of services. One of the major reasons for these inefficient services has been political bureaucracies among government organizations. To meet this challenge, a transparent and networked environment is required where government organizations can effectively partner with other relevant organizations. Using a case study analysis, we intend to identify not just the challenges in government organizations while providing services which require collaborative effort, but also the obstacles in adopting new technology for collaboration. We believe that the outcome of our research could provide a generalized guideline for government agencies where there is need for digital collaboration. Our findings will thus help government organizations to address the challenges in digital collaboration, and also help them implement new technology successfully to ensure efficient delivery of services.

  10. Localized Plasticity in the Streamlined Genomes of Vinyl Chloride Respiring Dehalococcoides

    Energy Technology Data Exchange (ETDEWEB)

    McMurdie, Paul J.; Behrens, Sebastien F.; Muller, Jochen A.; Goke, Jonathan; Ritalahti, Kirsti M.; Wagner, Ryan; Goltsman, Eugene; Lapidus, Alla; Holmes, Susan; Loffler, Frank E.; Spormann, Alfred M.

    2009-06-30

    Vinyl chloride (VC) is a human carcinogen and widespread priority pollutant. Here we report the first, to our knowledge, complete genome sequences of microorganisms able to respire VC, Dehalococcoides sp. strains VS and BAV1. Notably, the respective VC reductase encoding genes, vcrAB and bvcAB, were found embedded in distinct genomic islands (GEIs) with different predicted integration sites, suggesting that these genes were acquired horizontally and independently by distinct mechanisms. A comparative analysis that included two previously sequenced Dehalococcoides genomes revealed a contextually conserved core that is interrupted by two high plasticity regions (HPRs) near the Ori. These HPRs contain the majority of GEIs and strain-specific genes identified in the four Dehalococcoides genomes, an elevated number of repeated elements including insertion sequences (IS), as well as 91 of 96 rdhAB, genes that putatively encode terminal reductases in organohalide respiration. Only three core rdhA orthologous groups were identified, and only one of these groups is supported by synteny. The low number of core rdhAB, contrasted with the high rdhAB numbers per genome (up to 36 in strain VS), as well as their colocalization with GEIs and other signatures for horizontal transfer, suggests that niche adaptation via organohalide respiration is a fundamental ecological strategy in Dehalococccoides. This adaptation has been exacted through multiple mechanisms of recombination that are mainly confined within HPRs of an otherwise remarkably stable, syntenic, streamlined genome among the smallest of any free-living microorganism.

  11. Symbiotic adaptation drives genome streamlining of the cyanobacterial sponge symbiont "Candidatus Synechococcus pongiarum"

    KAUST Repository

    Gao, Zhao-Ming

    2014-04-01

    "Candidatus Synechococcus spongiarum" is a cyanobacterial symbiont widely distributed in sponges, but its functions at the genome level remain unknown. Here, we obtained the draft genome (1.66 Mbp, 90% estimated genome recovery) of "Ca. Synechococcus spongiarum" strain SH4 inhabiting the Red Sea sponge Carteriospongia foliascens. Phylogenomic analysis revealed a high dissimilarity between SH4 and free-living cyanobacterial strains. Essential functions, such as photosynthesis, the citric acid cycle, and DNA replication, were detected in SH4. Eukaryoticlike domains that play important roles in sponge-symbiont interactions were identified exclusively in the symbiont. However, SH4 could not biosynthesize methionine and polyamines and had lost partial genes encoding low-molecular-weight peptides of the photosynthesis complex, antioxidant enzymes, DNA repair enzymes, and proteins involved in resistance to environmental toxins and in biosynthesis of capsular and extracellular polysaccharides. These genetic modifications imply that "Ca. Synechococcus spongiarum" SH4 represents a low-light-adapted cyanobacterial symbiont and has undergone genome streamlining to adapt to the sponge\\'s mild intercellular environment. 2014 Gao et al.

  12. Supply chain cost improvement opportunities through streamlining cross-border operations

    Directory of Open Access Journals (Sweden)

    Jan Hendrik Havenga

    2013-09-01

    Full Text Available The Cross-Border Road Transport Agency (CBRTA in South Africa aims to encourage and facilitate trade between South Africa and its neighbouring countries. The CBRTA sponsored a study by Stellenbosch University (SU to determine the logistics cost impact of cross-border delays between South Africa and its major neighbouring trading partners, and prioritise opportunities for improvement. SU is the proprietor of both a comprehensive freight demand model and a logistics cost model for South Africa, which enable extractions and extensions of freight flows and related costs for specific purposes. Through the application of these models, the following information is identified and presented in this paper: South Africa’s most important border posts (based on traffic flows; a product profile for imports and exports through these border posts; the modal split (road and rail; the annual logistics costs incurred on the corridors feeding the border posts, as well as the additional costs incurred due to border delays. The research has proved that the streamlining of border-post operations that take a total supply chain view (i.e. of both border operations and those that could be moved from the border is beneficial.

  13. Development of a Streamlined Work Flow for Handling Patients' Genetic Testing Insurance Authorizations.

    Science.gov (United States)

    Uhlmann, Wendy R; Schwalm, Katie; Raymond, Victoria M

    2017-08-01

    Obtaining genetic testing insurance authorizations for patients is a complex, time-involved process often requiring genetic counselor (GC) and physician involvement. In an effort to mitigate this complexity and meet the increasing number of genetic testing insurance authorization requests, GCs formed a novel partnership with an industrial engineer (IE) and a patient services associate (PSA) to develop a streamlined work flow. Eight genetics clinics and five specialty clinics at the University of Michigan were surveyed to obtain benchmarking data. Tasks needed for genetic testing insurance authorization were outlined and time-saving work flow changes were introduced including 1) creation of an Excel password-protected shared database between GCs and PSAs, used for initiating insurance authorization requests, tracking and follow-up 2) instituting the PSAs sending GCs a pre-clinic email noting each patients' genetic testing insurance coverage 3) inclusion of test medical necessity documentation in the clinic visit summary note instead of writing a separate insurance letter and 4) PSAs development of a manual with insurance providers and genetic testing laboratories information. These work flow changes made it more efficient to request and track genetic testing insurance authorizations for patients, enhanced GCs and PSAs communication, and reduced tasks done by clinicians.

  14. Streamlining Workflow for Endovascular Mechanical Thrombectomy: Lessons Learned from a Comprehensive Stroke Center.

    Science.gov (United States)

    Wang, Hongjin; Thevathasan, Arthur; Dowling, Richard; Bush, Steven; Mitchell, Peter; Yan, Bernard

    2017-08-01

    Recently, 5 randomized controlled trials confirmed the superiority of endovascular mechanical thrombectomy (EMT) to intravenous thrombolysis in acute ischemic stroke with large-vessel occlusion. The implication is that our health systems would witness an increasing number of patients treated with EMT. However, in-hospital delays, leading to increased time to reperfusion, are associated with poor clinical outcomes. This review outlines the in-hospital workflow of the treatment of acute ischemic stroke at a comprehensive stroke center and the lessons learned in reduction of in-hospital delays. The in-hospital workflow for acute ischemic stroke was described from prehospital notification to femoral arterial puncture in preparation for EMT. Systematic review of literature was also performed with PubMed. The implementation of workflow streamlining could result in reduction of in-hospital time delays for patients who were eligible for EMT. In particular, time-critical measures, including prehospital notification, the transfer of patients from door to computed tomography (CT) room, initiation of intravenous thrombolysis in the CT room, and the mobilization of neurointervention team in parallel with thrombolysis, all contributed to reduction in time delays. We have identified issues resulting in in-hospital time delays and have reported possible solutions to improve workflow efficiencies. We believe that these measures may help stroke centers initiate an EMT service for eligible patients. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  15. Streamlining Appointment, Promotion, and Tenure Procedures to Promote Early-Career Faculty Success.

    Science.gov (United States)

    Smith, Shannon B; Hollerbach, Ann; Donato, Annemarie Sipkes; Edlund, Barbara J; Atz, Teresa; Kelechi, Teresa J

    2016-01-01

    A critical component of the progression of a successful academic career is being promoted in rank. Early-career faculty are required to have an understanding of appointment, promotion, and tenure (APT) guidelines, but many factors often impede this understanding, thwarting a smooth and planned promotion pathway for professional advancement. This article outlines the steps taken by an APT committee to improve the promotion process from instructor to assistant professor. Six sigma's DMAIC improvement model was selected as the guiding operational framework to remove variation in the promotion process. After faculty handbook revisions were made, several checklists developed, and a process review rubric was implemented; recently promoted faculty were surveyed on satisfaction with the process. Faculty opinions captured in the survey suggest increased transparency in the process and perceived support offered by the APT committee. Positive outcomes include a strengthened faculty support framework, streamlined promotion processes, and improved faculty satisfaction. Changes to the APT processes resulted in an unambiguous and standardized pathway for successful promotion. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Comparative Analysis of Wolbachia Genomes Reveals Streamlining and Divergence of Minimalist Two-Component Systems

    Science.gov (United States)

    Christensen, Steen; Serbus, Laura Renee

    2015-01-01

    Two-component regulatory systems are commonly used by bacteria to coordinate intracellular responses with environmental cues. These systems are composed of functional protein pairs consisting of a sensor histidine kinase and cognate response regulator. In contrast to the well-studied Caulobacter crescentus system, which carries dozens of these pairs, the streamlined bacterial endosymbiont Wolbachia pipientis encodes only two pairs: CckA/CtrA and PleC/PleD. Here, we used bioinformatic tools to compare characterized two-component system relays from C. crescentus, the related Anaplasmataceae species Anaplasma phagocytophilum and Ehrlichia chaffeensis, and 12 sequenced Wolbachia strains. We found the core protein pairs and a subset of interacting partners to be highly conserved within Wolbachia and these other Anaplasmataceae. Genes involved in two-component signaling were positioned differently within the various Wolbachia genomes, whereas the local context of each gene was conserved. Unlike Anaplasma and Ehrlichia, Wolbachia two-component genes were more consistently found clustered with metabolic genes. The domain architecture and key functional residues standard for two-component system proteins were well-conserved in Wolbachia, although residues that specify cognate pairing diverged substantially from other Anaplasmataceae. These findings indicate that Wolbachia two-component signaling pairs share considerable functional overlap with other α-proteobacterial systems, whereas their divergence suggests the potential for regulatory differences and cross-talk. PMID:25809075

  17. Symbiotic adaptation drives genome streamlining of the cyanobacterial sponge symbiont "Candidatus Synechococcus pongiarum"

    KAUST Repository

    Gao, Zhao-Ming; Wang, Yong; Tian, Ren-Mao; Wong, Yue Him; Batang, Zenon B.; Al-Suwailem, Abdulaziz M.; Bajic, Vladimir B.; Qian, Pei-Yuan

    2014-01-01

    "Candidatus Synechococcus spongiarum" is a cyanobacterial symbiont widely distributed in sponges, but its functions at the genome level remain unknown. Here, we obtained the draft genome (1.66 Mbp, 90% estimated genome recovery) of "Ca. Synechococcus spongiarum" strain SH4 inhabiting the Red Sea sponge Carteriospongia foliascens. Phylogenomic analysis revealed a high dissimilarity between SH4 and free-living cyanobacterial strains. Essential functions, such as photosynthesis, the citric acid cycle, and DNA replication, were detected in SH4. Eukaryoticlike domains that play important roles in sponge-symbiont interactions were identified exclusively in the symbiont. However, SH4 could not biosynthesize methionine and polyamines and had lost partial genes encoding low-molecular-weight peptides of the photosynthesis complex, antioxidant enzymes, DNA repair enzymes, and proteins involved in resistance to environmental toxins and in biosynthesis of capsular and extracellular polysaccharides. These genetic modifications imply that "Ca. Synechococcus spongiarum" SH4 represents a low-light-adapted cyanobacterial symbiont and has undergone genome streamlining to adapt to the sponge's mild intercellular environment. 2014 Gao et al.

  18. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  19. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  20. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area