WorldWideScience

Sample records for multiple automated sample

  1. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    Science.gov (United States)

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  2. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...... corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters...

  3. Microassay for interferon, using [3H]uridine, microculture plates, and a multiple automated sample harvester.

    Science.gov (United States)

    Richmond, J Y; Polatnick, J; Knudsen, R C

    1980-01-01

    A microassay for interferon is described which uses target cells grown in microculture wells, [3H]uridine to measure vesicular stomatitis virus replication in target cells, and a multiple automated sample harvester to collect the radioactively labeled viral ribonucleic acid onto glass fiber filter disks. The disks were placed in minivials, and radioactivity was counted in a liquid scintillation spectrophotometer. Interferon activity was calculated as the reciprocal of the highest titer which inhibited the incorporation of [3H]uridine into viral ribonucleic acid by 50%. Interferon titers determined by the microassay were similar to the plaque reduction assay when 100 plaque-forming units of challenge vesicular stomatitis virus was used. However, it was found that the interferon titers decreased approximately 2-fold for each 10-fold increase in the concentration of challenge vesicular stomatitis virus when tested in the range of 10(2) to 10(5) plaque-forming units. Interferon titers determined by the microassay show a high degree of repeatability, and the assay can be used to measure small and large numbers of interferon samples. PMID:6155105

  4. Automated blood-sample handling in the clinical laboratory.

    Science.gov (United States)

    Godolphin, W; Bodtker, K; Uyeno, D; Goh, L O

    1990-09-01

    The only significant advances in blood-taking in 25 years have been the disposable needle and evacuated blood-drawing tube. With the exception of a few isolated barcode experiments, most sample-tracking is performed through handwritten or computer-printed labels. Attempts to reduce the hazards of centrifugation have resulted in air-tight lids or chambers, the use of which is time-consuming and cumbersome. Most commonly used clinical analyzers require serum or plasma, distributed into specialized containers, unique to that analyzer. Aliquots for different tests are prepared by handpouring or pipetting. Moderate to large clinical laboratories perform so many different tests that even multi-analyzers performing multiple analyses on a single sample may account for only a portion of all tests ordered for a patient. Thus several aliquots of each specimen are usually required. We have developed a proprietary serial centrifuge and blood-collection tube suitable for incorporation into an automated or robotic sample-handling system. The system we propose is (a) safe--avoids or prevents biological danger to the many "handlers" of blood; (b) small--minimizes the amount of sample taken and space required to adapt to the needs of satellite and mobile testing, and direct interfacing with analyzers; (c) serial--permits each sample to be treated according to its own "merits," optimizes throughput, and facilitates flexible automation; and (d) smart--ensures quality results through monitoring and intelligent control of patient identification, sample characteristics, and separation process.

  5. Microcomputer automation of the sample probe of an ESCA spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J W; Downey, R M; Schrawyer, L R; Meisenheimer, R G [California Univ., Livermore (USA). Lawrence Livermore Lab.

    1979-06-01

    Many of the currently available ESCA spectrometers can be obtained with fully automated control and data acquisition systems as well as a variety of algorithms for data processing. The majority of these computer-controlled functions, however, are directed toward the analysis of a single physical sample. Since many analyses require the collection of data for several hours for the purpose of ensemble averaging to enhance the signal-to-noise ratio, overnight operation is a common practice. It is during these periods that a capability for multiple sample analyses would be most useful. The subject instrument for the automation was a Hewlett-Packard Model 5950A ESCA spectrometer which was supplied with a Hewlett-Packard 5952A Data System consisting of a Hewlett-Packard 2100A small computer and a Hewlett-Packard 85001A Magnetic Tape Cassett I/O unit. A microprocessor based system was used to implement this automation. The system uses the high-level language BASIC for virtually all control functions, thus providing the obvious advantages of rapid programming and ease of software modification over an assembly language although more ROM is required.

  6. Highly Reproducible Automated Proteomics Sample Preparation Workflow for Quantitative Mass Spectrometry.

    Science.gov (United States)

    Fu, Qin; Kowalski, Michael P; Mastali, Mitra; Parker, Sarah J; Sobhani, Kimia; van den Broek, Irene; Hunter, Christie L; Van Eyk, Jennifer E

    2018-01-05

    Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. We established an automated sample preparation workflow with a total processing time for 96 samples of 5 h, including a 2 h incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intraday and interday cyclic voltammograms (CVs) for 5 serum and 5 plasma samples over 5 days were samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. An automated trypsin digestion workflow yields uniformly processed samples in less than 5 h. Reproducible quantification of peptides was observed across replicates, days, instruments, and laboratory sites, demonstrating the broad applicability of this approach.

  7. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    International Nuclear Information System (INIS)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F.; Prasanna, P.G.S.

    2007-01-01

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and medical

  8. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    Energy Technology Data Exchange (ETDEWEB)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States); Prasanna, P.G.S. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States)], E-mail: prasanna@afrri.usuhs.mil

    2007-07-15

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and

  9. Automated multiple failure FMEA

    International Nuclear Information System (INIS)

    Price, C.J.; Taylor, N.S.

    2002-01-01

    Failure mode and effects analysis (FMEA) is typically performed by a team of engineers working together. In general, they will only consider single point failures in a system. Consideration of all possible combinations of failures is impractical for all but the simplest example systems. Even if the task of producing the FMEA report for the full multiple failure scenario were automated, it would still be impractical for the engineers to read, understand and act on all of the results. This paper shows how approximate failure rates for components can be used to select the most likely combinations of failures for automated investigation using simulation. The important information can be automatically identified from the resulting report, making it practical for engineers to study and act on the results. The strategy described in the paper has been applied to a range of electrical subsystems, and the results have confirmed that the strategy described here works well for realistically complex systems

  10. Portable Automation of Static Chamber Sample Collection for Quantifying Soil Gas Flux

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Morgan P.; Groh, Tyler A.; Parkin, Timothy B.; Williams, Ryan J.; Isenhart, Thomas M.; Hofmockel, Kirsten S.

    2018-01-01

    Quantification of soil gas flux using the static chamber method is labor intensive. The number of chambers that can be sampled is limited by the spacing between chambers and the availability of trained research technicians. An automated system for collecting gas samples from chambers in the field would eliminate the need for personnel to return to the chamber during a flux measurement period and would allow a single technician to sample multiple chambers simultaneously. This study describes Chamber Automated Sampling Equipment (FluxCASE) to collect and store chamber headspace gas samples at assigned time points for the measurement of soil gas flux. The FluxCASE design and operation is described, and the accuracy and precision of the FluxCASE system is evaluated. In laboratory measurements of nitrous oxide (N2O), carbon dioxide (CO2), and methane (CH4) concentrations of a standardized gas mixture, coefficients of variation associated with automated and manual sample collection were comparable, indicating no loss of precision. In the field, soil gas fluxes measured from FluxCASEs were in agreement with manual sampling for both N2O and CO2. Slopes of regression equations were 1.01 for CO2 and 0.97 for N2O. The 95% confidence limits of the slopes of the regression lines included the value of one, indicating no bias. Additionally, an expense analysis found a cost recovery ranging from 0.6 to 2.2 yr. Implementing the FluxCASE system is an alternative to improve the efficiency of the static chamber method for measuring soil gas flux while maintaining the accuracy and precision of manual sampling.

  11. On-line Automated Sample Preparation-Capillary Gas Chromatography for the Analysis of Plasma Samples.

    NARCIS (Netherlands)

    Louter, A.J.H.; van der Wagt, R.A.C.A.; Brinkman, U.A.T.

    1995-01-01

    An automated sample preparation module, (the automated sample preparation with extraction columns, ASPEC), was interfaced with a capillary gas chromatograph (GC) by means of an on-column interface. The system was optimised for the determination of the antidepressant trazodone in plasma. The clean-up

  12. Automated bar coding of air samples at Hanford (ABCASH)

    International Nuclear Information System (INIS)

    Troyer, G.L.; Brayton, D.D.; McNeece, S.G.

    1992-10-01

    This article describes the basis, main features and benefits of an automated system for tracking and reporting radioactive air particulate samples. The system was developed due to recognized need for improving the quality and integrity of air sample data related to personnel and environmental protection. The data capture, storage, and retrieval of air sample data are described. The automation aspect of the associated and data input eliminates a large potential for human error. The system utilizes personal computers, handheld computers, a commercial personal computer database package, commercial programming languages, and complete documentation to satisfy the system's automation objective

  13. Possibilities for automating coal sampling

    Energy Technology Data Exchange (ETDEWEB)

    Helekal, J; Vankova, J

    1987-11-01

    Outlines sampling equipment in use (AVR-, AVP-, AVN- and AVK-series samplers and RDK- and RDH-series separators produced by the Coal Research Institute, Ostrava; extractors, crushers and separators produced by ORGREZ). The Ostrava equipment covers bituminous coal needs while ORGREZ provides equipment for energy coal requirements. This equipment is designed to handle coal up to 200 mm in size at a throughput of up to 1200 t/h. Automation of sampling equipment is foreseen.

  14. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    Science.gov (United States)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  15. Sample handling and transport for the Secure Automated Fabrication line

    International Nuclear Information System (INIS)

    Sherrell, D.L.; Jensen, J.D.; Genoway, G.G.; Togesen, H.J.

    1983-06-01

    A totally automated system is described which packages, transports, receives, and unpackages sintered plutonium/uranium oxide fuel pellet samples to support automated chemical analysis equipment for the Secure Automated Fabrication (SAF) line. Samples are transferred 100 meters from the fuel production line to a different floor of the facility where automatic determinations are made for purposes of process control and fuel quality certification. The system automatically records identification numbers, net weights sent and received, and all other pertinent information such as fuel lot number, sample point, date, and time of day

  16. Automated platform for designing multiple robot work cells

    Science.gov (United States)

    Osman, N. S.; Rahman, M. A. A.; Rahman, A. A. Abdul; Kamsani, S. H.; Bali Mohamad, B. M.; Mohamad, E.; Zaini, Z. A.; Rahman, M. F. Ab; Mohamad Hatta, M. N. H.

    2017-06-01

    Designing the multiple robot work cells is very knowledge-intensive, intricate, and time-consuming process. This paper elaborates the development process of a computer-aided design program for generating the multiple robot work cells which offer a user-friendly interface. The primary purpose of this work is to provide a fast and easy platform for less cost and human involvement with minimum trial and errors adjustments. The automated platform is constructed based on the variant-shaped configuration concept with its mathematical model. A robot work cell layout, system components, and construction procedure of the automated platform are discussed in this paper where integration of these items will be able to automatically provide the optimum robot work cell design according to the information set by the user. This system is implemented on top of CATIA V5 software and utilises its Part Design, Assembly Design, and Macro tool. The current outcomes of this work provide a basis for future investigation in developing a flexible configuration system for the multiple robot work cells.

  17. Accelerated Enveloping Distribution Sampling: Enabling Sampling of Multiple End States while Preserving Local Energy Minima.

    Science.gov (United States)

    Perthold, Jan Walther; Oostenbrink, Chris

    2018-05-17

    Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.

  18. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  19. Influencing Trust for Human-Automation Collaborative Scheduling of Multiple Unmanned Vehicles.

    Science.gov (United States)

    Clare, Andrew S; Cummings, Mary L; Repenning, Nelson P

    2015-11-01

    We examined the impact of priming on operator trust and system performance when supervising a decentralized network of heterogeneous unmanned vehicles (UVs). Advances in autonomy have enabled a future vision of single-operator control of multiple heterogeneous UVs. Real-time scheduling for multiple UVs in uncertain environments requires the computational ability of optimization algorithms combined with the judgment and adaptability of human supervisors. Because of system and environmental uncertainty, appropriate operator trust will be instrumental to maintain high system performance and prevent cognitive overload. Three groups of operators experienced different levels of trust priming prior to conducting simulated missions in an existing, multiple-UV simulation environment. Participants who play computer and video games frequently were found to have a higher propensity to overtrust automation. By priming gamers to lower their initial trust to a more appropriate level, system performance was improved by 10% as compared to gamers who were primed to have higher trust in the automation. Priming was successful at adjusting the operator's initial and dynamic trust in the automated scheduling algorithm, which had a substantial impact on system performance. These results have important implications for personnel selection and training for futuristic multi-UV systems under human supervision. Although gamers may bring valuable skills, they may also be potentially prone to automation bias. Priming during training and regular priming throughout missions may be one potential method for overcoming this propensity to overtrust automation. © 2015, Human Factors and Ergonomics Society.

  20. Flexible automated approach for quantitative liquid handling of complex biological samples.

    Science.gov (United States)

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  1. Reducing the cost of semi-automated in-gel tryptic digestion and GeLC sample preparation for high-throughput proteomics.

    Science.gov (United States)

    Ruelcke, Jayde E; Loo, Dorothy; Hill, Michelle M

    2016-10-21

    Peptide generation by trypsin digestion is typically the first step in mass spectrometry-based proteomics experiments, including 'bottom-up' discovery and targeted proteomics using multiple reaction monitoring. Manual tryptic digest and the subsequent clean-up steps can add variability even before the sample reaches the analytical platform. While specialized filter plates and tips have been designed for automated sample processing, the specialty reagents required may not be accessible or feasible due to their high cost. Here, we report a lower-cost semi-automated protocol for in-gel digestion and GeLC using standard 96-well microplates. Further cost savings were realized by re-using reagent tips with optimized sample ordering. To evaluate the methodology, we compared a simple mixture of 7 proteins and a complex cell-lysate sample. The results across three replicates showed that our semi-automated protocol had performance equal to or better than a manual in-gel digestion with respect to replicate variability and level of contamination. In this paper, we also provide the Agilent Bravo method file, which can be adapted to other liquid handlers. The simplicity, reproducibility, and cost-effectiveness of our semi-automated protocol make it ideal for routine in-gel and GeLC sample preparations, as well as high throughput processing of large clinical sample cohorts. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Keyser, John

    2013-01-01

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation

  3. Automated Blood Sample Preparation Unit (ABSPU) for Portable Microfluidic Flow Cytometry.

    Science.gov (United States)

    Chaturvedi, Akhil; Gorthi, Sai Siva

    2017-02-01

    Portable microfluidic diagnostic devices, including flow cytometers, are being developed for point-of-care settings, especially in conjunction with inexpensive imaging devices such as mobile phone cameras. However, two pervasive drawbacks of these have been the lack of automated sample preparation processes and cells settling out of sample suspensions, leading to inaccurate results. We report an automated blood sample preparation unit (ABSPU) to prevent blood samples from settling in a reservoir during loading of samples in flow cytometers. This apparatus automates the preanalytical steps of dilution and staining of blood cells prior to microfluidic loading. It employs an assembly with a miniature vibration motor to drive turbulence in a sample reservoir. To validate performance of this system, we present experimental evidence demonstrating prevention of blood cell settling, cell integrity, and staining of cells prior to flow cytometric analysis. This setup is further integrated with a microfluidic imaging flow cytometer to investigate cell count variability. With no need for prior sample preparation, a drop of whole blood can be directly introduced to the setup without premixing with buffers manually. Our results show that integration of this assembly with microfluidic analysis provides a competent automation tool for low-cost point-of-care blood-based diagnostics.

  4. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  5. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  6. Microwave-Assisted Sample Treatment in a Fully Automated Flow-Based Instrument: Oxidation of Reduced Technetium Species in the Analysis of Total Technetium-99 in Caustic Aged Nuclear Waste Samples

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; O'Hara, Matthew J.; Grate, Jay W.

    2004-01-01

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment, removal of the gaseous reaction products, and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total 99Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total 99Tc in caustic aged nuclear waste matrixes from the Hanford site

  7. Automating Groundwater Sampling At Hanford, The Next Step

    International Nuclear Information System (INIS)

    Connell, C.W.; Conley, S.F.; Hildebrand, R.D.; Cunningham, D.E.

    2010-01-01

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very 'people intensive.' Approximately 1500 wells are sampled each year by field personnel or 'samplers.' These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  8. Indigenous development of automated metallographic sample preparation system

    International Nuclear Information System (INIS)

    Kulkarni, A.P.; Pandit, K.M.; Deshmukh, A.G.; Sahoo, K.C.

    2005-01-01

    Surface preparation of specimens for Metallographic studies on irradiated material involves a lot of remote handling of radioactive material by skilled manpower. These are laborious and man-rem intensive activities and put limitations on number of samples that can be prepared for the metallographic studies. To overcome these limitations, automated systems have been developed for surface preparation of specimens in PIE division. The system includes (i) Grinding and polishing stations (ii) Water jet cleaning station (iii) Ultrasonic cleaning stations (iv) Drying station (v) Sample loading and unloading station (vi) Dispenser for slurries and diluents and (vii) Automated head for movement of the sample holder disc from one station to other. System facilities the operator for programming/changing sequence of the sample preparations including remote changing of grinding/polishing discs from the stations. Two such systems have been installed and commissioned in Hot Cell for PIE Division. These are being used for preparation of irradiated samples from nuclear fuels and structural components. This development has increased the throughput of metallography work and savings in terms of (man-severts) radiation exposure to operators. This presentation will provide details of the challenges in undertaking this developmental work. (author)

  9. Automated washing of FTA Card punches and PCR setup for reference samples using a LIMS-controlled Sias Xantus automated liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Olsen, Addie Nina; Frøslev, Tobias G.

    2009-01-01

    We have implemented and validated automated methods for washing FTA Card punches containing buccal samples and subsequent PCR setup using a Sias Xantus automated liquid handler. The automated methods were controlled by worklists generated by our LabWare Laboratory Information Management System...

  10. Automated SEM and TEM sample preparation applied to copper/low k materials

    Science.gov (United States)

    Reyes, R.; Shaapur, F.; Griffiths, D.; Diebold, A. C.; Foran, B.; Raz, E.

    2001-01-01

    We describe the use of automated microcleaving for preparation of both SEM and TEM samples as done by SELA's new MC500 and TEMstation tools. The MC500 is an automated microcleaving tool that is capable of producing cleaves with 0.25 μm accuracy resulting in SEM-ready samples. The TEMstation is capable of taking a sample output from the MC500 (or from SELA's earlier MC200 tool) and producing a FIB ready slice of 25±5 μm, mounted on a TEM-washer and ready for FIB thinning to electron transparency for TEM analysis. The materials selected for the tool set evaluation mainly included the Cu/TaN/HOSP low-k system. The paper is divided into three sections, experimental approach, SEM preparation and analysis of HOSP low-k, and TEM preparation and analysis of Cu/TaN/HOSP low-k samples. For the samples discussed, data is presented to show the quality of preparation provided by these new automated tools.

  11. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    Science.gov (United States)

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  12. Computer automated mass spectrometer for isotope analysis on gas samples

    International Nuclear Information System (INIS)

    Pamula, A.; Kaucsar, M.; Fatu, C.; Ursu, D.; Vonica, D.; Bendea, D.; Muntean, F.

    1998-01-01

    A low resolution, high precision instrument was designed and realized in the mass spectrometry laboratory of the Institute of Isotopic and Molecular Technology, Cluj-Napoca. The paper presents the vacuum system, the sample inlet system, the ion source, the magnetic analyzer and the ion collector. The instrument is almost completely automated. There are described the analog-to-digital conversion circuits, the local control microcomputer, the automation systems and the performance checking. (authors)

  13. Automated dried blood spots standard and QC sample preparation using a robotic liquid handler.

    Science.gov (United States)

    Yuan, Long; Zhang, Duxi; Aubry, Anne-Francoise; Arnold, Mark E

    2012-12-01

    A dried blood spot (DBS) bioanalysis assay involves many steps, such as the preparation of standard (STD) and QC samples in blood, the spotting onto DBS cards, and the cutting-out of the spots. These steps are labor intensive and time consuming if done manually, which, therefore, makes automation very desirable in DBS bioanalysis. A robotic liquid handler was successfully applied to the preparation of STD and QC samples in blood and to spot the blood samples onto DBS cards using buspirone as the model compound. This automated preparation was demonstrated to be accurate and consistent. However the accuracy and precision of automated preparation were similar to those from manual preparation. The effect of spotting volume on accuracy was evaluated and a trend of increasing concentrations of buspirone with increasing spotting volumes was observed. The automated STD and QC sample preparation process significantly improved the efficiency, robustness and safety of DBS bioanalysis.

  14. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  15. Automated injection of a radioactive sample for preparative HPLC with feedback control

    International Nuclear Information System (INIS)

    Iwata, Ren; Yamazaki, Shigeki

    1990-01-01

    The injection of a radioactive reaction mixture into a preparative HPLC column has been automated with computer control for rapid purification of routinely prepared positron emitting radiopharmaceuticals. Using pneumatic valves, a motor-driven pump and a liquid level sensor, two intelligent injection methods for the automation were compared with regard to efficient and rapid sample loading into a 2 mL loop of the 6-way valve. One, a precise but rather slow method, was demonstrated to be suitable for purification of 18 F-radiopharmaceuticals, while the other, due to its rapid operation, was more suitable for 11 C-radiopharmaceuticals. A sample volume of approx 0.5 mL can be injected onto a preparative HPLC column with over 90% efficiency with the present automated system. (author)

  16. An automated blood sampling system used in positron emission tomography

    International Nuclear Information System (INIS)

    Eriksson, L.; Bohm, C.; Kesselberg, M.

    1988-01-01

    Fast dynamic function studies with positron emission tomography (PET), has the potential to give accurate information of physiological functions of the brain. This capability can be realised if the positron camera system accurately quantitates the tracer uptake in the brain with sufficiently high efficiency and in sufficiently short time intervals. However, in addition, the tracer concentration in blood, as a function of time, must be accurately determined. This paper describes and evaluates an automated blood sampling system. Two different detector units are compared. The use of the automated blood sampling system is demonstrated in studies of cerebral blood flow, in studies of the blood-brain barrier transfer of amino acids and of the cerebral oxygen consumption. 5 refs.; 7 figs

  17. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  18. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  19. Development and Evaluation of a Pilot Prototype Automated Online Sampling System

    International Nuclear Information System (INIS)

    Whitaker, M.J.

    2000-01-01

    An automated online sampling system has been developed for the BNFL-Hanford Technetium Monitoring Program. The system was designed to be flexible and allows for the collection and delivery of samples to a variety of detection devices that may be used

  20. Estimates of Radionuclide Loading to Cochiti Lake from Los Alamos Canyon Using Manual and Automated Sampling

    Energy Technology Data Exchange (ETDEWEB)

    McLean, Christopher T. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-07-01

    Los Alamos National Laboratory has a long-standing program of sampling storm water runoff inside the Laboratory boundaries. In 1995, the Laboratory started collecting the samples using automated storm water sampling stations; prior to this time the samples were collected manually. The Laboratory has also been periodically collecting sediment samples from Cochiti Lake. This paper presents the data for Pu-238 and Pu-239 bound to the sediments for Los Alamos Canyon storm water runoff and compares the sampling types by mass loading and as a percentage of the sediment deposition to Cochiti Lake. The data for both manual and automated sampling are used to calculate mass loads from Los Alamos Canyon on a yearly basis. The automated samples show mass loading 200- 500 percent greater for Pu-238 and 300-700 percent greater for Pu-239 than the manual samples. Using the mean manual flow volume for mass loading calculations, the automated samples are over 900 percent greater for Pu-238 and over 1800 percent greater for Pu-239. Evaluating the Pu-238 and Pu-239 activities as a percentage of deposition to Cochiti Lake indicates that the automated samples are 700-1300 percent greater for Pu- 238 and 200-500 percent greater for Pu-239. The variance was calculated by two methods. The first method calculates the variance for each sample event. The second method calculates the variances by the total volume of water discharged in Los Alamos Canyon for the year.

  1. Automated cellular sample preparation using a Centrifuge-on-a-Chip.

    Science.gov (United States)

    Mach, Albert J; Kim, Jae Hyun; Arshi, Armin; Hur, Soojung Claire; Di Carlo, Dino

    2011-09-07

    The standard centrifuge is a laboratory instrument widely used by biologists and medical technicians for preparing cell samples. Efforts to automate the operations of concentration, cell separation, and solution exchange that a centrifuge performs in a simpler and smaller platform have had limited success. Here, we present a microfluidic chip that replicates the functions of a centrifuge without moving parts or external forces. The device operates using a purely fluid dynamic phenomenon in which cells selectively enter and are maintained in microscale vortices. Continuous and sequential operation allows enrichment of cancer cells from spiked blood samples at the mL min(-1) scale, followed by fluorescent labeling of intra- and extra-cellular antigens on the cells without the need for manual pipetting and washing steps. A versatile centrifuge-analogue may open opportunities in automated, low-cost and high-throughput sample preparation as an alternative to the standard benchtop centrifuge in standardized clinical diagnostics or resource poor settings.

  2. An Automated Sample Processing System for Planetary Exploration

    Science.gov (United States)

    Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther

    2012-01-01

    An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.

  3. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    International Nuclear Information System (INIS)

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-01

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs

  4. SAMPL4 & DOCK3.7: lessons for automated docking procedures

    Science.gov (United States)

    Coleman, Ryan G.; Sterling, Teague; Weiss, Dahlia R.

    2014-03-01

    The SAMPL4 challenges were used to test current automated methods for solvation energy, virtual screening, pose and affinity prediction of the molecular docking pipeline DOCK 3.7. Additionally, first-order models of binding affinity were proposed as milestones for any method predicting binding affinity. Several important discoveries about the molecular docking software were made during the challenge: (1) Solvation energies of ligands were five-fold worse than any other method used in SAMPL4, including methods that were similarly fast, (2) HIV Integrase is a challenging target, but automated docking on the correct allosteric site performed well in terms of virtual screening and pose prediction (compared to other methods) but affinity prediction, as expected, was very poor, (3) Molecular docking grid sizes can be very important, serious errors were discovered with default settings that have been adjusted for all future work. Overall, lessons from SAMPL4 suggest many changes to molecular docking tools, not just DOCK 3.7, that could improve the state of the art. Future difficulties and projects will be discussed.

  5. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  6. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    Science.gov (United States)

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  7. A bench-top automated workstation for nucleic acid isolation from clinical sample types.

    Science.gov (United States)

    Thakore, Nitu; Garber, Steve; Bueno, Arial; Qu, Peter; Norville, Ryan; Villanueva, Michael; Chandler, Darrell P; Holmberg, Rebecca; Cooney, Christopher G

    2018-04-18

    Systems that automate extraction of nucleic acid from cells or viruses in complex clinical matrices have tremendous value even in the absence of an integrated downstream detector. We describe our bench-top automated workstation that integrates our previously-reported extraction method - TruTip - with our newly-developed mechanical lysis method. This is the first report of this method for homogenizing viscous and heterogeneous samples and lysing difficult-to-disrupt cells using "MagVor": a rotating magnet that rotates a miniature stir disk amidst glass beads confined inside of a disposable tube. Using this system, we demonstrate automated nucleic acid extraction from methicillin-resistant Staphylococcus aureus (MRSA) in nasopharyngeal aspirate (NPA), influenza A in nasopharyngeal swabs (NPS), human genomic DNA from whole blood, and Mycobacterium tuberculosis in NPA. The automated workstation yields nucleic acid with comparable extraction efficiency to manual protocols, which include commercially-available Qiagen spin column kits, across each of these sample types. This work expands the scope of applications beyond previous reports of TruTip to include difficult-to-disrupt cell types and automates the process, including a method for removal of organics, inside a compact bench-top workstation. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  9. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  10. Application of bar codes to the automation of analytical sample data collection

    International Nuclear Information System (INIS)

    Jurgensen, H.A.

    1986-01-01

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented

  11. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  12. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  13. Automated image analysis for quantitative fluorescence in situ hybridization with environmental samples.

    Science.gov (United States)

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L

    2007-05-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.

  14. Mockup of an automated material transport system for remote handling

    International Nuclear Information System (INIS)

    Porter, M.L.

    1992-01-01

    An Automated Material Transport System (AMTS) was identified for transport of samples within a Material and Process Control Laboratory (MPCL). The MPCL was designed with a dry sample handling laboratory and a wet chemistry analysis laboratory. Each laboratory contained several processing gloveboxes. The function of the AMTS was to automate the handling of materials, multiple process samples, and bulky items between process stations with a minimum of operator intervention and with minimum o[ waiting periods and nonproductive activities. This paper discusses the system design features, capabilities and results of initial testing. The overall performance of the AMTS is very good. No major problems or concerns were identified. System commands are simple and logical making the system user friendly. Operating principle and design of individual components is simple. With the addition of various track modules, the system can be configured in most any configuration. The AMTS lends itself very well for integration with other automated systems or products. The AMTS is suited for applications involving light payloads which require multiple sample and material handling, lot tracking, and system integration with other products

  15. A data-driven multiplicative fault diagnosis approach for automation processes.

    Science.gov (United States)

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  16. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  17. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Hebel, S.; Hands, J.; Goering, F.; Kirchner, G.; Purtschert, R.

    2015-01-01

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  18. Automated Localization of Multiple Pelvic Bone Structures on MRI.

    Science.gov (United States)

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2016-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are at present identified manually on MRI to locate reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures automatically. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this paper, we present a model that combines support vector machines and nonlinear regression capturing global and local information to automatically identify the bounding boxes of bone structures on MRI. The model identifies the location of the pelvic bone structures by establishing the association between their relative locations and using local information such as texture features. Results show that the proposed method is able to locate the bone structures of interest accurately (dice similarity index >0.75) in 87-91% of the images. This research aims to enable accurate, consistent, and fully automated localization of bone structures on MRI to facilitate and improve the diagnosis of health conditions such as female POP.

  19. Feasibility of automated speech sample collection with stuttering children using interactive voice response (IVR) technology.

    Science.gov (United States)

    Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena

    2015-04-01

    To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.

  20. Current status and future prospects of an automated sample exchange system PAM for protein crystallography

    Science.gov (United States)

    Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.

    2013-03-01

    To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.

  1. Oak ridge national laboratory automated clean chemistry for bulk analysis of environmental swipe samples

    Energy Technology Data Exchange (ETDEWEB)

    Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    To shorten the lengthy and costly manual chemical purification procedures, sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment. This addresses a serious need in the nuclear safeguards community to debottleneck the separation of U and Pu in environmental samples—currently performed by overburdened chemists—with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on current COTS equipment that was modified for U/Pu separations utilizing Eichrom™ TEVA and UTEVA resins. Initial verification of individual columns yielded small elution volumes with consistent elution profiles and good recovery. Combined column calibration demonstrated ample separation without crosscontamination of the eluent. Automated packing and unpacking of the built-in columns initially showed >15% deviation in resin loading by weight, which can lead to inconsistent separations. Optimization of the packing and unpacking methods led to a reduction in the variability of the packed resin to less than 5% daily. The reproducibility of the automated system was tested with samples containing 30 ng U and 15 pg Pu, which were separated in a series with alternating reagent blanks. These experiments showed very good washout of both the resin and the sample from the columns as evidenced by low blank values. Analysis of the major and minor isotope ratios for U and Pu provided values well within data quality limits for the International Atomic Energy Agency. Additionally, system process blanks spiked with 233U and 244Pu tracers were separated using the automated system after it was moved outside of a clean room and yielded levels equivalent to clean room blanks, confirming that the system can produce high quality results without the need for expensive clean room infrastructure. Comparison of the amount of personnel time necessary for successful manual vs

  2. Comparison and clinical utility evaluation of four multiple allergen simultaneous tests including two newly introduced fully automated analyzers

    Directory of Open Access Journals (Sweden)

    John Hoon Rim

    2016-04-01

    Full Text Available Background: We compared the diagnostic performances of two newly introduced fully automated multiple allergen simultaneous tests (MAST analyzers with two conventional MAST assays. Methods: The serum samples from a total of 53 and 104 patients were tested for food panels and inhalant panels, respectively, in four analyzers including AdvanSure AlloScreen (LG Life Science, Korea, AdvanSure Allostation Smart II (LG Life Science, PROTIA Allergy-Q (ProteomeTech, Korea, and RIDA Allergy Screen (R-Biopharm, Germany. We compared not only the total agreement percentages but also positive propensities among four analyzers. Results: Evaluation of AdvanSure Allostation Smart II as upgraded version of AdvanSure AlloScreen revealed good concordance with total agreement percentages of 93.0% and 92.2% in food and inhalant panel, respectively. Comparisons of AdvanSure Allostation Smart II or PROTIA Allergy-Q with RIDA Allergy Screen also showed good concordance performance with positive propensities of two new analyzers for common allergens (Dermatophagoides farina and Dermatophagoides pteronyssinus. The changes of cut-off level resulted in various total agreement percentage fluctuations among allergens by different analyzers, although current cut-off level of class 2 appeared to be generally suitable. Conclusions: AdvanSure Allostation Smart II and PROTIA Allergy-Q presented favorable agreement performances with RIDA Allergy Screen, although positive propensities were noticed in common allergens. Keywords: Multiple allergen simultaneous test, Automated analyzer

  3. Automated high-resolution NMR with a sample changer

    International Nuclear Information System (INIS)

    Wade, C.G.; Johnson, R.D.; Philson, S.B.; Strouse, J.; McEnroe, F.J.

    1989-01-01

    Within the past two years, it has become possible to obtain high-resolution NMR spectra using automated commercial instrumentation. Software control of all spectrometer functions has reduced most of the tedious manual operations to typing a few computer commands or even making selections from a menu. Addition of an automatic sample changer is the next natural step in improving efficiency and sample throughput; it has a significant (and even unexpected) impact on how NMR laboratories are run and how it is taught. Such an instrument makes even sophisticated experiments routine, so that people with no previous exposure to NMR can run these experiments after a training session of an hour or less. This A/C Interface examines the impact of such instrumentation on both the academic and the industrial laboratory

  4. Automated Classification of Seedlings Using Computer Vision

    DEFF Research Database (Denmark)

    Dyrmann, Mads; Christiansen, Peter

    The objective of this project is to investigate the possibilities of recognizing plant species at multiple growth stages based on RGB images. Plants and leaves are initially segmented from a database through a partly automated procedure providing samples of 2438 plants and 4767 leaves distributed...

  5. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    Science.gov (United States)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  6. Automated sample mounting and technical advance alignment system for biological crystallography at a synchrotron source

    International Nuclear Information System (INIS)

    Snell, Gyorgy; Cork, Carl; Nordmeyer, Robert; Cornell, Earl; Meigs, George; Yegian, Derek; Jaklevic, Joseph; Jin, Jian; Stevens, Raymond C.; Earnest, Thomas

    2004-01-01

    High-throughput data collection for macromolecular crystallography requires an automated sample mounting system for cryo-protected crystals that functions reliably when integrated into protein-crystallography beamlines at synchrotrons. Rapid mounting and dismounting of the samples increases the efficiency of the crystal screening and data collection processes, where many crystals can be tested for the quality of diffraction. The sample-mounting subsystem has random access to 112 samples, stored under liquid nitrogen. Results of extensive tests regarding the performance and reliability of the system are presented. To further increase throughput, we have also developed a sample transport/storage system based on 'puck-shaped' cassettes, which can hold sixteen samples each. Seven cassettes fit into a standard dry shipping Dewar. The capabilities of a robotic crystal mounting and alignment system with instrumentation control software and a relational database allows for automated screening and data collection to be developed

  7. Automated whole-genome multiple alignment of rat, mouse, and human

    Energy Technology Data Exchange (ETDEWEB)

    Brudno, Michael; Poliakov, Alexander; Salamov, Asaf; Cooper, Gregory M.; Sidow, Arend; Rubin, Edward M.; Solovyev, Victor; Batzoglou, Serafim; Dubchak, Inna

    2004-07-04

    We have built a whole genome multiple alignment of the three currently available mammalian genomes using a fully automated pipeline which combines the local/global approach of the Berkeley Genome Pipeline and the LAGAN program. The strategy is based on progressive alignment, and consists of two main steps: (1) alignment of the mouse and rat genomes; and (2) alignment of human to either the mouse-rat alignments from step 1, or the remaining unaligned mouse and rat sequences. The resulting alignments demonstrate high sensitivity, with 87% of all human gene-coding areas aligned in both mouse and rat. The specificity is also high: <7% of the rat contigs are aligned to multiple places in human and 97% of all alignments with human sequence > 100kb agree with a three-way synteny map built independently using predicted exons in the three genomes. At the nucleotide level <1% of the rat nucleotides are mapped to multiple places in the human sequence in the alignment; and 96.5% of human nucleotides within all alignments agree with the synteny map. The alignments are publicly available online, with visualization through the novel Multi-VISTA browser that we also present.

  8. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    Science.gov (United States)

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    Science.gov (United States)

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles.

  10. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  12. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...

  13. Automated aerosol sampling and analysis for the Comprehensive Test Ban Treaty

    International Nuclear Information System (INIS)

    Miley, H.S.; Bowyer, S.M.; Hubbard, C.W.; McKinnon, A.D.; Perkins, R.W.; Thompson, R.C.; Warner, R.A.

    1998-01-01

    Detecting nuclear debris from a nuclear weapon exploded in or substantially vented to the Earth's atmosphere constitutes the most certain indication that a violation of the Comprehensive Test Ban Treaty has occurred. For this reason, a radionuclide portion of the International Monitoring System is being designed and implemented. The IMS will monitor aerosols and gaseous xenon isotopes to detect atmospheric and underground tests, respectively. An automated system, the Radionuclide Aerosol Sampler/Analyzer (RASA), has been developed at Pacific Northwest National Laboratory to meet CTBT aerosol measurement requirements. This is achieved by the use of a novel sampling apparatus, a high-resolution germanium detector, and very sophisticated software. This system draws a large volume of air (∼ 20,000 m 3 /day), performs automated gamma-ray spectral measurements (MDC( 140 Ba) 3 ), and communicates this and other data to a central data facility. Automated systems offer the added benefit of rigid controls, easily implemented QA/QC procedures, and centralized depot maintenance and operation. Other types of automated communication include pull or push transmission of State-Of-Health data, commands, and configuration data. In addition, a graphical user interface, Telnet, and other interactive communications are supported over ordinary phone or network lines. This system has been the subject of a USAF commercialization effort to meet US CTBT monitoring commitments. It will also be available to other CTBT signatories and the monitoring community for various governmental, environmental, or commercial needs. The current status of the commercialization is discussed

  14. Invention and validation of an automated camera system that uses optical character recognition to identify patient name mislabeled samples.

    Science.gov (United States)

    Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L

    2014-03-01

    Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.

  15. Automation of Sample Transfer and Counting on Fast Neutron ActivationSystem

    International Nuclear Information System (INIS)

    Dewita; Budi-Santoso; Darsono

    2000-01-01

    The automation of sample transfer and counting were the transfer processof the sample to the activation and counting place which have been done byswitch (manually) previously, than being developed by automaticallyprogrammed logic instructions. The development was done by constructed theelectronics hardware and software for that communication. Transfer timemeasurement is on seconds and was done automatically with an error 1.6 ms.The counting and activation time were decided by the user on seconds andminutes, the execution error on minutes was 8.2 ms. This development systemwill be possible for measuring short half live elements and cyclic activationprocesses. (author)

  16. Automated, feature-based image alignment for high-resolution imaging mass spectrometry of large biological samples

    NARCIS (Netherlands)

    Broersen, A.; Liere, van R.; Altelaar, A.F.M.; Heeren, R.M.A.; McDonnell, L.A.

    2008-01-01

    High-resolution imaging mass spectrometry of large biological samples is the goal of several research groups. In mosaic imaging, the most common method, the large sample is divided into a mosaic of small areas that are then analyzed with high resolution. Here we present an automated alignment

  17. Novel diffusion cell for in vitro transdermal permeation, compatible with automated dynamic sampling

    NARCIS (Netherlands)

    Bosman, I.J; Lawant, A.L; Avegaart, S.R.; Ensing, K; de Zeeuw, R.A

    The development of a new diffusion cell for in vitro transdermal permeation is described. The so-called Kelder cells were used in combination with the ASPEC system (Automatic Sample Preparation with Extraction Columns), which is designed for the automation of solid-phase extractions (SPE). Instead

  18. Automated sample preparation using membrane microtiter extraction for bioanalytical mass spectrometry.

    Science.gov (United States)

    Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H

    1997-01-01

    The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.

  19. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    DEFF Research Database (Denmark)

    Qiao, Jixin

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development...... and optimization for rapid determination of plutonium in environmental samples using SIextraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples...... for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including...

  20. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat...

  1. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  2. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  3. Advantages of automation in plasma sample preparation prior to HPLC/MS/MS quantification: application to the determination of cilazapril and cilazaprilat in a bioequivalence study.

    Science.gov (United States)

    Kolocouri, Filomila; Dotsikas, Yannis; Apostolou, Constantinos; Kousoulos, Constantinos; Soumelas, Georgios-Stefanos; Loukas, Yannis L

    2011-01-01

    An HPLC/MS/MS method characterized by complete automation and high throughput was developed for the determination of cilazapril and its active metabolite cilazaprilat in human plasma. All sample preparation and analysis steps were performed by using 2.2 mL 96 deep-well plates, while robotic liquid handling workstations were utilized for all liquid transfer steps, including liquid-liquid extraction. The whole procedure was very fast compared to a manual procedure with vials and no automation. The method also had a very short chromatographic run time of 1.5 min. Sample analysis was performed by RP-HPLC/MS/MS with positive electrospray ionization using multiple reaction monitoring. The calibration curve was linear in the range of 0.500-300 and 0.250-150 ng/mL for cilazapril and cilazaprilat, respectively. The proposed method was fully validated and proved to be selective, accurate, precise, reproducible, and suitable for the determination of cilazapril and cilazaprilat in human plasma. Therefore, it was applied to a bioequivalence study after per os administration of 2.5 mg tablet formulations of cilazapril.

  4. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    Science.gov (United States)

    Fischer, Jesse R.; Quist, Michael C.

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  5. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  6. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-01-01

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper

  7. Sample size determination in clinical trials with multiple endpoints

    CERN Document Server

    Sozu, Takashi; Hamasaki, Toshimitsu; Evans, Scott R

    2015-01-01

    This book integrates recent methodological developments for calculating the sample size and power in trials with more than one endpoint considered as multiple primary or co-primary, offering an important reference work for statisticians working in this area. The determination of sample size and the evaluation of power are fundamental and critical elements in the design of clinical trials. If the sample size is too small, important effects may go unnoticed; if the sample size is too large, it represents a waste of resources and unethically puts more participants at risk than necessary. Recently many clinical trials have been designed with more than one endpoint considered as multiple primary or co-primary, creating a need for new approaches to the design and analysis of these clinical trials. The book focuses on the evaluation of power and sample size determination when comparing the effects of two interventions in superiority clinical trials with multiple endpoints. Methods for sample size calculation in clin...

  8. Automating ActionScript Projects with Eclipse and Ant

    CERN Document Server

    Koning, Sidney

    2011-01-01

    Automating repetitive programming tasks is easier than many Flash/AS3 developers think. With the Ant build tool, the Eclipse IDE, and this concise guide, you can set up your own "ultimate development machine" to code, compile, debug, and deploy projects faster. You'll also get started with versioning systems, such as Subversion and Git. Create a consistent workflow for multiple machines, or even complete departments, with the help of extensive Ant code samples. If you want to work smarter and take your skills to a new level, this book will get you on the road to automation-with Ant. Set up y

  9. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling

    Directory of Open Access Journals (Sweden)

    Sergi Valverde

    2015-01-01

    Full Text Available Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. First of all, images were processed using the LST and SLS toolkits with different pipeline combinations that differed in either automated or manual lesion segmentation, and lesion filling or masking out lesions. Then, images processed following each of the pipelines were segmented into gray matter (GM and white matter (WM using SPM8, and compared with the same images where expert lesion annotations were filled before segmentation. Our results showed that fully automated lesion segmentation and filling pipelines reduced significantly the % of error in GM and WM volume on images of MS patients, and performed similarly to the images where expert lesion annotations were masked before segmentation. In all the pipelines, the amount of misclassified lesion voxels was the main cause in the observed error in GM and WM volume. However, the % of error was significantly lower when automatically estimated lesions were filled and not masked before segmentation. These results are relevant and suggest that LST and SLS toolboxes allow the performance of accurate brain tissue volume measurements without any kind of manual intervention, which can be convenient not only in terms of time and economic costs, but also to avoid the inherent intra/inter variability between manual annotations.

  10. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    Science.gov (United States)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  11. Automated cell counts on CSF samples: A multicenter performance evaluation of the GloCyte system.

    Science.gov (United States)

    Hod, E A; Brugnara, C; Pilichowska, M; Sandhaus, L M; Luu, H S; Forest, S K; Netterwald, J C; Reynafarje, G M; Kratz, A

    2018-02-01

    Automated cell counters have replaced manual enumeration of cells in blood and most body fluids. However, due to the unreliability of automated methods at very low cell counts, most laboratories continue to perform labor-intensive manual counts on many or all cerebrospinal fluid (CSF) samples. This multicenter clinical trial investigated if the GloCyte System (Advanced Instruments, Norwood, MA), a recently FDA-approved automated cell counter, which concentrates and enumerates red blood cells (RBCs) and total nucleated cells (TNCs), is sufficiently accurate and precise at very low cell counts to replace all manual CSF counts. The GloCyte System concentrates CSF and stains RBCs with fluorochrome-labeled antibodies and TNCs with nucleic acid dyes. RBCs and TNCs are then counted by digital image analysis. Residual adult and pediatric CSF samples obtained for clinical analysis at five different medical centers were used for the study. Cell counts were performed by the manual hemocytometer method and with the GloCyte System following the same protocol at all sites. The limits of the blank, detection, and quantitation, as well as precision and accuracy of the GloCyte, were determined. The GloCyte detected as few as 1 TNC/μL and 1 RBC/μL, and reliably counted as low as 3 TNCs/μL and 2 RBCs/μL. The total coefficient of variation was less than 20%. Comparison with cell counts obtained with a hemocytometer showed good correlation (>97%) between the GloCyte and the hemocytometer, including at very low cell counts. The GloCyte instrument is a precise, accurate, and stable system to obtain red cell and nucleated cell counts in CSF samples. It allows for the automated enumeration of even very low cell numbers, which is crucial for CSF analysis. These results suggest that GloCyte is an acceptable alternative to the manual method for all CSF samples, including those with normal cell counts. © 2017 John Wiley & Sons Ltd.

  12. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  13. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  14. Monte carlo sampling of fission multiplicity.

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, J. S. (John S.)

    2004-01-01

    Two new methods have been developed for fission multiplicity modeling in Monte Carlo calculations. The traditional method of sampling neutron multiplicity from fission is to sample the number of neutrons above or below the average. For example, if there are 2.7 neutrons per fission, three would be chosen 70% of the time and two would be chosen 30% of the time. For many applications, particularly {sup 3}He coincidence counting, a better estimate of the true number of neutrons per fission is required. Generally, this number is estimated by sampling a Gaussian distribution about the average. However, because the tail of the Gaussian distribution is negative and negative neutrons cannot be produced, a slight positive bias can be found in the average value. For criticality calculations, the result of rejecting the negative neutrons is an increase in k{sub eff} of 0.1% in some cases. For spontaneous fission, where the average number of neutrons emitted from fission is low, the error also can be unacceptably large. If the Gaussian width approaches the average number of fissions, 10% too many fission neutrons are produced by not treating the negative Gaussian tail adequately. The first method to treat the Gaussian tail is to determine a correction offset, which then is subtracted from all sampled values of the number of neutrons produced. This offset depends on the average value for any given fission at any energy and must be computed efficiently at each fission from the non-integrable error function. The second method is to determine a corrected zero point so that all neutrons sampled between zero and the corrected zero point are killed to compensate for the negative Gaussian tail bias. Again, the zero point must be computed efficiently at each fission. Both methods give excellent results with a negligible computing time penalty. It is now possible to include the full effects of fission multiplicity without the negative Gaussian tail bias.

  15. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S.

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m 3 /h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10 -6 Bq/m 3 . The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  16. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography...

  17. Sample Handling and Processing on Mars for Future Astrobiology Missions

    Science.gov (United States)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  18. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility – High throughput sample evaluation and automation

    International Nuclear Information System (INIS)

    Theveneau, P; Baker, R; Barrett, R; Beteva, A; Bowler, M W; Carpentier, P; Caserotto, H; Sanctis, D de; Dobias, F; Flot, D; Guijarro, M; Giraud, T; Lentini, M; Leonard, G A; Mattenet, M; McSweeney, S M; Morawe, C; Nurizzo, D; McCarthy, A A; Nanao, M

    2013-01-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This 'first generation' of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  19. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  20. Long-term monitoring of soil gas fluxes with closed chambers using automated and manual systems

    Energy Technology Data Exchange (ETDEWEB)

    Scott, A.; Crichton, I.; Ball, B.C.

    1999-10-01

    The authors describe two gas sample collection techniques, each of which is used in conjunction with custom made automated or manually operated closed chambers. The automated system allows automatic collection of gas samples for simultaneous analysis of multiple trace gas efflux from soils, permitting long-term monitoring. Since the manual system is cheaper to produce, it can be replicated more than the automated and used to estimate spatial variability of soil fluxes. The automated chamber covers a soil area of 0.5 m{sup 2} and has a motor driven lid that remains operational throughout a range of weather conditions. Both systems use gas-tight containers of robust metal construction, which give good sample retention, thereby allowing long-term storage and convenience of transport from remote locations. The containers in the automated system are filled by pumping gas from the closed chamber via a multiway rotary valve. Stored samples from both systems are analyzed simultaneously for N{sub 2}O and CO{sub 2} using automated injection into laboratory-based gas chromatographs. The use of both collection systems is illustrated by results from a field experiment on sewage sludge disposal to land where N{sub 2}O fluxes were high. The automated gas sampling system permitted quantification of the marked temporal variability of concurrent N{sub 2}O and CO{sub 2} fluxes and allowed improved estimation of cumulative fluxes. The automated measurement approach yielded higher estimates of cumulative flux because integration of manual point-in-time observations missed a number of transient high-flux events.

  1. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  2. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  3. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  4. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    Science.gov (United States)

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  5. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Franke, Daniel; Kikhney, Alexey G.; Svergun, Dmitri I.

    2012-01-01

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  6. A novel flow injection chemiluminescence method for automated and miniaturized determination of phenols in smoked food samples.

    Science.gov (United States)

    Vakh, Christina; Evdokimova, Ekaterina; Pochivalov, Aleksei; Moskvin, Leonid; Bulatov, Andrey

    2017-12-15

    An easily performed fully automated and miniaturized flow injection chemiluminescence (CL) method for determination of phenols in smoked food samples has been proposed. This method includes the ultrasound assisted solid-liquid extraction coupled with gas-diffusion separation of phenols from smoked food sample and analytes absorption into a NaOH solution in a specially designed gas-diffusion cell. The flow system was designed to focus on automation and miniaturization with minimal sample and reagent consumption by inexpensive instrumentation. The luminol - N-bromosuccinimide system in an alkaline medium was used for the CL determination of phenols. The limit of detection of the proposed procedure was 3·10 -8 ·molL -1 (0.01mgkg -1 ) in terms of phenol. The presented method demonstrated to be a good tool for easy, rapid and cost-effective point-of-need screening phenols in smoked food samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    Science.gov (United States)

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. © The Author(s) 2016.

  8. Automated high-throughput flow-through real-time diagnostic system

    Science.gov (United States)

    Regan, John Frederick

    2012-10-30

    An automated real-time flow-through system capable of processing multiple samples in an asynchronous, simultaneous, and parallel fashion for nucleic acid extraction and purification, followed by assay assembly, genetic amplification, multiplex detection, analysis, and decontamination. The system is able to hold and access an unlimited number of fluorescent reagents that may be used to screen samples for the presence of specific sequences. The apparatus works by associating extracted and purified sample with a series of reagent plugs that have been formed in a flow channel and delivered to a flow-through real-time amplification detector that has a multiplicity of optical windows, to which the sample-reagent plugs are placed in an operative position. The diagnostic apparatus includes sample multi-position valves, a master sample multi-position valve, a master reagent multi-position valve, reagent multi-position valves, and an optical amplification/detection system.

  9. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  10. Plasma cortisol and noradrenalin concentrations in pigs: automated sampling of freely moving pigs housed in PigTurn versus manually sampled and restrained pigs

    Science.gov (United States)

    Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...

  11. Fully automated gamma spectrometry gauge observing possible radioactive contamination of melting-shop samples

    International Nuclear Information System (INIS)

    Kroos, J.; Westkaemper, G.; Stein, J.

    1999-01-01

    At Salzgitter AG, several monitoring systems have been installed to check the scrap transport by rail and by car. At the moment, the scrap transport by ship is reloaded onto wagons for monitoring afterwards. In the future, a detection system will be mounted onto a crane for a direct check on scrap upon the departure of ship. Furthermore, at Salzgitter AG Central Chemical Laboratory, a fully automated gamma spectrometry gauge is installed in order to observe a possible radioactive contamination of the products. The gamma spectrometer is integrated into the automated OE spectrometry line for testing melting shop samples after performing the OE spectrometry. With this technique the specific activity of selected nuclides and dose rate will be determined. The activity observation is part of the release procedure. The corresponding measurement data are stored in a database for quality management reasons. (author)

  12. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    Energy Technology Data Exchange (ETDEWEB)

    Kertesz, Vilmos [ORNL; Van Berkel, Gary J [ORNL

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  13. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  14. Multiple biopsy probe sampling enabled minimally invasive electrical impedance tomography

    International Nuclear Information System (INIS)

    Shini, Mohanad; Rubinsky, Boris

    2008-01-01

    Biopsies are a reliable method for examining tissues and organs inside the body, in particular for detection of tumors. However, a single biopsy produces only limited information on the site from which it is taken. Therefore, tumor detection now employs multiple biopsy samplings to examine larger volumes of tissue. Nevertheless, even with multiple biopsies, the information remains discrete, while the costs of biopsy increase. Here we propose and evaluate the feasibility of using minimally invasive medical imaging as a means to overcome the limitations of discrete biopsy sampling. The minimally invasive medical imaging technique employs the biopsy probe as electrodes for measurements of electrical impedance tomography relevant data during each biopsy sampling. The data from multiple samplings are combined and used to produce an EIT image of the tissue. Two- and three-dimensional mathematical simulations confirm that the minimally invasive medical imaging technique can produce electrical impedance tomography images of the tissues between the biopsy probe insertion sites. We show that these images can detect tumors that would be missed with multiple biopsy samplings only, and that the technique may facilitate the detection of tumors with fewer biopsies, thereby reducing the cost of cancer detection

  15. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  16. Retention model for sorptive extraction-thermal desorption of aqueous samples : application to the automated analysis of pesticides and polyaromatic hydrocarbons in water samples

    NARCIS (Netherlands)

    Baltussen, H.A.; David, F.; Sandra, P.J.F.; Janssen, J.G.M.; Cramers, C.A.M.G.

    1998-01-01

    In this report, an automated method for sorptive enrichment of aqueous samples is presented. It is based on sorption of the analytes of interest into a packed bed containing 100% polydimethylsiloxane (PDMS) particles followed by thermal desorption for complete transfer of the enriched solutes onto

  17. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  18. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-05

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  19. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  20. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  1. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    International Nuclear Information System (INIS)

    El-Alaily, T.M.; El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M.; Assar, S.T.

    2015-01-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability

  2. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  3. Automated facility for analysis of soil samples by neutron activation, counting, and data control

    International Nuclear Information System (INIS)

    Voegele, A.L.; Jesse, R.H.; Russell, W.L.; Baker, J.

    1978-01-01

    An automated facility remotely and automatically analyzes soil, water, and sediment samples for uranium. The samples travel through pneumatic tubes and switches to be first irradiated by neutrons and then counted for resulting neutron and gamma emission. Samples are loaded into special carriers, or rabbits, which are then automatically loaded into the pneumatic transfer system. The sample carriers have been previously coded with an identification number, which can be automatically read in the system. This number is used for correlating and filing data about the samples. The transfer system, counters, and identification system are controlled by a network of microprocessors. A master microprocessor initiates routines in other microprocessors assigned to specific tasks. The software in the microprocessors is unique for this type of application and lends flexibility to the system

  4. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  5. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    method opens a new avenue for automated DLLME that not only greatly expands the range of viable extractants, especially functional ILs but also enhances its application for various detection methods. Furthermore, multiple samples can be processed simultaneously, which accelerates the sample preparation and allows the examination of a large number of samples.

  6. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    for automated DLLME that not only greatly expands the range of viable extractants, especially functional ILs but also enhances its application for various detection methods. Furthermore, multiple samples can be processed simultaneously, which accelerates the sample preparation and allows the examination of a large number of samples

  7. The RABiT: a rapid automated biodosimetry tool for radiological triage. II. Technological developments.

    Science.gov (United States)

    Garty, Guy; Chen, Youhua; Turner, Helen C; Zhang, Jian; Lyulko, Oleksandra V; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Lawrence Yao, Y; Brenner, David J

    2011-08-01

    Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. The RABiT analyses fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cut-off dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day.

  8. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  9. Detection of virus-specific intrathecally synthesised immunoglobulin G with a fully automated enzyme immunoassay system

    Directory of Open Access Journals (Sweden)

    Weissbrich Benedikt

    2007-05-01

    Full Text Available Abstract Background The determination of virus-specific immunoglobulin G (IgG antibodies in cerebrospinal fluid (CSF is useful for the diagnosis of virus associated diseases of the central nervous system (CNS and for the detection of a polyspecific intrathecal immune response in patients with multiple sclerosis. Quantification of virus-specific IgG in the CSF is frequently performed by calculation of a virus-specific antibody index (AI. Determination of the AI is a demanding and labour-intensive technique and therefore automation is desirable. We evaluated the precision and the diagnostic value of a fully automated enzyme immunoassay for the detection of virus-specific IgG in serum and CSF using the analyser BEP2000 (Dade Behring. Methods The AI for measles, rubella, varicella-zoster, and herpes simplex virus IgG was determined from pairs of serum and CSF samples of patients with viral CNS infections, multiple sclerosis and of control patients. CSF and serum samples were tested simultaneously with reference to a standard curve. Starting dilutions were 1:6 and 1:36 for CSF and 1:1386 and 1:8316 for serum samples. Results The interassay coefficient of variation was below 10% for all parameters tested. There was good agreement between AIs obtained with the BEP2000 and AIs derived from the semi-automated reference method. Conclusion Determination of virus-specific IgG in serum-CSF-pairs for calculation of AI has been successfully automated on the BEP2000. Current limitations of the assay layout imposed by the analyser software should be solved in future versions to offer more convenience in comparison to manual or semi-automated methods.

  10. Honest Importance Sampling with Multiple Markov Chains.

    Science.gov (United States)

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable

  11. Mockup of an automated material transport system for remote handling

    International Nuclear Information System (INIS)

    Porter, M.L.

    1992-01-01

    The automated material transport system (AMTS) was conceived for the transport of samples within the material and process control laboratory (MPCL), located in the plutonium processing building of the special isotope separation (SIS) facility. The MPCL was designed with a dry sample handling laboratory and a wet chemistry analysis laboratory. Each laboratory contained several processing glove boxes. The function of the AMTS was to automate the handling of materials, multiple process samples, and bulky items between process stations with a minimum of operator intervention and with a minimum of waiting periods and nonproductive activities. The AMTS design requirements, design verification mockup plan, and AMTS mockup procurement specification were established prior to cancellation of the SIS project. Due to the AMTS's flexibility, the need for technology development, and applicability to other US Department of Energy facilities, mockup of the AMTS continued. This paper discusses the system design features, capabilities, and results of initial testing

  12. Feasibility of surface sampling in automated inspection of concrete aggregates during bulk transport on a conveyor

    NARCIS (Netherlands)

    Bakker, M.C.M.; Di Maio, F.; Lotfi, S.; Bakker, M.; Hu, M.; Vahidi, A.

    2017-01-01

    Automated optic inspection of concrete aggregates for pollutants (e.g. wood, plastics, gypsum and brick) is required to establish the suitability for reuse in new concrete products. Inspection is more efficient when directly sampling the materials on the conveyor belt instead of feeding them in a

  13. Sample size determination for equivalence assessment with multiple endpoints.

    Science.gov (United States)

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  14. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  15. Automated identification of best-quality coronary artery segments from multiple-phase coronary CT angiography (cCTA) for vessel analysis

    Science.gov (United States)

    Zhou, Chuan; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A.

    2016-03-01

    We are developing an automated method to identify the best quality segment among the corresponding segments in multiple-phase cCTA. The coronary artery trees are automatically extracted from different cCTA phases using our multi-scale vessel segmentation and tracking method. An automated registration method is then used to align the multiple-phase artery trees. The corresponding coronary artery segments are identified in the registered vessel trees and are straightened by curved planar reformation (CPR). Four features are extracted from each segment in each phase as quality indicators in the original CT volume and the straightened CPR volume. Each quality indicator is used as a voting classifier to vote the corresponding segments. A newly designed weighted voting ensemble (WVE) classifier is finally used to determine the best-quality coronary segment. An observer preference study is conducted with three readers to visually rate the quality of the vessels in 1 to 6 rankings. Six and 10 cCTA cases are used as training and test set in this preliminary study. For the 10 test cases, the agreement between automatically identified best-quality (AI-BQ) segments and radiologist's top 2 rankings is 79.7%, and between AI-BQ and the other two readers are 74.8% and 83.7%, respectively. The results demonstrated that the performance of our automated method was comparable to those of experienced readers for identification of the best-quality coronary segments.

  16. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    Fisk, J.F.; Leasure, C.; Sauter, A.D.

    1993-01-01

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  17. Analysis of Product Sampling for New Product Diffusion Incorporating Multiple-Unit Ownership

    Directory of Open Access Journals (Sweden)

    Zhineng Hu

    2014-01-01

    Full Text Available Multiple-unit ownership of nondurable products is an important component of sales in many product categories. Based on the Bass model, this paper develops a new model considering the multiple-unit adoptions as a diffusion process under the influence of product sampling. Though the analysis aims to determine the optimal dynamic sampling effort for a firm and the results demonstrate that experience sampling can accelerate the diffusion process, the best time to send free samples is just before the product being launched. Multiple-unit purchasing behavior can increase sales to make more profit for a firm, and it needs more samples to make the product known much better. The local sensitivity analysis shows that the increase of both external coefficients and internal coefficients has a negative influence on the sampling level, but the internal influence on the subsequent multiple-unit adoptions has little significant influence on the sampling. Using the logistic regression along with linear regression, the global sensitivity analysis gives a whole analysis of the interaction of all factors, which manifests the external influence and multiunit purchase rate are two most important factors to influence the sampling level and net present value of the new product, and presents a two-stage method to determine the sampling level.

  18. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  19. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Ling-Yu Duan

    2010-01-01

    Full Text Available Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  20. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Tian Yonghong

    2010-01-01

    Full Text Available Abstract Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  1. Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

    OpenAIRE

    Loke Mun Sei

    2015-01-01

    Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, ...

  2. Simple and multiple linear regression: sample size considerations.

    Science.gov (United States)

    Hanley, James A

    2016-11-01

    The suggested "two subjects per variable" (2SPV) rule of thumb in the Austin and Steyerberg article is a chance to bring out some long-established and quite intuitive sample size considerations for both simple and multiple linear regression. This article distinguishes two of the major uses of regression models that imply very different sample size considerations, neither served well by the 2SPV rule. The first is etiological research, which contrasts mean Y levels at differing "exposure" (X) values and thus tends to focus on a single regression coefficient, possibly adjusted for confounders. The second research genre guides clinical practice. It addresses Y levels for individuals with different covariate patterns or "profiles." It focuses on the profile-specific (mean) Y levels themselves, estimating them via linear compounds of regression coefficients and covariates. By drawing on long-established closed-form variance formulae that lie beneath the standard errors in multiple regression, and by rearranging them for heuristic purposes, one arrives at quite intuitive sample size considerations for both research genres. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Automated otolith image classification with multiple views: an evaluation on Sciaenidae.

    Science.gov (United States)

    Wong, J Y; Chu, C; Chong, V C; Dhillon, S K; Loh, K H

    2016-08-01

    Combined multiple 2D views (proximal, anterior and ventral aspects) of the sagittal otolith are proposed here as a method to capture shape information for fish classification. Classification performance of single view compared with combined 2D views show improved classification accuracy of the latter, for nine species of Sciaenidae. The effects of shape description methods (shape indices, Procrustes analysis and elliptical Fourier analysis) on classification performance were evaluated. Procrustes analysis and elliptical Fourier analysis perform better than shape indices when single view is considered, but all perform equally well with combined views. A generic content-based image retrieval (CBIR) system that ranks dissimilarity (Procrustes distance) of otolith images was built to search query images without the need for detailed information of side (left or right), aspect (proximal or distal) and direction (positive or negative) of the otolith. Methods for the development of this automated classification system are discussed. © 2016 The Fisheries Society of the British Isles.

  4. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  5. Microfluidic Sample Preparation for Diagnostic Cytopathology

    Science.gov (United States)

    Mach, Albert J.; Adeyiga, Oladunni B.; Di Carlo, Dino

    2014-01-01

    The cellular components of body fluids are routinely analyzed to identify disease and treatment approaches. While significant focus has been placed on developing cell analysis technologies, tools to automate the preparation of cellular specimens have been more limited, especially for body fluids beyond blood. Preparation steps include separating, concentrating, and exposing cells to reagents. Sample preparation continues to be routinely performed off-chip by technicians, preventing cell-based point-of-care diagnostics, increasing the cost of tests, and reducing the consistency of the final analysis following multiple manually-performed steps. Here, we review the assortment of biofluids for which suspended cells are analyzed, along with their characteristics and diagnostic value. We present an overview of the conventional sample preparation processes for cytological diagnosis. We finally discuss the challenges and opportunities in developing microfluidic devices for the purpose of automating or miniaturizing these processes, with particular emphases on preparing large or small volume samples, working with samples of high cellularity, automating multi-step processes, and obtaining high purity subpopulations of cells. We hope to convey the importance of and help identify new research directions addressing the vast biological and clinical applications in preparing and analyzing the array of available biological fluids. Successfully addressing the challenges described in this review can lead to inexpensive systems to improve diagnostic accuracy while simultaneously reducing overall systemic healthcare costs. PMID:23380972

  6. OASIS is Automated Statistical Inference for Segmentation, with applications to multiple sclerosis lesion segmentation in MRI.

    Science.gov (United States)

    Sweeney, Elizabeth M; Shinohara, Russell T; Shiee, Navid; Mateen, Farrah J; Chudgar, Avni A; Cuzzocreo, Jennifer L; Calabresi, Peter A; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M

    2013-01-01

    Magnetic resonance imaging (MRI) can be used to detect lesions in the brains of multiple sclerosis (MS) patients and is essential for diagnosing the disease and monitoring its progression. In practice, lesion load is often quantified by either manual or semi-automated segmentation of MRI, which is time-consuming, costly, and associated with large inter- and intra-observer variability. We propose OASIS is Automated Statistical Inference for Segmentation (OASIS), an automated statistical method for segmenting MS lesions in MRI studies. We use logistic regression models incorporating multiple MRI modalities to estimate voxel-level probabilities of lesion presence. Intensity-normalized T1-weighted, T2-weighted, fluid-attenuated inversion recovery and proton density volumes from 131 MRI studies (98 MS subjects, 33 healthy subjects) with manual lesion segmentations were used to train and validate our model. Within this set, OASIS detected lesions with a partial area under the receiver operating characteristic curve for clinically relevant false positive rates of 1% and below of 0.59% (95% CI; [0.50%, 0.67%]) at the voxel level. An experienced MS neuroradiologist compared these segmentations to those produced by LesionTOADS, an image segmentation software that provides segmentation of both lesions and normal brain structures. For lesions, OASIS out-performed LesionTOADS in 74% (95% CI: [65%, 82%]) of cases for the 98 MS subjects. To further validate the method, we applied OASIS to 169 MRI studies acquired at a separate center. The neuroradiologist again compared the OASIS segmentations to those from LesionTOADS. For lesions, OASIS ranked higher than LesionTOADS in 77% (95% CI: [71%, 83%]) of cases. For a randomly selected subset of 50 of these studies, one additional radiologist and one neurologist also scored the images. Within this set, the neuroradiologist ranked OASIS higher than LesionTOADS in 76% (95% CI: [64%, 88%]) of cases, the neurologist 66% (95% CI: [52%, 78

  7. Future of Automated Insulin Delivery Systems

    NARCIS (Netherlands)

    Castle, Jessica R.; DeVries, J. Hans; Kovatchev, Boris

    2017-01-01

    Advances in continuous glucose monitoring (CGM) have brought on a paradigm shift in the management of type 1 diabetes. These advances have enabled the automation of insulin delivery, where an algorithm determines the insulin delivery rate in response to the CGM values. There are multiple automated

  8. A real-time automated quality control of rain gauge data based on multiple sensors

    Science.gov (United States)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  9. Rapid and automated determination of plutonium and neptunium in environmental samples

    International Nuclear Information System (INIS)

    Qiao, J.

    2011-03-01

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242 Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  10. Rapid and automated determination of plutonium and neptunium in environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, J.

    2011-03-15

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  11. Automated controlled-potential coulometric determination of uranium

    International Nuclear Information System (INIS)

    Knight, C.H.; Clegg, D.E.; Wright, K.D.; Cassidy, R.M.

    1982-06-01

    A controlled-potential coulometer has been automated in our laboratory for routine determination of uranium in solution. The CRNL-designed automated system controls degassing, prereduction, and reduction of the sample. The final result is displayed on a digital coulometer readout. Manual and automated modes of operation are compared to show the precision and accuracy of the automated system. Results are also shown for the coulometric titration of typical uranium-aluminum alloy samples

  12. ASPIRE: An automated sample positioning and irradiation system for radiation biology experiments at Inter University Accelerator Centre, New Delhi

    International Nuclear Information System (INIS)

    Kothari, Ashok; Barua, P.; Archunan, M.; Rani, Kusum; Subramanian, E.T.; Pujari, Geetanjali; Kaur, Harminder; Satyanarayanan, V.V.V.; Sarma, Asitikantha; Avasthi, D.K.

    2015-01-01

    An automated irradiation setup for biology samples has been built at Inter University Accelerator Centre (IUAC), New Delhi, India. It can automatically load and unload 20 biology samples in a run of experiment. It takes about 20 min [2% of the cell doubling time] to irradiate all the 20 samples. Cell doubling time is the time taken by the cells (kept in the medium) to grow double in numbers. The cells in the samples keep growing during entire of the experiment. The fluence irradiated to the samples is measured with two silicon surface barrier detectors. Tests show that the uniformity of fluence and dose of heavy ions reaches to 2% at the sample area in diameter of 40 mm. The accuracy of mean fluence at the center of the target area is within 1%. The irradiation setup can be used to the studies of radiation therapy, radiation dosimetry and molecular biology at the heavy ion accelerator. - Highlights: • Automated positioning and irradiation setup for biology samples at IUAC is built. • Loading and unloading of 20 biology samples can be automatically carried out. • Biologicals cells keep growing during entire experiment. • Fluence and dose of heavy ions are measured by two silicon barrier detectors. • Uniformity of fluence and dose of heavy ions at sample position reaches to 2%

  13. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  14. Immunosuppressant therapeutic drug monitoring by LC-MS/MS: workflow optimization through automated processing of whole blood samples.

    Science.gov (United States)

    Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario

    2013-11-01

    Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.

  15. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  16. Field evaluation of personal sampling methods for multiple bioaerosols.

    Science.gov (United States)

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  17. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  18. Mindboggle: Automated brain labeling with multiple atlases

    International Nuclear Information System (INIS)

    Klein, Arno; Mensh, Brett; Ghosh, Satrajit; Tourville, Jason; Hirsch, Joy

    2005-01-01

    To make inferences about brain structures or activity across multiple individuals, one first needs to determine the structural correspondences across their image data. We have recently developed Mindboggle as a fully automated, feature-matching approach to assign anatomical labels to cortical structures and activity in human brain MRI data. Label assignment is based on structural correspondences between labeled atlases and unlabeled image data, where an atlas consists of a set of labels manually assigned to a single brain image. In the present work, we study the influence of using variable numbers of individual atlases to nonlinearly label human brain image data. Each brain image voxel of each of 20 human subjects is assigned a label by each of the remaining 19 atlases using Mindboggle. The most common label is selected and is given a confidence rating based on the number of atlases that assigned that label. The automatically assigned labels for each subject brain are compared with the manual labels for that subject (its atlas). Unlike recent approaches that transform subject data to a labeled, probabilistic atlas space (constructed from a database of atlases), Mindboggle labels a subject by each atlas in a database independently. When Mindboggle labels a human subject's brain image with at least four atlases, the resulting label agreement with coregistered manual labels is significantly higher than when only a single atlas is used. Different numbers of atlases provide significantly higher label agreements for individual brain regions. Increasing the number of reference brains used to automatically label a human subject brain improves labeling accuracy with respect to manually assigned labels. Mindboggle software can provide confidence measures for labels based on probabilistic assignment of labels and could be applied to large databases of brain images

  19. Sample preparation automation for dosing plutonium in urine

    International Nuclear Information System (INIS)

    Jeanmaire, Lucien; Ballada, Jean; Ridelle Berger, Ariane

    1969-06-01

    After having indicated that dosing urinary plutonium by using the Henry technique can be divided into three stages (plutonium concentration by precipitation, passing the solution on an anionic resin column and plutonium elution, and eluate evaporation to obtain a source of which the radioactivity is measured), and recalled that the automation of the second stage has been reported in another document, this document describes the automation of the first stage, i.e. obtaining from urine a residue containing the plutonium, and sufficiently mineralized to be analyzed by means of ion exchanging resins. Two techniques are proposed, leading to slightly different devices. The different operations to be performed are indicated. The different components of the apparatus are described: beakers, hot plate stirrers, reagent circuits, a system for supernatant suction, and a control-command circuit. The operation and use are then described, and results are given

  20. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    Directory of Open Access Journals (Sweden)

    Kottawattage S. A. Kottawatta

    2017-11-01

    Full Text Available Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants (n = 102 and wet markets (n = 25. From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter. Broiler meat purchased from retail outlets (n = 37 was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter-free broiler flocks were processed in semi-automated facilities 15% (5/33 of neck skin samples became contaminated by the end of processing whereas 25% (2/8 became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni. Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  1. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples.

    Science.gov (United States)

    Kottawatta, Kottawattage S A; Van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-11-29

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants ( n = 102) and wet markets ( n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter . Broiler meat purchased from retail outlets ( n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter -free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni . Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  2. MICROBIOLOGICAL MONITORING AND AUTOMATED EVENT SAMPLING AT KARST SPRINGS USING LEO-SATELLITES

    Science.gov (United States)

    Stadler, Hermann; Skritek, Paul; Sommer, Regina; Mach, Robert L.; Zerobin, Wolfgang; Farnleitner, Andreas H.

    2010-01-01

    Data communication via Low-Earth-Orbit Satellites between portable hydro-meteorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E.coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples by hand compared with the auto-sampling procedure revealed corresponding results and were in agreement to the ISO 9308-1 reference method. E.coli concentrations were individually corrected by event specific die-off rates (0.10–0.14 day−1) compensating losses due to sample storage at spring temperature in the auto sampler. Two large summer events 2005/2006 at a large alpine karst spring (LKAS2) were monitored including detailed analysis of E.coli dynamics (n = 271) together with comprehensive hydrological characterisations. High resolution time series demonstrated a sudden increase of E.coli concentrations in spring water (approx. 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorbent coefficient measured at 254nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-Satellite based system it is a helpful tool for Early-Warning-Systems in the field of drinking water protection. PMID:18776628

  3. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  4. ARTIP: Automated Radio Telescope Image Processing Pipeline

    Science.gov (United States)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  5. Biological Agent Sample Preparation for the Detection and Identification of Multiple Agents by Nucleic Acid-Based Analysis

    National Research Council Canada - National Science Library

    Fields, Robert

    2002-01-01

    .... The AutoLyser instrument which we have developed, provides fully automated purification of viral, bacterial and human genomic DNA and RNA from clinical samples, cell culture and swabs in as little...

  6. Artificial Neural Network for Total Laboratory Automation to Improve the Management of Sample Dilution.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.

  7. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  8. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  9. Multiple Robots Localization Via Data Sharing

    Science.gov (United States)

    2015-09-01

    multiple humans, each with specialized skills complementing each other, work to create the solution. Hence, there is a motivation to think in terms of...pygame.Color(255,255,255) COLORBLACK = pygame.Color(0,0,0) F. AUTOMATE.PY The automate.py file is a helper file to assist in running multiple simulation

  10. Improving neutron multiplicity counting for the spatial dependence of multiplication: Results for spherical plutonium samples

    Energy Technology Data Exchange (ETDEWEB)

    Göttsche, Malte, E-mail: malte.goettsche@physik.uni-hamburg.de; Kirchner, Gerald

    2015-10-21

    The fissile mass deduced from a neutron multiplicity counting measurement of high mass dense items is underestimated if the spatial dependence of the multiplication is not taken into account. It is shown that an appropriate physics-based correction successfully removes the bias. It depends on four correction coefficients which can only be exactly determined if the sample geometry and composition are known. In some cases, for example in warhead authentication, available information on the sample will be very limited. MCNPX-PoliMi simulations have been performed to obtain the correction coefficients for a range of spherical plutonium metal geometries, with and without polyethylene reflection placed around the spheres. For hollow spheres, the analysis shows that the correction coefficients can be approximated with high accuracy as a function of the sphere's thickness depending only slightly on the radius. If the thickness remains unknown, less accurate estimates of the correction coefficients can be obtained from the neutron multiplication. The influence of isotopic composition is limited. The correction coefficients become somewhat smaller when reflection is present.

  11. Automated-immunosensor with centrifugal fluid valves for salivary cortisol measurement

    Directory of Open Access Journals (Sweden)

    Masaki Yamaguchi

    2014-08-01

    Full Text Available Point-of-care measurement of the stress hormone cortisol will greatly facilitate the timely diagnosis and management of stress-related disorders. We describe an automated salivary cortisol immunosensor, incorporating centrifugal fluid valves and a disposable disc-chip that allows for truncated reporting of cortisol levels (<15 min. The performance characteristics of the immunosensor are optimized through select blocking agents to prevent the non-specific adsorption of proteins; immunoglobulin G (IgG polymer for the pad and milk protein for the reservoirs and the flow channels. Incorporated centrifugal fluid valves allow for rapid and repeat washings to remove impurities from the saliva samples. An optical reader and laptop computer automate the immunoassay processes and provide easily accessible digital readouts of salivary cortisol measurements. Linear regression analysis of the calibration curve for the cortisol immunosensor showed 0.92 of coefficient of multiple determination, R2, and 38.7% of coefficient of variation, CV, for a range of salivary cortisol concentrations between 0.4 and 11.3 ng/mL. The receiver operating characteristic (ROC curve analysis of human saliva samples indicate potential utility for discriminating stress disorders and underscore potential application of the biosensor in stress disorders. The performance of our salivary cortisol immunosensor approaches laboratory based tests and allows noninvasive, quantitative, and automated analysis of human salivary cortisol levels with reporting times compatible with point-of-care applications. Keywords: Immunosensor, Centrifugal fluid valve, Automation, Cortisol, Saliva

  12. Standardization of automated 25-hydroxyvitamin D assays: How successful is it?

    Science.gov (United States)

    Elsenberg, E H A M; Ten Boekel, E; Huijgen, H; Heijboer, A C

    2017-12-01

    Multiple 25(OH)D assays have recently been aligned to improve comparibility. In this study we investigated the performance of these assays using both native single-donor sera with target values certified by a reference method as well as single donor sera from a heterogeneous patient population. 25(OH)D levels were measured in twenty reference samples (Ref!25OHD; Labquality, Finland) using five automated methods (Lumipulse, Liaison, Cobas, iSYS and Access) and one aligned ID-XLC-MS/MS method (slope: 1,00; intercept: 0,00; R=0,996). Furthermore, 25(OH)D concentrations measured in 50 pregnant women and 52 random patients using the 5 automated assays were compared to the ID-XLC-MS/MS. In addition, Vitamin D binding protein (DBP) was measured. Most automated assays showed significant differences in 25(OH)D levels measured in reference samples. Slopes varied from 1,00 to 1,33, intercepts from -5.48 to -15,81nmol/L and the R from 0,971 to 0,997. This inaccuracy was even more prominent in a heterogeneous patient population. Slopes varied from 0,75 to 1,35, intercepts from -9.02 to 11,51nmol/L and the R from 0,840 to 0,949. For most assays the deviation in 25(OH)D concentration increased with elevating DBP concentrations suggesting that DBP might be one of the factors contributing to the inaccuracy in currently used automated 25(OH)D methods. Despite the use of standardized assays, we observed significant differences in 25(OH)D concentrations in some automated methods using reference material obtained from healthy single donor sera. In sera of a patient population this inaccuracy was even worse which is highly concerning as patient samples are being investigated in clinical laboratories. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Multiple Smaller Missions as a Direct Pathway to Mars Sample Return

    Science.gov (United States)

    Niles, P. B.; Draper, D. S.; Evans, C. A.; Gibson, E. K.; Graham, L. D.; Jones, J. H.; Lederer, S. M.; Ming, D.; Seaman, C. H.; Archer, P. D.; hide

    2012-01-01

    Recent discoveries by the Mars Exploration Rovers, Mars Express, Mars Odyssey, and Mars Reconnaissance Orbiter spacecraft include multiple, tantalizing astrobiological targets representing both past and present environments on Mars. The most desirable path to Mars Sample Return (MSR) would be to collect and return samples from that site which provides the clearest examples of the variety of rock types considered a high priority for sample return (pristine igneous, sedimentary, and hydrothermal). Here we propose an MSR architecture in which the next steps (potentially launched in 2018) would entail a series of smaller missions, including caching, to multiple landing sites to verify the presence of high priority sample return targets through in situ analyses. This alternative architecture to one flagship-class sample caching mission to a single site would preserve a direct path to MSR as stipulated by the Planetary Decadal Survey, while permitting investigation of diverse deposit types and providing comparison of the site of returned samples to other aqueous environments on early Mars

  14. Cerebral perfusion and automated individual analysis using SPECT among an obsessive-compulsive population

    Directory of Open Access Journals (Sweden)

    Euclides Timóteo da Rocha

    2011-01-01

    Full Text Available OBJECTIVE: To make individual assessments using automated quantification methodology in order to screen for perfusion abnormalities in cerebral SPECT examinations among a sample of subjects with OCD. METHODS: Statistical parametric mapping (SPM was used to compare 26 brain SPECT images from patients with OCD individually with an image bank of 32 normal subjects, using the statistical threshold of p < 0.05 (corrected for multiple comparisons at the level of individual voxels or clusters. The maps were analyzed, and regions presenting voxels that remained above this threshold were sought. RESULTS: Six patients from a sample of 26 OCD images showed abnormalities at cluster or voxel level, considering the criteria described above, which represented 23.07%. However, seven images from the normal group of 32 were also indicated as cases of perfusional abnormality, representing 21.8% of the sample. CONCLUSION: The automated quantification method was not considered to be a useful tool for clinical practice, for analyses complementary to visual inspection.

  15. Sample handling for kinetics and molecular assembly in flow cytometry

    Energy Technology Data Exchange (ETDEWEB)

    Sklar, L.A. [Los Alamos National Lab., NM (United States). National Flow Cytometry Resource]|[Univ. of New Mexico, Albuquerque, NM (United States). School of Medicine; Seamer, L.C.; Kuckuck, F.; Prossnitz, E.; Edwards, B. [Univ. of New Mexico, Albuquerque, NM (United States). School of Medicine; Posner, G. [Northern Arizona Univ., Flagstaff, AZ (United States). Dept. of Chemistry

    1998-07-01

    Flow cytometry discriminates particle associated fluorescence from the fluorescence of the surrounding medium. It permits assemblies of macromolecular complexes on beads or cells to be detected in real-time with precision and specificity. The authors have investigated two types of robust sample handling systems which provide sub-second resolution and high throughput: (1) mixers which use stepper-motor driven syringes to initiate chemical reactions in msec time frames; and (2) flow injection controllers with valves and automated syringes used in chemical process control. In the former system, the authors used fast valves to overcome the disparity between mixing 100 {micro}ls of sample in 100 msecs and delivering sample to a flow cytometer at 1 {micro}l/sec. Particles were detected within 100 msec after mixing, but turbulence was created which lasted for 1 sec after injection of the sample into the flow cytometer. They used optical criteria to discriminate particles which were out of alignment due to the turbulent flow. Complex sample handling protocols involving multiple mixing steps and sample dilution have also been achieved. With the latter system they were able to automate sample handling and delivery with intervals of a few seconds. The authors used a fluidic approach to defeat turbulence caused by sample introduction. By controlling both sheath and sample with individual syringes, the period of turbulence was reduced to {approximately} 200 msecs. Automated sample handling and sub-second resolution should permit broad analytical and diagnostic applications of flow cytometry.

  16. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automatedsample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  17. A longitudinal field multiple sampling ionization chamber for RIBLL2

    International Nuclear Information System (INIS)

    Tang Shuwen; Ma Peng; Lu Chengui; Duan Limin; Sun Zhiyu; Yang Herun; Zhang Jinxia; Hu Zhengguo; Xu Shanhu

    2012-01-01

    A longitudinal field MUltiple Sampling Ionization Chamber (MUSIC), which makes multiple measurements of energy loss for very high energy heavy ions at RIBLL2, has been constructed and tested with 3 constituent α source ( 239 Pu : 3.435 MeV, 241 Am : 3.913 MeV, 244 Cm : 4.356 MeV). The voltage plateau curve has been plotted and-500 V is determined as a proper work voltage. The energy resolution is 271.4 keV FWHM for the sampling unit when 3.435 MeV energy deposited. A Geant4 Monte Carlo simulation is made and it indicates the detector can provide unique particle identification for ions Z≥4. (authors)

  18. Effects of (α,n) contaminants and sample multiplication on statistical neutron correlation measurements

    International Nuclear Information System (INIS)

    Dowdy, E.J.; Hansen, G.E.; Robba, A.A.; Pratt, J.C.

    1980-01-01

    The complete formalism for the use of statistical neutron fluctuation measurements for the nondestructive assay of fissionable materials has been developed. This formalism includes the effect of detector deadtime, neutron multiplicity, random neutron pulse contributions from (α,n) contaminants in the sample, and the sample multiplication of both fission-related and background neutrons

  19. An Automated Approach to Reasoning Under Multiple Perspectives

    Science.gov (United States)

    deBessonet, Cary

    2004-01-01

    This is the final report with emphasis on research during the last term. The context for the research has been the development of an automated reasoning technology for use in SMS (symbolic Manipulation System), a system used to build and query knowledge bases (KBs) using a special knowledge representation language SL (Symbolic Language). SMS interpreters assertive SL input and enters the results as components of its universe. The system operates in two basic models: 1) constructive mode (for building KBs); and 2) query/search mode (for querying KBs). Query satisfaction consists of matching query components with KB components. The system allows "penumbral matches," that is, matches that do not exactly meet the specifications of the query, but which are deemed relevant for the conversational context. If the user wants to know whether SMS has information that holds, say, for "any chow," the scope of relevancy might be set so that the system would respond based on a finding that it has information that holds for "most dogs," although this is not exactly what was called for by the query. The response would be qualified accordingly, as would normally be the case in ordinary human conversation. The general goal of the research was to develop an approach by which assertive content could be interpreted from multiple perspectives so that reasoning operations could be successfully conducted over the results. The interpretation of an SL statement such as, "{person believes [captain (asserted (perhaps)) (astronaut saw (comet (bright)))]}," which in English would amount to asserting something to the effect that, "Some person believes that a captain perhaps asserted that an astronaut saw a bright comet," would require the recognition of multiple perspectives, including some that are: a) epistemically-based (focusing on "believes"); b) assertion-based (focusing on "asserted"); c) perception-based (focusing on "saw"); d) adjectivally-based (focusing on "bight"); and e) modally

  20. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  1. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    International Nuclear Information System (INIS)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L.

    2015-01-01

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h"−"1). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates the

  2. Digitally programmable microfluidic automaton for multiscale combinatorial mixing and sample processing†

    Science.gov (United States)

    Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.

    2013-01-01

    A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232

  3. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    International Nuclear Information System (INIS)

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-01-01

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants

  4. Automated gravimetric sample pretreatment using an industrial robot for the high-precision determination of plutonium by isotope dilution mass spectrometry

    International Nuclear Information System (INIS)

    Surugaya, Naoki; Hiyama, Toshiaki; Watahiki, Masaru

    2008-01-01

    A robotized sample-preparation method for the determination of Pu, which is recovered by extraction reprocessing of spent nuclear fuel, by isotope dilution mass spectrometry (IDMS) is described. The automated system uses a six-axis industrial robot, whose motility is very fast, accurate, and flexible, installed in a glove box. The automation of the weighing and dilution steps enables operator-unattended sample pretreatment for the high-precision analysis of Pu in aqueous solutions. Using the developed system, the Pu concentration in a HNO 3 medium was successfully determined using a set of subsequent mass spectrometric measurements. The relative uncertainty in determining the Pu concentration by IDMS using this system was estimated to be less than 0.1% (k=2), which is equal to that expected of a talented analysis. The operation time required was the same as that for a skilled operator. (author)

  5. Future of Automated Insulin Delivery Systems.

    Science.gov (United States)

    Castle, Jessica R; DeVries, J Hans; Kovatchev, Boris

    2017-06-01

    Advances in continuous glucose monitoring (CGM) have brought on a paradigm shift in the management of type 1 diabetes. These advances have enabled the automation of insulin delivery, where an algorithm determines the insulin delivery rate in response to the CGM values. There are multiple automated insulin delivery (AID) systems in development. A system that automates basal insulin delivery has already received Food and Drug Administration approval, and more systems are likely to follow. As the field of AID matures, future systems may incorporate additional hormones and/or multiple inputs, such as activity level. All AID systems are impacted by CGM accuracy and future CGM devices must be shown to be sufficiently accurate to be safely incorporated into AID. In this article, we summarize recent achievements in AID development, with a special emphasis on CGM sensor performance, and discuss the future of AID systems from the point of view of their input-output characteristics, form factor, and adaptability.

  6. A new and standardized method to sample and analyse vitreous samples by the Cellient automated cell block system.

    Science.gov (United States)

    Van Ginderdeuren, Rita; Van Calster, Joachim; Stalmans, Peter; Van den Oord, Joost

    2014-08-01

    In this prospective study, a universal protocol for sampling and analysing vitreous material was investigated. Vitreous biopsies are difficult to handle because of the paucity of cells and the gelatinous structure of the vitreous. Histopathological analysis of the vitreous is useful in difficult uveitis cases to differentiate uveitis from lymphoma or infection and to define the type of cellular reaction. Hundred consecutive vitreous samples were analysed with the Cellient tissue processor (Hologic). This machine is a fully automated processor starting from a specified container with PreservCyt (fixative fluid) with cells to paraffin. Cytology was compared with fixatives Cytolyt (contains a mucolyticum) and PreservCyt. Routine histochemical and immunostainings were evaluated. In 92% of the cases, sufficient material was found for diagnosis. In 14%, a Cytolyt wash was necessary to prevent clotting of the tubes in the Cellient due to the viscosity of the sample. In 23%, the diagnosis was an acute inflammation (presence of granulocytes); in 33%, chronic active inflammation (presence of T lymphocytes); in 33%, low-grade inflammation (presence of CD68 cells, without T lymphocytes); and in 3%, a malignant process. A standardized protocol for sampling and handling vitreous biopsies, fixing in PreservCyt and processing by the Cellient gives a satisfactory result in morphology, number of cells and possibility of immuno-histochemical stainings. The diagnosis can be established or confirmed in more than 90% of cases. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  7. Application of In-Segment Multiple Sampling in Object-Based Classification

    Directory of Open Access Journals (Sweden)

    Nataša Đurić

    2014-12-01

    Full Text Available When object-based analysis is applied to very high-resolution imagery, pixels within the segments reveal large spectral inhomogeneity; their distribution can be considered complex rather than normal. When normality is violated, the classification methods that rely on the assumption of normally distributed data are not as successful or accurate. It is hard to detect normality violations in small samples. The segmentation process produces segments that vary highly in size; samples can be very big or very small. This paper investigates whether the complexity within the segment can be addressed using multiple random sampling of segment pixels and multiple calculations of similarity measures. In order to analyze the effect sampling has on classification results, statistics and probability value equations of non-parametric two-sample Kolmogorov-Smirnov test and parametric Student’s t-test are selected as similarity measures in the classification process. The performance of both classifiers was assessed on a WorldView-2 image for four land cover classes (roads, buildings, grass and trees and compared to two commonly used object-based classifiers—k-Nearest Neighbor (k-NN and Support Vector Machine (SVM. Both proposed classifiers showed a slight improvement in the overall classification accuracies and produced more accurate classification maps when compared to the ground truth image.

  8. MaxBin 2.0: an automated binning algorithm to recover genomes from multiple metagenomic datasets

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yu-Wei [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Simmons, Blake A. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Steven W. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-10-29

    The recovery of genomes from metagenomic datasets is a critical step to defining the functional roles of the underlying uncultivated populations. We previously developed MaxBin, an automated binning approach for high-throughput recovery of microbial genomes from metagenomes. Here, we present an expanded binning algorithm, MaxBin 2.0, which recovers genomes from co-assembly of a collection of metagenomic datasets. Tests on simulated datasets revealed that MaxBin 2.0 is highly accurate in recovering individual genomes, and the application of MaxBin 2.0 to several metagenomes from environmental samples demonstrated that it could achieve two complementary goals: recovering more bacterial genomes compared to binning a single sample as well as comparing the microbial community composition between different sampling environments. Availability and implementation: MaxBin 2.0 is freely available at http://sourceforge.net/projects/maxbin/ under BSD license. Supplementary information: Supplementary data are available at Bioinformatics online.

  9. Physiological Self-Regulation and Adaptive Automation

    Science.gov (United States)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  10. Flexible software architecture for user-interface and machine control in laboratory automation.

    Science.gov (United States)

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  11. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  12. The Protein Maker: an automated system for high-throughput parallel purification

    International Nuclear Information System (INIS)

    Smith, Eric R.; Begley, Darren W.; Anderson, Vanessa; Raymond, Amy C.; Haffner, Taryn E.; Robinson, John I.; Edwards, Thomas E.; Duncan, Natalie; Gerdts, Cory J.; Mixon, Mark B.; Nollert, Peter; Staker, Bart L.; Stewart, Lance J.

    2011-01-01

    The Protein Maker instrument addresses a critical bottleneck in structural genomics by allowing automated purification and buffer testing of multiple protein targets in parallel with a single instrument. Here, the use of this instrument to (i) purify multiple influenza-virus proteins in parallel for crystallization trials and (ii) identify optimal lysis-buffer conditions prior to large-scale protein purification is described. The Protein Maker is an automated purification system developed by Emerald BioSystems for high-throughput parallel purification of proteins and antibodies. This instrument allows multiple load, wash and elution buffers to be used in parallel along independent lines for up to 24 individual samples. To demonstrate its utility, its use in the purification of five recombinant PB2 C-terminal domains from various subtypes of the influenza A virus is described. Three of these constructs crystallized and one diffracted X-rays to sufficient resolution for structure determination and deposition in the Protein Data Bank. Methods for screening lysis buffers for a cytochrome P450 from a pathogenic fungus prior to upscaling expression and purification are also described. The Protein Maker has become a valuable asset within the Seattle Structural Genomics Center for Infectious Disease (SSGCID) and hence is a potentially valuable tool for a variety of high-throughput protein-purification applications

  13. Automated patterning and probing with multiple nanoscale tools for single-cell analysis.

    Science.gov (United States)

    Li, Jiayao; Kim, Yeonuk; Liu, Boyin; Qin, Ruwen; Li, Jian; Fu, Jing

    2017-10-01

    The nano-manipulation approach that combines Focused Ion Beam (FIB) milling and various imaging and probing techniques enables researchers to investigate the cellular structures in three dimensions. Such fusion approach, however, requires extensive effort on locating and examining randomly-distributed targets due to limited Field of View (FOV) when high magnification is desired. In the present study, we present the development that automates 'pattern and probe' particularly for single-cell analysis, achieved by computer aided tools including feature recognition and geometric planning algorithms. Scheduling of serial FOVs for imaging and probing of multiple cells was considered as a rectangle covering problem, and optimal or near-optimal solutions were obtained with the heuristics developed. FIB milling was then employed automatically followed by downstream analysis using Atomic Force Microscopy (AFM) to probe the cellular interior. Our strategy was applied to examine bacterial cells (Klebsiella pneumoniae) and achieved high efficiency with limited human interference. The developed algorithms can be easily adapted and integrated with different imaging platforms towards high-throughput imaging analysis of single cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  15. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained....... The major disadvantage of manual FT segmentations, unfortunately, is that placing regions-of-interest for tract selection can be very labor-intensive and time-consuming. Although there are several methods that can identify specific WM fiber bundles in an automated way, manual FT segmentations across...... multiple subjects performed by a trained rater with neuroanatomical expertise are generally assumed to be more accurate. However, for longitudinal DTI analyses it may still be beneficial to automate the FT segmentation across multiple time points, but then for each individual subject separately. Both...

  16. Determining Sample Size for Accurate Estimation of the Squared Multiple Correlation Coefficient.

    Science.gov (United States)

    Algina, James; Olejnik, Stephen

    2000-01-01

    Discusses determining sample size for estimation of the squared multiple correlation coefficient and presents regression equations that permit determination of the sample size for estimating this parameter for up to 20 predictor variables. (SLD)

  17. Fast and scalable inference of multi-sample cancer lineages.

    KAUST Repository

    Popic, Victoria; Salari, Raheleh; Hajirasouliha, Iman; Kashef-Haghighi, Dorna; West, Robert B; Batzoglou, Serafim

    2015-01-01

    Somatic variants can be used as lineage markers for the phylogenetic reconstruction of cancer evolution. Since somatic phylogenetics is complicated by sample heterogeneity, novel specialized tree-building methods are required for cancer phylogeny reconstruction. We present LICHeE (Lineage Inference for Cancer Heterogeneity and Evolution), a novel method that automates the phylogenetic inference of cancer progression from multiple somatic samples. LICHeE uses variant allele frequencies of somatic single nucleotide variants obtained by deep sequencing to reconstruct multi-sample cell lineage trees and infer the subclonal composition of the samples. LICHeE is open source and available at http://viq854.github.io/lichee .

  18. Fast and scalable inference of multi-sample cancer lineages.

    KAUST Repository

    Popic, Victoria

    2015-05-06

    Somatic variants can be used as lineage markers for the phylogenetic reconstruction of cancer evolution. Since somatic phylogenetics is complicated by sample heterogeneity, novel specialized tree-building methods are required for cancer phylogeny reconstruction. We present LICHeE (Lineage Inference for Cancer Heterogeneity and Evolution), a novel method that automates the phylogenetic inference of cancer progression from multiple somatic samples. LICHeE uses variant allele frequencies of somatic single nucleotide variants obtained by deep sequencing to reconstruct multi-sample cell lineage trees and infer the subclonal composition of the samples. LICHeE is open source and available at http://viq854.github.io/lichee .

  19. Automated Registration of Images from Multiple Bands of Resourcesat-2 Liss-4 camera

    Science.gov (United States)

    Radhadevi, P. V.; Solanki, S. S.; Jyothi, M. V.; Varadan, G.

    2014-11-01

    Continuous and automated co-registration and geo-tagging of images from multiple bands of Liss-4 camera is one of the interesting challenges of Resourcesat-2 data processing. Three arrays of the Liss-4 camera are physically separated in the focal plane in alongtrack direction. Thus, same line on the ground will be imaged by extreme bands with a time interval of as much as 2.1 seconds. During this time, the satellite would have covered a distance of about 14 km on the ground and the earth would have rotated through an angle of 30". A yaw steering is done to compensate the earth rotation effects, thus ensuring a first level registration between the bands. But this will not do a perfect co-registration because of the attitude fluctuations, satellite movement, terrain topography, PSM steering and small variations in the angular placement of the CCD lines (from the pre-launch values) in the focal plane. This paper describes an algorithm based on the viewing geometry of the satellite to do an automatic band to band registration of Liss-4 MX image of Resourcesat-2 in Level 1A. The algorithm is using the principles of photogrammetric collinearity equations. The model employs an orbit trajectory and attitude fitting with polynomials. Then, a direct geo-referencing with a global DEM with which every pixel in the middle band is mapped to a particular position on the surface of the earth with the given attitude. Attitude is estimated by interpolating measurement data obtained from star sensors and gyros, which are sampled at low frequency. When the sampling rate of attitude information is low compared to the frequency of jitter or micro-vibration, images processed by geometric correction suffer from distortion. Therefore, a set of conjugate points are identified between the bands to perform a relative attitude error estimation and correction which will ensure the internal accuracy and co-registration of bands. Accurate calculation of the exterior orientation parameters with

  20. Automation of analytical systems in power cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2008-01-01

    'Automation' is a widely used term in instrumentation and is often applied to signal exchange, PLC and SCADA systems. Common use, however, does not necessarily described autonomous operation of analytical devices. We define an automated analytical system as a black box with an input (sample) and an output (measured value). In addition we need dedicated status lines for assessing the validities of the input for our black box and the output for subsequent systems. We will discuss input parameters, automated analytical processes and output parameters. Further considerations will be given to signal exchange and integration into the operating routine of a power plant. Local control loops (chemical dosing) and the automation of sampling systems are not discussed here. (author)

  1. Nonanalytic Laboratory Automation: A Quarter Century of Progress.

    Science.gov (United States)

    Hawker, Charles D

    2017-06-01

    Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.

  2. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    Science.gov (United States)

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Sampling plans in attribute mode with multiple levels of precision

    International Nuclear Information System (INIS)

    Franklin, M.

    1986-01-01

    This paper describes a method for deriving sampling plans for nuclear material inventory verification. The method presented is different from the classical approach which envisages two levels of measurement precision corresponding to NDA and DA. In the classical approach the precisions of the two measurement methods are taken as fixed parameters. The new approach is based on multiple levels of measurement precision. The design of the sampling plan consists of choosing the number of measurement levels, the measurement precision to be used at each level and the sample size to be used at each level

  4. ReseqChip: Automated integration of multiple local context probe data from the MitoChip array in mitochondrial DNA sequence assembly

    Directory of Open Access Journals (Sweden)

    Spang Rainer

    2009-12-01

    Full Text Available Abstract Background The Affymetrix MitoChip v2.0 is an oligonucleotide tiling array for the resequencing of the human mitochondrial (mt genome. For each of 16,569 nucleotide positions of the mt genome it holds two sets of four 25-mer probes each that match the heavy and the light strand of a reference mt genome and vary only at their central position to interrogate all four possible alleles. In addition, the MitoChip v2.0 carries alternative local context probes to account for known mtDNA variants. These probes have been neglected in most studies due to the lack of software for their automated analysis. Results We provide ReseqChip, a free software that automates the process of resequencing mtDNA using multiple local context probes on the MitoChip v2.0. ReseqChip significantly improves base call rate and sequence accuracy. ReseqChip is available at http://code.open-bio.org/svnweb/index.cgi/bioperl/browse/bioperl-live/trunk/Bio/Microarray/Tools/. Conclusions ReseqChip allows for the automated consolidation of base calls from alternative local mt genome context probes. It thereby improves the accuracy of resequencing, while reducing the number of non-called bases.

  5. Automated processing integrated with a microflow cytometer for pathogen detection in clinical matrices.

    Science.gov (United States)

    Golden, J P; Verbarg, J; Howell, P B; Shriver-Lake, L C; Ligler, F S

    2013-02-15

    A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose-response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. Published by Elsevier B.V.

  6. Carotid Catheterization and Automated Blood Sampling Induce Systemic IL-6 Secretion and Local Tissue Damage and Inflammation in the Heart, Kidneys, Liver and Salivary Glands in NMRI Mice

    DEFF Research Database (Denmark)

    Teilmann, Anne Charlotte; Rozell, Björn; Kalliokoski, Otto

    2016-01-01

    Automated blood sampling through a vascular catheter is a frequently utilized technique in laboratory mice. The potential immunological and physiological implications associated with this technique have, however, not been investigated in detail. The present study compared plasma levels of the cyt...... and embolized to distant sites. Thus, catheterization and subsequent automated blood sampling may have physiological impact. Possible confounding effects of visceral damage should be assessed and considered, when using catheterized mouse models....

  7. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  8. Multiple Property Cross Direction Control of Paper Machines

    Directory of Open Access Journals (Sweden)

    Markku Ohenoja

    2011-07-01

    Full Text Available Cross direction (CD control in sheet-forming process forms a challenging problem with high dimensions. Accounting the interactions between different properties and actuators, the dimensionality increases further and also computational issues arise. We present a multiple property controller feasible to be used especially with imaging measurements that provide high sampling frequency and therefore enable short control interval. The simulation results state the benefits of multiple property CD control over single property control and single property control using full feedforward compensation. The controller presented may also be tuned in automated manner and the results demonstrate the effect of tuning on input saturation.

  9. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Salim, Nazaratul Ashifa Bt. Abdullah [Division of Waste and Environmental Technology, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Ismail, Nadiah Binti [Fakulti Kejuruteraan Elektrik, UiTM Pulau Pinang, 13500 Permatang Pauh, Pulau Pinang (Malaysia)

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  10. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Science.gov (United States)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  11. Automated pose estimation of objects using multiple ID devices for handling and maintenance task in nuclear fusion reactor

    International Nuclear Information System (INIS)

    Umetani, Tomohiro; Morioka, Jun-ichi; Tamura, Yuichi; Inoue, Kenji; Arai, Tatsuo; Mae, Yasusi

    2011-01-01

    This paper describes a method for the automated estimation of three-dimensional pose (position and orientation) of objects by autonomous robots, using multiple identification (ID) devices. Our goal is to estimate the object pose for assembly or maintenance tasks in a real nuclear fusion reactor system, with autonomous robots cooperating in a virtual assembly system. The method estimates the three-dimensional pose for autonomous robots. This paper discusses a method of motion generation for ID acquisition using the sensory data acquired by the measurement system attached to the robots and from the environment. Experimental results show the feasibility of the proposed method. (author)

  12. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    Science.gov (United States)

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  13. Multiple steroid radioimmunoassays and automation. Versatile techniques for reproductive endocrinology

    International Nuclear Information System (INIS)

    Vihko, R.; Hammond, G.L.; Pakarinen, A.; Viinikka, L.

    1978-01-01

    The combination of the efficient steroid-separating properties of a lipophilic Sephadex derivative Lipidex-5000sup(TM) and the use of antibodies with carefully selected specificity allows the quantitative determination of pregnenolone, progesterone, 17α-hydroxyprogesterone, androstenedione, testosterone, 5α-dihydrotestosterone, 5α-androstanedione, androsterone and 5α-androstane-3α, 17β-diol in 1- to 2-ml samples of both blood serum and amniotic fluid as well as in 300- to 600-mg pieces of prostatic tissue. The adaptation of the pipetting unit and incubator of a discrete clinical chemical analyser, System Olli 3000, for the automation of the radioimmunoassays has resulted in a greatly increased through-put and has decreased the experimental error of the procedure. In studies on reproductive endocrinology, the methodology developed has allowed the detection of a sex difference in androgen composition of the amniotic fluid early in pregnancy. Further, it is very likely that the decline in steroid production by the testis seen during the first year of life and then in senescence is affected by basically different mechanisms. There are also important differences in the steroid content of normal, hyperplastic and carcinomatous prostate. (author)

  14. Multiple steroid radioimmunoassays and automation: versatile techniques for reproductive endocrinology

    International Nuclear Information System (INIS)

    Vihko, R.; Hammond, G.L.; Pakarinen, A.; Viinikka, L.

    1977-01-01

    The combination of the efficient steroid separating properties of a lipophilic Sephadex derivative Lipidex-5000sup(TM), with the use of antibodies with carefully selected specificity allows the quantitative determination of pregnenolone, progesterone, 17α-hydroxyprogesterone, androstenedione, testosterone, 5α-dihydrotestosterone, 5α-androstanedione, androsterone and 5α-androstane-3α, 17β-diol from 1-2 ml samples of blood serum, amniotic fluid or 300-600 mg pieces of prostatic tissue. The adaptation of the pipetting unit and incubator of a discrete clinical chemical analyzer, System Olli 3000, for the automation of the radioimmunoassays has resulted in a greatly increased through-put and decreased experimental error of the procedure. In studies on reproductive endocrinology, the methodology developed has allowed the detection of a sex difference in androgen composition of the amniotic fluid early in pregnancy. Further, it is very likely that the decline in steroid production by the testis seen during the first year of life and then in senescence is affected by basically different mechanisms. There are also important differences in the steroid content of normal, hyperplastic and carcinomatous prostate. (orig.) [de

  15. Single- versus multiple-sample method to measure glomerular filtration rate.

    Science.gov (United States)

    Delanaye, Pierre; Flamant, Martin; Dubourg, Laurence; Vidal-Petiot, Emmanuelle; Lemoine, Sandrine; Cavalier, Etienne; Schaeffner, Elke; Ebert, Natalie; Pottel, Hans

    2018-01-08

    There are many different ways to measure glomerular filtration rate (GFR) using various exogenous filtration markers, each having their own strengths and limitations. However, not only the marker, but also the methodology may vary in many ways, including the use of urinary or plasma clearance, and, in the case of plasma clearance, the number of time points used to calculate the area under the concentration-time curve, ranging from only one (Jacobsson method) to eight (or more) blood samples. We collected the results obtained from 5106 plasma clearances (iohexol or 51Cr-ethylenediaminetetraacetic acid (EDTA)) using three to four time points, allowing GFR calculation using the slope-intercept method and the Bröchner-Mortensen correction. For each time point, the Jacobsson formula was applied to obtain the single-sample GFR. We used Bland-Altman plots to determine the accuracy of the Jacobsson method at each time point. The single-sample method showed within 10% concordances with the multiple-sample method of 66.4%, 83.6%, 91.4% and 96.0% at the time points 120, 180, 240 and ≥300 min, respectively. Concordance was poorer at lower GFR levels, and this trend is in parallel with increasing age. Results were similar in males and females. Some discordance was found in the obese subjects. Single-sample GFR is highly concordant with a multiple-sample strategy, except in the low GFR range (<30 mL/min). © The Author 2018. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  16. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    Science.gov (United States)

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  17. Automated detection of multiple sclerosis lesions in serial brain MRI

    International Nuclear Information System (INIS)

    Llado, Xavier; Ganiler, Onur; Oliver, Arnau; Marti, Robert; Freixenet, Jordi; Valls, Laia; Vilanova, Joan C.; Ramio-Torrenta, Lluis; Rovira, Alex

    2012-01-01

    Multiple sclerosis (MS) is a serious disease typically occurring in the brain whose diagnosis and efficacy of treatment monitoring are vital. Magnetic resonance imaging (MRI) is frequently used in serial brain imaging due to the rich and detailed information provided. Time-series analysis of images is widely used for MS diagnosis and patient follow-up. However, conventional manual methods are time-consuming, subjective, and error-prone. Thus, the development of automated techniques for the detection and quantification of MS lesions is a major challenge. This paper presents an up-to-date review of the approaches which deal with the time-series analysis of brain MRI for detecting active MS lesions and quantifying lesion load change. We provide a comprehensive reference source for researchers in which several approaches to change detection and quantification of MS lesions are investigated and classified. We also analyze the results provided by the approaches, discuss open problems, and point out possible future trends. Lesion detection approaches are required for the detection of static lesions and for diagnostic purposes, while either quantification of detected lesions or change detection algorithms are needed to follow up MS patients. However, there is not yet a single approach that can emerge as a standard for the clinical practice, automatically providing an accurate MS lesion evolution quantification. Future trends will focus on combining the lesion detection in single studies with the analysis of the change detection in serial MRI. (orig.)

  18. Automated detection of multiple sclerosis lesions in serial brain MRI

    Energy Technology Data Exchange (ETDEWEB)

    Llado, Xavier; Ganiler, Onur; Oliver, Arnau; Marti, Robert; Freixenet, Jordi [University of Girona, Computer Vision and Robotics Group, Girona (Spain); Valls, Laia [Dr. Josep Trueta University Hospital, Department of Radiology, Girona (Spain); Vilanova, Joan C. [Girona Magnetic Resonance Center, Girona (Spain); Ramio-Torrenta, Lluis [Dr. Josep Trueta University Hospital, Institut d' Investigacio Biomedica de Girona, Multiple Sclerosis and Neuroimmunology Unit, Girona (Spain); Rovira, Alex [Vall d' Hebron University Hospital, Magnetic Resonance Unit, Department of Radiology, Barcelona (Spain)

    2012-08-15

    Multiple sclerosis (MS) is a serious disease typically occurring in the brain whose diagnosis and efficacy of treatment monitoring are vital. Magnetic resonance imaging (MRI) is frequently used in serial brain imaging due to the rich and detailed information provided. Time-series analysis of images is widely used for MS diagnosis and patient follow-up. However, conventional manual methods are time-consuming, subjective, and error-prone. Thus, the development of automated techniques for the detection and quantification of MS lesions is a major challenge. This paper presents an up-to-date review of the approaches which deal with the time-series analysis of brain MRI for detecting active MS lesions and quantifying lesion load change. We provide a comprehensive reference source for researchers in which several approaches to change detection and quantification of MS lesions are investigated and classified. We also analyze the results provided by the approaches, discuss open problems, and point out possible future trends. Lesion detection approaches are required for the detection of static lesions and for diagnostic purposes, while either quantification of detected lesions or change detection algorithms are needed to follow up MS patients. However, there is not yet a single approach that can emerge as a standard for the clinical practice, automatically providing an accurate MS lesion evolution quantification. Future trends will focus on combining the lesion detection in single studies with the analysis of the change detection in serial MRI. (orig.)

  19. Automation of TL brick dating by ADAM-1

    International Nuclear Information System (INIS)

    Cechak, T.; Gerndt, J.; Hirsl, P.; Jirousek, P.; Kubelik, M.; Musilek, L.; Kanaval, J.

    2000-01-01

    Thermoluminescence has become an established dating method for ceramics and more recently for bricks. Based on the experiences of the work carried out since the late 1970's at the Rathgen-Forschungslabor in Berlin on the dating of bricks from historic architecture, and after evaluating all commercially available and some individually built automated and semi-automated TL-readers, a specially adapted machine for the fine grain dating of bricks was constructed in an interdisciplinary research project, undertaken by a team recruited from three faculties of the Czech Technical University in Prague. The result is the automated TL-reader ADAM-1 (Automated Dating Apparatus for Monuments) for the dating of historic architecture. Both the specific adaptation of the technique and the necessary optimal automation have influenced the design of this TL-reader. The principle advantage of brick as opposed to ceramic TL-dating emerges from the possibility of being able to obtain both a large number of samples and an above average quantity of datable material from each sample. This, together with the specific physical and chemical conditions in a brick wall, allowed a rethinking of the traditional error calculation and thus lower error margins as those obtained when dating ceramic shards. The TL-reader must therefore be able to measure and evaluate automatically numerous samples. The annular sample holder of ADAM-1 has 60 sample positions, which allow the irradiation and evaluation of samples taken from two locations. The thirty samples from one sampling point are divided into subgroups, which are processed in various ways. Three samples serve for a rough estimate of the TL sensitivity of the brick material. Nine samples are used for the measurement of 'natural TL' of the material. A further nine samples are used for testing the sensitivity of the material to beta radiation. The last nine samples serve for the testing of the sensitivity to alpha radiation. To determine the

  20. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.; Denton, M.M.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day

  1. Low-sampling-rate M-ary multiple access UWB communications in multipath channels

    KAUST Repository

    Alkhodary, Mohammad T.

    2015-08-31

    The desirable characteristics of ultra-wideband (UWB) technology are challenged by formidable sampling frequency, performance degradation in the presence of multi-user interference, and complexity of the receiver due to the channel estimation process. In this paper, a low-rate-sampling technique is used to implement M-ary multiple access UWB communications, in both the detection and channel estimation stages. A novel approach is used for multiple-access-interference (MAI) cancelation for the purpose of channel estimation. Results show reasonable performance of the proposed receiver for different number of users operating many times below Nyquist rate.

  2. Low-sampling-rate M-ary multiple access UWB communications in multipath channels

    KAUST Repository

    Alkhodary, Mohammad T.; Ballal, Tarig; Al-Naffouri, Tareq Y.; Muqaibel, Ali H.

    2015-01-01

    The desirable characteristics of ultra-wideband (UWB) technology are challenged by formidable sampling frequency, performance degradation in the presence of multi-user interference, and complexity of the receiver due to the channel estimation process. In this paper, a low-rate-sampling technique is used to implement M-ary multiple access UWB communications, in both the detection and channel estimation stages. A novel approach is used for multiple-access-interference (MAI) cancelation for the purpose of channel estimation. Results show reasonable performance of the proposed receiver for different number of users operating many times below Nyquist rate.

  3. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  5. Re-Emergence of Under-Selected Stimuli, after the Extinction of Over-Selected Stimuli in an Automated Match to Samples Procedure

    Science.gov (United States)

    Broomfield, Laura; McHugh, Louise; Reed, Phil

    2008-01-01

    Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…

  6. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    Science.gov (United States)

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  7. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    Directory of Open Access Journals (Sweden)

    Tongxin Shu

    2017-11-01

    Full Text Available Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA, while achieving around the same Normalized Mean Error (NME, DDASA is superior in saving 5.31% more battery energy.

  8. Manual versus automated streaking system in clinical microbiology laboratory: Performance evaluation of Previ Isola for blood culture and body fluid samples.

    Science.gov (United States)

    Choi, Qute; Kim, Hyun Jin; Kim, Jong Wan; Kwon, Gye Cheol; Koo, Sun Hoe

    2018-01-04

    The process of plate streaking has been automated to improve routine workflow of clinical microbiology laboratories. Although there were many evaluation reports about the inoculation of various body fluid samples, few evaluations have been reported for blood. In this study, we evaluated the performance of automated inoculating system, Previ Isola for various routine clinical samples including blood. Blood culture, body fluid, and urine samples were collected. All samples were inoculated on both sheep blood agar plate (BAP) and MacConkey agar plate (MCK) using Previ Isola and manual method. We compared two methods in aspect of quality and quantity of cultures, and sample processing time. To ensure objective colony counting, an enumeration reading reference was made through a preliminary experiment. A total of 377 nonduplicate samples (102 blood culture, 203 urine, 72 body fluid) were collected and inoculated. The concordance rate of quality was 100%, 97.0%, and 98.6% in blood, urine, and other body fluids, respectively. In quantitative aspect, it was 98.0%, 97.0%, and 95.8%, respectively. The Previ Isola took a little longer to inoculate the specimen than manual method, but the hands-on time decreased dramatically. The shortened hands-on time using Previ Isola was about 6 minutes per 10 samples. We demonstrated that the Previ Isola showed high concordance with the manual method in the inoculation of various body fluids, especially in blood culture sample. The use of Previ Isola in clinical microbiology laboratories is expected to save considerable time and human resources. © 2018 Wiley Periodicals, Inc.

  9. Pharmacokinetic Studies of Chinese Medicinal Herbs Using an Automated Blood Sampling System and Liquid Chromatography-mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Yu-Tse Wu

    2012-01-01

    Full Text Available The safety of herbal products is one of the major concerns for the modernization of traditional Chinese medicine, and pharmacokinetic data of medicinal herbs guide us to design the rational use of the herbal formula. This article reviews the advantages of the automated blood sampling (ABS systems for pharmacokinetic studies. In addition, three commonly used sample preparative methods, protein precipitation, liquid-liquid extraction and solid-phase extraction, are introduced. Furthermore, the definition, causes and evaluation of matrix effects in liquid chromatography-mass spectrometry (LC/MS analysis are demonstrated. Finally, we present our previous works as practical examples of the application of ABS systems and LC/MS for the pharmacokinetic studies of Chinese medicinal herbs.

  10. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    Science.gov (United States)

    Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248

  11. Automated long-term monitoring of parallel microfluidic operations applying a machine vision-assisted positioning method.

    Science.gov (United States)

    Yip, Hon Ming; Li, John C S; Xie, Kai; Cui, Xin; Prasad, Agrim; Gao, Qiannan; Leung, Chi Chiu; Lam, Raymond H W

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  12. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    Directory of Open Access Journals (Sweden)

    Hon Ming Yip

    2014-01-01

    Full Text Available As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  13. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    International Nuclear Information System (INIS)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-01-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes

  14. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Hart, A. John, E-mail: ajhart@mit.edu [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2013-11-15

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  15. Automated track recognition and event reconstruction in nuclear emulsion

    International Nuclear Information System (INIS)

    Deines-Jones, P.; Aranas, A.; Cherry, M.L.; Dugas, J.; Kudzia, D.; Nilsen, B.S.; Sengupta, K.; Waddington, C.J.; Wefel, J.P.; Wilczynska, B.; Wilczynski, H.; Wosiek, B.

    1997-01-01

    The major advantages of nuclear emulsion for detecting charged particles are its submicron position resolution and sensitivity to minimum ionizing particles. These must be balanced, however, against the difficult manual microscope measurement by skilled observers required for the analysis. We have developed an automated system to acquire and analyze the microscope images from emulsion chambers. Each emulsion plate is analyzed independently, allowing coincidence techniques to be used in order to reject background and estimate error rates. The system has been used to analyze a sample of high-multiplicity Pb-Pb interactions (charged particle multiplicities ∝ 1100) produced by the 158 GeV/c per nucleon 208 Pb beam at CERN. Automatically measured events agree with our best manual measurements on 97% of all the tracks. We describe the image analysis and track reconstruction techniques, and discuss the measurement and reconstruction uncertainties. (orig.)

  16. Performance of optimized McRAPD in identification of 9 yeast species frequently isolated from patient samples: potential for automation.

    Science.gov (United States)

    Trtkova, Jitka; Pavlicek, Petr; Ruskova, Lenka; Hamal, Petr; Koukalova, Dagmar; Raclavsky, Vladislav

    2009-11-10

    Rapid, easy, economical and accurate species identification of yeasts isolated from clinical samples remains an important challenge for routine microbiological laboratories, because susceptibility to antifungal agents, probability to develop resistance and ability to cause disease vary in different species. To overcome the drawbacks of the currently available techniques we have recently proposed an innovative approach to yeast species identification based on RAPD genotyping and termed McRAPD (Melting curve of RAPD). Here we have evaluated its performance on a broader spectrum of clinically relevant yeast species and also examined the potential of automated and semi-automated interpretation of McRAPD data for yeast species identification. A simple fully automated algorithm based on normalized melting data identified 80% of the isolates correctly. When this algorithm was supplemented by semi-automated matching of decisive peaks in first derivative plots, 87% of the isolates were identified correctly. However, a computer-aided visual matching of derivative plots showed the best performance with average 98.3% of the accurately identified isolates, almost matching the 99.4% performance of traditional RAPD fingerprinting. Since McRAPD technique omits gel electrophoresis and can be performed in a rapid, economical and convenient way, we believe that it can find its place in routine identification of medically important yeasts in advanced diagnostic laboratories that are able to adopt this technique. It can also serve as a broad-range high-throughput technique for epidemiological surveillance.

  17. Trace element analysis of environmental samples by multiple prompt gamma-ray analysis method

    International Nuclear Information System (INIS)

    Oshima, Masumi; Matsuo, Motoyuki; Shozugawa, Katsumi

    2011-01-01

    The multiple γ-ray detection method has been proved to be a high-resolution and high-sensitivity method in application to nuclide quantification. The neutron prompt γ-ray analysis method is successfully extended by combining it with the γ-ray detection method, which is called Multiple prompt γ-ray analysis, MPGA. In this review we show the principle of this method and its characteristics. Several examples of its application to environmental samples, especially river sediments in the urban area and sea sediment samples are also described. (author)

  18. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  19. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Directory of Open Access Journals (Sweden)

    Kamfai Chan

    Full Text Available Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs. Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  20. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection

    Science.gov (United States)

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A.; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A.; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target’s nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer’s heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  1. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation.   Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  2. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation. Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  3. BOA: Framework for Automated Builds

    CERN Document Server

    Ratnikova, N

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  4. BOA: Framework for automated builds

    International Nuclear Information System (INIS)

    Ratnikova, N.

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions

  5. Automation of Acquisitions at Parkland College

    Directory of Open Access Journals (Sweden)

    Ruth C. Carter

    1972-06-01

    Full Text Available This paper presents a case study of the automation of acquisitions functions at Parkland College. This system, utilizing batch processing, demonstrates that small libraries can develop and support large-scale automated systems at a reasonable cost. In operation since September 1971, it provides machine-generated purchase orders, multiple order cards, budget statements, overdue notices to vendors, and many cataloging by-products. The entire collection, print and nonprint, of the Learning Resource Center is being accumulated gradually into a machine-readable data base.

  6. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    Energy Technology Data Exchange (ETDEWEB)

    Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McBay, Eddy H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-30

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oak Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.

  7. Synthetic Multiple-Imputation Procedure for Multistage Complex Samples

    Directory of Open Access Journals (Sweden)

    Zhou Hanzhi

    2016-03-01

    Full Text Available Multiple imputation (MI is commonly used when item-level missing data are present. However, MI requires that survey design information be built into the imputation models. For multistage stratified clustered designs, this requires dummy variables to represent strata as well as primary sampling units (PSUs nested within each stratum in the imputation model. Such a modeling strategy is not only operationally burdensome but also inferentially inefficient when there are many strata in the sample design. Complexity only increases when sampling weights need to be modeled. This article develops a generalpurpose analytic strategy for population inference from complex sample designs with item-level missingness. In a simulation study, the proposed procedures demonstrate efficient estimation and good coverage properties. We also consider an application to accommodate missing body mass index (BMI data in the analysis of BMI percentiles using National Health and Nutrition Examination Survey (NHANES III data. We argue that the proposed methods offer an easy-to-implement solution to problems that are not well-handled by current MI techniques. Note that, while the proposed method borrows from the MI framework to develop its inferential methods, it is not designed as an alternative strategy to release multiply imputed datasets for complex sample design data, but rather as an analytic strategy in and of itself.

  8. Automated identification and quantification of glycerophospholipid molecular species by multiple precursor ion scanning

    DEFF Research Database (Denmark)

    Ejsing, Christer S.; Duchoslav, Eva; Sampaio, Julio

    2006-01-01

    We report a method for the identification and quantification of glycerophospholipid molecular species that is based on the simultaneous automated acquisition and processing of 41 precursor ion spectra, specific for acyl anions of common fatty acids moieties and several lipid class-specific fragment...... of glycerophospholipids. The automated analysis of total lipid extracts was powered by a robotic nanoflow ion source and produced currently the most detailed description of the glycerophospholipidome....

  9. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  10. Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.

    Science.gov (United States)

    Price, W R; Olsen, R A; Hunter, J E

    1972-04-01

    A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.

  11. Efficient computation of the joint sample frequency spectra for multiple populations.

    Science.gov (United States)

    Kamm, John A; Terhorst, Jonathan; Song, Yun S

    2017-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.

  12. 32P-postlabeling assay for carcinogen-DNA adducts: description of beta shielding apparatus and semi-automatic spotting and washing devices that facilitate the handling of multiple samples

    International Nuclear Information System (INIS)

    Reddy, M.V.; Blackburn, G.R.

    1990-01-01

    The utilization of the 32 P-postlabeling assay in combination with TLC for the sensitive detection and estimation of aromatic DNA adducts has been increasing. The procedure consists of 32 P-labeling of carcinogen-adducted 3'-nucleotides in the DNA digests using γ- 32 P ATP and polynucleotide kinase, separation of 32 P-labeled adducts by TLC, and their detection by autoradiography. During both 32 P-labeling and initial phases of TLC, a relatively high amount of γ- 32 P ATP is handled when 30 samples are processed simultaneously. We describe the design of acrylic shielding apparatus, semi-automatic TLC spotting devices, and devices for development and washing of multiple TLC plates, which not only provide substantial protection from exposure to 32 P beta radiation, but also allow quick and easy handling of a large number of samples. Specifically, the equipment includes: (i) a multi-tube carousel rack having 15 wells to hold capless Eppendorf tubes and a rotatable lid with an aperture to access individual tubes; (ii) a pipette shielder; (iii) two semi-automatic spotting devices to apply radioactive solutions to TLC plates; (iv) a multi-plate holder for TLC plates; and (v) a mechanical device for washing multiple TLC plates. Item (i) is small enough to be held in one-hand, vortexed, and centrifuged to mix the solutions in each tube while beta radiation is shielded. Items (iii) to (iv) aid in the automation of the assay. (author)

  13. Reactor pressure vessel stud management automation strategies

    International Nuclear Information System (INIS)

    Biach, W.L.; Hill, R.; Hung, K.

    1992-01-01

    The adoption of hydraulic tensioner technology as the standard for bolting and unbolting the reactor pressure vessel (RPV) head 35 yr ago represented an incredible commitment to new technology, but the existing technology was so primitive as to be clearly unacceptable. Today, a variety of approaches for improvement make the decision more difficult. Automation in existing installations must meet complex physical, logistic, and financial parameters while addressing the demands of reduced exposure, reduced critical path, and extended plant life. There are two generic approaches to providing automated RPV stud engagement and disengagement: the multiple stud tensioner and automated individual tools. A variation of the latter would include the handling system. Each has its benefits and liabilities

  14. Non-destructive isotopic uranium assay by multiple delayed neutron measurements

    International Nuclear Information System (INIS)

    Papadopoulos, N.N.; Tsagas, N.F.

    1991-01-01

    The high accuracy and precision required in nuclear safeguards measurements can be achieved by an improved neutron activation technique based on multiple delayed fission neutron counting under various experimental conditions. For the necessary ultrahigh counting statistics required, cyclic activation of multiple subsamples has been applied. The home-made automated flexible analytical system with neutron flux and spectrum differentiation by irradiation position adjustment and cadmium screening, permits the non-destructive determination of the U235 abundance and the total U element concentration needed in nuclear safeguards sample analysis, with a high throughout and a low operational cost. Careful experimental optimization led to considerable improvement of the results

  15. Economic and workflow analysis of a blood bank automated system.

    Science.gov (United States)

    Shin, Kyung-Hwa; Kim, Hyung Hoi; Chang, Chulhun L; Lee, Eun Yup

    2013-07-01

    This study compared the estimated costs and times required for ABO/Rh(D) typing and unexpected antibody screening using an automated system and manual methods. The total cost included direct and labor costs. Labor costs were calculated on the basis of the average operator salaries and unit values (minutes), which was the hands-on time required to test one sample. To estimate unit values, workflows were recorded on video, and the time required for each process was analyzed separately. The unit values of ABO/Rh(D) typing using the manual method were 5.65 and 8.1 min during regular and unsocial working hours, respectively. The unit value was less than 3.5 min when several samples were tested simultaneously. The unit value for unexpected antibody screening was 2.6 min. The unit values using the automated method for ABO/Rh(D) typing, unexpected antibody screening, and both simultaneously were all 1.5 min. The total cost of ABO/Rh(D) typing of only one sample using the automated analyzer was lower than that of testing only one sample using the manual technique but higher than that of testing several samples simultaneously. The total cost of unexpected antibody screening using an automated analyzer was less than that using the manual method. ABO/Rh(D) typing using an automated analyzer incurs a lower unit value and cost than that using the manual technique when only one sample is tested at a time. Unexpected antibody screening using an automated analyzer always incurs a lower unit value and cost than that using the manual technique.

  16. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    Energy Technology Data Exchange (ETDEWEB)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Andruch, Vasil, E-mail: vasil.andruch@upjs.sk [Department of Analytical Chemistry, University of P.J. Šafárik, SK-04154 Košice (Slovakia); Moskvin, Leonid [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Bulatov, Andrey, E-mail: bulatov_andrey@mail.ru [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation)

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L{sup −1} Na{sub 2}CO{sub 3}) and the proton donor solution (1 mol L{sup −1} CH{sub 3}COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min{sup −1} during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L{sup −1} of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L{sup −1}. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  17. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    International Nuclear Information System (INIS)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L"−"1 Na_2CO_3) and the proton donor solution (1 mol L"−"1 CH_3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min"−"1 during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L"−"1 of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L"−"1. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  18. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  19. The Impact of Automation Reliability and Operator Fatigue on Performance and Reliance

    Science.gov (United States)

    2016-09-23

    Cummings et al., 2007). Automation designed to assist operators in overload situations may promote operator disengagement during periods of low...Calhoun et al., 2011). This testbed offers several tasks designed to emulate the cognitive demands that an operator managing multiple UAVs is likely...reliable (Cronbach’s α = 0.94) measure of affective and cognitive components of trust in automation. Items gauge confidence in an automation and

  20. Automated Processing of Plasma Samples for Lipoprotein Separation by Rate-Zonal Ultracentrifugation.

    Science.gov (United States)

    Peters, Carl N; Evans, Iain E J

    2016-12-01

    Plasma lipoproteins are the primary means of lipid transport among tissues. Defining alterations in lipid metabolism is critical to our understanding of disease processes. However, lipoprotein measurement is limited to specialized centers. Preparation for ultracentrifugation involves the formation of complex density gradients that is both laborious and subject to handling errors. We created a fully automated device capable of forming the required gradient. The design has been made freely available for download by the authors. It is inexpensive relative to commercial density gradient formers, which generally create linear gradients unsuitable for rate-zonal ultracentrifugation. The design can easily be modified to suit user requirements and any potential future improvements. Evaluation of the device showed reliable peristaltic pump accuracy and precision for fluid delivery. We also demonstrate accurate fluid layering with reduced mixing at the gradient layers when compared to usual practice by experienced laboratory personnel. Reduction in layer mixing is of critical importance, as it is crucial for reliable lipoprotein separation. The automated device significantly reduces laboratory staff input and reduces the likelihood of error. Overall, this device creates a simple and effective solution to formation of complex density gradients. © 2015 Society for Laboratory Automation and Screening.

  1. Inventory management and reagent supply for automated chemistry.

    Science.gov (United States)

    Kuzniar, E

    1999-08-01

    Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.

  2. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  3. Automation of the radiation measuring facilities for samples in health physics - MA 9

    International Nuclear Information System (INIS)

    Martini, M.

    1980-12-01

    Routine radation measurements of samples are performed by the HMI health physics department by means of test stations for individual samples and multiple samples (using a changing equipment). The basic device of these test stations is a SCALER/TIMER system (BF 22/25, BERTHOLD Corp.). This measuring facility has been extended by a CAMAC intrumentation which incorporates an autonomous CAMAC processor (CAPRO-1, INCAA B.V.) for monitoring an automatic control of the system. The programming language is BASIC. A DECwriter (LA 34) is used for user interaction and for printing the measurement results. This report describes the features of this system and present some examples of, the dialogue with the system and the printout of data. (orig.) [de

  4. A multiple sampling ionization chamber for the External Target Facility

    International Nuclear Information System (INIS)

    Zhang, X.H.; Tang, S.W.; Ma, P.; Lu, C.G.; Yang, H.R.; Wang, S.T.; Yu, Y.H.; Yue, K.; Fang, F.; Yan, D.; Zhou, Y.; Wang, Z.M.; Sun, Y.; Sun, Z.Y.; Duan, L.M.; Sun, B.H.

    2015-01-01

    A multiple sampling ionization chamber used as a particle identification device for high energy heavy ions has been developed for the External Target Facility. The performance of this detector was tested with a 239 Pu α source and RI beams. A Z resolution (FWHM) of 0.4–0.6 was achieved for nuclear fragments of 18 O at 400 AMeV

  5. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1981-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  6. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.

  7. The Automation-by-Expertise-by-Training Interaction.

    Science.gov (United States)

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  8. Automated multi-lesion detection for referable diabetic retinopathy in indigenous health care.

    Science.gov (United States)

    Pires, Ramon; Carvalho, Tiago; Spurling, Geoffrey; Goldenstein, Siome; Wainer, Jacques; Luckie, Alan; Jelinek, Herbert F; Rocha, Anderson

    2015-01-01

    Diabetic Retinopathy (DR) is a complication of diabetes mellitus that affects more than one-quarter of the population with diabetes, and can lead to blindness if not discovered in time. An automated screening enables the identification of patients who need further medical attention. This study aimed to classify retinal images of Aboriginal and Torres Strait Islander peoples utilizing an automated computer-based multi-lesion eye screening program for diabetic retinopathy. The multi-lesion classifier was trained on 1,014 images from the São Paulo Eye Hospital and tested on retinal images containing no DR-related lesion, single lesions, or multiple types of lesions from the Inala Aboriginal and Torres Strait Islander health care centre. The automated multi-lesion classifier has the potential to enhance the efficiency of clinical practice delivering diabetic retinopathy screening. Our program does not necessitate image samples for training from any specific ethnic group or population being assessed and is independent of image pre- or post-processing to identify retinal lesions. In this Aboriginal and Torres Strait Islander population, the program achieved 100% sensitivity and 88.9% specificity in identifying bright lesions, while detection of red lesions achieved a sensitivity of 67% and specificity of 95%. When both bright and red lesions were present, 100% sensitivity with 88.9% specificity was obtained. All results obtained with this automated screening program meet WHO standards for diabetic retinopathy screening.

  9. Improvement of Vivarium Biodecontamination through Data-acquisition Systems and Automation.

    Science.gov (United States)

    Devan, Shakthi Rk; Vasu, Suresh; Mallikarjuna, Yogesha; Ponraj, Ramkumar; Kamath, Gireesh; Poosala, Suresh

    2018-03-01

    Biodecontamination is important for eliminating pathogens at research animal facilities, thereby preventing contamination within barrier systems. We enhanced our facility's standard biodecontamination method to replace the traditional foggers, and the new system was used effectively after creating bypass ducts in HVAC units so that individual rooms could be isolated. The entire system was controlled by inhouse-developed supervisory control and data-acquisition software that supported multiple cycles of decontamination by equipment, which had different decontamination capacities, operated in parallel, and used different agents, including H2O2 vapor and ClO2 gas. The process was validated according to facility mapping, and effectiveness was assessed by using biologic (Geobacillus stearothermophilus) and chemical indicator strips, which were positioned before decontamination, and by sampling contact plates after the completion of each cycle. The results of biologic indicators showed 6-log reduction in microbial counts after successful decontamination cycles for both agents and found to be compatible with clean-room panels including commonly used materials in vivarium such as racks, cages, trolleys, cage changing stations, biosafety cabinets, refrigerators and other equipment in both procedure and animal rooms. In conclusion, the automated process enabled users to perform effective decontamination through multiple cycles with realtime documentation and provided additional capability to deal with potential outbreaks. Enabling software integration of automation improved quality-control systems in our vivarium.

  10. Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.

    Directory of Open Access Journals (Sweden)

    Andrew Cron

    Full Text Available Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less. Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing enrichment, and the ability to align cell subsets across multiple data samples for comparative analysis. In this manuscript, we develop hierarchical modeling extensions to the Dirichlet Process Gaussian Mixture Model (DPGMM approach we have previously described for cell subset identification, and show that the hierarchical DPGMM (HDPGMM naturally generates an aligned data model that captures both commonalities and variations across multiple samples. HDPGMM also increases the sensitivity to extremely low frequency events by sharing information across multiple samples analyzed simultaneously. We validate the accuracy and reproducibility of HDPGMM estimates of antigen-specific T cells on clinically relevant reference peripheral blood mononuclear cell (PBMC samples with known frequencies of antigen-specific T cells. These cell samples take advantage of retrovirally TCR-transduced T cells spiked into autologous PBMC samples to give a defined number of antigen-specific T cells detectable by HLA-peptide multimer binding. We provide open source software that can take advantage of both multiple processors and GPU-acceleration to perform the numerically-demanding computations. We show that hierarchical modeling is a useful probabilistic approach that can provide a

  11. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  12. System-wide versus component-specific trust using multiple aids.

    Science.gov (United States)

    Keller, David; Rice, Stephen

    2010-01-01

    Previous research in operator trust toward automated aids has focused primarily on single aids. The current study focuses on how operator trust is affected by the presence of multiple aids. Two competing theories of multiple-trust are presented. A component-specific trust theory predicts that operators will differentially place their trust in automated aids that vary in reliability. A system-wide trust theory predicts that operators will treat multiple imperfect aids as one "system" and merge their trust across aids despite differences in the aids' reliability. A simulated flight task was used to test these theories, whereby operators performed a pursuit tracking task while concurrently monitoring multiple system gauges that were augmented with perfect or imperfect automated aids. The data revealed that a system-wide trust theory best predicted the data; operators merged their trust across both aids, behaving toward a perfectly reliable aid in the same manner as they did towards unreliable aids.

  13. Localized Multiple Kernel Learning Via Sample-Wise Alternating Optimization.

    Science.gov (United States)

    Han, Yina; Yang, Kunde; Ma, Yuanliang; Liu, Guizhong

    2014-01-01

    Our objective is to train support vector machines (SVM)-based localized multiple kernel learning (LMKL), using the alternating optimization between the standard SVM solvers with the local combination of base kernels and the sample-specific kernel weights. The advantage of alternating optimization developed from the state-of-the-art MKL is the SVM-tied overall complexity and the simultaneous optimization on both the kernel weights and the classifier. Unfortunately, in LMKL, the sample-specific character makes the updating of kernel weights a difficult quadratic nonconvex problem. In this paper, starting from a new primal-dual equivalence, the canonical objective on which state-of-the-art methods are based is first decomposed into an ensemble of objectives corresponding to each sample, namely, sample-wise objectives. Then, the associated sample-wise alternating optimization method is conducted, in which the localized kernel weights can be independently obtained by solving their exclusive sample-wise objectives, either linear programming (for l1-norm) or with closed-form solutions (for lp-norm). At test time, the learnt kernel weights for the training data are deployed based on the nearest-neighbor rule. Hence, to guarantee their generality among the test part, we introduce the neighborhood information and incorporate it into the empirical loss when deriving the sample-wise objectives. Extensive experiments on four benchmark machine learning datasets and two real-world computer vision datasets demonstrate the effectiveness and efficiency of the proposed algorithm.

  14. Streamlining and automation of radioanalytical methods at a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, J.T.; Dillard, J.W. [IT Corp., Knoxville, TN (United States)

    1993-12-31

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed.

  15. Streamlining and automation of radioanalytical methods at a commercial laboratory

    International Nuclear Information System (INIS)

    Harvey, J.T.; Dillard, J.W.

    1993-01-01

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed

  16. Automated Liquibase Generator And ValidatorALGV

    Directory of Open Access Journals (Sweden)

    Manik Jain

    2015-08-01

    Full Text Available Abstract This paper presents an automation tool namely ALGV Automated Liquibase Generator and Validator for the automated generation and verification of liquibase scripts. Liquibase is one of the most efficient ways of applying and persisting changes to a database schema. Since its invention by Nathan Voxland 1 it has become de facto standard for database change management. The advantages of using liquibase scripts over traditional sql queries ranges from version control to reusing the same scripts over multiple database platforms. Irrespective of its advantages manual creation of liquibase scripts takes a lot of effort and sometimes is error-prone. ALGV helps to reduce the time consuming liquibase script generation manual typing efforts possible error occurrence and manual verification process and time by 75. Automating the liquibase generation process also helps to remove the burden of recollecting specific tags to be used for a particular change. Moreover developers can concentrate on the business logic and business data rather than wasting their precious efforts in writing files.

  17. A completely automated flow, heat-capacity, calorimeter for use at high temperatures and pressures

    Science.gov (United States)

    Rogers, P. S. Z.; Sandarusi, Jamal

    1990-11-01

    An automated, flow calorimeter has been constructed to measure the isobaric heat capacities of concentrated, aqueous electrolyte solutions using a differential calorimetry technique. The calorimeter is capable of operation to 700 K and 40 MPa with a measurement accuracy of 0.03% relative to the heat capacity of the pure reference fluid (water). A novel design encloses the calorimeter within a double set of separately controlled, copper, adiabatic shields that minimize calorimeter heat losses and precisely control the temperature of the inlet fluids. A multistage preheat train, used to efficiently heat the flowing fluid, includes a counter-current heat exchanger for the inlet and outlet fluid streams in tandem with two calorimeter preheaters. Complete system automation is accomplished with a distributed control scheme using multiple processors, allowing the major control tasks of calorimeter operation and control, data logging and display, and pump control to be performed simultaneously. A sophisticated pumping strategy for the two separate syringe pumps allows continuous fluid delivery. This automation system enables the calorimeter to operate unattended except for the reloading of sample fluids. In addition, automation has allowed the development and implementation of an improved heat loss calibration method that provides calorimeter calibration with absolute accuracy comparable to the overall measurement precision, even for very concentrated solutions.

  18. Analysis of stationary power/amplitude distributions for multiple channels of sampled FBGs.

    Science.gov (United States)

    Xing, Ya; Zou, Xihua; Pan, Wei; Yan, Lianshan; Luo, Bin; Shao, Liyang

    2015-08-10

    Stationary power/amplitude distributions for multiple channels of the sampled fiber Bragg grating (SFBG) along the grating length are analyzed. Unlike a uniform FBG, the SFBG has multiple channels in the reflection spectrum, not a single channel. Thus, the stationary power/amplitude distributions for these multiple channels are analyzed by using two different theoretical models. In the first model, the SFBG is regarded as a set of grating sections and non-grating sections, which are alternately stacked. A step-like distribution is obtained for the corresponding power/amplitude of each channel along the grating length. While, in the second model, the SFBG is decomposed into multiple uniform "ghost" gratings, and a continuous distribution is obtained for each ghost grating (i.e., each channel). After a comparison, the distributions obtained in the two models are identical, and the equivalence between the two models is demonstrated. In addition, the impacts of the duty cycle on the power/amplitude distributions of multiple channels of SFBG are presented.

  19. Automated nutrient analyses in seawater

    Energy Technology Data Exchange (ETDEWEB)

    Whitledge, T.E.; Malloy, S.C.; Patton, C.J.; Wirick, C.D.

    1981-02-01

    This manual was assembled for use as a guide for analyzing the nutrient content of seawater samples collected in the marine coastal zone of the Northeast United States and the Bering Sea. Some modifications (changes in dilution or sample pump tube sizes) may be necessary to achieve optimum measurements in very pronounced oligotrophic, eutrophic or brackish areas. Information is presented under the following section headings: theory and mechanics of automated analysis; continuous flow system description; operation of autoanalyzer system; cookbook of current nutrient methods; automated analyzer and data analysis software; computer interfacing and hardware modifications; and trouble shooting. The three appendixes are entitled: references and additional reading; manifold components and chemicals; and software listings. (JGB)

  20. Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.

    Science.gov (United States)

    Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A

    2011-04-01

    Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Automated oil spill detection with multispectral imagery

    Science.gov (United States)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  2. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    Science.gov (United States)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  3. [DNA Extraction from Old Bones by AutoMate Express™ System].

    Science.gov (United States)

    Li, B; Lü, Z

    2017-08-01

    To establish a method for extracting DNA from old bones by AutoMate Express™ system. Bones were grinded into powder by freeze-mill. After extraction by AutoMate Express™, DNA were amplified and genotyped by Identifiler®Plus and MinFiler™ kits. DNA were extracted from 10 old bone samples, which kept in different environments with the postmortem interval from 10 to 20 years, in 3 hours by AutoMate Express™ system. Complete STR typing results were obtained from 8 samples. AutoMate Express™ system can quickly and efficiently extract DNA from old bones, which can be applied in forensic practice. Copyright© by the Editorial Department of Journal of Forensic Medicine

  4. Extensive monitoring through multiple blood samples in professional soccer players

    DEFF Research Database (Denmark)

    Heisterberg, Mette F; Fahrenkrug, Jan; Krustrup, Peter

    2013-01-01

    of the season. Leucocytes decreased with increased physical training. Lymphocytes decreased at the end of the season. VO2max decreased towards the end of the season whereas no significant changes were observed in the IE2 test.The regular blood samples from elite soccer players reveal significant changes......ABSTRACT: The aim of this study was to make a comprehensive gathering of consecutive detailed blood samples from professional soccer players, and to analyze different blood parameters in relation to seasonal changes in training and match exposure.Blood samples were collected five times during a six...... months period and analyzed for 37 variables in 27 professional soccer players from the best Danish league. Additionally, players were tested for body composition, VO2max and physical performance by the Yo-Yo intermittent endurance sub-max test (IE2).Multiple variations in blood parameters occurred during...

  5. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  6. Water-quality assessment of south-central Texas : comparison of water quality in surface-water samples collected manually and by automated samplers

    Science.gov (United States)

    Ging, Patricia B.

    1999-01-01

    Surface-water sampling protocols of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program specify samples for most properties and constituents to be collected manually in equal-width increments across a stream channel and composited for analysis. Single-point sampling with an automated sampler (autosampler) during storms was proposed in the upper part of the South-Central Texas NAWQA study unit, raising the question of whether property and constituent concentrations from automatically collected samples differ significantly from those in samples collected manually. Statistical (Wilcoxon signed-rank test) analyses of 3 to 16 paired concentrations for each of 26 properties and constituents from water samples collected using both methods at eight sites in the upper part of the study unit indicated that there were no significant differences in concentrations for dissolved constituents, other than calcium and organic carbon.

  7. UAV : Warnings From Multiple Automated Static Analysis Tools At A Glance

    NARCIS (Netherlands)

    Buckers, T.B.; Cao, C.S.; Doesburg, M.S.; Gong, Boning; Wang, Sunwei; Beller, M.M.; Zaidman, A.E.; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Automated Static Analysis Tools (ASATs) are an integral part of today’s software quality assurance practices. At present, a plethora of ASATs exist, each with different strengths. However, there is little guidance for developers on which of these ASATs to choose and combine for a project. As a

  8. Automatic remote sampling and delivery system incorporating decontamination and disposal of sample bottles

    International Nuclear Information System (INIS)

    Savarkar, V.K.; Mishra, A.K.; Bajpai, D.D.; Nair, M.K.T.

    1990-01-01

    The present generation of reprocessing plants have sampling and delivery systems that have to be operated manually with its associated problems. The complete automation and remotisation of sampling system has hence been considered to reduce manual intervention and personnel exposure. As a part of this scheme an attempt to automate and remotise various steps in sampling system has been made. This paper discusses in detail the development work carried out in this area as well as the tests conducted to incorporate the same in the existing plants. (author). 3 figs

  9. Automated biosurveillance data from England and Wales, 1991-2011.

    Science.gov (United States)

    Enki, Doyo G; Noufaily, Angela; Garthwaite, Paul H; Andrews, Nick J; Charlett, André; Lane, Chris; Farrington, C Paddy

    2013-01-01

    Outbreak detection systems for use with very large multiple surveillance databases must be suited both to the data available and to the requirements of full automation. To inform the development of more effective outbreak detection algorithms, we analyzed 20 years of data (1991-2011) from a large laboratory surveillance database used for outbreak detection in England and Wales. The data relate to 3,303 distinct types of infectious pathogens, with a frequency range spanning 6 orders of magnitude. Several hundred organism types were reported each week. We describe the diversity of seasonal patterns, trends, artifacts, and extra-Poisson variability to which an effective multiple laboratory-based outbreak detection system must adjust. We provide empirical information to guide the selection of simple statistical models for automated surveillance of multiple organisms, in the light of the key requirements of such outbreak detection systems, namely, robustness, flexibility, and sensitivity.

  10. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.; Beugelsdijk, T.J.

    1992-01-01

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  11. Automated dispersive liquid-liquid microextraction coupled to high performance liquid chromatography - cold vapour atomic fluorescence spectroscopy for the determination of mercury species in natural water samples.

    Science.gov (United States)

    Liu, Yao-Min; Zhang, Feng-Ping; Jiao, Bao-Yu; Rao, Jin-Yu; Leng, Geng

    2017-04-14

    An automated, home-constructed, and low cost dispersive liquid-liquid microextraction (DLLME) device that directly coupled to a high performance liquid chromatography (HPLC) - cold vapour atomic fluorescence spectroscopy (CVAFS) system was designed and developed for the determination of trace concentrations of methylmercury (MeHg + ), ethylmercury (EtHg + ) and inorganic mercury (Hg 2+ ) in natural waters. With a simple, miniaturized and efficient automated DLLME system, nanogram amounts of these mercury species were extracted from natural water samples and injected into a hyphenated HPLC-CVAFS for quantification. The complete analytical procedure, including chelation, extraction, phase separation, collection and injection of the extracts, as well as HPLC-CVAFS quantification, was automated. Key parameters, such as the type and volume of the chelation, extraction and dispersive solvent, aspiration speed, sample pH, salt effect and matrix effect, were thoroughly investigated. Under the optimum conditions, linear range was 10-1200ngL -1 for EtHg + and 5-450ngL -1 for MeHg + and Hg 2+ . Limits of detection were 3.0ngL -1 for EtHg + and 1.5ngL -1 for MeHg + and Hg 2+ . Reproducibility and recoveries were assessed by spiking three natural water samples with different Hg concentrations, giving recoveries from 88.4-96.1%, and relative standard deviations <5.1%. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    Science.gov (United States)

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  13. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    to point distance. T-test for common mean are used to determine the performance of the two methods (supported by a Wilcoxon signed rank test). The performance influence of sampling density, sampling quantity, and norms is analyzed using a similar method.......We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore...

  14. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  15. Automated Groundwater Monitoring of Uranium at the Hanford Site, Washington - 13116

    Energy Technology Data Exchange (ETDEWEB)

    Burge, Scott R. [Burge Environmental, Inc., 6100 South Maple Avenue, no. 114, Tempe, AZ, 85283 (United States); O' Hara, Matthew J. [Pacific Northwest National Laboratory, 902 Battelle Blvd., Richland, WA, 99352 (United States)

    2013-07-01

    An automated groundwater monitoring system for the detection of uranyl ion in groundwater was deployed at the 300 Area Industrial Complex, Hanford Site, Washington. The research was conducted to determine if at-site, automated monitoring of contaminant movement in the subsurface is a viable alternative to the baseline manual sampling and analytical laboratory assay methods currently employed. The monitoring system used Arsenazo III, a colorimetric chelating compound, for the detection of the uranyl ion. The analytical system had a limit of quantification of approximately 10 parts per billion (ppb, μg/L). The EPA's drinking water maximum contaminant level (MCL) is 30 ppb [1]. In addition to the uranyl ion assay, the system was capable of acquiring temperature, conductivity, and river level data. The system was fully automated and could be operated remotely. The system was capable of collecting water samples from four sampling sources, quantifying the uranyl ion, and periodically performing a calibration of the analytical cell. The system communications were accomplished by way of cellular data link with the information transmitted through the internet. Four water sample sources were selected for the investigation: one location provided samples of Columbia River water, and the remaining three sources provided groundwater from aquifer sampling tubes positioned in a vertical array at the Columbia River shoreline. The typical sampling schedule was to sample the four locations twice per day with one calibration check per day. This paper outlines the instrumentation employed, the operation of the instrumentation, and analytical results for a period of time between July and August, 2012. The presentation includes the uranyl ion concentration and conductivity results from the automated sampling/analysis system, along with a comparison between the automated monitor's analytical performance and an independent laboratory analysis. Benefits of using the automated

  16. Automated Groundwater Monitoring of Uranium at the Hanford Site, Washington - 13116

    International Nuclear Information System (INIS)

    Burge, Scott R.; O'Hara, Matthew J.

    2013-01-01

    An automated groundwater monitoring system for the detection of uranyl ion in groundwater was deployed at the 300 Area Industrial Complex, Hanford Site, Washington. The research was conducted to determine if at-site, automated monitoring of contaminant movement in the subsurface is a viable alternative to the baseline manual sampling and analytical laboratory assay methods currently employed. The monitoring system used Arsenazo III, a colorimetric chelating compound, for the detection of the uranyl ion. The analytical system had a limit of quantification of approximately 10 parts per billion (ppb, μg/L). The EPA's drinking water maximum contaminant level (MCL) is 30 ppb [1]. In addition to the uranyl ion assay, the system was capable of acquiring temperature, conductivity, and river level data. The system was fully automated and could be operated remotely. The system was capable of collecting water samples from four sampling sources, quantifying the uranyl ion, and periodically performing a calibration of the analytical cell. The system communications were accomplished by way of cellular data link with the information transmitted through the internet. Four water sample sources were selected for the investigation: one location provided samples of Columbia River water, and the remaining three sources provided groundwater from aquifer sampling tubes positioned in a vertical array at the Columbia River shoreline. The typical sampling schedule was to sample the four locations twice per day with one calibration check per day. This paper outlines the instrumentation employed, the operation of the instrumentation, and analytical results for a period of time between July and August, 2012. The presentation includes the uranyl ion concentration and conductivity results from the automated sampling/analysis system, along with a comparison between the automated monitor's analytical performance and an independent laboratory analysis. Benefits of using the automated system as an

  17. Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2007-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse-height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance. (author)

  18. Quantification of multiple elements in dried blood spot samples

    DEFF Research Database (Denmark)

    Pedersen, Lise; Andersen-Ranberg, Karen; Hollergaard, Mads

    2017-01-01

    BACKGROUND: Dried blood spots (DBS) is a unique matrix that offers advantages compared to conventional blood collection making it increasingly popular in large population studies. We here describe development and validation of a method to determine multiple elements in DBS. METHODS: Elements were...... in venous blood. Samples with different hematocrit were spotted onto filter paper to assess hematocrit effect. RESULTS: The established method was precise and accurate for measurement of most elements in DBS. There was a significant but relatively weak correlation between measurement of the elements Mg, K...

  19. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    Science.gov (United States)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  20. A Fast Multiple Sampling Method for Low-Noise CMOS Image Sensors With Column-Parallel 12-bit SAR ADCs

    Directory of Open Access Journals (Sweden)

    Min-Kyu Kim

    2015-12-01

    Full Text Available This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs. The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB.

  1. Computerized automated remote inspection system

    International Nuclear Information System (INIS)

    The automated inspection system utilizes a computer to control the location of the ultrasonic transducer, the actual inspection process, the display of the data, and the storage of the data on IBM magnetic tape. This automated inspection equipment provides two major advantages. First, it provides a cost savings, because of the reduced inspection time, made possible by the automation of the data acquisition, processing, and storage equipment. This reduced inspection time is also made possible by a computerized data evaluation aid which speeds data interpretation. In addition, the computer control of the transducer location drive allows the exact duplication of a previously located position or flaw. The second major advantage is that the use of automated inspection equipment also allows a higher-quality inspection, because of the automated data acquisition, processing, and storage. This storage of data, in accurate digital form on IBM magnetic tape, for example, facilitates retrieval for comparison with previous inspection data. The equipment provides a multiplicity of scan data which will provide statistical information on any questionable volume or flaw. An automatic alarm for location of all reportable flaws reduces the probability of operator error. This system has the ability to present data on a cathode ray tube as numerical information, a three-dimensional picture, or ''hard-copy'' sheet. One important advantage of this system is the ability to store large amounts of data in compact magnetic tape reels

  2. Peripheral refractive correction and automated perimetric profiles.

    Science.gov (United States)

    Wild, J M; Wood, J M; Crews, S J

    1988-06-01

    The effect of peripheral refractive error correction on the automated perimetric sensitivity profile was investigated on a sample of 10 clinically normal, experienced observers. Peripheral refractive error was determined at eccentricities of 0 degree, 20 degrees and 40 degrees along the temporal meridian of the right eye using the Canon Autoref R-1, an infra-red automated refractor, under the parametric conditions of the Octopus automated perimeter. Perimetric sensitivity was then undertaken at these eccentricities (stimulus sizes 0 and III) with and without the appropriate peripheral refractive correction using the Octopus 201 automated perimeter. Within the measurement limits of the experimental procedures employed, perimetric sensitivity was not influenced by peripheral refractive correction.

  3. The hybrid model for sampling multiple elastic scattering angular deflections based on Goudsmit-Saunderson theory

    Directory of Open Access Journals (Sweden)

    Wasaye Muhammad Abdul

    2017-01-01

    Full Text Available An algorithm for the Monte Carlo simulation of electron multiple elastic scattering based on the framework of SuperMC (Super Monte Carlo simulation program for nuclear and radiation process is presented. This paper describes efficient and accurate methods by which the multiple scattering angular deflections are sampled. The Goudsmit-Saunderson theory of multiple scattering has been used for sampling angular deflections. Differential cross-sections of electrons and positrons by neutral atoms have been calculated by using Dirac partial wave program ELSEPA. The Legendre coefficients are accurately computed by using the Gauss-Legendre integration method. Finally, a novel hybrid method for sampling angular distribution has been developed. The model uses efficient rejection sampling method for low energy electrons (500 mean free paths. For small path lengths, a simple, efficient and accurate analytical distribution function has been proposed. The later uses adjustable parameters determined from the fitting of Goudsmith-Saunderson angular distribution. A discussion of the sampling efficiency and accuracy of this newly developed algorithm is given. The efficiency of rejection sampling algorithm is at least 50 % for electron kinetic energies less than 500 keV and longer path lengths (>500 mean free paths. Monte Carlo Simulation results are then compared with measured angular distributions of Ross et al. The comparison shows that our results are in good agreement with experimental measurements.

  4. Formal Test Automation: A Simple Experiment

    NARCIS (Netherlands)

    Belinfante, Axel; Feenstra, J.; de Vries, R.G.; Tretmans, G.J.; Goga, N.; Feijs, Loe; Mauw, Sjouke; Heerink, A.W.; Csopaki, Gyula; Dibuz, Sarolta; Tarnay, Katalin

    1999-01-01

    In this paper1 we study the automation of test derivation and execution in the area of conformance testing. The test scenarios are derived from multiple specication languages: LOTOS, Promela and SDL. A central theme of this study is the usability of batch-oriented and on-the-fly testing approaches.

  5. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  6. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    Science.gov (United States)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  7. New automated pellet/powder assay system

    International Nuclear Information System (INIS)

    Olsen, R.N.

    1975-01-01

    This paper discusses an automated, high precision, pellet/ powder assay system. The system is an active assay system using a small isotopic neutron source and a coincidence detection system. The handling of the pellet powder samples has been automated and a programmable calculator has been integrated into the system to provide control and data analysis. The versatile system can assay uranium or plutonium in either active or passive modes

  8. Accelerated solvent extraction (ASE) - a fast and automated technique with low solvent consumption for the extraction of solid samples (T12)

    International Nuclear Information System (INIS)

    Hoefler, F.

    2002-01-01

    Full text: Accelerated solvent extraction (ASE) is a modern extraction technique that significantly streamlines sample preparation. A common organic solvent as well as water is used as extraction solvent at elevated temperature and pressure to increase extraction speed and efficiency. The entire extraction process is fully automated and performed within 15 minutes with a solvent consumption of 18 ml for a 10 g sample. For many matrices and for a variety of solutes, ASE has proven to be equivalent or superior to sonication, Soxhlet, and reflux extraction techniques while requiring less time, solvent and labor. First ASE has been applied for the extraction of environmental hazards from solid matrices. Within a very short time ASE was approved by the U.S. EPA for the extraction of BNAs, PAHs, PCBs, pesticides, herbicides, TPH, and dioxins from solid samples in method 3545. Especially for the extraction of dioxins the extraction time with ASE is reduced to 20 minutes in comparison to 18 h using Soxhlet. In food analysis ASE is used for the extraction of pesticide and mycotoxin residues from fruits and vegetables, the fat determination and extraction of vitamins. Time consuming and solvent intensive methods for the extraction of additives from polymers as well as for the extraction of marker compounds from herbal supplements can be performed with higher efficiencies using ASE. For the analysis of chemical weapons the extraction process and sample clean-up including derivatization can be automated and combined with GC-MS using an online ASE-APEC-GC system. (author)

  9. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  10. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  11. Automated analysis of autoradiographic imagery

    International Nuclear Information System (INIS)

    Bisignani, W.T.; Greenhouse, S.C.

    1975-01-01

    A research programme is described which has as its objective the automated characterization of neurological tissue regions from autoradiographs by utilizing hybrid-resolution image processing techniques. An experimental system is discussed which includes raw imagery, scanning an digitizing equipments, feature-extraction algorithms, and regional characterization techniques. The parameters extracted by these algorithms are presented as well as the regional characteristics which are obtained by operating on the parameters with statistical sampling techniques. An approach is presented for validating the techniques and initial experimental results are obtained from an anlysis of an autoradiograph of a region of the hypothalamus. An extension of these automated techniques to other biomedical research areas is discussed as well as the implications of applying automated techniques to biomedical research problems. (author)

  12. A Simple Method for Automated Solid Phase Extraction of Water Samples for Immunological Analysis of Small Pollutants.

    Science.gov (United States)

    Heub, Sarah; Tscharner, Noe; Kehl, Florian; Dittrich, Petra S; Follonier, Stéphane; Barbe, Laurent

    2016-01-01

    A new method for solid phase extraction (SPE) of environmental water samples is proposed. The developed prototype is cost-efficient and user friendly, and enables to perform rapid, automated and simple SPE. The pre-concentrated solution is compatible with analysis by immunoassay, with a low organic solvent content. A method is described for the extraction and pre-concentration of natural hormone 17β-estradiol in 100 ml water samples. Reverse phase SPE is performed with octadecyl-silica sorbent and elution is done with 200 µl of methanol 50% v/v. Eluent is diluted by adding di-water to lower the amount of methanol. After preparing manually the SPE column, the overall procedure is performed automatically within 1 hr. At the end of the process, estradiol concentration is measured by using a commercial enzyme-linked immune-sorbent assay (ELISA). 100-fold pre-concentration is achieved and the methanol content in only 10% v/v. Full recoveries of the molecule are achieved with 1 ng/L spiked de-ionized and synthetic sea water samples.

  13. Automated fault-management in a simulated spaceflight micro-world

    Science.gov (United States)

    Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja

    2002-01-01

    BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.

  14. Automated multi-lesion detection for referable diabetic retinopathy in indigenous health care.

    Directory of Open Access Journals (Sweden)

    Ramon Pires

    Full Text Available Diabetic Retinopathy (DR is a complication of diabetes mellitus that affects more than one-quarter of the population with diabetes, and can lead to blindness if not discovered in time. An automated screening enables the identification of patients who need further medical attention. This study aimed to classify retinal images of Aboriginal and Torres Strait Islander peoples utilizing an automated computer-based multi-lesion eye screening program for diabetic retinopathy. The multi-lesion classifier was trained on 1,014 images from the São Paulo Eye Hospital and tested on retinal images containing no DR-related lesion, single lesions, or multiple types of lesions from the Inala Aboriginal and Torres Strait Islander health care centre. The automated multi-lesion classifier has the potential to enhance the efficiency of clinical practice delivering diabetic retinopathy screening. Our program does not necessitate image samples for training from any specific ethnic group or population being assessed and is independent of image pre- or post-processing to identify retinal lesions. In this Aboriginal and Torres Strait Islander population, the program achieved 100% sensitivity and 88.9% specificity in identifying bright lesions, while detection of red lesions achieved a sensitivity of 67% and specificity of 95%. When both bright and red lesions were present, 100% sensitivity with 88.9% specificity was obtained. All results obtained with this automated screening program meet WHO standards for diabetic retinopathy screening.

  15. Automated Layup of Sheet Prepregs on Complex Moulds

    OpenAIRE

    Elkington, Michael P; Ward, Carwyn; Potter, Kevin D

    2016-01-01

    A new two-stage method for the automated manufacture of high performance composites components is presented which aims to combine the capacity for forming complex shapes of Hand Layup with the speed of existing automated systems. In the first stage of the new process plies are formed into the approximate shape of the mould using a press mechanism. They are then passed onto a layup stage which uses multiple end effectors controlled by a single six axis robot to stick the plies down onto the mo...

  16. Automated hardwood lumber grading utilizing a multiple sensor machine vision technology

    Science.gov (United States)

    D. Earl Kline; Chris Surak; Philip A. Araman

    2003-01-01

    Over the last 10 years, scientists at the Thomas M. Brooks Forest Products Center, the Bradley Department of Electrical and Computer Engineering, and the USDA Forest Service have been working on lumber scanning systems that can accurately locate and identify defects in hardwood lumber. Current R&D efforts are targeted toward developing automated lumber grading...

  17. Next frontier in agent-based complex automated negotiation

    CERN Document Server

    Ito, Takayuki; Zhang, Minjie; Robu, Valentin

    2015-01-01

    This book focuses on automated negotiations based on multi-agent systems. It is intended for researchers and students in various fields involving autonomous agents and multi-agent systems, such as e-commerce tools, decision-making and negotiation support systems, and collaboration tools. The contents will help them to understand the concept of automated negotiations, negotiation protocols, negotiating agents’ strategies, and the applications of those strategies. In this book, some negotiation protocols focusing on the multiple interdependent issues in negotiations are presented, making it possible to find high-quality solutions for the complex agents’ utility functions. This book is a compilation of the extended versions of the very best papers selected from the many that were presented at the International Workshop on Agent-Based Complex Automated Negotiations.

  18. Parallelize Automated Tests in a Build and Test Environment

    OpenAIRE

    Durairaj, Selva Ganesh

    2016-01-01

    This thesis investigates the possibilities of finding solutions, in order to reduce the total time spent for testing and waiting times for running multiple automated test cases in a test framework. The “Automated Test Framework”, developed by Axis Communications AB, is used to write the functional tests to test both hardware and software of a resource. The functional tests that tests the software is considered in this thesis work. In the current infrastructure, tests are executed sequentially...

  19. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    Science.gov (United States)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.

  20. Hierarchical sampling of multiple strata: an innovative technique in exposure characterization

    International Nuclear Information System (INIS)

    Ericson, J.E.; Gonzalez, Elisabeth J.

    2003-01-01

    Sampling of multiple strata, or hierarchical sampling of various exposure sources and activity areas, has been tested and is suggested as a method to sample (or to locate) areas with a high prevalence of elevated blood lead in children. Hierarchical sampling was devised to supplement traditional soil lead sampling of a single stratum, either residential or fixed point source, using a multistep strategy. Blood lead (n=1141) and soil lead (n=378) data collected under the USEPA/UCI Tijuana Lead Project (1996-1999) were analyzed to evaluate the usefulness of sampling soil lead from background sites, schools and parks, point sources, and residences. Results revealed that industrial emissions have been a contributing factor to soil lead contamination in Tijuana. At the regional level, point source soil lead was associated with mean blood lead levels and concurrent high background, and point source soil lead levels were predictive of a high percentage of subjects with blood lead equal to or greater than 10 μg/dL (pe 10). Significant relationships were observed between mean blood lead level and fixed point source soil lead (r=0.93; P 2 =0.72 using a quadratic model) and between residential soil lead and fixed point source soil lead (r=0.90; P 2 =0.86 using a cubic model). This study suggests that point sources alone are not sufficient for predicting the relative risk of exposure to lead in the urban environment. These findings will be useful in defining regions for targeted or universal soil lead sampling by site type. Point sources have been observed to be predictive of mean blood lead at the regional level; however, this relationship alone was not sufficient to predict pe 10. It is concluded that when apparently undisturbed sites reveal high soil lead levels in addition to local point sources, dispersion of lead is widespread and will be associated with a high prevalence of elevated blood lead in children. Multiple strata sampling was shown to be useful in

  1. EXPERIMENTS TOWARDS DETERMINING BEST TRAINING SAMPLE SIZE FOR AUTOMATED EVALUATION OF DESCRIPTIVE ANSWERS THROUGH SEQUENTIAL MINIMAL OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Sunil Kumar C

    2014-01-01

    Full Text Available With number of students growing each year there is a strong need to automate systems capable of evaluating descriptive answers. Unfortunately, there aren’t many systems capable of performing this task. In this paper, we use a machine learning tool called LightSIDE to accomplish auto evaluation and scoring of descriptive answers. Our experiments are designed to cater to our primary goal of identifying the optimum training sample size so as to get optimum auto scoring. Besides the technical overview and the experiments design, the paper also covers challenges, benefits of the system. We also discussed interdisciplinary areas for future research on this topic.

  2. Development of an automated desktop procedure for defining macro ...

    African Journals Online (AJOL)

    2006-07-03

    break points' such as ... An automated desktop procedure was developed for computing statistically defensible, multiple change .... from source to mouth. .... the calculated value was less than the test statistic given in Owen.

  3. Homogeneous immunosubtraction integrated with sample preparation is enabled by a microfluidic format

    Science.gov (United States)

    Apori, Akwasi A.; Herr, Amy E.

    2011-01-01

    Immunosubtraction is a powerful and resource-intensive laboratory medicine assay that reports both protein mobility and binding specificity. To expedite and automate this electrophoretic assay, we report on advances to the electrophoretic immunosubtraction assay by introducing a homogeneous, not heterogeneous, format with integrated sample preparation. To accomplish homogeneous immunosubtraction, a step-decrease in separation matrix pore-size at the head of a polyacrylamide gel electrophoresis (PAGE) separation channel enables ‘subtraction’ of target analyte when capture antibody is present (as the large immune-complex is excluded from PAGE), but no subtraction when capture antibody is absent. Inclusion of sample preparation functionality via small pore size polyacrylamide membranes is also key to automated operation (i.e., sample enrichment, fluorescence sample labeling, and mixing of sample with free capture antibody). Homogenous sample preparation and assay operation allows on-the-fly, integrated subtraction of one to multiple protein targets and reuse of each device. Optimization of the assay is detailed which allowed for ~95% subtraction of target with 20% non-specific extraction of large species at the optimal antibody-antigen ratio, providing conditions needed for selective target identification. We demonstrate the assay on putative markers of injury and inflammation in cerebrospinal fluid (CSF), an emerging area of diagnostics research, by rapidly reporting protein mobility and binding specificity within the sample matrix. We simultaneously detect S100B and C-reactive protein, suspected biomarkers for traumatic brain injury (TBI), in ~2 min. Lastly, we demonstrate S100B detection (65 nM) in raw human CSF with a lower limit of detection of ~3.25 nM, within the clinically relevant concentration range for detecting TBI in CSF. Beyond the novel CSF assay introduced here, a fully automated immunosubtraction assay would impact a spectrum of routine but labor

  4. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    Energy Technology Data Exchange (ETDEWEB)

    Milliard, Alex; Durand-Jezequel, Myriam [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada); Lariviere, Dominic, E-mail: dominic.lariviere@chm.ulaval.ca [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada)

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO{sub 2}/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg{sup -1} for 5-300 mg of sample.

  5. Automation of registration of sample weights for high-volume neutron activation analysis at the IBR-2 reactor of FLNP, JINR

    International Nuclear Information System (INIS)

    Dmitriev, A.Yu.; Dmitriev, F.A.

    2015-01-01

    The 'Weight' software tool was created at FLNP JINR to automate the reading of analytical balance readouts and saving these values in the NAA database. The analytical balance connected to the personal computer is used to measure weight values. The 'Weight' software tool controls the reading of weight values and the exchange of information with the NAA database. The weighing process of a large amount of samples is reliably provided during high-volume neutron activation analysis. [ru

  6. Identification of multiple mRNA and DNA sequences from small tissue samples isolated by laser-assisted microdissection.

    Science.gov (United States)

    Bernsen, M R; Dijkman, H B; de Vries, E; Figdor, C G; Ruiter, D J; Adema, G J; van Muijen, G N

    1998-10-01

    Molecular analysis of small tissue samples has become increasingly important in biomedical studies. Using a laser dissection microscope and modified nucleic acid isolation protocols, we demonstrate that multiple mRNA as well as DNA sequences can be identified from a single-cell sample. In addition, we show that the specificity of procurement of tissue samples is not compromised by smear contamination resulting from scraping of the microtome knife during sectioning of lesions. The procedures described herein thus allow for efficient RT-PCR or PCR analysis of multiple nucleic acid sequences from small tissue samples obtained by laser-assisted microdissection.

  7. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Directory of Open Access Journals (Sweden)

    Casey Olives

    Full Text Available Originally a binary classifier, Lot Quality Assurance Sampling (LQAS has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%, and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa.We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa.Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87. In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50, the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error.This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  8. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  9. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  10. First evaluation of automated specimen inoculation for wound swab samples by use of the Previ Isola system compared to manual inoculation in a routine laboratory: finding a cost-effective and accurate approach.

    Science.gov (United States)

    Mischnik, Alexander; Mieth, Markus; Busch, Cornelius J; Hofer, Stefan; Zimmermann, Stefan

    2012-08-01

    Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis.

  11. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  12. A neutron multiplicity analysis method for uranium samples with liquid scintillators

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Hao, E-mail: zhouhao_ciae@126.com [China Institute of Atomic Energy, P.O.BOX 275-8, Beijing 102413 (China); Lin, Hongtao [Xi' an Reasearch Institute of High-tech, Xi' an, Shaanxi 710025 (China); Liu, Guorong; Li, Jinghuai; Liang, Qinglei; Zhao, Yonggang [China Institute of Atomic Energy, P.O.BOX 275-8, Beijing 102413 (China)

    2015-10-11

    A new neutron multiplicity analysis method for uranium samples with liquid scintillators is introduced. An active well-type fast neutron multiplicity counter has been built, which consists of four BC501A liquid scintillators, a n/γdiscrimination module MPD-4, a multi-stop time to digital convertor MCS6A, and two Am–Li sources. A mathematical model is built to symbolize the detection processes of fission neutrons. Based on this model, equations in the form of R=F*P*Q*T could be achieved, where F indicates the induced fission rate by interrogation sources, P indicates the transfer matrix determined by multiplication process, Q indicates the transfer matrix determined by detection efficiency, T indicates the transfer matrix determined by signal recording process and crosstalk in the counter. Unknown parameters about the item are determined by the solutions of the equations. A {sup 252}Cf source and some low enriched uranium items have been measured. The feasibility of the method is proven by its application to the data analysis of the experiments.

  13. A novel sampling method for multiple multiscale targets from scattering amplitudes at a fixed frequency

    Science.gov (United States)

    Liu, Xiaodong

    2017-08-01

    A sampling method by using scattering amplitude is proposed for shape and location reconstruction in inverse acoustic scattering problems. Only matrix multiplication is involved in the computation, thus the novel sampling method is very easy and simple to implement. With the help of the factorization of the far field operator, we establish an inf-criterion for characterization of underlying scatterers. This result is then used to give a lower bound of the proposed indicator functional for sampling points inside the scatterers. While for the sampling points outside the scatterers, we show that the indicator functional decays like the bessel functions as the sampling point goes away from the boundary of the scatterers. We also show that the proposed indicator functional continuously depends on the scattering amplitude, this further implies that the novel sampling method is extremely stable with respect to errors in the data. Different to the classical sampling method such as the linear sampling method or the factorization method, from the numerical point of view, the novel indicator takes its maximum near the boundary of the underlying target and decays like the bessel functions as the sampling points go away from the boundary. The numerical simulations also show that the proposed sampling method can deal with multiple multiscale case, even the different components are close to each other.

  14. Designing an automated blood fractionation system.

    Science.gov (United States)

    McQuillan, Adrian C; Sales, Sean D

    2008-04-01

    UK Biobank will be collecting blood samples from a cohort of 500 000 volunteers and it is expected that the rate of collection will peak at approximately 3000 blood collection tubes per day. These samples need to be prepared for long-term storage. It is not considered practical to manually process this quantity of samples so an automated blood fractionation system is required. Principles of industrial automation were applied to the blood fractionation process leading to the requirement of developing a vision system to identify the blood fractions within the blood collection tube so that the fractions can be accurately aspirated and dispensed into micro-tubes. A prototype was manufactured and tested on a range of human blood samples collected in different tube types. A specially designed vision system was capable of accurately measuring the position of the plasma meniscus, plasma/buffy coat interface and the red cells/buffy coat interface within a vacutainer. A rack of 24 vacutainers could be processed in blood fractionation system offers a solution to the problem of processing human blood samples collected in vacutainers in a consistent manner and provides a means of ensuring data and sample integrity.

  15. Automated diagnostic kiosk for diagnosing diseases

    Science.gov (United States)

    Regan, John Frederick; Birch, James Michael

    2014-02-11

    An automated and autonomous diagnostic apparatus that is capable of dispensing collection vials and collections kits to users interesting in collecting a biological sample and submitting their collected sample contained within a collection vial into the apparatus for automated diagnostic services. The user communicates with the apparatus through a touch-screen monitor. A user is able to enter personnel information into the apparatus including medical history, insurance information, co-payment, and answer a series of questions regarding their illness, which is used to determine the assay most likely to yield a positive result. Remotely-located physicians can communicate with users of the apparatus using video tele-medicine and request specific assays to be performed. The apparatus archives submitted samples for additional testing. Users may receive their assay results electronically. Users may allow the uploading of their diagnoses into a central databank for disease surveillance purposes.

  16. Automated system for noise-measurements on low-ohmic samples and magnetic sensors

    NARCIS (Netherlands)

    Jonker, R.J.W.; Briaire, J.; Vandamme, L.K.J.

    1999-01-01

    An automated system for electronic noise measurements on metal films is presented. This new system, controlled by a personal computer which utilizes National Instruments' LabVIEW software, is designed to measure low frequency noise as a function of an externally imposed magnetic field and as a

  17. Adaptation of a T3-uptake test and of radioimmunoassays for serum digoxin, thyroxine, and triiodothyronine to an automated radioimmunoassay system: ''Centria''

    International Nuclear Information System (INIS)

    Ertingshausen, G.; Shapiro, S.I.; Green, G.; Zborowski, G.

    1975-01-01

    We report the adaptation of four radioassays to the prototype of an automated radioimmunoassay system (''Centria,'' Union Carbide). The system consists of three integrated modules: an automated pipettor, which dispenses samples and reagents; the key module, an incubator/separator, in which centrifugal force is used to initiate and terminate multiple radioassay incubations and separations simultaneously; and a gamma-counter/computer, which counts three tubes simultaneously and converts counts into concentration units. Radioimmunoassays for thyroxine, triiodothyronine, and digoxin were developed with use of well-characterized antibodies and of prepackaged Sephadex-containing columns to separate bound and free radioactive ligand. A triiodothyronine-uptake test in which the same kind of columns were used was also adapted to the instrument. Results for clinical samples compared favorably with those obtained by manual procedures. We report data on correlation between different methods and preliminary data on precision of the prototype system

  18. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  19. Automation of diagnostic genetic testing: mutation detection by cyclic minisequencing.

    Science.gov (United States)

    Alagrund, Katariina; Orpana, Arto K

    2014-01-01

    The rising role of nucleic acid testing in clinical decision making is creating a need for efficient and automated diagnostic nucleic acid test platforms. Clinical use of nucleic acid testing sets demands for shorter turnaround times (TATs), lower production costs and robust, reliable methods that can easily adopt new test panels and is able to run rare tests in random access principle. Here we present a novel home-brew laboratory automation platform for diagnostic mutation testing. This platform is based on the cyclic minisequecing (cMS) and two color near-infrared (NIR) detection. Pipetting is automated using Tecan Freedom EVO pipetting robots and all assays are performed in 384-well micro plate format. The automation platform includes a data processing system, controlling all procedures, and automated patient result reporting to the hospital information system. We have found automated cMS a reliable, inexpensive and robust method for nucleic acid testing for a wide variety of diagnostic tests. The platform is currently in clinical use for over 80 mutations or polymorphisms. Additionally to tests performed from blood samples, the system performs also epigenetic test for the methylation of the MGMT gene promoter, and companion diagnostic tests for analysis of KRAS and BRAF gene mutations from formalin fixed and paraffin embedded tumor samples. Automation of genetic test reporting is found reliable and efficient decreasing the work load of academic personnel.

  20. 3S-R10 automated RBS system

    International Nuclear Information System (INIS)

    Norton, G.A.; Schroeder, J.B.; Klody, G.M.; Strathman, M.D.

    1989-01-01

    The NEC 3S-R10 automated RBS spectrometer system includes the features required for routine application of Rutherford backscattering (RBS) and related techniques for materials analysis in both research and industrial settings. The NEC Model 3SDH Pelletron accelerator system provides stable, monoenergetic beams of helium ions up to 3.3 MeV and protons to 2.2 MeV and has heavy ion capability. The analytical end station is the fully computerized Charles Evans and Associates Model RBS-400. Automated features include sample positioning (precision 4-axis goniometer), channeling alignment, polar plot generation, and data acquisition and reduction. Computer automation of accelerator and chamber functions includes storage and recall of operating parameters. Unattended data acquisition, e.g., overnight or over a weekend, is possible for up to 100 samples per batch for random orientation, rotating random or channeling analyses at any sample location. Single samples may be up to 50 cm in diameter. A laser for sample alignment and a TV for video monitoring are included. Simultaneous detection (up to 4 detectors) at normal and grazing angles, external control of grazing angle detector position, and transmission scattering capabiltiy enhance system flexibility. The system is also compatible with PIXE, NRA, and hydrogen forward-backscattering analyses. Data reduction is part of the computer system, which features displays (several formats) and manipulation of up to five spectra at one time using constant multipliers or point by point operations between spectra. (orig.)

  1. Automation of the software production process for multiple cryogenic control applications

    OpenAIRE

    Fluder, Czeslaw; Lefebvre, Victor; Pezzetti, Marco; Plutecki, Przemyslaw; Tovar-González, Antonio; Wolak, Tomasz

    2018-01-01

    The development of process control systems for the cryogenic infrastructure at CERN is based on an automatic software generation approach. The overall complexity of the systems, their frequent evolution as well as the extensive use of databases, repositories, commercial engineering software and CERN frameworks have led to further efforts towards improving the existing automation based software production methodology. A large number of control system upgrades were successfully performed for th...

  2. Fully Automated Trimethylsilyl (TMS Derivatisation Protocol for Metabolite Profiling by GC-MS

    Directory of Open Access Journals (Sweden)

    Erica Zarate

    2016-12-01

    Full Text Available Gas Chromatography-Mass Spectrometry (GC-MS has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20 and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted in any metabolomics laboratory.

  3. Standards for space automation and robotics

    Science.gov (United States)

    Kader, Jac B.; Loftin, R. B.

    1992-01-01

    The AIAA's Committee on Standards for Space Automation and Robotics (COS/SAR) is charged with the identification of key functions and critical technologies applicable to multiple missions that reflect fundamental consideration of environmental factors. COS/SAR's standards/practices/guidelines implementation methods will be based on reliability, performance, and operations, as well as economic viability and life-cycle costs, simplicity, and modularity.

  4. M073: Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2004-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance.

  5. A simple and automated sample preparation system for subsequent halogens determination: Combustion followed by pyrohydrolysis.

    Science.gov (United States)

    Pereira, L S F; Pedrotti, M F; Vecchia, P Dalla; Pereira, J S F; Flores, E M M

    2018-06-20

    A simple and automated system based on combustion followed by a pyrohydrolysis reaction was proposed for further halogens determination. This system was applied for digestion of soils containing high (90%) and also low (10%) organic matter content for further halogens determination. The following parameters were evaluated: sample mass, use of microcrystalline cellulose and heating time. For analytes absorption, a diluted alkaline solution (6 mL of 25 mmol L -1  NH 4 OH) was used in all experiments. Up to 400 mg of soil with high organic matter content and 100 mg of soil with low organic matter content (mixed with 400 mg of cellulose) could be completely digested using the proposed system. Quantitative results for all halogens were obtained using less than 12 min of sample preparation step (about 1.8 min for sample combustion and 10 min for pyrohydrolysis). The accuracy was evaluated using a certified reference material of coal and spiked samples. No statistical difference was observed between the certified values and results obtained by the proposed method. Additionally, the recoveries obtained using spiked samples were in the range of 98-103% with relative standard deviation values lower than 5%. The limits of quantification obtained for F, Cl, Br and I for soil with high (400 mg of soil) and low (100 mg of soil) organic matter were in the range of 0.01-2 μg g -1 and 0.07-59 μg g -1 , respectively. The proposed system was considered as a simple and suitable alternative for soils digestion for further halogens determination by ion chromatography and inductively coupled plasma mass spectrometry techniques. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Development of INAA automation at CARR

    International Nuclear Information System (INIS)

    Yao Yonggang; Xiao Caijin; Hua Long; Jin Xiangchun; Wang Xinghua; Li Jue; Cao Lei; Wang Pingsheng; Ni Bangfa

    2016-01-01

    A new INAA (instrumental neutron activation analysis) automatic measurement system has been established at China advanced research reactor. Three HPGe spectrometers can be working simultaneously at 24 h a day. The sample rabbit contains an RFID (radio frequency identification) tag which is used for identifying and/or tracing the sample route or location. The automation system controlled by software Kingview6.55 and programmable logic controller is mainly composed of four parts, sample counting system, sample driving system, samples holder and sample exchanger. (author)

  7. Turbidity-controlled sampling for suspended sediment load estimation

    Science.gov (United States)

    Jack Lewis

    2003-01-01

    Abstract - Automated data collection is essential to effectively measure suspended sediment loads in storm events, particularly in small basins. Continuous turbidity measurements can be used, along with discharge, in an automated system that makes real-time sampling decisions to facilitate sediment load estimation. The Turbidity Threshold Sampling method distributes...

  8. Extending laboratory automation to the wards: effect of an innovative pneumatic tube system on diagnostic samples and transport time.

    Science.gov (United States)

    Suchsland, Juliane; Winter, Theresa; Greiser, Anne; Streichert, Thomas; Otto, Benjamin; Mayerle, Julia; Runge, Sören; Kallner, Anders; Nauck, Matthias; Petersmann, Astrid

    2017-02-01

    The innovative pneumatic tube system (iPTS) transports one sample at a time without the use of cartridges and allows rapid sending of samples directly into the bulk loader of a laboratory automation system (LAS). We investigated effects of the iPTS on samples and turn-around time (TAT). During transport, a mini data logger recorded the accelerations in three dimensions and reported them in arbitrary area under the curve (AUC) units. In addition representative quantities of clinical chemistry, hematology and coagulation were measured and compared in 20 blood sample pairs transported by iPTS and courier. Samples transported by iPTS were brought to the laboratory (300 m) within 30 s without adverse effects on the samples. The information retrieved from the data logger showed a median AUC of 7 and 310 arbitrary units for courier and iPTS transport, respectively. This is considerably below the reported limit for noticeable hemolysis of 500 arbitrary units. iPTS reduces TAT by reducing the hands-on time and a fast transport. No differences in the measurement results were found for any of the investigated 36 analytes between courier and iPTS transport. Based on these findings the iPTS was cleared for clinical use in our hospital.

  9. The automated sample preparation system MixMaster for investigation of volatile organic compounds with mid-infrared evanescent wave spectroscopy.

    Science.gov (United States)

    Vogt, F; Karlowatz, M; Jakusch, M; Mizaikoff, B

    2003-04-01

    For efficient development assessment, and calibration of new chemical analyzers a large number of independently prepared samples of target analytes is necessary. Whereas mixing units for gas analysis are readily available, there is a lack of instrumentation for accurate preparation of liquid samples containing volatile organic compounds (VOCs). Manual preparation of liquid samples containing VOCs at trace concentration levels is a particularly challenging and time consuming task. Furthermore, regularly scheduled calibration of sensors and analyzer systems demands for computer controlled automated sample preparation systems. In this paper we present a novel liquid mixing device enabling extensive measurement series with focus on volatile organic compounds, facilitating analysis of water polluted by traces of volatile hydrocarbons. After discussing the mixing system and control software, first results obtained by coupling with an FT-IR spectrometer are reported. Properties of the mixing system are assessed by mid-infrared attenuated total reflection (ATR) spectroscopy of methanol-acetone mixtures and by investigation of multicomponent samples containing volatile hydrocarbons such as 1,2,4-trichlorobenzene and tetrachloroethylene. Obtained ATR spectra are evaluated by principal component regression (PCR) algorithms. It is demonstrated that the presented sample mixing device provides reliable multicomponent mixtures with sufficient accuracy and reproducibility at trace concentration levels.

  10. Sample Set (SE): SE57 [Metabolonote[Archive

    Lifescience Database Archive (English)

    Full Text Available etabolite accumulation patterns in plants We optimized the MRM conditions for specifi c compounds by performing automate...applied to high-throughput automated analysis of biological samples using TQMS coupled with ultra performanc...ies, and family-specifi c metabolites could be predicted using a batch-learning self organizing map analysis. Thus, the automate

  11. Compensation Methods for Non-uniform and Incomplete Data Sampling in High Resolution PET with Multiple Scintillation Crystal Layers

    International Nuclear Information System (INIS)

    Lee, Jae Sung; Kim, Soo Mee; Lee, Dong Soo; Hong, Jong Hong; Sim, Kwang Souk; Rhee, June Tak

    2008-01-01

    To establish the methods for sinogram formation and correction in order to appropriately apply the filtered backprojection (FBP) reconstruction algorithm to the data acquired using PET scanner with multiple scintillation crystal layers. Formation for raw PET data storage and conversion methods from listmode data to histogram and sinogram were optimized. To solve the various problems occurred while the raw histogram was converted into sinogram, optimal sampling strategy and sampling efficiency correction method were investigated. Gap compensation methods that is unique in this system were also investigated. All the sinogram data were reconstructed using 2D filtered backprojection algorithm and compared to estimate the improvements by the correction algorithms. Optimal radial sampling interval and number of angular samples in terms of the sampling theorem and sampling efficiency correction algorithm were pitch/2 and 120, respectively. By applying the sampling efficiency correction and gap compensation, artifacts and background noise on the reconstructed image could be reduced. Conversion method from the histogram to sinogram was investigated for the FBP reconstruction of data acquired using multiple scintillation crystal layers. This method will be useful for the fast 2D reconstruction of multiple crystal layer PET data

  12. Automating Energy Bandgap Measurements in Semiconductors Using LabVIEW

    Science.gov (United States)

    Garg, Amit; Sharma, Reena; Dhingra, Vishal

    2010-01-01

    In this paper, we report the development of an automated system for energy bandgap and resistivity measurement of a semiconductor sample using Four-Probe method for use in the undergraduate laboratory of Physics and Electronics students. The automated data acquisition and analysis system has been developed using National Instruments USB-6008 DAQ…

  13. Dockomatic - automated ligand creation and docking.

    Science.gov (United States)

    Bullock, Casey W; Jacob, Reed B; McDougal, Owen M; Hampikian, Greg; Andersen, Tim

    2010-11-08

    The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI) application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  14. Dockomatic - automated ligand creation and docking

    Directory of Open Access Journals (Sweden)

    Hampikian Greg

    2010-11-01

    Full Text Available Abstract Background The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. Results DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. Conclusions DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  15. Capacity analysis of an automated kit transportation system

    NARCIS (Netherlands)

    Zijm, W.H.M.; Adan, I.J.B.F.; Buitenhek, R.; Houtum, van G.J.J.A.N.

    2000-01-01

    In this paper, we present a capacity analysis of an automated transportation system in a flexible assembly factory. The transportation system, together with the workstations, is modeled as a network of queues with multiple job classes. Due to its complex nature, the steadystate behavior of this

  16. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  17. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS.

    Science.gov (United States)

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G; Baker, Philip; Pinu, Farhana R

    2016-12-29

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory.

  18. Novel automated biomarker discovery work flow for urinary peptidomics

    DEFF Research Database (Denmark)

    Balog, Crina I.; Hensbergen, Paul J.; Derks, Rico

    2009-01-01

    samples from Schistosoma haematobium-infected individuals to evaluate clinical applicability. RESULTS: The automated RP-SCX sample cleanup and fractionation system exhibits a high qualitative and quantitative reproducibility, with both BSA standards and urine samples. Because of the relatively high...

  19. An alarm filtering system for an automated process: a multiple-agent approach

    International Nuclear Information System (INIS)

    Khoualdi, Kamel

    1994-01-01

    Nowadays, the supervision process of industrial installations is more and more complex involving the automation of their control. A malfunction generates an avalanche of alarms. The operator, in charge of the supervision, must face the incident and execute right actions to recover a normal situation. Generally, he is drowned under the great number of alarms. Our aim, in the frame of our researches, is to perform an alarm filtering system for an automated metro line, to help the operator finding the main alarm responsible for the malfunction. Our works are divided into two parts, both dealing with study and development of an alarm filtering system but using two different approaches. The first part is developed in the frame of the SARA project (an operator assistance system for an automated metro line) which is an expert system prototype helping the operators of a command center. In this part, a centralized approach has been used representing the events with a single event graph and using a global procedure to perform diagnosis. This approach has itself shown its limits. In the second part of our works, we have considered the distributed artificial intelligence (DAI) techniques, and more especially the multi-agent approach. The multi-agent approach has been motivated by the natural distribution of the metro line equipment and by the fact that each equipment has its own local control and knowledge. Thus, each equipment has been considered as an autonomous agent. Through agents cooperation, the system is able to determine the main alarm and the faulty equipment responsible for the incident. A prototype, written in SPIRAL (a tool for knowledge-based system) is running on a workstation. This prototype has allowed the concretization and the validation of our multi-agent approach. (author) [fr

  20. Comparison Between Conventional and Automated Techniques for Blood Grouping and Crossmatching: Experience from a Tertiary Care Centre.

    Science.gov (United States)

    Bhagwat, Swarupa Nikhil; Sharma, Jayashree H; Jose, Julie; Modi, Charusmita J

    2015-01-01

    The routine immunohematological tests can be performed by automated as well as manual techniques. These techniques have advantages and disadvantages inherent to them. The present study aims to compare the results of manual and automated techniques for blood grouping and crossmatching so as to validate the automated system effectively. A total of 1000 samples were subjected to blood grouping by the conventional tube technique (CTT) and the automated microplate LYRA system on Techno TwinStation. A total of 269 samples (multitransfused patients and multigravida females) were compared for 927 crossmatches by the CTT in indirect antiglobulin phase against the column agglutination technique (CAT) performed on Techno TwinStation. For blood grouping, the study showed a concordance in results for 942/1000 samples (94.2%), discordance for 4/1000 (0.4%) samples and uninterpretable result for 54/1000 samples (5.4%). On resolution, the uninterpretable results reduced to 49/1000 samples (4.9%) with 951/1000 samples (95.1%) showing concordant results. For crossmatching, the automated CAT showed concordant results in 887/927 (95.6%) and discordant results in 3/927 (0.32%) crossmatches as compared to the CTT. Total 37/927 (3.9%) crossmatches were not interpretable by the automated technique. The automated system shows a high concordance of results with CTT and hence can be brought into routine use. However, the high proportion of uninterpretable results emphasizes on the fact that proper training and standardization are needed prior to its use.

  1. Unmet needs in automated cytogenetics

    International Nuclear Information System (INIS)

    Bender, M.A.

    1976-01-01

    Though some, at least, of the goals of automation systems for analysis of clinical cytogenetic material seem either at hand, like automatic metaphase finding, or at least likely to be met in the near future, like operator-assisted semi-automatic analysis of banded metaphase spreads, important areas of cytogenetic analsis, most importantly the determination of chromosomal aberration frequencies in populations of cells or in samples of cells from people exposed to environmental mutagens, await practical methods of automation. Important as are the clinical diagnostic applications, it is apparent that increasing concern over the clastogenic effects of the multitude of potentially clastogenic chemical and physical agents to which human populations are being increasingly exposed, and the resulting emergence of extensive cytogenetic testing protocols, makes the development of automation not only economically feasible but almost mandatory. The nature of the problems involved, and acutal of possible approaches to their solution, are discussed

  2. Automation of potentiometric titration with a personal computer using ...

    African Journals Online (AJOL)

    sampling was designed and tested for automation of potentiometric titrations with personal ... automation permits us to carry out new types of experiments, such as those requiring ... have proved to be very useful in routine tasks but not in research, due to their ... This is done by a simple delay sub-routine in data acquisition.

  3. Automated gamma counters

    International Nuclear Information System (INIS)

    Regener, M.

    1977-01-01

    This is a report on the most recent developments in the full automation of gamma counting in RIA, in particular by Messrs. Kontron. The development targets were flexibility in sample capacity and shape of test tubes, the possibility of using different radioisotopes for labelling due to an optimisation of the detector system and the use of microprocessers to substitute software for hardware. (ORU) [de

  4. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  5. Automated atomic absorption spectrophotometer, utilizing a programmable desk calculator

    International Nuclear Information System (INIS)

    Futrell, T.L.; Morrow, R.W.

    1977-01-01

    A commercial, double-beam atomic absorption spectrophotometer has been interfaced with a sample changer and a Hewlett-Packard 9810A calculator to yield a completely automated analysis system. The interface electronics can be easily constructed and should be adaptable to any double-beam atomic absorption instrument. The calculator is easily programmed and can be used for general laboratory purposes when not operating the instrument. The automated system has been shown to perform very satisfactorily when operated unattended to analyze a large number of samples. Performance statistics agree well with a manually operated instrument

  6. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  7. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    Science.gov (United States)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  8. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods.

    Science.gov (United States)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J; Morling, Niels

    2013-05-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were processed. The highest DNA recovery was obtained with the PrepFiler Express kit on an AutoMate Express while the lowest DNA recovery was obtained using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen. Extraction using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen resulted in the lowest percentage of PCR inhibition (0%) while extraction using manual Chelex resulted in the highest percentage of PCR inhibition (51%). The largest number of reportable STR-profiles was obtained with DNA from samples extracted with the PrepFiler Express kit (75%) while the lowest number was obtained with DNA from samples extracted using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen (41%). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    Science.gov (United States)

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  10. Automated flow cytometric analysis across large numbers of samples and cell types.

    Science.gov (United States)

    Chen, Xiaoyi; Hasan, Milena; Libri, Valentina; Urrutia, Alejandra; Beitz, Benoît; Rouilly, Vincent; Duffy, Darragh; Patin, Étienne; Chalmond, Bernard; Rogge, Lars; Quintana-Murci, Lluis; Albert, Matthew L; Schwikowski, Benno

    2015-04-01

    Multi-parametric flow cytometry is a key technology for characterization of immune cell phenotypes. However, robust high-dimensional post-analytic strategies for automated data analysis in large numbers of donors are still lacking. Here, we report a computational pipeline, called FlowGM, which minimizes operator input, is insensitive to compensation settings, and can be adapted to different analytic panels. A Gaussian Mixture Model (GMM)-based approach was utilized for initial clustering, with the number of clusters determined using Bayesian Information Criterion. Meta-clustering in a reference donor permitted automated identification of 24 cell types across four panels. Cluster labels were integrated into FCS files, thus permitting comparisons to manual gating. Cell numbers and coefficient of variation (CV) were similar between FlowGM and conventional gating for lymphocyte populations, but notably FlowGM provided improved discrimination of "hard-to-gate" monocyte and dendritic cell (DC) subsets. FlowGM thus provides rapid high-dimensional analysis of cell phenotypes and is amenable to cohort studies. Copyright © 2015. Published by Elsevier Inc.

  11. The gingival vein as a minimally traumatic site for multiple blood sampling in guinea pigs and hamsters.

    Science.gov (United States)

    Rodrigues, Mariana Valotta; de Castro, Simone Oliveira; de Albuquerque, Cynthia Zaccanini; Mattaraia, Vânia Gomes de Moura; Santoro, Marcelo Larami

    2017-01-01

    Laboratory animals are still necessary in scientific investigation and vaccine testing, but while novel methodological approaches are not available for their replacement, the search for new, humane, easy, and painless methods is necessary to diminish their stress and pain. When multiple blood samples are to be collected from hamsters and guinea pigs, the number of available venipuncture sites-which are greatly diminished in these species in comparison with other rodents due to the absence of a long tail-, harasses animal caregivers and researchers. Thus, this study aimed to evaluate if gingival vein puncture could be used as an additional route to obtain multiple blood samples from anesthetized hamsters and guinea pigs in such a way that animal behavior, well-being or hematological parameters would not be altered. Thus, twelve anesthetized Syrian golden hamsters and English guinea pigs were randomly allocated in two groups: a control group, whose blood samples were not collected, and an experimental group in which blood samples (200 microliters) were collected by gingival vein puncture at weekly intervals over six weeks. Clinical assessment, body weight gain and complete blood cell count were evaluated weekly, and control and experimental animals were euthanized at week seven, when the mentolabial region was processed to histological analyses. Multiple blood sampling from the gingival vein evoked no clinical manifestations or alteration in the behavioral repertoire, nor a statistically significant difference in weight gain in both species. Guinea pigs showed no alteration in red blood cell, leukocyte or platelet parameters over time. Hamsters developed a characteristic pattern of age-related physiological changes, which were considered normal. Histological analyses showed no difference in morphological structures in the interdental gingiva, confirming that multiple blood sampling is barely traumatic. Thus, these results evidence that blood collection from multiple

  12. Technology Transfer Opportunities: Automated Ground-Water Monitoring

    Science.gov (United States)

    Smith, Kirk P.; Granato, Gregory E.

    1997-01-01

    Introduction A new automated ground-water monitoring system developed by the U.S. Geological Survey (USGS) measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automation of water-quality monitoring systems in the field, in laboratories, and in industry have increased data density and utility while reducing operating costs. Uses for an automated ground-water monitoring system include, (but are not limited to) monitoring ground-water quality for research, monitoring known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, and as an early warning system monitoring groundwater quality near public water-supply wells.

  13. Life Sciences Research Facility automation requirements and concepts for the Space Station

    Science.gov (United States)

    Rasmussen, Daryl N.

    1986-01-01

    An evaluation is made of the methods and preliminary results of a study on prospects for the automation of the NASA Space Station's Life Sciences Research Facility. In order to remain within current Space Station resource allocations, approximately 85 percent of planned life science experiment tasks must be automated; these tasks encompass specimen care and feeding, cage and instrument cleaning, data acquisition and control, sample analysis, waste management, instrument calibration, materials inventory and management, and janitorial work. Task automation will free crews for specimen manipulation, tissue sampling, data interpretation and communication with ground controllers, and experiment management.

  14. Manual for the Epithermal Neutron Multiplicity Detector (ENMC) for Measurement of Impure MOX and Plutonium Samples

    International Nuclear Information System (INIS)

    Menlove, H. O.; Rael, C. D.; Kroncke, K. E.; DeAguero, K. J.

    2004-01-01

    We have designed a high-efficiency neutron detector for passive neutron coincidence and multiplicity counting of dirty scrap and bulk samples of plutonium. The counter will be used for the measurement of impure plutonium samples at the JNC MOX fabrication facility in Japan. The counter can also be used to create working standards from bulk process MOX. The detector uses advanced design "3He tubes to increase the efficiency and to shorten the neutron die-away time. The efficiency is 64% and the die-away time is 19.1 ?s. The Epithermal Neutron Multiplicity Counter (ENMC) is designed for high-precision measurements of bulk plutonium samples with diameters of less than 200 mm. The average neutron energy from the sample can be measured using the ratio of the inner ring of He-3 tubes to the outer ring. This report describes the hardware, performance, and calibration for the ENMC.

  15. Implementing New-age Authentication Techniques using OpenID for Security Automation

    OpenAIRE

    Dharmendra Choukse; Umesh Kumar Singh; Deepak Sukheja; Rekha Shahapurkar

    2010-01-01

    Security of any software can be enhanced manifolds if multiple factors for authorization and authentication are used .The main aim of this work was to design and implement an Academy Automation Software for IPS Academy which uses OpenID and Windows CardSpace as Authentication Techniques in addition to Role Based Authentication (RBA) System to ensure that only authentic users can access the predefined roles as per their Authorization level. The Automation covers different computing hardware an...

  16. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Murine Automated Urine Sampler (MAUS), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal outlines planned development for a low-power, low-mass automated urine sample collection and preservation system for small mammals, capable of...

  18. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  19. NASA Systems Autonomy Demonstration Project - Development of Space Station automation technology

    Science.gov (United States)

    Bull, John S.; Brown, Richard; Friedland, Peter; Wong, Carla M.; Bates, William

    1987-01-01

    A 1984 Congressional expansion of the 1958 National Aeronautics and Space Act mandated that NASA conduct programs, as part of the Space Station program, which will yield the U.S. material benefits, particularly in the areas of advanced automation and robotics systems. Demonstration programs are scheduled for automated systems such as the thermal control, expert system coordination of Station subsystems, and automation of multiple subsystems. The programs focus the R&D efforts and provide a gateway for transfer of technology to industry. The NASA Office of Aeronautics and Space Technology is responsible for directing, funding and evaluating the Systems Autonomy Demonstration Project, which will include simulated interactions between novice personnel and astronauts and several automated, expert subsystems to explore the effectiveness of the man-machine interface being developed. Features and progress on the TEXSYS prototype thermal control system expert system are outlined.

  20. Sample Size Calculation for Estimating or Testing a Nonzero Squared Multiple Correlation Coefficient

    Science.gov (United States)

    Krishnamoorthy, K.; Xia, Yanping

    2008-01-01

    The problems of hypothesis testing and interval estimation of the squared multiple correlation coefficient of a multivariate normal distribution are considered. It is shown that available one-sided tests are uniformly most powerful, and the one-sided confidence intervals are uniformly most accurate. An exact method of calculating sample size to…

  1. Miniaturized bead-beating device to automate full DNA sample preparation processes for gram-positive bacteria.

    Science.gov (United States)

    Hwang, Kyu-Youn; Kwon, Sung Hong; Jung, Sun-Ok; Lim, Hee-Kyun; Jung, Won-Jong; Park, Chin-Sung; Kim, Joon-Ho; Suh, Kahp-Yang; Huh, Nam

    2011-11-07

    We have developed a miniaturized bead-beating device to automate nucleic acids extraction from Gram-positive bacteria for molecular diagnostics. The microfluidic device was fabricated by sandwiching a monolithic flexible polydimethylsiloxane (PDMS) membrane between two glass wafers (i.e., glass-PDMS-glass), which acted as an actuator for bead collision via its pneumatic vibration without additional lysis equipment. The Gram-positive bacteria, S. aureus and methicillin-resistant S. aureus, were captured on surface-modified glass beads from 1 mL of initial sample solution and in situ lyzed by bead-beating operation. Then, 10 μL or 20 μL of bacterial DNA solution was eluted and amplified successfully by real-time PCR. It was found that liquid volume fraction played a crucial role in determining the cell lysis efficiency in a confined chamber by facilitating membrane deflection and bead motion. The miniaturized bead-beating operation disrupted most of S. aureus within 3 min, which turned out to be as efficient as the conventional benchtop vortexing machine or the enzyme-based lysis technique. The effective cell concentration was significantly enhanced with the reduction of initial sample volume by 50 or 100 times. Combination of such analyte enrichment and in situ bead-beating lysis provided an excellent PCR detection sensitivity amounting to ca. 46 CFU even for the Gram-positive bacteria. The proposed bead-beating microdevice is potentially useful as a nucleic acid extraction method toward a PCR-based sample-to-answer system. This journal is © The Royal Society of Chemistry 2011

  2. AST: an automated sequence-sampling method for improving the taxonomic diversity of gene phylogenetic trees.

    Science.gov (United States)

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php.

  3. A fully automated mass spectrometer for the analysis of organic solids

    International Nuclear Information System (INIS)

    Hillig, H.; Kueper, H.; Riepe, W.

    1979-01-01

    Automation of a mass spectrometer-computer system makes it possible to process up to 30 samples without attention after sample loading. An automatic sample changer introduces the samples successively into the ion source by means of a direct inlet probe. A process control unit determines the operation sequence. Computer programs are available for the hardware support, system supervision and evaluation of the spectrometer signals. The most essential precondition for automation - automatic evaporation of the sample material by electronic control of the total ion current - is confirmed to be satisfactory. The system operates routinely overnight in an industrial laboratory, so that day work can be devoted to difficult analytical problems. The cost of routine analyses is halved. (Auth.)

  4. Automated Registration Of Images From Multiple Sensors

    Science.gov (United States)

    Rignot, Eric J. M.; Kwok, Ronald; Curlander, John C.; Pang, Shirley S. N.

    1994-01-01

    Images of terrain scanned in common by multiple Earth-orbiting remote sensors registered automatically with each other and, where possible, on geographic coordinate grid. Simulated image of terrain viewed by sensor computed from ancillary data, viewing geometry, and mathematical model of physics of imaging. In proposed registration algorithm, simulated and actual sensor images matched by area-correlation technique.

  5. Automated protein structure calculation from NMR data

    International Nuclear Information System (INIS)

    Williamson, Mike P.; Craven, C. Jeremy

    2009-01-01

    Current software is almost at the stage to permit completely automatic structure determination of small proteins of <15 kDa, from NMR spectra to structure validation with minimal user interaction. This goal is welcome, as it makes structure calculation more objective and therefore more easily validated, without any loss in the quality of the structures generated. Moreover, it releases expert spectroscopists to carry out research that cannot be automated. It should not take much further effort to extend automation to ca 20 kDa. However, there are technological barriers to further automation, of which the biggest are identified as: routines for peak picking; adoption and sharing of a common framework for structure calculation, including the assembly of an automated and trusted package for structure validation; and sample preparation, particularly for larger proteins. These barriers should be the main target for development of methodology for protein structure determination, particularly by structural genomics consortia

  6. AUTOMATED FEATURE BASED TLS DATA REGISTRATION FOR 3D BUILDING MODELING

    OpenAIRE

    K. Kitamura; N. Kochi; S. Kaneko

    2012-01-01

    In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS). The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not re...

  7. The Accutension Stetho, an automated auscultatory device to validate automated sphygmomanometer readings in individual patients.

    Science.gov (United States)

    Alpert, Bruce S

    2018-04-06

    The aim of this report is to describe a new device that can validate, by automated auscultation, individual blood pressure (BP) readings taken by automated sphygmomanometers.The Accutension Stetho utilizes a smartphone application in conjunction with a specially designed stethoscope that interfaces directly into the smartphone via the earphone jack. The Korotkoff sounds are recorded by the application and are analyzed by the operator on the screen of the smartphone simultaneously with the images from the sphygmomanometer screen during BP estimation. Current auscultatory validation standards require at least 85 subjects and strict statistical criteria for passage. A device that passes can make no guarantee of accuracy on individual patients. The Accutension Stetho is an inexpensive smartphone/stethoscope kit combination that estimates precise BP values by auscultation to confirm the accuracy of an automated sphygmomanometer's readings on individual patients. This should be of great value for both professional and, in certain circumstances, self-measurement BP. Patients will avoid both unnecessary treatment and errors of underestimation of BP, in which the patient requires therapy. The Stetho's software has been validated in an independent ANSI/AAMI/ISO standard study. The Stetho has been shown to perform without difficulty in multiple deflation-based devices by many manufacturers.

  8. Analysis of multiple single nucleotide polymorphisms (SNP) on DNA traces from plasma and dried blood samples

    NARCIS (Netherlands)

    Catsburg, Arnold; van der Zwet, Wil C.; Morre, Servaas A.; Ouburg, Sander; Vandenbroucke-Grauls, Christina M. J. E.; Savelkoul, Paul H. M.

    2007-01-01

    Reliable analysis of single nucleotide polymorphisms (SNPs) in DNA derived from samples containing low numbers of cells or from suboptimal sources can be difficult. A new procedure to characterize multiple SNPs in traces of DNA from plasma and old dried blood samples was developed. Six SNPs in the

  9. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    Science.gov (United States)

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  10. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    Science.gov (United States)

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample. © The Author(s) 2016.

  11. Smart management of sample dilution using an artificial neural network to achieve streamlined processes and saving resources: the automated nephelometric testing of serum free light chain as case study.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Saving resources is a paramount issue for the modern laboratory, and new trainable as well as smart technologies can be used to allow the automated instrumentation to manage samples more efficiently in order to achieve streamlined processes. In this regard the serum free light chain (sFLC) testing represents an interesting challenge, as it usually causes using a number of assays before achieving an acceptable result within the analytical range. An artificial neural network based on the multi-layer perceptron (MLP-ANN) was used to infer the starting dilution status of sFLC samples based on the information available through the laboratory information system (LIS). After the learning phase, the MLP-ANN simulation was applied to the nephelometric testing routinely performed in our laboratory on a BN ProSpec® System analyzer (Siemens Helathcare) using the N Latex FLC kit. The MLP-ANN reduced the serum kappa free light chain (κ-FLC) and serum lambda free light chain (λ-FLC) wasted tests by 69.4% and 70.8% with respect to the naïve stepwise dilution scheme used by the automated analyzer, and by 64.9% and 66.9% compared to a "rational" dilution scheme based on a 4-step dilution. Although it was restricted to follow-up samples, the MLP-ANN showed good predictive performance, which alongside the possibility to implement it in any automated system, made it a suitable solution for achieving streamlined laboratory processes and saving resources.

  12. A replica exchange transition interface sampling method with multiple interface sets for investigating networks of rare events

    Science.gov (United States)

    Swenson, David W. H.; Bolhuis, Peter G.

    2014-07-01

    The multiple state transition interface sampling (TIS) framework in principle allows the simulation of a large network of complex rare event transitions, but in practice suffers from convergence problems. To improve convergence, we combine multiple state TIS [J. Rogal and P. G. Bolhuis, J. Chem. Phys. 129, 224107 (2008)] with replica exchange TIS [T. S. van Erp, Phys. Rev. Lett. 98, 268301 (2007)]. In addition, we introduce multiple interface sets, which allow more than one order parameter to be defined for each state. We illustrate the methodology on a model system of multiple independent dimers, each with two states. For reaction networks with up to 64 microstates, we determine the kinetics in the microcanonical ensemble, and discuss the convergence properties of the sampling scheme. For this model, we find that the kinetics depend on the instantaneous composition of the system. We explain this dependence in terms of the system's potential and kinetic energy.

  13. Decision Making In A High-Tech World: Automation Bias and Countermeasures

    Science.gov (United States)

    Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and

  14. A semi-automated magnetic capture probe based DNA extraction and real-time PCR method applied in the Swedish surveillance of Echinococcus multilocularis in red fox (Vulpes vulpes) faecal samples.

    Science.gov (United States)

    Isaksson, Mats; Hagström, Åsa; Armua-Fernandez, Maria Teresa; Wahlström, Helene; Ågren, Erik Olof; Miller, Andrea; Holmberg, Anders; Lukacs, Morten; Casulli, Adriano; Deplazes, Peter; Juremalm, Mikael

    2014-12-19

    Following the first finding of Echinococcus multilocularis in Sweden in 2011, 2985 red foxes (Vulpes vulpes) were analysed by the segmental sedimentation and counting technique. This is a labour intensive method and requires handling of the whole carcass of the fox, resulting in a costly analysis. In an effort to reduce the cost of labour and sample handling, an alternative method has been developed. The method is sensitive and partially automated for detection of E. multilocularis in faecal samples. The method has been used in the Swedish E. multilocularis monitoring program for 2012-2013 on more than 2000 faecal samples. We describe a new semi-automated magnetic capture probe DNA extraction method and real time hydrolysis probe polymerase chain reaction assay (MC-PCR) for the detection of E. multilocularis DNA in faecal samples from red fox. The diagnostic sensitivity was determined by validating the new method against the sedimentation and counting technique in fox samples collected in Switzerland where E. multilocularis is highly endemic. Of 177 foxes analysed by the sedimentation and counting technique, E. multilocularis was detected in 93 animals. Eighty-two (88%, 95% C.I 79.8-93.9) of these were positive in the MC-PCR. In foxes with more than 100 worms, the MC-PCR was positive in 44 out of 46 (95.7%) cases. The two MC-PCR negative samples originated from foxes with only immature E. multilocularis worms. In foxes with 100 worms or less, (n = 47), 38 (80.9%) were positive in the MC-PCR. The diagnostic specificity of the MC-PCR was evaluated using fox scats collected within the Swedish screening. Of 2158 samples analysed, two were positive. This implies that the specificity is at least 99.9% (C.I. = 99.7-100). The MC-PCR proved to have a high sensitivity and a very high specificity. The test is partially automated but also possible to perform manually if desired. The test is well suited for nationwide E. multilocularis surveillance programs where sampling

  15. Open-Source Automated Mapping Four-Point Probe

    Directory of Open Access Journals (Sweden)

    Handy Chandra

    2017-01-01

    Full Text Available Scientists have begun using self-replicating rapid prototyper (RepRap 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  16. Open-Source Automated Mapping Four-Point Probe.

    Science.gov (United States)

    Chandra, Handy; Allen, Spencer W; Oberloier, Shane W; Bihari, Nupur; Gwamuri, Jephias; Pearce, Joshua M

    2017-01-26

    Scientists have begun using self-replicating rapid prototyper (RepRap) 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP) measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO) samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  17. A Chip-Capillary Hybrid Device for Automated Transfer of Sample Pre-Separated by Capillary Isoelectric Focusing to Parallel Capillary Gel Electrophoresis for Two-Dimensional Protein Separation

    Science.gov (United States)

    Lu, Joann J.; Wang, Shili; Li, Guanbin; Wang, Wei; Pu, Qiaosheng; Liu, Shaorong

    2012-01-01

    In this report, we introduce a chip-capillary hybrid device to integrate capillary isoelectric focusing (CIEF) with parallel capillary sodium dodecyl sulfate – polyacrylamide gel electrophoresis (SDS-PAGE) or capillary gel electrophoresis (CGE) toward automating two-dimensional (2D) protein separations. The hybrid device consists of three chips that are butted together. The middle chip can be moved between two positions to re-route the fluidic paths, which enables the performance of CIEF and injection of proteins partially resolved by CIEF to CGE capillaries for parallel CGE separations in a continuous and automated fashion. Capillaries are attached to the other two chips to facilitate CIEF and CGE separations and to extend the effective lengths of CGE columns. Specifically, we illustrate the working principle of the hybrid device, develop protocols for producing and preparing the hybrid device, and demonstrate the feasibility of using this hybrid device for automated injection of CIEF-separated sample to parallel CGE for 2D protein separations. Potentials and problems associated with the hybrid device are also discussed. PMID:22830584

  18. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  19. Comparison of semiautomated bird song recognition with manual detection of recorded bird song samples

    Directory of Open Access Journals (Sweden)

    Lisa A. Venier

    2017-12-01

    Full Text Available Automated recording units are increasingly being used to sample wildlife populations. These devices can produce large amounts of data that are difficult to process manually. However, the information in the recordings can be summarized with semiautomated sound recognition software. Our objective was to assess the utility of the semiautomated bird song recognizers to produce data useful for conservation and sustainable forest management applications. We compared detection data generated from expert-interpreted recordings of bird songs collected with automated recording units and data derived from a semiautomated recognition process. We recorded bird songs at 109 sites in boreal forest in 2013 and 2014 using automated recording units. We developed bird-song recognizers for 10 species using Song Scope software (Wildlife Acoustics and each recognizer was used to scan a set of recordings that was also interpreted manually by an expert in birdsong identification. We used occupancy models to estimate the detection probability associated with each method. Based on these detection probability estimates we produced cumulative detection probability curves. In a second analysis we estimated detection probability of bird song recognizers using multiple 10-minute recordings for a single station and visit (35-63, 10-minute recordings in each of four one-week periods. Results show that the detection probability of most species from single 10-min recordings is substantially higher using expert-interpreted bird song recordings than using the song recognizer software. However, our results also indicate that detection probabilities for song recognizers can be significantly improved by using more than a single 10-minute recording, which can be easily done with little additional cost with the automate procedure. Based on these results we suggest that automated recording units and song recognizer software can be valuable tools to estimate detection probability and

  20. Sample size requirements for one-year treatment effects using deep gray matter volume from 3T MRI in progressive forms of multiple sclerosis.

    Science.gov (United States)

    Kim, Gloria; Chu, Renxin; Yousuf, Fawad; Tauhid, Shahamat; Stazzone, Lynn; Houtchens, Maria K; Stankiewicz, James M; Severson, Christopher; Kimbrough, Dorlan; Quintana, Francisco J; Chitnis, Tanuja; Weiner, Howard L; Healy, Brian C; Bakshi, Rohit

    2017-11-01

    The subcortical deep gray matter (DGM) develops selective, progressive, and clinically relevant atrophy in progressive forms of multiple sclerosis (PMS). This patient population is the target of active neurotherapeutic development, requiring the availability of outcome measures. We tested a fully automated MRI analysis pipeline to assess DGM atrophy in PMS. Consistent 3D T1-weighted high-resolution 3T brain MRI was obtained over one year in 19 consecutive patients with PMS [15 secondary progressive, 4 primary progressive, 53% women, age (mean±SD) 50.8±8.0 years, Expanded Disability Status Scale (median, range) 5.0, 2.0-6.5)]. DGM segmentation applied the fully automated FSL-FIRST pipeline ( http://fsl.fmrib.ox.ac.uk ). Total DGM volume was the sum of the caudate, putamen, globus pallidus, and thalamus. On-study change was calculated using a random-effects linear regression model. We detected one-year decreases in raw [mean (95% confidence interval): -0.749 ml (-1.455, -0.043), p = 0.039] and annualized [-0.754 ml/year (-1.492, -0.016), p = 0.046] total DGM volumes. A treatment trial for an intervention that would show a 50% reduction in DGM brain atrophy would require a sample size of 123 patients for a single-arm study (one-year run-in followed by one-year on-treatment). For a two-arm placebo-controlled one-year study, 242 patients would be required per arm. The use of DGM fraction required more patients. The thalamus, putamen, and globus pallidus, showed smaller effect sizes in their on-study changes than the total DGM; however, for the caudate, the effect sizes were somewhat larger. DGM atrophy may prove efficient as a short-term outcome for proof-of-concept neurotherapeutic trials in PMS.

  1. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Automation of multiple neutral beam injector controls at Lawrence Livermore Laboratory

    International Nuclear Information System (INIS)

    Pollock, G.G.

    1977-01-01

    The computer control system used on the twelve Neutral Beams of the 2XIIB experiment at the Lawrence Livermore Laboratory (LLL) has evolved over the last three years. It is now in its final form and in regular use. It provides automatic data collection, reduction, and graphics presentation, as well as automatic conditioning, automatic normal operation, and processing of calorimeter data. This paper presents an overview of the capabilities and implementation of the current system, a detailed discussion of the automatic conditioning algorithm, and discusses the future directions for neutral beam automation

  3. Quality-assurance techniques used with automated analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Killian, E.W.; Koeppen, L.D.; Femec, D.A.

    1994-01-01

    In the course of developing gamma-ray spectrum analysis algorithms for use by the Radiation Measurements Laboratory at the Idaho National Engineering Laboratory (INEL), several techniques have been developed that enhance and verify the quality of the analytical results. The use of these quality-assurance techniques is critical when gamma-ray analysis results from low-level environmental samples are used in risk assessment or site restoration and cleanup decisions. This paper describes four of the quality-assurance techniques that are in routine use at the laboratory. They are used for all types of samples, from reactor effluents to environmental samples. The techniques include: (1) the use of precision pulsers (with subsequent removal) to validate the correct operation of the spectrometer electronics for each and every spectrum acquired, (2) the use of naturally occurring and cosmically induced radionuclides in samples to help verify that the data acquisition and analysis were performed properly, (3) the use of an ambient background correction technique that involves superimposing (open-quotes mappingclose quotes) sample photopeak fitting parameters onto multiple background spectra for accurate and more consistent quantification of the background activities, (4) the use of interactive, computer-driven graphics to review the automated locating and fitting of photopeaks and to allow for manual fitting of photopeaks

  4. Automated Quantum Mechanical Predictions of Enantioselectivity in a Rhodium-Catalyzed Asymmetric Hydrogenation.

    Science.gov (United States)

    Guan, Yanfei; Wheeler, Steven E

    2017-07-24

    A computational toolkit (AARON: An automated reaction optimizer for new catalysts) is described that automates the density functional theory (DFT) based screening of chiral ligands for transition-metal-catalyzed reactions with well-defined reaction mechanisms but multiple stereocontrolling transition states. This is demonstrated for the Rh-catalyzed asymmetric hydrogenation of (E)-β-aryl-N-acetyl enamides, for which a new C 2 -symmetric phosphorus ligand is designed. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Automated waste canister docking and emplacement using a sensor-based intelligent controller

    International Nuclear Information System (INIS)

    Drotning, W.D.

    1992-08-01

    A sensor-based intelligent control system is described that utilizes a multiple degree-of-freedom robotic system for the automated remote manipulation and precision docking of large payloads such as waste canisters. Computer vision and ultrasonic proximity sensing are used to control the automated precision docking of a large object with a passive target cavity. Real-time sensor processing and model-based analysis are used to control payload position to a precision of ± 0.5 millimeter

  6. Automated extraction of radiation dose information from CT dose report images.

    Science.gov (United States)

    Li, Xinhua; Zhang, Da; Liu, Bob

    2011-06-01

    The purpose of this article is to describe the development of an automated tool for retrieving texts from CT dose report images. Optical character recognition was adopted to perform text recognitions of CT dose report images. The developed tool is able to automate the process of analyzing multiple CT examinations, including text recognition, parsing, error correction, and exporting data to spreadsheets. The results were precise for total dose-length product (DLP) and were about 95% accurate for CT dose index and DLP of scanned series.

  7. Analytical characterization of high-level mixed wastes using multiple sample preparation treatments

    International Nuclear Information System (INIS)

    King, A.G.; Baldwin, D.L.; Urie, M.W.; McKinley, S.G.

    1994-01-01

    The Analytical Chemistry Laboratory at the Pacific Northwest Laboratory in Richland, Washington, is actively involved in performing analytical characterization of high-level mixed waste from Hanford's single shell and double shell tank characterization programs. A full suite of analyses is typically performed on homogenized tank core samples. These analytical techniques include inductively-coupled plasma-atomic emission spectroscopy, total organic carbon methods and radiochemistry methods, as well as many others, all requiring some type of remote sample-preparation treatment to solubilize the tank sludge material for analysis. Most of these analytical methods typically use a single sample-preparation treatment, inherently providing elemental information only. To better understand and interpret tank chemistry and assist in identifying chemical compounds, selected analytical methods are performed using multiple sample-preparation treatments. The sample preparation treatments used at Pacific Northwest Laboratory for this work with high-level mixed waste include caustic fusion, acid digestion, and water leach. The type of information available by comparing results from different sample-prep treatments includes evidence for the presence of refractory compounds, acid-soluble compounds, or water-soluble compounds. Problems unique to the analysis of Hanford tank wastes are discussed. Selected results from the Hanford single shell ferrocyanide tank, 241-C-109, are presented, and the resulting conclusions are discussed

  8. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-01-01

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  9. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    Science.gov (United States)

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  10. Challenges in Gaining Large Scale Carbon Reductions through Wireless Home Automation Systems

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Rovsing, Poul Ejnar; Toftegaard, Thomas Skjødeberg

    2010-01-01

    Buildings account for more than a 35 % of the energy consumption in Europe. Therefore a step towards more sustainable lifestile is to use home automation to optimize the energy consumption “automatically”. This paper reports about the usage and some of the remaining challenges of especially...... wireless but also powerline communication in a home automation setting. For many years, home automation has been visible to many, but accessible to only a few, because of inadequate integration of systems. A vast number of both standard and proprietary communication protocols are used, and systems...... are often difficult to install and configure so professional assistance is needed. In this paper we report about our experience in constructing an open universal home automation framework enabling interoperability of multiple communication protocols. The framework can easily be expanded in order to support...

  11. A fully automated fast analysis system for capillary gas chromatography. Part 1. Automation of system control

    NARCIS (Netherlands)

    Snijders, H.M.J.; Rijks, J.P.E.M.; Bombeeck, A.J.; Rijks, J.A.; Sandra, P.; Lee, M.L.

    1992-01-01

    This paper is dealing with the design, the automation and evaluation of a high speed capillary gas chromatographic system. A combination of software and hardware was developed for a new cold trap/reinjection device that allows selective solvent eliminating and on column sample enrichment and an

  12. Automated statistical matching of multiple tephra records exemplified using five long maar sequences younger than 75 ka, Auckland, New Zealand

    Science.gov (United States)

    Green, Rebecca M.; Bebbington, Mark S.; Cronin, Shane J.; Jones, Geoff

    2014-09-01

    Detailed tephrochronologies are built to underpin probabilistic volcanic hazard forecasting, and to understand the dynamics and history of diverse geomorphic, climatic, soil-forming and environmental processes. Complicating factors include highly variable tephra distribution over time; difficulty in correlating tephras from site to site based on physical and chemical properties; and uncertain age determinations. Multiple sites permit construction of more accurate composite tephra records, but correctly merging individual site records by recognizing common events and site-specific gaps is complex. We present an automated procedure for matching tephra sequences between multiple deposition sites using stochastic local optimization techniques. If individual tephra age determinations are not significantly different between sites, they are matched and a more precise age is assigned. Known stratigraphy and mineralogical or geochemical compositions are used to constrain tephra matches. We apply this method to match tephra records from five long sediment cores (≤ 75 cal ka BP) in Auckland, New Zealand. Sediments at these sites preserve basaltic tephras from local eruptions of the Auckland Volcanic Field as well as distal rhyolitic and andesitic tephras from Okataina, Taupo, Egmont, Tongariro, and Tuhua (Mayor Island) volcanic centers. The new correlated record compiled is statistically more likely than previously published arrangements from this area.

  13. Automation of cDNA Synthesis and Labelling Improves Reproducibility

    Directory of Open Access Journals (Sweden)

    Daniel Klevebring

    2009-01-01

    Full Text Available Background. Several technologies, such as in-depth sequencing and microarrays, enable large-scale interrogation of genomes and transcriptomes. In this study, we asses reproducibility and throughput by moving all laboratory procedures to a robotic workstation, capable of handling superparamagnetic beads. Here, we describe a fully automated procedure for cDNA synthesis and labelling for microarrays, where the purification steps prior to and after labelling are based on precipitation of DNA on carboxylic acid-coated paramagnetic beads. Results. The fully automated procedure allows for samples arrayed on a microtiter plate to be processed in parallel without manual intervention and ensuring high reproducibility. We compare our results to a manual sample preparation procedure and, in addition, use a comprehensive reference dataset to show that the protocol described performs better than similar manual procedures. Conclusions. We demonstrate, in an automated gene expression microarray experiment, a reduced variance between replicates, resulting in an increase in the statistical power to detect differentially expressed genes, thus allowing smaller differences between samples to be identified. This protocol can with minor modifications be used to create cDNA libraries for other applications such as in-depth analysis using next-generation sequencing technologies.

  14. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  15. Multiple surveys employing a new sample-processing protocol reveal the genetic diversity of placozoans in Japan.

    Science.gov (United States)

    Miyazawa, Hideyuki; Nakano, Hiroaki

    2018-03-01

    Placozoans, flat free-living marine invertebrates, possess an extremely simple bauplan lacking neurons and muscle cells and represent one of the earliest-branching metazoan phyla. They are widely distributed from temperate to tropical oceans. Based on mitochondrial 16S rRNA sequences, 19 haplotypes forming seven distinct clades have been reported in placozoans to date. In Japan, placozoans have been found at nine locations, but 16S genotyping has been performed at only two of these locations. Here, we propose a new processing protocol, "ethanol-treated substrate sampling," for collecting placozoans from natural environments. We also report the collection of placozoans from three new locations, the islands of Shikine-jima, Chichi-jima, and Haha-jima, and we present the distribution of the 16S haplotypes of placozoans in Japan. Multiple surveys conducted at multiple locations yielded five haplotypes that were not reported previously, revealing high genetic diversity in Japan, especially at Shimoda and Shikine-jima Island. The observed geographic distribution patterns were different among haplotypes; some were widely distributed, while others were sampled only from a single location. However, samplings conducted on different dates at the same sites yielded different haplotypes, suggesting that placozoans of a given haplotype do not inhabit the same site constantly throughout the year. Continued sampling efforts conducted during all seasons at multiple locations worldwide and the development of molecular markers within the haplotypes are needed to reveal the geographic distribution pattern and dispersal history of placozoans in greater detail.

  16. Comparison of manual and automated pretreatment methods for AMS radiocarbon dating of plant fossils

    Science.gov (United States)

    Bradley, L.A.; Stafford, Thomas W.

    1994-01-01

    A new automated pretreatment system for the preparation of materials submitted for accelerator mass spectrometry (AMS) analysis is less time-consuming and results in a higher sample yield. The new procedure was tested using two groups of plant fossils: one group was pretreated using the traditional method, and the second, using the automated pretreatment apparatus. The time it took to complete the procedure and the amount of sample material remaining were compared. The automated pretreatment apparatus proved to be more than three times faster and, in most cases, produced a higher yield. A darker discoloration of the KOH solutions was observed indicating that the automated system is more thorough in removing humates from the specimen compared to the manual method. -Authors

  17. Multiple sample, radioactive particle counting apparatus

    International Nuclear Information System (INIS)

    Reddy, R.R.V.; Kelso, D.M.

    1978-01-01

    An apparatus is described for determining the respective radioactive particle sample count being emitted from radioactive particle containing samples. It includes means for modulating the information on the radioactive particles being emitted from the samples, coded detecting means for sequentially detecting different respective coded combinations of the radioactive particles emitted from more than one but less than all of the samples, and processing the modulated information to derive the sample count for each sample. It includes a single light emitting crystal next to a number of samples, an encoder belt sequentially movable between the crystal and the samples. The encoder belt has a coded array of apertures to provide corresponding modulated light pulses from the crystal, and a photomultiplier tube to convert the modulated light pulses to decodable electrical signals for deriving the respective sample count

  18. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Brooker, Simon J; Pagano, Marcello

    2012-01-01

    Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  19. Automation in structural biology beamlines of the Photon Factory

    International Nuclear Information System (INIS)

    Igarashi, Noriyuki; Hiraki, Masahiko; Matsugaki, Naohiro; Yamada, Yusuke; Wakatsuki, Soichi

    2007-01-01

    The Photon Factory currently operates four synchrotron beamlines for protein crystallography and two more beamlines are scheduled to be constructed in the next years. Over the last years these beamlines have been upgraded and equipped with a fully automated beamline control system based on a robotic sample changer. The current system allows for remote operation, controlled from the user's area, of sample mounting, centering and data collection of pre-frozen crystals mounted in Hampton-type cryo-loops on goniometer head. New intuitive graphical user interfaces have been developed so as to control the complete beamline operation. Furthermore, algorithms for automatic sample centering based on pattern matching and X-ray beam scanning are being developed and combined with newly developed diffraction evaluation programs in order to complete entire automation of the data collection. (author)

  20. Evaluation of an automated microplate technique in the Galileo system for ABO and Rh(D) blood grouping.

    Science.gov (United States)

    Xu, Weiyi; Wan, Feng; Lou, Yufeng; Jin, Jiali; Mao, Weilin

    2014-01-01

    A number of automated devices for pretransfusion testing have recently become available. This study evaluated the Immucor Galileo System, a fully automated device based on the microplate hemagglutination technique for ABO/Rh (D) determinations. Routine ABO/Rh typing tests were performed on 13,045 samples using the Immucor automated instruments. Manual tube method was used to resolve ABO forward and reverse grouping discrepancies. D-negative test results were investigated and confirmed manually by the indirect antiglobulin test (IAT). The system rejected 70 tests for sample inadequacy. 87 samples were read as "No-type-determined" due to forward and reverse grouping discrepancies. 25 tests gave these results because of sample hemolysis. After further tests, we found 34 tests were caused by weakened RBC antibodies, 5 tests were attributable to weak A and/or B antigens, 4 tests were due to mixed-field reactions, and 8 tests had high titer cold agglutinin with blood qualifications which react only at temperatures below 34 degrees C. In the remaining 11 cases, irregular RBC antibodies were identified in 9 samples (seven anti-M and two anti-P) and two subgroups were identified in 2 samples (one A1 and one A2) by a reference laboratory. As for D typing, 2 weak D+ samples missed by automated systems gave negative results, but weak-positive reactions were observed in the IAT. The Immucor Galileo System is reliable and suited for ABO and D blood groups, some reasons may cause a discrepancy in ABO/D typing using a fully automated system. It is suggested that standardization of sample collection may improve the performance of the fully automated system.

  1. ANA IIF Automation: Moving towards Harmonization? Results of a Multicenter Study

    Directory of Open Access Journals (Sweden)

    Stefanie Van den Bremt

    2017-01-01

    Full Text Available Background. Our study aimed to investigate whether the introduction of automated anti-nuclear antibody (ANA indirect immunofluorescence (IIF analysis decreases the interlaboratory variability of ANA titer results. Method. Three serum samples were sent to 10 laboratories using the QUANTA-Lyser® in combination with the NOVA View®. Each laboratory performed the ANA IIF analysis 10x in 1 run and 1x in 10 different runs and determined the endpoint titer by dilution. One of the three samples had been sent in 2012, before the era of ANA IIF automation, by the Belgian National External Quality Assessment (EQA Scheme. Harmonization was evaluated in terms of variability in fluorescence intensity (LIU and ANA IIF titer. Results. The evaluation of the intra- and interrun LIU variability revealed a larger variability for 2 laboratories, due to preanalytical and analytical problems. Reanalysis of the EQA sample resulted in a lower titer variability. Diluted endpoint titers were similar to the estimated single well titer and the overall median titer as reported by the EQA in 2012. Conclusion. The introduction of automated microscopic analysis allows more harmonized ANA IIF reporting, provided that this totally automated process is controlled by a thorough quality assurance program, covering the total ANA IIF process.

  2. Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.

    Science.gov (United States)

    Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo

    2015-11-17

    Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.

  3. Recent trends in laboratory automation in the pharmaceutical industry.

    Science.gov (United States)

    Rutherford, M L; Stinger, T

    2001-05-01

    The impact of robotics and automation on the pharmaceutical industry over the last two decades has been significant. In the last ten years, the emphasis of laboratory automation has shifted from the support of manufactured products and quality control of laboratory applications, to research and development. This shift has been the direct result of an increased emphasis on the identification, development and eventual marketing of innovative new products. In this article, we will briefly identify and discuss some of the current trends in laboratory automation in the pharmaceutical industry as they apply to research and development, including screening, sample management, combinatorial chemistry, ADME/Tox and pharmacokinetics.

  4. Automated radiometric detection of bacteria

    International Nuclear Information System (INIS)

    Waters, J.R.

    1974-01-01

    A new radiometric method called BACTEC, used for the detection of bacteria in cultures or in supposedly sterile samples, was discussed from the standpoint of methodology, both automated and semi-automated. Some of the results obtained so far were reported and some future applications and development possibilities were described. In this new method, the test sample is incubated in a sealed vial with a liquid culture medium containing a 14 C-labeled substrate. If bacteria are present, they break down the substrate, producing 14 CO 2 which is periodically extracted from the vial as a gas and is tested for radioactivity. If this gaseous radioactivity exceeds a threshold level, it is evidence of bacterial presence and growth in the test vial. The first application was for the detection of bacteria in the blood cultures of hospital patients. Data were presented showing typical results. Also discussed were future applications, such as rapid screening for bacteria in urine industrial sterility testing and the disposal of used 14 C substrates. (Mukohata, S.)

  5. A Study on Integrated Control Network for Multiple Automation Services-1st year report

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, D.H.; Park, B.S.; Kim, M.S.; Lim, Y.H.; Ahn, S.K. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    This report describes the development of Integrated and Intelligent Gateway which is under developed. The network operating technique in this report can identifies the causes of the communication faults and can avoid communication network faults in advance. Utility companies spend large financial investment and time for supplying the stabilized power. Since this is deeply related to the reliability of Automation Systems, it is natural to employ Fault-Tolerant communication network for Automation Systems. Use of the network system developed in this report is not limited in DAS. It can be expandable to the many kinds of data services for customer. Thus this report suggests the direction of the communication network development. This 1st year report is composed of following contents, 1) The introduction and problems of DAS. 2) The configuration and functions of IIG. 3) The protocols. (author). 27 refs., 73 figs., 6 tabs.

  6. Multiple double cross-section transmission electron microscope sample preparation of specific sub-10 nm diameter Si nanowire devices.

    Science.gov (United States)

    Gignac, Lynne M; Mittal, Surbhi; Bangsaruntip, Sarunya; Cohen, Guy M; Sleight, Jeffrey W

    2011-12-01

    The ability to prepare multiple cross-section transmission electron microscope (XTEM) samples from one XTEM sample of specific sub-10 nm features was demonstrated. Sub-10 nm diameter Si nanowire (NW) devices were initially cross-sectioned using a dual-beam focused ion beam system in a direction running parallel to the device channel. From this XTEM sample, both low- and high-resolution transmission electron microscope (TEM) images were obtained from six separate, specific site Si NW devices. The XTEM sample was then re-sectioned in four separate locations in a direction perpendicular to the device channel: 90° from the original XTEM sample direction. Three of the four XTEM samples were successfully sectioned in the gate region of the device. From these three samples, low- and high-resolution TEM images of the Si NW were taken and measurements of the NW diameters were obtained. This technique demonstrated the ability to obtain high-resolution TEM images in directions 90° from one another of multiple, specific sub-10 nm features that were spaced 1.1 μm apart.

  7. Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC

    Science.gov (United States)

    Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet

    1999-01-01

    The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.

  8. Automated data collection based on RoboDiff at the ESRF beamline MASSIF-1

    Energy Technology Data Exchange (ETDEWEB)

    Nurizzo, Didier, E-mail: Didier.nurizzo@esrf.fr; Guichard, Nicolas; McSweeney, Sean; Theveneau, Pascal; Guijarro, Matias; Svensson, Olof; Mueller-Dieckmann, Christoph; Leonard, Gordon [ESRF, The European Synchrotron, 71, Avenue des Martyrs,CS 40220, 38043 Grenoble (France); Bowler, Matthew W. [EMBL Grenoble Outstation, 71 Avenue des Martyrs, CS90181, 38042 Grenoble Cedex 9 (France)

    2016-07-27

    The European Synchrotron Radiation Facility has a long standing history in the automation of experiments in Macromolecular Crystallography. MASSIF-1 (Massively Automated Sample Screening and evaluation Integrated Facility), a beamline constructed as part of the ESRF Upgrade Phase I program, has been open to the external user community since July 2014 and offers a unique completely automated data collection service to both academic and industrial structural biologists.

  9. Automated design of degenerate codon libraries.

    Science.gov (United States)

    Mena, Marco A; Daugherty, Patrick S

    2005-12-01

    Degenerate codon libraries are frequently used in protein engineering and evolution studies but are often limited to targeting a small number of positions to adequately limit the search space. To mitigate this, codon degeneracy can be limited using heuristics or previous knowledge of the targeted positions. To automate design of libraries given a set of amino acid sequences, an algorithm (LibDesign) was developed that generates a set of possible degenerate codon libraries, their resulting size, and their score relative to a user-defined scoring function. A gene library of a specified size can then be constructed that is representative of the given amino acid distribution or that includes specific sequences or combinations thereof. LibDesign provides a new tool for automated design of high-quality protein libraries that more effectively harness existing sequence-structure information derived from multiple sequence alignment or computational protein design data.

  10. Extensive monitoring through multiple blood samples in professional soccer players.

    Science.gov (United States)

    Heisterberg, Mette F; Fahrenkrug, Jan; Krustrup, Peter; Storskov, Anders; Kjær, Michael; Andersen, Jesper L

    2013-05-01

    The aim of this study was to make a comprehensive gathering of consecutive detailed blood samples from professional soccer players and to analyze different blood parameters in relation to seasonal changes in training and match exposure. Blood samples were collected 5 times during a 6-month period and analyzed for 37 variables in 27 professional soccer players from the best Danish league. Additionally, the players were tested for body composition, V[Combining Dot Above]O2max and physical performance by the Yo-Yo intermittent endurance submax test (IE2). Multiple variations in blood parameters occurred during the observation period, including a decrease in hemoglobin and an increase in hematocrit as the competitive season progressed. Iron and transferrin were stable, whereas ferritin showed a decrease at the end of the season. The immunoglobulin A (IgA) and IgM increased in the period with basal physical training and at the end of the season. Leucocytes decreased with increased physical training. Lymphocytes decreased at the end of the season. The V[Combining Dot Above]O2max decreased toward the end of the season, whereas no significant changes were observed in the IE2 test. The regular blood samples from elite soccer players reveal significant changes that may be related to changes in training pattern, match exposure, or length of the match season. Especially the end of the preparation season and at the end of the competitive season seem to be time points were the blood-derived values indicate that the players are under excessive physical strain and might be more subjected to a possible overreaching-overtraining conditions. We suggest that regular analyses of blood samples could be an important initiative to optimize training adaptation, training load, and game participation, but sampling has to be regular, and a database has to be built for each individual player.

  11. Feasibility study for the computerized automation of the Annapolis Field Office of EPA region III

    International Nuclear Information System (INIS)

    Ames, H.S.; Barton, G.W. Jr.; Bystroff, R.I.; Crawford, R.W.; Kray, A.M.; Maples, M.D.

    1976-08-01

    This report describes a feasibility study for computerized automation of the Annapolis Field Office (AFO) of EPA's Region III. The AFO laboratory provides analytical support for a number of EPA divisions; its primary function at present is analysis of water samples from rivers, estuaries, and the ocean in the Chesapeake Bay area. Automation of the AFO laboratory is found to be not only feasible but also highly desirable. An automation system is proposed which will give major improvements in analytical capacity, quality control, sample management, and reporting capabilities. This system is similar to the LLL-developed automation systems already installed at other EPA laboratories, with modifications specific to the needs of the AFO laboratory and the addition of sample file control. It is estimated that the initial cost of the system, nearly $300,000, would be recouped in about three years by virtue of the increased capacity and efficiency of operation

  12. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Jay W. Grate; Timothy A. DeVol

    2006-01-01

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodology, and to initiate work on new detection approaches, especially using modified diode detectors

  13. [The actual possibilities of robotic microscopy in analysis automation and laboratory telemedicine].

    Science.gov (United States)

    Medovyĭ, V S; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Balugian, R Sh

    2012-10-01

    The article discusses the possibilities of automation microscopy complexes manufactured by Cellavision and MEKOS to perform the medical analyses of blood films and other biomaterials. The joint work of the complex and physician in the regimen of automatic load stages, screening, sampling and sorting on types with simple morphology, visual sorting of sub-sample with complex morphology provides significant increase of method sensitivity, load decrease and enhancement of physician work conditions. The information technologies, the virtual slides and laboratory telemedicine included permit to develop the representative samples of rare types and pathologies to promote automation methods and medical research targets.

  14. Portfolio of automated trading systems: complexity and learning set size issues.

    Science.gov (United States)

    Raudys, Sarunas

    2013-03-01

    In this paper, we consider using profit/loss histories of multiple automated trading systems (ATSs) as N input variables in portfolio management. By means of multivariate statistical analysis and simulation studies, we analyze the influences of sample size (L) and input dimensionality on the accuracy of determining the portfolio weights. We find that degradation in portfolio performance due to inexact estimation of N means and N(N - 1)/2 correlations is proportional to N/L; however, estimation of N variances does not worsen the result. To reduce unhelpful sample size/dimensionality effects, we perform a clustering of N time series and split them into a small number of blocks. Each block is composed of mutually correlated ATSs. It generates an expert trading agent based on a nontrainable 1/N portfolio rule. To increase the diversity of the expert agents, we use training sets of different lengths for clustering. In the output of the portfolio management system, the regularized mean-variance framework-based fusion agent is developed in each walk-forward step of an out-of-sample portfolio validation experiment. Experiments with the real financial data (2003-2012) confirm the effectiveness of the suggested approach.

  15. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam; Aslam, Muhammad; Jun, Chi-Hyuck

    2017-01-01

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  16. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam

    2017-03-25

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  17. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  18. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  19. Intention to use a fully automated car: attitudes and a priori acceptability

    OpenAIRE

    PAYRE, William; CESTAC, Julien; DELHOMME, Patricia

    2014-01-01

    If previous research studied acceptability of partially or highly automated driving, few of them focused on fully automated driving (FAD), including the ability to master longitudinal control, lateral control and maneuvers. The present study analyzes a priori acceptability, attitudes, personality traits and intention to use a fully automated vehicle. 421 French drivers (153 males, M= 40.2 years, age range 19-73) answered an online questionnaire. 68.1% of the sample a priori accepted FAD. P...

  20. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  1. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  2. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  3. Automated Scoring of Constructed-Response Science Items: Prospects and Obstacles

    Science.gov (United States)

    Liu, Ou Lydia; Brew, Chris; Blackmore, John; Gerard, Libby; Madhok, Jacquie; Linn, Marcia C.

    2014-01-01

    Content-based automated scoring has been applied in a variety of science domains. However, many prior applications involved simplified scoring rubrics without considering rubrics representing multiple levels of understanding. This study tested a concept-based scoring tool for content-based scoring, c-rater™, for four science items with rubrics…

  4. Quantification of Parvovirus B19 DNA Using COBAS AmpliPrep Automated Sample Preparation and LightCycler Real-Time PCR

    Science.gov (United States)

    Schorling, Stefan; Schalasta, Gunnar; Enders, Gisela; Zauke, Michael

    2004-01-01

    The COBAS AmpliPrep instrument (Roche Diagnostics GmbH, D-68305 Mannheim, Germany) automates the entire sample preparation process of nucleic acid isolation from serum or plasma for polymerase chain reaction analysis. We report the analytical performance of the LightCycler Parvovirus B19 Quantification Kit (Roche Diagnostics) using nucleic acids isolated with the COBAS AmpliPrep instrument. Nucleic acids were extracted using the Total Nucleic Acid Isolation Kit (Roche Diagnostics) and amplified with the LightCycler Parvovirus B19 Quantification Kit. The kit combination processes 72 samples per 8-hour shift. The lower detection limit is 234 IU/ml at a 95% hit-rate, linear range approximately 104-1010 IU/ml, and overall precision 16 to 40%. Relative sensitivity and specificity in routine samples from pregnant women are 100% and 93%, respectively. Identification of a persistent parvovirus B19-infected individual by the polymerase chain reaction among 51 anti-parvovirus B19 IgM-negative samples underlines the importance of additional nucleic acid testing in pregnancy and its superiority to serology in identifying the risk of parvovirus B19 transmission via blood or blood products. Combination of the Total Nucleic Acid Isolation Kit on the COBAS AmpliPrep instrument with the LightCycler Parvovirus B19 Quantification Kit provides a reliable and time-saving tool for sensitive and accurate detection of parvovirus B19 DNA. PMID:14736825

  5. Multiple sampling ionization chamber (MUSIC) for measuring the charge of relativistic heavy ions

    Energy Technology Data Exchange (ETDEWEB)

    Christie, W.B.; Romero, J.L.; Brady, F.P.; Tull, C.E.; Castaneda, C.M.; Barasch, E.F.; Webb, M.L.; Drummond, J.R.; Crawford, H.J.; Flores, I.

    1987-04-01

    A large area (1 m x 2 m) multiple sampling ionization chamber (MUSIC) has been constructed and tested. The MUSIC detector makes multiple measurements of energy 'loss', dE/dx, for a relativistic heavy ion. Given the velocity, the charge of the ion can be extracted from the energy loss distributions. The widths of the distributions we observe are much narrower than predicted by Vavilov's theory for energy loss while agreeing well with the theory of Badhwar which deals with the energy deposited. The versatile design of MUSIC allows a variety of anode configurations which results in a large dynamic range of charge. In our tests to date we have observed charge resolutions of 0.25e fwhm for 727 MeV/nucleon /sup 40/Ar and 0.30e fwhm for 1.08 GeV/nucleon /sup 139/La and /sup 139/La fragments. Vertical position and multiple track determination are obtained by using time projection chamber electronics. Preliminary tests indicate that the position resolution is also very good with sigmaapprox. =100 ..mu..m.

  6. Detection of Salmonella spp. in veterinary samples by combining selective enrichment and real-time PCR.

    Science.gov (United States)

    Goodman, Laura B; McDonough, Patrick L; Anderson, Renee R; Franklin-Guild, Rebecca J; Ryan, James R; Perkins, Gillian A; Thachil, Anil J; Glaser, Amy L; Thompson, Belinda S

    2017-11-01

    Rapid screening for enteric bacterial pathogens in clinical environments is essential for biosecurity. Salmonella found in veterinary hospitals, particularly Salmonella enterica serovar Dublin, can pose unique challenges for culture and testing because of its poor growth. Multiple Salmonella serovars including Dublin are emerging threats to public health given increasing prevalence and antimicrobial resistance. We adapted an automated food testing method to veterinary samples and evaluated the performance of the method in a variety of matrices including environmental samples ( n = 81), tissues ( n = 52), feces ( n = 148), and feed ( n = 29). A commercial kit was chosen as the basis for this approach in view of extensive performance characterizations published by multiple independent organizations. A workflow was established for efficiently and accurately testing veterinary matrices and environmental samples by use of real-time PCR after selective enrichment in Rappaport-Vassiliadis soya (RVS) medium. Using this method, the detection limit for S. Dublin improved by 100-fold over subculture on selective agars (eosin-methylene blue, brilliant green, and xylose-lysine-deoxycholate). Overall, the procedure was effective in detecting Salmonella spp. and provided next-day results.

  7. Automated ensemble assembly and validation of microbial genomes

    Science.gov (United States)

    2014-01-01

    Background The continued democratization of DNA sequencing has sparked a new wave of development of genome assembly and assembly validation methods. As individual research labs, rather than centralized centers, begin to sequence the majority of new genomes, it is important to establish best practices for genome assembly. However, recent evaluations such as GAGE and the Assemblathon have concluded that there is no single best approach to genome assembly. Instead, it is preferable to generate multiple assemblies and validate them to determine which is most useful for the desired analysis; this is a labor-intensive process that is often impossible or unfeasible. Results To encourage best practices supported by the community, we present iMetAMOS, an automated ensemble assembly pipeline; iMetAMOS encapsulates the process of running, validating, and selecting a single assembly from multiple assemblies. iMetAMOS packages several leading open-source tools into a single binary that automates parameter selection and execution of multiple assemblers, scores the resulting assemblies based on multiple validation metrics, and annotates the assemblies for genes and contaminants. We demonstrate the utility of the ensemble process on 225 previously unassembled Mycobacterium tuberculosis genomes as well as a Rhodobacter sphaeroides benchmark dataset. On these real data, iMetAMOS reliably produces validated assemblies and identifies potential contamination without user intervention. In addition, intelligent parameter selection produces assemblies of R. sphaeroides comparable to or exceeding the quality of those from the GAGE-B evaluation, affecting the relative ranking of some assemblers. Conclusions Ensemble assembly with iMetAMOS provides users with multiple, validated assemblies for each genome. Although computationally limited to small or mid-sized genomes, this approach is the most effective and reproducible means for generating high-quality assemblies and enables users to

  8. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Bourret, S.C.

    1974-01-01

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  9. THE USE OF MULTIPLE DISPLACEMENT AMPLIFICATION TO INCREASE THE DETECTION AND GENOTYPING OF TRYPANOSOMA SPECIES SAMPLES IMMOBILISED ON FTA FILTERS

    Science.gov (United States)

    MORRISON, LIAM J.; McCORMACK, GILLIAN; SWEENEY, LINDSAY; LIKEUFACK, ANNE C. L.; TRUC, PHILIPPE; TURNER, C. MICHAEL; TAIT, ANDY; MacLEOD, ANNETTE

    2007-01-01

    Whole genome amplification methods are a recently developed tool for amplifying DNA from limited template. We report its application in trypanosome infections, characterised by low parasitaemias. Multiple Displacement Amplification (MDA) amplifies DNA with a simple in vitro step, and was evaluated on mouse blood samples on FTA filter cards with known numbers of Trypanosoma brucei parasites. The data showed a twenty-fold increase in the number of PCRs possible per sample, using primers diagnostic for the multi-copy ribosomal ITS region or 177 bp repeats, and a twenty-fold increase in sensitivity over nested PCR against a single copy microsatellite. Using MDA for microsatellite genotyping caused allele dropout at low DNA concentrations, which was overcome by pooling multiple MDA reactions. The validity of using MDA was established with samples from Human African Trypanosomiasis patients. The use of MDA allows maximal use of finite DNA samples and may prove a valuable tool in studies where multiple reactions are necessary, such as population genetic analyses. PMID:17556624

  10. Automated Speech Rate Measurement in Dysarthria

    Science.gov (United States)

    Martens, Heidi; Dekens, Tomas; Van Nuffelen, Gwen; Latacz, Lukas; Verhelst, Werner; De Bodt, Marc

    2015-01-01

    Purpose: In this study, a new algorithm for automated determination of speech rate (SR) in dysarthric speech is evaluated. We investigated how reliably the algorithm calculates the SR of dysarthric speech samples when compared with calculation performed by speech-language pathologists. Method: The new algorithm was trained and tested using Dutch…

  11. Devices used by automated milking systems are similarly accurate in estimating milk yield and in collecting a representative milk sample compared with devices used by farms with conventional milk recording

    NARCIS (Netherlands)

    Kamphuis, Claudia; Dela Rue, B.; Turner, S.A.; Petch, S.

    2015-01-01

    Information on accuracy of milk-sampling devices used on farms with automated milking systems (AMS) is essential for development of milk recording protocols. The hypotheses of this study were (1) devices used by AMS units are similarly accurate in estimating milk yield and in collecting

  12. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  13. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  14. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  15. The cobas® 6800/8800 System: a new era of automation in molecular diagnostics.

    Science.gov (United States)

    Cobb, Bryan; Simon, Christian O; Stramer, Susan L; Body, Barbara; Mitchell, P Shawn; Reisch, Natasa; Stevens, Wendy; Carmona, Sergio; Katz, Louis; Will, Stephen; Liesenfeld, Oliver

    2017-02-01

    Molecular diagnostics is a key component of laboratory medicine. Here, the authors review key triggers of ever-increasing automation in nucleic acid amplification testing (NAAT) with a focus on specific automated Polymerase Chain Reaction (PCR) testing and platforms such as the recently launched cobas® 6800 and cobas® 8800 Systems. The benefits of such automation for different stakeholders including patients, clinicians, laboratory personnel, hospital administrators, payers, and manufacturers are described. Areas Covered: The authors describe how molecular diagnostics has achieved total laboratory automation over time, rivaling clinical chemistry to significantly improve testing efficiency. Finally, the authors discuss how advances in automation decrease the development time for new tests enabling clinicians to more readily provide test results. Expert Commentary: The advancements described enable complete diagnostic solutions whereby specific test results can be combined with relevant patient data sets to allow healthcare providers to deliver comprehensive clinical recommendations in multiple fields ranging from infectious disease to outbreak management and blood safety solutions.

  16. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    Science.gov (United States)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-02-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults.

  17. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    International Nuclear Information System (INIS)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-01-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults. (paper)

  18. Addressing small sample size bias in multiple-biomarker trials: Inclusion of biomarker-negative patients and Firth correction.

    Science.gov (United States)

    Habermehl, Christina; Benner, Axel; Kopp-Schneider, Annette

    2018-03-01

    In recent years, numerous approaches for biomarker-based clinical trials have been developed. One of these developments are multiple-biomarker trials, which aim to investigate multiple biomarkers simultaneously in independent subtrials. For low-prevalence biomarkers, small sample sizes within the subtrials have to be expected, as well as many biomarker-negative patients at the screening stage. The small sample sizes may make it unfeasible to analyze the subtrials individually. This imposes the need to develop new approaches for the analysis of such trials. With an expected large group of biomarker-negative patients, it seems reasonable to explore options to benefit from including them in such trials. We consider advantages and disadvantages of the inclusion of biomarker-negative patients in a multiple-biomarker trial with a survival endpoint. We discuss design options that include biomarker-negative patients in the study and address the issue of small sample size bias in such trials. We carry out a simulation study for a design where biomarker-negative patients are kept in the study and are treated with standard of care. We compare three different analysis approaches based on the Cox model to examine if the inclusion of biomarker-negative patients can provide a benefit with respect to bias and variance of the treatment effect estimates. We apply the Firth correction to reduce the small sample size bias. The results of the simulation study suggest that for small sample situations, the Firth correction should be applied to adjust for the small sample size bias. Additional to the Firth penalty, the inclusion of biomarker-negative patients in the analysis can lead to further but small improvements in bias and standard deviation of the estimates. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. MZDASoft: a software architecture that enables large-scale comparison of protein expression levels over multiple samples based on liquid chromatography/tandem mass spectrometry.

    Science.gov (United States)

    Ghanat Bari, Mehrab; Ramirez, Nelson; Wang, Zhiwei; Zhang, Jianqiu Michelle

    2015-10-15

    Without accurate peak linking/alignment, only the expression levels of a small percentage of proteins can be compared across multiple samples in Liquid Chromatography/Mass Spectrometry/Tandem Mass Spectrometry (LC/MS/MS) due to the selective nature of tandem MS peptide identification. This greatly hampers biomedical research that aims at finding biomarkers for disease diagnosis, treatment, and the understanding of disease mechanisms. A recent algorithm, PeakLink, has allowed the accurate linking of LC/MS peaks without tandem MS identifications to their corresponding ones with identifications across multiple samples collected from different instruments, tissues and labs, which greatly enhanced the ability of comparing proteins. However, PeakLink cannot be implemented practically for large numbers of samples based on existing software architectures, because it requires access to peak elution profiles from multiple LC/MS/MS samples simultaneously. We propose a new architecture based on parallel processing, which extracts LC/MS peak features, and saves them in database files to enable the implementation of PeakLink for multiple samples. The software has been deployed in High-Performance Computing (HPC) environments. The core part of the software, MZDASoft Parallel Peak Extractor (PPE), can be downloaded with a user and developer's guide, and it can be run on HPC centers directly. The quantification applications, MZDASoft TandemQuant and MZDASoft PeakLink, are written in Matlab, which are compiled with a Matlab runtime compiler. A sample script that incorporates all necessary processing steps of MZDASoft for LC/MS/MS quantification in a parallel processing environment is available. The project webpage is http://compgenomics.utsa.edu/zgroup/MZDASoft. The proposed architecture enables the implementation of PeakLink for multiple samples. Significantly more (100%-500%) proteins can be compared over multiple samples with better quantification accuracy in test cases. MZDASoft

  20. A Centrifugal Microfluidic Platform That Separates Whole Blood Samples into Multiple Removable Fractions Due to Several Discrete but Continuous Density Gradient Sections

    Science.gov (United States)

    Moen, Scott T.; Hatcher, Christopher L.; Singh, Anup K.

    2016-01-01

    We present a miniaturized centrifugal platform that uses density centrifugation for separation and analysis of biological components in small volume samples (~5 μL). We demonstrate the ability to enrich leukocytes for on-disk visualization via microscopy, as well as recovery of viable cells from each of the gradient partitions. In addition, we simplified the traditional Modified Wright-Giemsa staining by decreasing the time, volume, and expertise involved in the procedure. From a whole blood sample, we were able to extract 95.15% of leukocytes while excluding 99.8% of red blood cells. This platform has great potential in both medical diagnostics and research applications as it offers a simpler, automated, and inexpensive method for biological sample separation, analysis, and downstream culturing. PMID:27054764

  1. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  2. Symmetry relationships for multiple scattering of polarized light in turbid spherical samples: theory and a Monte Carlo simulation.

    Science.gov (United States)

    Otsuki, Soichi

    2016-02-01

    This paper presents a theory describing totally incoherent multiple scattering of turbid spherical samples. It is proved that if reciprocity and mirror symmetry hold for single scattering by a particle, they also hold for multiple scattering in spherical samples. Monte Carlo simulations generate a reduced effective scattering Mueller matrix, which virtually satisfies reciprocity and mirror symmetry. The scattering matrix was factorized by using the symmetric decomposition in a predefined form, as well as the Lu-Chipman polar decomposition, approximately into a product of a pure depolarizer and vertically oriented linear retarding diattenuators. The parameters of these components were calculated as a function of the polar angle. While the turbid spherical sample is a pure depolarizer at low polar angles, it obtains more functions of the retarding diattenuator with increasing polar angle.

  3. Double sampling with multiple imputation to answer large sample meta-research questions: Introduction and illustration by evaluating adherence to two simple CONSORT guidelines

    Directory of Open Access Journals (Sweden)

    Patrice L. Capers

    2015-03-01

    Full Text Available BACKGROUND: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. OBJECTIVE: To evaluate the use of double sampling combined with multiple imputation (DS+MI to address meta-research questions, using as an example adherence of PubMed entries to two simple Consolidated Standards of Reporting Trials (CONSORT guidelines for titles and abstracts. METHODS: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT; human; abstract available; and English language (n=322,107. For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO human rating method. Multiple imputation of the missing-completely-at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. RESULTS: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title=1.00, abstract=0.92. Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS+MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by Year: subsample RHITLO 1.050-1.174 vs. DS+MI 1.082-1.151. As evidence of improved accuracy, DS+MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. CONCLUSIONS: Our results support our hypothesis that DS+MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of

  4. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    Science.gov (United States)

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  5. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    Science.gov (United States)

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  6. Automated ion-exchange system for the radiochemical separation of the noble metals

    International Nuclear Information System (INIS)

    Parry, S.J.

    1980-01-01

    Ion-exchange separation is particularly suitable for mechanisation and automated ion exchange has been applied to the activation analysis of biological and environmental samples. In this work a system has been designed for experimental studies, which can be adapted for different modes of operation. The equipment is based on a large-volume sampler for the automatic presentation of 500 ml of liquid to a sampling probe. The sample is delivered to the ion-exchange column by means of a peristaltic pump. The purpose of this work was to automate a procedure for separating the noble metals from irradiated geological samples, for neutron-activation analysis. The process of digesting the rock sample is carried out manually in 30 min and is not suited to unattended operation. The volume of the resulting liquid sample may be 100 ml and so the manual separation step may take as long as 1.25 h per sample. The reason for automating this part of the procedure is to reduce the separation time for a group of five samples and consequently to improve the sensitivity of the analysis for radionuclides with short half-lives. This paper describes the automatic ion-exchange system and the ways in which it can be used. The mode of operation for the separation of the noble metals is given in detail. The reproducibility of the system has been assessed by repeated measurements on a standard reference matte. (author)

  7. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  8. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  9. Psychosocial risks associated with multiple births resulting from assisted reproduction: a Spanish sample.

    Science.gov (United States)

    Roca de Bes, Montserrat; Gutierrez Maldonado, José; Gris Martínez, José M

    2009-09-01

    To determine the psychosocial risks associated with multiple births (twins or triplets) resulting from assisted reproductive technology (ART). Transverse study. Infertility units of a university hospital and a private hospital. Mothers and fathers of children between 6 months and 4 years conceived by ART (n = 123). The sample was divided into three groups: parents of singletons (n = 77), twins (n = 37), and triplets (n = 9). The questionnaire was self-administered by patients. It was either completed at the hospital or mailed to participants' homes. Scales measured material needs, quality of life, social stigma, depression, stress, and marital satisfaction. Logistic regression models were applied. Significant odds ratios were obtained for the number of children, material needs, social stigma, quality of life, and marital satisfaction. The results were more significant for data provided by mothers than by fathers. The informed consent form handed out at the beginning of ART should include information on the high risk of conceiving twins and triplets and on the possible psychosocial consequences of multiple births. As soon as a multiple pregnancy is confirmed, it would be useful to provide information on support groups and institutions. Psychological advice should also be given to the parents.

  10. Delegation control of multiple unmanned systems

    Science.gov (United States)

    Flaherty, Susan R.; Shively, Robert J.

    2010-04-01

    Maturing technologies and complex payloads coupled with a future objective to reduce the logistics burden of current unmanned aerial systems (UAS) operations require a change to the 2-crew employment paradigm. Increased automation and operator supervisory control of unmanned systems have been advocated to meet the objective of reducing the crew requirements, while managing future technologies. Specifically, a delegation control employment strategy has resulted in reduced workload and higher situation awareness for single operators controlling multiple unmanned systems in empirical studies1,2. Delegation control is characterized by the ability for an operator to call a single "play" that initiates prescribed default actions for each vehicle and associated sensor related to a common mission goal. Based upon the effectiveness of delegation control in simulation, the U.S. Army Aeroflightdynamics Directorate (AFDD) developed a Delegation Control (DelCon) operator interface with voice recognition implementation for play selection, real-time play modification, and play status with automation transparency to enable single operator control of multiple unmanned systems in flight. AFDD successfully demonstrated delegation control in a Troops-in-Contact mission scenario at Ft. Ord in 2009. This summary showcases the effort as a beneficial advance in single operator control of multiple UAS.

  11. Microfluidic devices for sample clean-up and screening of biological samples

    NARCIS (Netherlands)

    Tetala, K.K.R.

    2009-01-01

    Analytical chemistry plays an important role in the separation and identification of analytes from raw samples (e.g. plant extracts, blood), but the whole analytical process is tedious, difficult to automate and time consuming. To overcome these drawbacks, the concept of μTAS (miniaturized total

  12. A systematic engineering tool chain approach for self-organizing building automation systems

    NARCIS (Netherlands)

    Mc Gibney, A.; Rea, S.; Lehmann, M.; Thior, S.; Lesecq, S.; Hendriks, M.; Guyon-Gardeux, C.; Mai, Linh Tuan; Pacull, F.; Ploennigs, J.; Basten, T.; Pesch, D.

    2013-01-01

    There is a strong push towards smart buildings that aim to achieve comfort, safety and energy efficiency, through building automation systems (BAS) that incorporate multiple subsystems such as heating and air-conditioning, lighting, access control etc. The design, commissioning and operation of BAS

  13. Use of an automated bolus calculator in MDI-treated type 1 diabetes

    DEFF Research Database (Denmark)

    Schmidt, Signe; Meldgaard, Merete; Serifovski, Nermin

    2012-01-01

    To investigate the effect of flexible intensive insulin therapy (FIIT) and an automated bolus calculator (ABC) in a Danish type 1 diabetes population treated with multiple daily injections. Furthermore, to test the feasibility of teaching FIIT in a 3-h structured course....

  14. Automated Photoreceptor Cell Identification on Nonconfocal Adaptive Optics Images Using Multiscale Circular Voting.

    Science.gov (United States)

    Liu, Jianfei; Jung, HaeWon; Dubra, Alfredo; Tam, Johnny

    2017-09-01

    Adaptive optics scanning light ophthalmoscopy (AOSLO) has enabled quantification of the photoreceptor mosaic in the living human eye using metrics such as cell density and average spacing. These rely on the identification of individual cells. Here, we demonstrate a novel approach for computer-aided identification of cone photoreceptors on nonconfocal split detection AOSLO images. Algorithms for identification of cone photoreceptors were developed, based on multiscale circular voting (MSCV) in combination with a priori knowledge that split detection images resemble Nomarski differential interference contrast images, in which dark and bright regions are present on the two sides of each cell. The proposed algorithm locates dark and bright region pairs, iteratively refining the identification across multiple scales. Identification accuracy was assessed in data from 10 subjects by comparing automated identifications with manual labeling, followed by computation of density and spacing metrics for comparison to histology and published data. There was good agreement between manual and automated cone identifications with overall recall, precision, and F1 score of 92.9%, 90.8%, and 91.8%, respectively. On average, computed density and spacing values using automated identification were within 10.7% and 11.2% of the expected histology values across eccentricities ranging from 0.5 to 6.2 mm. There was no statistically significant difference between MSCV-based and histology-based density measurements (P = 0.96, Kolmogorov-Smirnov 2-sample test). MSCV can accurately detect cone photoreceptors on split detection images across a range of eccentricities, enabling quick, objective estimation of photoreceptor mosaic metrics, which will be important for future clinical trials utilizing adaptive optics.

  15. High-throughput mouse genotyping using robotics automation.

    Science.gov (United States)

    Linask, Kaari L; Lo, Cecilia W

    2005-02-01

    The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.

  16. A real-time data acquisition and processing system for the analytical laboratory automation of a HTR spent fuel reprocessing facility

    International Nuclear Information System (INIS)

    Watzlawik, K.H.

    1979-12-01

    A real-time data acquisition and processing system for the analytical laboratory of an experimental HTR spent fuel reprocessing facility is presented. The on-line open-loop system combines in-line and off-line analytical measurement procedures including data acquisition and evaluation as well as analytical laboratory organisation under the control of a computer-supported laboratory automation system. In-line measurements are performed for density, volume and temperature in process tanks and registration of samples for off-line measurements. Off-line computer-coupled experiments are potentiometric titration, gas chromatography and X-ray fluorescence analysis. Organisational sections like sample registration, magazining, distribution and identification, multiple data assignment and especially calibrations of analytical devices are performed by the data processing system. (orig.) [de

  17. AUTOMATION OF IN FEED CENTERLESS GRINDING MACHINE

    OpenAIRE

    Piyusha P. Jadhav*, Sachin V. Lomte, Sudhir Surve

    2017-01-01

    In-feed centerless grinding technique offers a major contribution to the industries. This is the alternative in-feed centerless grinding technique using regulating wheel. Mainly centerless grinding is divided in three types, and those are End feed, in-feed and through feed Centerless grinding. This paper mainly deals with low cost automation on in-feed Centerless grinding machine using regulating wheel suitable for multiple in-feed type jobs. It deals with the development of a Centerless grin...

  18. Simple automated system for simultaneous production of 11C-labeled tracers by solid supported methylation

    International Nuclear Information System (INIS)

    Quincoces, Gemma; Penuelas, Ivan; Valero, Marta; Serra, Patricia; Collantes, Maria; Marti-Climent, Josep; Arbizu, Javier; Jose Garcia-Velloso, Maria; Angel Richter, Jose

    2006-01-01

    We herein describe a simple setup for the automated simultaneous synthesis of L-[methyl- 11 C]methionine and N-[methyl- 11 C]choline by solid-supported methylation . The setup is extremely simple and easy to adapt to other automated systems and due to its versatility, the method can be utilized for the production of other radiopharmaceuticals requiring a simple [ 11 C]methylation step. Furthermore, it can be used for multiple simultaneous synthesis

  19. Development and evaluation of a multiple-plate fraction collector for sample processing: application to radioprofiling in drug metabolism studies.

    Science.gov (United States)

    Barros, Anthony; Ly, Van T; Chando, Theodore J; Ruan, Qian; Donenfeld, Scott L; Holub, David P; Christopher, Lisa J

    2011-04-05

    Microplate scintillation counters are utilized routinely in drug metabolism laboratories for the off-line radioanalysis of fractions collected during HPLC radioprofiling. In this process, the current fraction collection technology is limited by the number of plates that can be used per injection as well as the potential for sample loss due to dripping or spraying as the fraction collector head moves from well to well or between plates. More importantly, sample throughput is limited in the conventional process, since the collection plates must be manually exchanged after each injection. The Collect PAL, an innovative multiple-plate fraction collector, was developed to address these deficiencies and improve overall sample throughput. It employs a zero-loss design and has sub-ambient temperature control. Operation of the system is completely controlled with software and up to 24 (96- or 384-well) fraction collection plates can be loaded in a completely automated run. The system may also be configured for collection into various-sized tubes or vials. At flow rates of 0.5 or 1.0 mL/min and at collection times of 10 or 15s, the system precisely delivered 83-μL fractions (within 4.1% CV) and 250-μL fractions (within 1.4% CV), respectively, of three different mobile phases into 12 mm × 32 mm vials. Similarly, at a flow rate of 1 mL/min and 10s collection times, the system precisely dispensed mobile phase containing a [(14)C]-radiolabeled compound across an entire 96-well plate (% CV was within 5.3%). Triplicate analyses of metabolism test samples containing [(14)C]buspirone and its metabolites, derived from three different matrices (plasma, urine and bile), indicated that the Collect PAL produced radioprofiles that were reproducible and comparable to the current technology; the % CV for 9 selected peaks in the radioprofiles generated with the Collect PAL were within 9.3%. Radioprofiles generated by collecting into 96- and 384-well plates were qualitatively comparable

  20. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.

  1. A comparison of automated and manual radioimmunoassays for the estimation of serum digoxin

    International Nuclear Information System (INIS)

    Tuttlebee, J.W.

    1984-01-01

    Digoxin was assayed manually (SPAC kit) and by an automated radioimmunoassay instrument (ARIA II). There was good correlation between the methods (r=0.964; n=120). Carryover by the automated method was marginally significant from low to high level samples (5.2%), but insignificant from high to low level samples (1.4%). The precision on the ARIA II was far better in spite of the fact that assays were performed only in singlets compared with duplicates on the manual procedure. Best precision was obtained at the high level (3.76 nmol/l; between-batch CV: 2.4% ARIA; 5.5% SPAC). All values for control samples (manual and automated) fell within 6% of the all-method means. Using the ARIA II is marginally quicker and significantly cheaper. However, some back-up procedure is required for 'down time' with the machine, for what can be an urgent investigation. Patient results correlated well with clinical data. (orig.) [de

  2. Automated analysis for nitrate by hydrazine reduction

    Energy Technology Data Exchange (ETDEWEB)

    Kamphake, L J; Hannah, S A; Cohen, J M

    1967-01-01

    An automated procedure for the simultaneous determinations of nitrate and nitrite in water is presented. Nitrite initially present in the sample is determined by a conventional diazotization-coupling reaction. Nitrate in another portion of sample is quantitatively reduced with hydrazine sulfate to nitrite which is then determined by the same diazotization-coupling reaction. Subtracting the nitrite initially present in the sample from that after reduction yields nitrite equivalent to nitrate initially in the sample. The rate of analysis is 20 samples/hr. Applicable range of the described method is 0.05-10 mg/l nitrite or nitrate nitrogen; however, increased sensitivity can be obtained by suitable modifications.

  3. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  4. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  5. An automated approach for annual layer counting in ice cores

    Directory of Open Access Journals (Sweden)

    M. Winstrup

    2012-11-01

    Full Text Available A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of hidden Markov models (HMMs, which originally was developed for use in machine speech recognition. The strength of the layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on statistical criteria for annual layer identification. The most likely positions of multiple layer boundaries in a section of ice core data are determined simultaneously, and a probabilistic uncertainty estimate of the resulting layer count is provided, ensuring an objective treatment of ambiguous layers in the data. Furthermore, multiple data series can be incorporated and used simultaneously. In this study, the automated layer counting algorithm has been applied to two ice core records from Greenland: one displaying a distinct annual signal and one which is more challenging. The algorithm shows high skill in reproducing the results from manual layer counts, and the resulting timescale compares well to absolute-dated volcanic marker horizons where these exist.

  6. Rapid and automated on-line solid phase extraction HPLC-MS/MS with peak focusing for the determination of ochratoxin A in wine samples.

    Science.gov (United States)

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca

    2018-04-01

    This study reports a fast and automated analytical procedure based on an on-line SPE-HPLC-MS/MS method for the automatic pre-concentration, clean up and sensitive determination of OTA in wine. The amount of OTA contained in 100μL of sample (pH≅5.5) was retained and concentrated on an Oasis MAX SPE cartridge. After a washing step to remove matrix interferents, the analyte was eluted in back-flush mode and the eluent from the SPE column was diluted through a mixing Tee, using an aqueous solution before the chromatographic separation achieved on a monolithic column. The developed method has been validated according to EU regulation N. 519/2014 and applied for the analysis of 41 red and 17 white wines. The developed method features minimal sample handling, low solvent consumption, high sample throughput, low analysis cost and provides an accurate and highly selective results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. DEP-On-Go for Simultaneous Sensing of Multiple Heavy Metals Pollutants in Environmental Samples

    Directory of Open Access Journals (Sweden)

    Madhu Biyani

    2016-12-01

    Full Text Available We describe a simple and affordable “Disposable electrode printed (DEP-On-Go” sensing platform for the rapid on-site monitoring of trace heavy metal pollutants in environmental samples for early warning by developing a mobile electrochemical device composed of palm-sized potentiostat and disposable unmodified screen-printed electrode chips. We present the analytical performance of our device for the sensitive detection of major heavy metal ions, namely, mercury, cadmium, lead, arsenic, zinc, and copper with detection limits of 1.5, 2.6, 4.0, 5.0, 14.4, and, 15.5 μg·L−1, respectively. Importantly, the utility of this device is extended to detect multiple heavy metals simultaneously with well-defined voltammograms and similar sensitivity. Finally, “DEP-On-Go” was successfully applied to detect heavy metals in real environmental samples from groundwater, tap water, house dust, soil, and industry-processed rice and noodle foods. We evaluated the efficiency of this system with a linear correlation through inductively coupled plasma mass spectrometry, and the results suggested that this system can be reliable for on-site screening purposes. On-field applications using real samples of groundwater for drinking in the northern parts of India support the easy-to-detect, low-cost (<1 USD, rapid (within 5 min, and reliable detection limit (ppb levels performance of our device for the on-site detection and monitoring of multiple heavy metals in resource-limited settings.

  8. Automated Radioanalytical Chemistry: Applications For The Laboratory And Industrial Process Monitoring

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Farawila, Anne F.; Grate, Jay W.

    2009-01-01

    The identification and quantification of targeted α- and β-emitting radionuclides via destructive analysis in complex radioactive liquid matrices is highly challenging. Analyses are typically accomplished at on- or off-site laboratories through laborious sample preparation steps and extensive chemical separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, alpha energy spectroscopy, mass spectrometry). Analytical results may take days or weeks to report. When an industrial-scale plant requires periodic or continuous monitoring of radionuclides as an indication of the composition of its feed stream, diversion of safeguarded nuclides, or of plant operational conditions (for example), radiochemical measurements should be rapid, but not at the expense of precision and accuracy. Scientists at Pacific Northwest National Laboratory have developed and characterized a host of automated radioanalytical systems designed to perform reproducible and rapid radioanalytical processes. Platforms have been assembled for (1) automation and acceleration of sample analysis in the laboratory and (2) automated monitors for monitoring industrial scale nuclear processes on-line with near-real time results. These methods have been applied to the analysis of environmental-level actinides and fission products to high-level nuclear process fluids. Systems have been designed to integrate a number of discrete sample handling steps, including sample pretreatment (e.g., digestion and valence state adjustment) and chemical separations. The systems have either utilized on-line analyte detection or have collected the purified analyte fractions for off-line measurement applications. One PNNL system of particular note is a fully automated prototype on-line radioanalytical system designed for the Waste Treatment Plant at Hanford, WA, USA. This system demonstrated nearly continuous destructive analysis of the soft β-emitting radionuclide 99Tc in nuclear

  9. Significant increase in cultivation of Gardnerella vaginalis, Alloscardovia omnicolens, Actinotignum schaalii, and Actinomyces spp. in urine samples with total laboratory automation.

    Science.gov (United States)

    Klein, Sabrina; Nurjadi, Dennis; Horner, Susanne; Heeg, Klaus; Zimmermann, Stefan; Burckhardt, Irene

    2018-04-13

    While total laboratory automation (TLA) is well established in laboratory medicine, only a few microbiological laboratories are using TLA systems. Especially in terms of speed and accuracy, working with TLA is expected to be superior to conventional microbiology. We compared in total 35,564 microbiological urine cultures with and without incubation and processing with BD Kiestra TLA for a 6-month period each retrospectively. Sixteen thousand three hundred thirty-eight urine samples were analyzed in the pre-TLA period and 19,226 with TLA. Sixty-two percent (n = 10,101/16338) of the cultures processed without TLA and 68% (n = 13,102/19226) of the cultures processed with TLA showed growth. There were significantly more samples with two or more species per sample and with low numbers of colony forming units (CFU) after incubation with TLA. Regarding the type of bacteria, there were comparable amounts of Enterobacteriaceae in the samples, slightly less non-fermenting Gram-negative bacteria, but significantly more Gram-positive cocci, and Gram-positive rods. Especially Alloscardivia omnicolens, Gardnerella vaginalis, Actinomyces spp., and Actinotignum schaalii were significantly more abundant in the samples incubated and processed with TLA. The time to report was significantly lower in the TLA processed samples by 1.5 h. We provide the first report in Europe of a large number of urine samples processed with TLA. TLA showed enhanced growth of non-classical and rarely cultured bacteria from urine samples. Our findings suggest that previously underestimated bacteria may be relevant pathogens for urinary tract infections. Further studies are needed to confirm our findings.

  10. Voxel-based morphometry and automated lobar volumetry: The trade-off between spatial scale and statistical correction

    Science.gov (United States)

    Voormolen, Eduard H.J.; Wei, Corie; Chow, Eva W.C.; Bassett, Anne S.; Mikulis, David J.; Crawley, Adrian P.

    2011-01-01

    Voxel-based morphometry (VBM) and automated lobar region of interest (ROI) volumetry are comprehensive and fast methods to detect differences in overall brain anatomy on magnetic resonance images. However, VBM and automated lobar ROI volumetry have detected dissimilar gray matter differences within identical image sets in our own experience and in previous reports. To gain more insight into how diverging results arise and to attempt to establish whether one method is superior to the other, we investigated how differences in spatial scale and in the need to statistically correct for multiple spatial comparisons influence the relative sensitivity of either technique to group differences in gray matter volumes. We assessed the performance of both techniques on a small dataset containing simulated gray matter deficits and additionally on a dataset of 22q11-deletion syndrome patients with schizophrenia (22q11DS-SZ) vs. matched controls. VBM was more sensitive to simulated focal deficits compared to automated ROI volumetry, and could detect global cortical deficits equally well. Moreover, theoretical calculations of VBM and ROI detection sensitivities to focal deficits showed that at increasing ROI size, ROI volumetry suffers more from loss in sensitivity than VBM. Furthermore, VBM and automated ROI found corresponding GM deficits in 22q11DS-SZ patients, except in the parietal lobe. Here, automated lobar ROI volumetry found a significant deficit only after a smaller subregion of interest was employed. Thus, sensitivity to focal differences is impaired relatively more by averaging over larger volumes in automated ROI methods than by the correction for multiple comparisons in VBM. These findings indicate that VBM is to be preferred over automated lobar-scale ROI volumetry for assessing gray matter volume differences between groups. PMID:19619660

  11. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    Tobin, K.W.; Karnowski, T.P.; Gleason, S.S.; Jensen, D.; Lakhani, F.

    1999-01-01

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  12. JWST Associations overview: automated generation of combined products

    Science.gov (United States)

    Alexov, Anastasia; Swade, Daryl; Bushouse, Howard; Diaz, Rosa; Eisenhamer, Jonathan; Hack, Warren; Kyprianou, Mark; Levay, Karen; Rahmani, Christopher; Swam, Mike; Valenti, Jeff

    2018-01-01

    We are presenting the design of the James Webb Space Telescope (JWST) Data Management System (DMS) automated processing of Associations. An Association captures the relationship between exposures and higher level data products, such as combined mosaics created from dithered and tiled observations. The astronomer’s intent is captured within the Proposal Planning System (PPS) and provided to DMS as candidate associations. These candidates are converted into Association Pools and Association Generator Tables that serve as input to automated processing which create the combined data products. Association Pools are generated to capture a list of exposures that could potentially form associations and provide relevant information about those exposures. The Association Generator using definitions on groupings creates one or more Association Tables from a single input Association Pool. Each Association Table defines a set of exposures to be combined and the ruleset of the combination to be performed; the calibration software creates Associated data products based on these input tables. The initial design produces automated Associations within a proposal. Additionally this JWST overall design is conducive to eventually produce Associations for observations from multiple proposals, similar to the Hubble Legacy Archive (HLA).

  13. Semi-automated literature mining to identify putative biomarkers of disease from multiple biofluids

    Science.gov (United States)

    2014-01-01

    Background Computational methods for mining of biomedical literature can be useful in augmenting manual searches of the literature using keywords for disease-specific biomarker discovery from biofluids. In this work, we develop and apply a semi-automated literature mining method to mine abstracts obtained from PubMed to discover putative biomarkers of breast and lung cancers in specific biofluids. Methodology A positive set of abstracts was defined by the terms ‘breast cancer’ and ‘lung cancer’ in conjunction with 14 separate ‘biofluids’ (bile, blood, breastmilk, cerebrospinal fluid, mucus, plasma, saliva, semen, serum, synovial fluid, stool, sweat, tears, and urine), while a negative set of abstracts was defined by the terms ‘(biofluid) NOT breast cancer’ or ‘(biofluid) NOT lung cancer.’ More than 5.3 million total abstracts were obtained from PubMed and examined for biomarker-disease-biofluid associations (34,296 positive and 2,653,396 negative for breast cancer; 28,355 positive and 2,595,034 negative for lung cancer). Biological entities such as genes and proteins were tagged using ABNER, and processed using Python scripts to produce a list of putative biomarkers. Z-scores were calculated, ranked, and used to determine significance of putative biomarkers found. Manual verification of relevant abstracts was performed to assess our method’s performance. Results Biofluid-specific markers were identified from the literature, assigned relevance scores based on frequency of occurrence, and validated using known biomarker lists and/or databases for lung and breast cancer [NCBI’s On-line Mendelian Inheritance in Man (OMIM), Cancer Gene annotation server for cancer genomics (CAGE), NCBI’s Genes & Disease, NCI’s Early Detection Research Network (EDRN), and others]. The specificity of each marker for a given biofluid was calculated, and the performance of our semi-automated literature mining method assessed for breast and lung cancer

  14. MITIE: Simultaneous RNA-Seq-based transcript identification and quantification in multiple samples.

    Science.gov (United States)

    Behr, Jonas; Kahles, André; Zhong, Yi; Sreedharan, Vipin T; Drewe, Philipp; Rätsch, Gunnar

    2013-10-15

    High-throughput sequencing of mRNA (RNA-Seq) has led to tremendous improvements in the detection of expressed genes and reconstruction of RNA transcripts. However, the extensive dynamic range of gene expression, technical limitations and biases, as well as the observed complexity of the transcriptional landscape, pose profound computational challenges for transcriptome reconstruction. We present the novel framework MITIE (Mixed Integer Transcript IdEntification) for simultaneous transcript reconstruction and quantification. We define a likelihood function based on the negative binomial distribution, use a regularization approach to select a few transcripts collectively explaining the observed read data and show how to find the optimal solution using Mixed Integer Programming. MITIE can (i) take advantage of known transcripts, (ii) reconstruct and quantify transcripts simultaneously in multiple samples, and (iii) resolve the location of multi-mapping reads. It is designed for genome- and assembly-based transcriptome reconstruction. We present an extensive study based on realistic simulated RNA-Seq data. When compared with state-of-the-art approaches, MITIE proves to be significantly more sensitive and overall more accurate. Moreover, MITIE yields substantial performance gains when used with multiple samples. We applied our system to 38 Drosophila melanogaster modENCODE RNA-Seq libraries and estimated the sensitivity of reconstructing omitted transcript annotations and the specificity with respect to annotated transcripts. Our results corroborate that a well-motivated objective paired with appropriate optimization techniques lead to significant improvements over the state-of-the-art in transcriptome reconstruction. MITIE is implemented in C++ and is available from http://bioweb.me/mitie under the GPL license.

  15. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  16. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo

    2012-07-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  17. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo; Huang, Jianhua Z.; Huitink, David; Kundu, Subrata; Mallick, Bani K.; Liang, Hong; Ding, Yu

    2012-01-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  18. Virtual instrument automation of ion channeling setup for 1.7 MV tandetron accelerator

    International Nuclear Information System (INIS)

    Suresh, K.; Sundaravel, B.; Panigrahi, B.K.; Nair, K.G.M.; Viswanathan, B.

    2004-01-01

    A virtual instrument based automated ion channeling experimental setup has been developed and implemented in a 1.7 MV tandetron accelerator. Automation of the PC based setup is done using a windows based virtual instrument software allowing the setup to be easily ported between different computer operating systems. The virtual instrument software has been chosen to achieve quick and easy development of versatile, multi-purpose user friendly graphical interface for carrying out channeling experiments. The software has been modular designed to provide independent control of the stepper motors for fixing the sample at any user defined orientation, running and on-line display of azimuthal and tilt angular scans, auto storage of the angular scan data. Using this automated setup, the crystallographic axis of the sample can be aligned with the incident ion beam rapidly minimizing the beam damages to the sample. A standard single crystalline GaAs(100) has been characterized with this set up using 2 MeV He ++ ion beam. The crystalline quality (χ min ) and channeling half angle (ψ 1sol2 ) are measured to be 3.7% and 0.48 deg., respectively, which are close to the theoretical values. Salient features, working principles and design details of the automated setup are discussed in this paper

  19. Feasibility Study of Neutron Multiplicity Assay for a Heterogeneous Sludge Sample containing Na, Pu and other Impurities

    International Nuclear Information System (INIS)

    Nakamura, H.; Nakamichi, H.; Mukai, Y.; Yoshimoto, K.; Beddingfield, D.H.

    2010-01-01

    To reduce radioactivity of liquid waste generated at PCDF, a neutralization precipitation processes of radioactive nuclides by sodium hydroxide is used. We call the precipitate a 'sludge' after calcination. Pu mass in the sludge is normally determined by sampling and DA within the required uncertainty on DIQ. Annual yield of the mass is small but it accumulates and reaches to a few kilograms, so it is declared as retained waste and verified at PIV. A HM-5-based verification is applied for sludge verification. The sludge contains many chemical components. For example, Pu (-10wt%), U, Am, SUS components, halogens, NaNO 3 (main component), residual NaOH, and moisture. They are mixed together as an impure heterogeneous sludge sample. As a result, there is a large uncertainty in the sampling and DA that is currently used at PCDF. In order to improve the material accounting, we performed a feasibility study using neutron multiplicity assay for impure sludge samples. We have measured selected sludge samples using a multiplicity counter which is called FCAS (Fast Carton Assay System) which was designed by JAEA and Canberra. The PCDF sludge materials fall into the category of 'difficult to measure' because of the high levels of impurities, high alpha values and somewhat small Pu mass. For the sludge measurements, it was confirmed that good consistency between Pu mass in a pure sludge standard (PuO 2 -Na 2 U 2 O 7 , alpha=7) and the DA could be obtained. For unknown samples, using 14-hour measurements, we could obtain quite low statistical uncertainty on Doubles (-1%) and Triples (-10%) count rate although the alpha value was extremely high (15-25) and FCAS efficiency was relatively low (40%) for typical multiplicity counters. Despite the detector efficiency challenges and the material challenges (high alpha, low Pu mass, heterogeneous matrix), we have been able to obtain assay results that greatly exceed the accountancy requirements for retained waste materials. We have

  20. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)