WorldWideScience

Sample records for multiple automated sample

  1. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    Science.gov (United States)

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  2. Microassay for interferon, using [3H]uridine, microculture plates, and a multiple automated sample harvester.

    Science.gov (United States)

    Richmond, J Y; Polatnick, J; Knudsen, R C

    1980-01-01

    A microassay for interferon is described which uses target cells grown in microculture wells, [3H]uridine to measure vesicular stomatitis virus replication in target cells, and a multiple automated sample harvester to collect the radioactively labeled viral ribonucleic acid onto glass fiber filter disks. The disks were placed in minivials, and radioactivity was counted in a liquid scintillation spectrophotometer. Interferon activity was calculated as the reciprocal of the highest titer which inhibited the incorporation of [3H]uridine into viral ribonucleic acid by 50%. Interferon titers determined by the microassay were similar to the plaque reduction assay when 100 plaque-forming units of challenge vesicular stomatitis virus was used. However, it was found that the interferon titers decreased approximately 2-fold for each 10-fold increase in the concentration of challenge vesicular stomatitis virus when tested in the range of 10(2) to 10(5) plaque-forming units. Interferon titers determined by the microassay show a high degree of repeatability, and the assay can be used to measure small and large numbers of interferon samples. PMID:6155105

  3. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    Science.gov (United States)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  4. Automated multiple failure FMEA

    International Nuclear Information System (INIS)

    Price, C.J.; Taylor, N.S.

    2002-01-01

    Failure mode and effects analysis (FMEA) is typically performed by a team of engineers working together. In general, they will only consider single point failures in a system. Consideration of all possible combinations of failures is impractical for all but the simplest example systems. Even if the task of producing the FMEA report for the full multiple failure scenario were automated, it would still be impractical for the engineers to read, understand and act on all of the results. This paper shows how approximate failure rates for components can be used to select the most likely combinations of failures for automated investigation using simulation. The important information can be automatically identified from the resulting report, making it practical for engineers to study and act on the results. The strategy described in the paper has been applied to a range of electrical subsystems, and the results have confirmed that the strategy described here works well for realistically complex systems

  5. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...... corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters...

  6. Possibilities for automating coal sampling

    Energy Technology Data Exchange (ETDEWEB)

    Helekal, J; Vankova, J

    1987-11-01

    Outlines sampling equipment in use (AVR-, AVP-, AVN- and AVK-series samplers and RDK- and RDH-series separators produced by the Coal Research Institute, Ostrava; extractors, crushers and separators produced by ORGREZ). The Ostrava equipment covers bituminous coal needs while ORGREZ provides equipment for energy coal requirements. This equipment is designed to handle coal up to 200 mm in size at a throughput of up to 1200 t/h. Automation of sampling equipment is foreseen.

  7. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  8. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Keyser, John

    2013-01-01

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation

  9. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  10. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  11. Automated blood-sample handling in the clinical laboratory.

    Science.gov (United States)

    Godolphin, W; Bodtker, K; Uyeno, D; Goh, L O

    1990-09-01

    The only significant advances in blood-taking in 25 years have been the disposable needle and evacuated blood-drawing tube. With the exception of a few isolated barcode experiments, most sample-tracking is performed through handwritten or computer-printed labels. Attempts to reduce the hazards of centrifugation have resulted in air-tight lids or chambers, the use of which is time-consuming and cumbersome. Most commonly used clinical analyzers require serum or plasma, distributed into specialized containers, unique to that analyzer. Aliquots for different tests are prepared by handpouring or pipetting. Moderate to large clinical laboratories perform so many different tests that even multi-analyzers performing multiple analyses on a single sample may account for only a portion of all tests ordered for a patient. Thus several aliquots of each specimen are usually required. We have developed a proprietary serial centrifuge and blood-collection tube suitable for incorporation into an automated or robotic sample-handling system. The system we propose is (a) safe--avoids or prevents biological danger to the many "handlers" of blood; (b) small--minimizes the amount of sample taken and space required to adapt to the needs of satellite and mobile testing, and direct interfacing with analyzers; (c) serial--permits each sample to be treated according to its own "merits," optimizes throughput, and facilitates flexible automation; and (d) smart--ensures quality results through monitoring and intelligent control of patient identification, sample characteristics, and separation process.

  12. Microcomputer automation of the sample probe of an ESCA spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J W; Downey, R M; Schrawyer, L R; Meisenheimer, R G [California Univ., Livermore (USA). Lawrence Livermore Lab.

    1979-06-01

    Many of the currently available ESCA spectrometers can be obtained with fully automated control and data acquisition systems as well as a variety of algorithms for data processing. The majority of these computer-controlled functions, however, are directed toward the analysis of a single physical sample. Since many analyses require the collection of data for several hours for the purpose of ensemble averaging to enhance the signal-to-noise ratio, overnight operation is a common practice. It is during these periods that a capability for multiple sample analyses would be most useful. The subject instrument for the automation was a Hewlett-Packard Model 5950A ESCA spectrometer which was supplied with a Hewlett-Packard 5952A Data System consisting of a Hewlett-Packard 2100A small computer and a Hewlett-Packard 85001A Magnetic Tape Cassett I/O unit. A microprocessor based system was used to implement this automation. The system uses the high-level language BASIC for virtually all control functions, thus providing the obvious advantages of rapid programming and ease of software modification over an assembly language although more ROM is required.

  13. Mindboggle: Automated brain labeling with multiple atlases

    International Nuclear Information System (INIS)

    Klein, Arno; Mensh, Brett; Ghosh, Satrajit; Tourville, Jason; Hirsch, Joy

    2005-01-01

    To make inferences about brain structures or activity across multiple individuals, one first needs to determine the structural correspondences across their image data. We have recently developed Mindboggle as a fully automated, feature-matching approach to assign anatomical labels to cortical structures and activity in human brain MRI data. Label assignment is based on structural correspondences between labeled atlases and unlabeled image data, where an atlas consists of a set of labels manually assigned to a single brain image. In the present work, we study the influence of using variable numbers of individual atlases to nonlinearly label human brain image data. Each brain image voxel of each of 20 human subjects is assigned a label by each of the remaining 19 atlases using Mindboggle. The most common label is selected and is given a confidence rating based on the number of atlases that assigned that label. The automatically assigned labels for each subject brain are compared with the manual labels for that subject (its atlas). Unlike recent approaches that transform subject data to a labeled, probabilistic atlas space (constructed from a database of atlases), Mindboggle labels a subject by each atlas in a database independently. When Mindboggle labels a human subject's brain image with at least four atlases, the resulting label agreement with coregistered manual labels is significantly higher than when only a single atlas is used. Different numbers of atlases provide significantly higher label agreements for individual brain regions. Increasing the number of reference brains used to automatically label a human subject brain improves labeling accuracy with respect to manually assigned labels. Mindboggle software can provide confidence measures for labels based on probabilistic assignment of labels and could be applied to large databases of brain images

  14. Automated bar coding of air samples at Hanford (ABCASH)

    International Nuclear Information System (INIS)

    Troyer, G.L.; Brayton, D.D.; McNeece, S.G.

    1992-10-01

    This article describes the basis, main features and benefits of an automated system for tracking and reporting radioactive air particulate samples. The system was developed due to recognized need for improving the quality and integrity of air sample data related to personnel and environmental protection. The data capture, storage, and retrieval of air sample data are described. The automation aspect of the associated and data input eliminates a large potential for human error. The system utilizes personal computers, handheld computers, a commercial personal computer database package, commercial programming languages, and complete documentation to satisfy the system's automation objective

  15. Sample handling and transport for the Secure Automated Fabrication line

    International Nuclear Information System (INIS)

    Sherrell, D.L.; Jensen, J.D.; Genoway, G.G.; Togesen, H.J.

    1983-06-01

    A totally automated system is described which packages, transports, receives, and unpackages sintered plutonium/uranium oxide fuel pellet samples to support automated chemical analysis equipment for the Secure Automated Fabrication (SAF) line. Samples are transferred 100 meters from the fuel production line to a different floor of the facility where automatic determinations are made for purposes of process control and fuel quality certification. The system automatically records identification numbers, net weights sent and received, and all other pertinent information such as fuel lot number, sample point, date, and time of day

  16. Automated Registration Of Images From Multiple Sensors

    Science.gov (United States)

    Rignot, Eric J. M.; Kwok, Ronald; Curlander, John C.; Pang, Shirley S. N.

    1994-01-01

    Images of terrain scanned in common by multiple Earth-orbiting remote sensors registered automatically with each other and, where possible, on geographic coordinate grid. Simulated image of terrain viewed by sensor computed from ancillary data, viewing geometry, and mathematical model of physics of imaging. In proposed registration algorithm, simulated and actual sensor images matched by area-correlation technique.

  17. Computer automated mass spectrometer for isotope analysis on gas samples

    International Nuclear Information System (INIS)

    Pamula, A.; Kaucsar, M.; Fatu, C.; Ursu, D.; Vonica, D.; Bendea, D.; Muntean, F.

    1998-01-01

    A low resolution, high precision instrument was designed and realized in the mass spectrometry laboratory of the Institute of Isotopic and Molecular Technology, Cluj-Napoca. The paper presents the vacuum system, the sample inlet system, the ion source, the magnetic analyzer and the ion collector. The instrument is almost completely automated. There are described the analog-to-digital conversion circuits, the local control microcomputer, the automation systems and the performance checking. (authors)

  18. On-line Automated Sample Preparation-Capillary Gas Chromatography for the Analysis of Plasma Samples.

    NARCIS (Netherlands)

    Louter, A.J.H.; van der Wagt, R.A.C.A.; Brinkman, U.A.T.

    1995-01-01

    An automated sample preparation module, (the automated sample preparation with extraction columns, ASPEC), was interfaced with a capillary gas chromatograph (GC) by means of an on-column interface. The system was optimised for the determination of the antidepressant trazodone in plasma. The clean-up

  19. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  20. An Automated Sample Processing System for Planetary Exploration

    Science.gov (United States)

    Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther

    2012-01-01

    An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.

  1. Automated platform for designing multiple robot work cells

    Science.gov (United States)

    Osman, N. S.; Rahman, M. A. A.; Rahman, A. A. Abdul; Kamsani, S. H.; Bali Mohamad, B. M.; Mohamad, E.; Zaini, Z. A.; Rahman, M. F. Ab; Mohamad Hatta, M. N. H.

    2017-06-01

    Designing the multiple robot work cells is very knowledge-intensive, intricate, and time-consuming process. This paper elaborates the development process of a computer-aided design program for generating the multiple robot work cells which offer a user-friendly interface. The primary purpose of this work is to provide a fast and easy platform for less cost and human involvement with minimum trial and errors adjustments. The automated platform is constructed based on the variant-shaped configuration concept with its mathematical model. A robot work cell layout, system components, and construction procedure of the automated platform are discussed in this paper where integration of these items will be able to automatically provide the optimum robot work cell design according to the information set by the user. This system is implemented on top of CATIA V5 software and utilises its Part Design, Assembly Design, and Macro tool. The current outcomes of this work provide a basis for future investigation in developing a flexible configuration system for the multiple robot work cells.

  2. An automated blood sampling system used in positron emission tomography

    International Nuclear Information System (INIS)

    Eriksson, L.; Bohm, C.; Kesselberg, M.

    1988-01-01

    Fast dynamic function studies with positron emission tomography (PET), has the potential to give accurate information of physiological functions of the brain. This capability can be realised if the positron camera system accurately quantitates the tracer uptake in the brain with sufficiently high efficiency and in sufficiently short time intervals. However, in addition, the tracer concentration in blood, as a function of time, must be accurately determined. This paper describes and evaluates an automated blood sampling system. Two different detector units are compared. The use of the automated blood sampling system is demonstrated in studies of cerebral blood flow, in studies of the blood-brain barrier transfer of amino acids and of the cerebral oxygen consumption. 5 refs.; 7 figs

  3. Automating Groundwater Sampling At Hanford, The Next Step

    International Nuclear Information System (INIS)

    Connell, C.W.; Conley, S.F.; Hildebrand, R.D.; Cunningham, D.E.

    2010-01-01

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very 'people intensive.' Approximately 1500 wells are sampled each year by field personnel or 'samplers.' These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  4. Indigenous development of automated metallographic sample preparation system

    International Nuclear Information System (INIS)

    Kulkarni, A.P.; Pandit, K.M.; Deshmukh, A.G.; Sahoo, K.C.

    2005-01-01

    Surface preparation of specimens for Metallographic studies on irradiated material involves a lot of remote handling of radioactive material by skilled manpower. These are laborious and man-rem intensive activities and put limitations on number of samples that can be prepared for the metallographic studies. To overcome these limitations, automated systems have been developed for surface preparation of specimens in PIE division. The system includes (i) Grinding and polishing stations (ii) Water jet cleaning station (iii) Ultrasonic cleaning stations (iv) Drying station (v) Sample loading and unloading station (vi) Dispenser for slurries and diluents and (vii) Automated head for movement of the sample holder disc from one station to other. System facilities the operator for programming/changing sequence of the sample preparations including remote changing of grinding/polishing discs from the stations. Two such systems have been installed and commissioned in Hot Cell for PIE Division. These are being used for preparation of irradiated samples from nuclear fuels and structural components. This development has increased the throughput of metallography work and savings in terms of (man-severts) radiation exposure to operators. This presentation will provide details of the challenges in undertaking this developmental work. (author)

  5. Accelerated Enveloping Distribution Sampling: Enabling Sampling of Multiple End States while Preserving Local Energy Minima.

    Science.gov (United States)

    Perthold, Jan Walther; Oostenbrink, Chris

    2018-05-17

    Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.

  6. Monte carlo sampling of fission multiplicity.

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, J. S. (John S.)

    2004-01-01

    Two new methods have been developed for fission multiplicity modeling in Monte Carlo calculations. The traditional method of sampling neutron multiplicity from fission is to sample the number of neutrons above or below the average. For example, if there are 2.7 neutrons per fission, three would be chosen 70% of the time and two would be chosen 30% of the time. For many applications, particularly {sup 3}He coincidence counting, a better estimate of the true number of neutrons per fission is required. Generally, this number is estimated by sampling a Gaussian distribution about the average. However, because the tail of the Gaussian distribution is negative and negative neutrons cannot be produced, a slight positive bias can be found in the average value. For criticality calculations, the result of rejecting the negative neutrons is an increase in k{sub eff} of 0.1% in some cases. For spontaneous fission, where the average number of neutrons emitted from fission is low, the error also can be unacceptably large. If the Gaussian width approaches the average number of fissions, 10% too many fission neutrons are produced by not treating the negative Gaussian tail adequately. The first method to treat the Gaussian tail is to determine a correction offset, which then is subtracted from all sampled values of the number of neutrons produced. This offset depends on the average value for any given fission at any energy and must be computed efficiently at each fission from the non-integrable error function. The second method is to determine a corrected zero point so that all neutrons sampled between zero and the corrected zero point are killed to compensate for the negative Gaussian tail bias. Again, the zero point must be computed efficiently at each fission. Both methods give excellent results with a negligible computing time penalty. It is now possible to include the full effects of fission multiplicity without the negative Gaussian tail bias.

  7. Automated Localization of Multiple Pelvic Bone Structures on MRI.

    Science.gov (United States)

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2016-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are at present identified manually on MRI to locate reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures automatically. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this paper, we present a model that combines support vector machines and nonlinear regression capturing global and local information to automatically identify the bounding boxes of bone structures on MRI. The model identifies the location of the pelvic bone structures by establishing the association between their relative locations and using local information such as texture features. Results show that the proposed method is able to locate the bone structures of interest accurately (dice similarity index >0.75) in 87-91% of the images. This research aims to enable accurate, consistent, and fully automated localization of bone structures on MRI to facilitate and improve the diagnosis of health conditions such as female POP.

  8. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    International Nuclear Information System (INIS)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F.; Prasanna, P.G.S.

    2007-01-01

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and medical

  9. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    Energy Technology Data Exchange (ETDEWEB)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States); Prasanna, P.G.S. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States)], E-mail: prasanna@afrri.usuhs.mil

    2007-07-15

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and

  10. Honest Importance Sampling with Multiple Markov Chains.

    Science.gov (United States)

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable

  11. Automated high-resolution NMR with a sample changer

    International Nuclear Information System (INIS)

    Wade, C.G.; Johnson, R.D.; Philson, S.B.; Strouse, J.; McEnroe, F.J.

    1989-01-01

    Within the past two years, it has become possible to obtain high-resolution NMR spectra using automated commercial instrumentation. Software control of all spectrometer functions has reduced most of the tedious manual operations to typing a few computer commands or even making selections from a menu. Addition of an automatic sample changer is the next natural step in improving efficiency and sample throughput; it has a significant (and even unexpected) impact on how NMR laboratories are run and how it is taught. Such an instrument makes even sophisticated experiments routine, so that people with no previous exposure to NMR can run these experiments after a training session of an hour or less. This A/C Interface examines the impact of such instrumentation on both the academic and the industrial laboratory

  12. Highly Reproducible Automated Proteomics Sample Preparation Workflow for Quantitative Mass Spectrometry.

    Science.gov (United States)

    Fu, Qin; Kowalski, Michael P; Mastali, Mitra; Parker, Sarah J; Sobhani, Kimia; van den Broek, Irene; Hunter, Christie L; Van Eyk, Jennifer E

    2018-01-05

    Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. We established an automated sample preparation workflow with a total processing time for 96 samples of 5 h, including a 2 h incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intraday and interday cyclic voltammograms (CVs) for 5 serum and 5 plasma samples over 5 days were samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. An automated trypsin digestion workflow yields uniformly processed samples in less than 5 h. Reproducible quantification of peptides was observed across replicates, days, instruments, and laboratory sites, demonstrating the broad applicability of this approach.

  13. Portable Automation of Static Chamber Sample Collection for Quantifying Soil Gas Flux

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Morgan P.; Groh, Tyler A.; Parkin, Timothy B.; Williams, Ryan J.; Isenhart, Thomas M.; Hofmockel, Kirsten S.

    2018-01-01

    Quantification of soil gas flux using the static chamber method is labor intensive. The number of chambers that can be sampled is limited by the spacing between chambers and the availability of trained research technicians. An automated system for collecting gas samples from chambers in the field would eliminate the need for personnel to return to the chamber during a flux measurement period and would allow a single technician to sample multiple chambers simultaneously. This study describes Chamber Automated Sampling Equipment (FluxCASE) to collect and store chamber headspace gas samples at assigned time points for the measurement of soil gas flux. The FluxCASE design and operation is described, and the accuracy and precision of the FluxCASE system is evaluated. In laboratory measurements of nitrous oxide (N2O), carbon dioxide (CO2), and methane (CH4) concentrations of a standardized gas mixture, coefficients of variation associated with automated and manual sample collection were comparable, indicating no loss of precision. In the field, soil gas fluxes measured from FluxCASEs were in agreement with manual sampling for both N2O and CO2. Slopes of regression equations were 1.01 for CO2 and 0.97 for N2O. The 95% confidence limits of the slopes of the regression lines included the value of one, indicating no bias. Additionally, an expense analysis found a cost recovery ranging from 0.6 to 2.2 yr. Implementing the FluxCASE system is an alternative to improve the efficiency of the static chamber method for measuring soil gas flux while maintaining the accuracy and precision of manual sampling.

  14. Automated detection of multiple sclerosis lesions in serial brain MRI

    International Nuclear Information System (INIS)

    Llado, Xavier; Ganiler, Onur; Oliver, Arnau; Marti, Robert; Freixenet, Jordi; Valls, Laia; Vilanova, Joan C.; Ramio-Torrenta, Lluis; Rovira, Alex

    2012-01-01

    Multiple sclerosis (MS) is a serious disease typically occurring in the brain whose diagnosis and efficacy of treatment monitoring are vital. Magnetic resonance imaging (MRI) is frequently used in serial brain imaging due to the rich and detailed information provided. Time-series analysis of images is widely used for MS diagnosis and patient follow-up. However, conventional manual methods are time-consuming, subjective, and error-prone. Thus, the development of automated techniques for the detection and quantification of MS lesions is a major challenge. This paper presents an up-to-date review of the approaches which deal with the time-series analysis of brain MRI for detecting active MS lesions and quantifying lesion load change. We provide a comprehensive reference source for researchers in which several approaches to change detection and quantification of MS lesions are investigated and classified. We also analyze the results provided by the approaches, discuss open problems, and point out possible future trends. Lesion detection approaches are required for the detection of static lesions and for diagnostic purposes, while either quantification of detected lesions or change detection algorithms are needed to follow up MS patients. However, there is not yet a single approach that can emerge as a standard for the clinical practice, automatically providing an accurate MS lesion evolution quantification. Future trends will focus on combining the lesion detection in single studies with the analysis of the change detection in serial MRI. (orig.)

  15. Automated detection of multiple sclerosis lesions in serial brain MRI

    Energy Technology Data Exchange (ETDEWEB)

    Llado, Xavier; Ganiler, Onur; Oliver, Arnau; Marti, Robert; Freixenet, Jordi [University of Girona, Computer Vision and Robotics Group, Girona (Spain); Valls, Laia [Dr. Josep Trueta University Hospital, Department of Radiology, Girona (Spain); Vilanova, Joan C. [Girona Magnetic Resonance Center, Girona (Spain); Ramio-Torrenta, Lluis [Dr. Josep Trueta University Hospital, Institut d' Investigacio Biomedica de Girona, Multiple Sclerosis and Neuroimmunology Unit, Girona (Spain); Rovira, Alex [Vall d' Hebron University Hospital, Magnetic Resonance Unit, Department of Radiology, Barcelona (Spain)

    2012-08-15

    Multiple sclerosis (MS) is a serious disease typically occurring in the brain whose diagnosis and efficacy of treatment monitoring are vital. Magnetic resonance imaging (MRI) is frequently used in serial brain imaging due to the rich and detailed information provided. Time-series analysis of images is widely used for MS diagnosis and patient follow-up. However, conventional manual methods are time-consuming, subjective, and error-prone. Thus, the development of automated techniques for the detection and quantification of MS lesions is a major challenge. This paper presents an up-to-date review of the approaches which deal with the time-series analysis of brain MRI for detecting active MS lesions and quantifying lesion load change. We provide a comprehensive reference source for researchers in which several approaches to change detection and quantification of MS lesions are investigated and classified. We also analyze the results provided by the approaches, discuss open problems, and point out possible future trends. Lesion detection approaches are required for the detection of static lesions and for diagnostic purposes, while either quantification of detected lesions or change detection algorithms are needed to follow up MS patients. However, there is not yet a single approach that can emerge as a standard for the clinical practice, automatically providing an accurate MS lesion evolution quantification. Future trends will focus on combining the lesion detection in single studies with the analysis of the change detection in serial MRI. (orig.)

  16. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  17. Multiple steroid radioimmunoassays and automation. Versatile techniques for reproductive endocrinology

    International Nuclear Information System (INIS)

    Vihko, R.; Hammond, G.L.; Pakarinen, A.; Viinikka, L.

    1978-01-01

    The combination of the efficient steroid-separating properties of a lipophilic Sephadex derivative Lipidex-5000sup(TM) and the use of antibodies with carefully selected specificity allows the quantitative determination of pregnenolone, progesterone, 17α-hydroxyprogesterone, androstenedione, testosterone, 5α-dihydrotestosterone, 5α-androstanedione, androsterone and 5α-androstane-3α, 17β-diol in 1- to 2-ml samples of both blood serum and amniotic fluid as well as in 300- to 600-mg pieces of prostatic tissue. The adaptation of the pipetting unit and incubator of a discrete clinical chemical analyser, System Olli 3000, for the automation of the radioimmunoassays has resulted in a greatly increased through-put and has decreased the experimental error of the procedure. In studies on reproductive endocrinology, the methodology developed has allowed the detection of a sex difference in androgen composition of the amniotic fluid early in pregnancy. Further, it is very likely that the decline in steroid production by the testis seen during the first year of life and then in senescence is affected by basically different mechanisms. There are also important differences in the steroid content of normal, hyperplastic and carcinomatous prostate. (author)

  18. Multiple steroid radioimmunoassays and automation: versatile techniques for reproductive endocrinology

    International Nuclear Information System (INIS)

    Vihko, R.; Hammond, G.L.; Pakarinen, A.; Viinikka, L.

    1977-01-01

    The combination of the efficient steroid separating properties of a lipophilic Sephadex derivative Lipidex-5000sup(TM), with the use of antibodies with carefully selected specificity allows the quantitative determination of pregnenolone, progesterone, 17α-hydroxyprogesterone, androstenedione, testosterone, 5α-dihydrotestosterone, 5α-androstanedione, androsterone and 5α-androstane-3α, 17β-diol from 1-2 ml samples of blood serum, amniotic fluid or 300-600 mg pieces of prostatic tissue. The adaptation of the pipetting unit and incubator of a discrete clinical chemical analyzer, System Olli 3000, for the automation of the radioimmunoassays has resulted in a greatly increased through-put and decreased experimental error of the procedure. In studies on reproductive endocrinology, the methodology developed has allowed the detection of a sex difference in androgen composition of the amniotic fluid early in pregnancy. Further, it is very likely that the decline in steroid production by the testis seen during the first year of life and then in senescence is affected by basically different mechanisms. There are also important differences in the steroid content of normal, hyperplastic and carcinomatous prostate. (orig.) [de

  19. An Automated Approach to Reasoning Under Multiple Perspectives

    Science.gov (United States)

    deBessonet, Cary

    2004-01-01

    This is the final report with emphasis on research during the last term. The context for the research has been the development of an automated reasoning technology for use in SMS (symbolic Manipulation System), a system used to build and query knowledge bases (KBs) using a special knowledge representation language SL (Symbolic Language). SMS interpreters assertive SL input and enters the results as components of its universe. The system operates in two basic models: 1) constructive mode (for building KBs); and 2) query/search mode (for querying KBs). Query satisfaction consists of matching query components with KB components. The system allows "penumbral matches," that is, matches that do not exactly meet the specifications of the query, but which are deemed relevant for the conversational context. If the user wants to know whether SMS has information that holds, say, for "any chow," the scope of relevancy might be set so that the system would respond based on a finding that it has information that holds for "most dogs," although this is not exactly what was called for by the query. The response would be qualified accordingly, as would normally be the case in ordinary human conversation. The general goal of the research was to develop an approach by which assertive content could be interpreted from multiple perspectives so that reasoning operations could be successfully conducted over the results. The interpretation of an SL statement such as, "{person believes [captain (asserted (perhaps)) (astronaut saw (comet (bright)))]}," which in English would amount to asserting something to the effect that, "Some person believes that a captain perhaps asserted that an astronaut saw a bright comet," would require the recognition of multiple perspectives, including some that are: a) epistemically-based (focusing on "believes"); b) assertion-based (focusing on "asserted"); c) perception-based (focusing on "saw"); d) adjectivally-based (focusing on "bight"); and e) modally

  20. Making MUSIC: A multiple sampling ionization chamber

    International Nuclear Information System (INIS)

    Shumard, B.; Henderson, D.J.; Rehm, K.E.; Tang, X.D.

    2007-01-01

    A multiple sampling ionization chamber (MUSIC) was developed for use in conjunction with the Atlas scattering chamber (ATSCAT). This chamber was developed to study the (α, p) reaction in stable and radioactive beams. The gas filled ionization chamber is used as a target and detector for both particles in the outgoing channel (p + beam particles for elastic scattering or p + residual nucleus for (α, p) reactions). The MUSIC detector is followed by a Si array to provide a trigger for anode events. The anode events are gated by a gating grid so that only (α, p) reactions where the proton reaches the Si detector result in an anode event. The MUSIC detector is a segmented ionization chamber. The active length of the chamber is 11.95 in. and is divided into 16 equal anode segments (3.5 in. x 0.70 in. with 0.3 in. spacing between pads). The dead area of the chamber was reduced by the addition of a Delrin snout that extends 0.875 in. into the chamber from the front face, to which a mylar window is affixed. 0.5 in. above the anode is a Frisch grid that is held at ground potential. 0.5 in. above the Frisch grid is a gating grid. The gating grid functions as a drift electron barrier, effectively halting the gathering of signals. Setting two sets of alternating wires at differing potentials creates a lateral electric field which traps the drift electrons, stopping the collection of anode signals. The chamber also has a reinforced mylar exit window separating the Si array from the target gas. This allows protons from the (α, p) reaction to be detected. The detection of these protons opens the gating grid to allow the drift electrons released from the ionizing gas during the (α, p) reaction to reach the anode segment below the reaction

  1. Making MUSIC: A multiple sampling ionization chamber

    Science.gov (United States)

    Shumard, B.; Henderson, D. J.; Rehm, K. E.; Tang, X. D.

    2007-08-01

    A multiple sampling ionization chamber (MUSIC) was developed for use in conjunction with the Atlas scattering chamber (ATSCAT). This chamber was developed to study the (α, p) reaction in stable and radioactive beams. The gas filled ionization chamber is used as a target and detector for both particles in the outgoing channel (p + beam particles for elastic scattering or p + residual nucleus for (α, p) reactions). The MUSIC detector is followed by a Si array to provide a trigger for anode events. The anode events are gated by a gating grid so that only (α, p) reactions where the proton reaches the Si detector result in an anode event. The MUSIC detector is a segmented ionization chamber. The active length of the chamber is 11.95 in. and is divided into 16 equal anode segments (3.5 in. × 0.70 in. with 0.3 in. spacing between pads). The dead area of the chamber was reduced by the addition of a Delrin snout that extends 0.875 in. into the chamber from the front face, to which a mylar window is affixed. 0.5 in. above the anode is a Frisch grid that is held at ground potential. 0.5 in. above the Frisch grid is a gating grid. The gating grid functions as a drift electron barrier, effectively halting the gathering of signals. Setting two sets of alternating wires at differing potentials creates a lateral electric field which traps the drift electrons, stopping the collection of anode signals. The chamber also has a reinforced mylar exit window separating the Si array from the target gas. This allows protons from the (α, p) reaction to be detected. The detection of these protons opens the gating grid to allow the drift electrons released from the ionizing gas during the (α, p) reaction to reach the anode segment below the reaction.

  2. Making MUSIC: A multiple sampling ionization chamber

    Energy Technology Data Exchange (ETDEWEB)

    Shumard, B. [Argonne National Laboratory, Building 203 H-113, Argonne, IL 60439 (United States)]. E-mail: shumard@phy.anl.gov; Henderson, D.J. [Argonne National Laboratory, Building 203 H-113, Argonne, IL 60439 (United States); Rehm, K.E. [Argonne National Laboratory, Building 203 H-113, Argonne, IL 60439 (United States); Tang, X.D. [Argonne National Laboratory, Building 203 H-113, Argonne, IL 60439 (United States)

    2007-08-15

    A multiple sampling ionization chamber (MUSIC) was developed for use in conjunction with the Atlas scattering chamber (ATSCAT). This chamber was developed to study the ({alpha}, p) reaction in stable and radioactive beams. The gas filled ionization chamber is used as a target and detector for both particles in the outgoing channel (p + beam particles for elastic scattering or p + residual nucleus for ({alpha}, p) reactions). The MUSIC detector is followed by a Si array to provide a trigger for anode events. The anode events are gated by a gating grid so that only ({alpha}, p) reactions where the proton reaches the Si detector result in an anode event. The MUSIC detector is a segmented ionization chamber. The active length of the chamber is 11.95 in. and is divided into 16 equal anode segments (3.5 in. x 0.70 in. with 0.3 in. spacing between pads). The dead area of the chamber was reduced by the addition of a Delrin snout that extends 0.875 in. into the chamber from the front face, to which a mylar window is affixed. 0.5 in. above the anode is a Frisch grid that is held at ground potential. 0.5 in. above the Frisch grid is a gating grid. The gating grid functions as a drift electron barrier, effectively halting the gathering of signals. Setting two sets of alternating wires at differing potentials creates a lateral electric field which traps the drift electrons, stopping the collection of anode signals. The chamber also has a reinforced mylar exit window separating the Si array from the target gas. This allows protons from the ({alpha}, p) reaction to be detected. The detection of these protons opens the gating grid to allow the drift electrons released from the ionizing gas during the ({alpha}, p) reaction to reach the anode segment below the reaction.

  3. Multiple sample, radioactive particle counting apparatus

    International Nuclear Information System (INIS)

    Reddy, R.R.V.; Kelso, D.M.

    1978-01-01

    An apparatus is described for determining the respective radioactive particle sample count being emitted from radioactive particle containing samples. It includes means for modulating the information on the radioactive particles being emitted from the samples, coded detecting means for sequentially detecting different respective coded combinations of the radioactive particles emitted from more than one but less than all of the samples, and processing the modulated information to derive the sample count for each sample. It includes a single light emitting crystal next to a number of samples, an encoder belt sequentially movable between the crystal and the samples. The encoder belt has a coded array of apertures to provide corresponding modulated light pulses from the crystal, and a photomultiplier tube to convert the modulated light pulses to decodable electrical signals for deriving the respective sample count

  4. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  5. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  6. Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

    OpenAIRE

    Loke Mun Sei

    2015-01-01

    Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, ...

  7. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  8. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  9. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  10. Adaptive multiple importance sampling for Gaussian processes

    Czech Academy of Sciences Publication Activity Database

    Xiong, X.; Šmídl, Václav; Filippone, M.

    2017-01-01

    Roč. 87, č. 8 (2017), s. 1644-1665 ISSN 0094-9655 R&D Projects: GA MŠk(CZ) 7F14287 Institutional support: RVO:67985556 Keywords : Gaussian Process * Bayesian estimation * Adaptive importance sampling Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 0.757, year: 2016 http://library.utia.cas.cz/separaty/2017/AS/smidl-0469804.pdf

  11. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    Science.gov (United States)

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  12. Sample preparation automation for dosing plutonium in urine

    International Nuclear Information System (INIS)

    Jeanmaire, Lucien; Ballada, Jean; Ridelle Berger, Ariane

    1969-06-01

    After having indicated that dosing urinary plutonium by using the Henry technique can be divided into three stages (plutonium concentration by precipitation, passing the solution on an anionic resin column and plutonium elution, and eluate evaporation to obtain a source of which the radioactivity is measured), and recalled that the automation of the second stage has been reported in another document, this document describes the automation of the first stage, i.e. obtaining from urine a residue containing the plutonium, and sufficiently mineralized to be analyzed by means of ion exchanging resins. Two techniques are proposed, leading to slightly different devices. The different operations to be performed are indicated. The different components of the apparatus are described: beakers, hot plate stirrers, reagent circuits, a system for supernatant suction, and a control-command circuit. The operation and use are then described, and results are given

  13. Estimates of Radionuclide Loading to Cochiti Lake from Los Alamos Canyon Using Manual and Automated Sampling

    Energy Technology Data Exchange (ETDEWEB)

    McLean, Christopher T. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-07-01

    Los Alamos National Laboratory has a long-standing program of sampling storm water runoff inside the Laboratory boundaries. In 1995, the Laboratory started collecting the samples using automated storm water sampling stations; prior to this time the samples were collected manually. The Laboratory has also been periodically collecting sediment samples from Cochiti Lake. This paper presents the data for Pu-238 and Pu-239 bound to the sediments for Los Alamos Canyon storm water runoff and compares the sampling types by mass loading and as a percentage of the sediment deposition to Cochiti Lake. The data for both manual and automated sampling are used to calculate mass loads from Los Alamos Canyon on a yearly basis. The automated samples show mass loading 200- 500 percent greater for Pu-238 and 300-700 percent greater for Pu-239 than the manual samples. Using the mean manual flow volume for mass loading calculations, the automated samples are over 900 percent greater for Pu-238 and over 1800 percent greater for Pu-239. Evaluating the Pu-238 and Pu-239 activities as a percentage of deposition to Cochiti Lake indicates that the automated samples are 700-1300 percent greater for Pu- 238 and 200-500 percent greater for Pu-239. The variance was calculated by two methods. The first method calculates the variance for each sample event. The second method calculates the variances by the total volume of water discharged in Los Alamos Canyon for the year.

  14. Automated washing of FTA Card punches and PCR setup for reference samples using a LIMS-controlled Sias Xantus automated liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Olsen, Addie Nina; Frøslev, Tobias G.

    2009-01-01

    We have implemented and validated automated methods for washing FTA Card punches containing buccal samples and subsequent PCR setup using a Sias Xantus automated liquid handler. The automated methods were controlled by worklists generated by our LabWare Laboratory Information Management System...

  15. Flexible automated approach for quantitative liquid handling of complex biological samples.

    Science.gov (United States)

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  16. Automated dried blood spots standard and QC sample preparation using a robotic liquid handler.

    Science.gov (United States)

    Yuan, Long; Zhang, Duxi; Aubry, Anne-Francoise; Arnold, Mark E

    2012-12-01

    A dried blood spot (DBS) bioanalysis assay involves many steps, such as the preparation of standard (STD) and QC samples in blood, the spotting onto DBS cards, and the cutting-out of the spots. These steps are labor intensive and time consuming if done manually, which, therefore, makes automation very desirable in DBS bioanalysis. A robotic liquid handler was successfully applied to the preparation of STD and QC samples in blood and to spot the blood samples onto DBS cards using buspirone as the model compound. This automated preparation was demonstrated to be accurate and consistent. However the accuracy and precision of automated preparation were similar to those from manual preparation. The effect of spotting volume on accuracy was evaluated and a trend of increasing concentrations of buspirone with increasing spotting volumes was observed. The automated STD and QC sample preparation process significantly improved the efficiency, robustness and safety of DBS bioanalysis.

  17. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  18. Influencing Trust for Human-Automation Collaborative Scheduling of Multiple Unmanned Vehicles.

    Science.gov (United States)

    Clare, Andrew S; Cummings, Mary L; Repenning, Nelson P

    2015-11-01

    We examined the impact of priming on operator trust and system performance when supervising a decentralized network of heterogeneous unmanned vehicles (UVs). Advances in autonomy have enabled a future vision of single-operator control of multiple heterogeneous UVs. Real-time scheduling for multiple UVs in uncertain environments requires the computational ability of optimization algorithms combined with the judgment and adaptability of human supervisors. Because of system and environmental uncertainty, appropriate operator trust will be instrumental to maintain high system performance and prevent cognitive overload. Three groups of operators experienced different levels of trust priming prior to conducting simulated missions in an existing, multiple-UV simulation environment. Participants who play computer and video games frequently were found to have a higher propensity to overtrust automation. By priming gamers to lower their initial trust to a more appropriate level, system performance was improved by 10% as compared to gamers who were primed to have higher trust in the automation. Priming was successful at adjusting the operator's initial and dynamic trust in the automated scheduling algorithm, which had a substantial impact on system performance. These results have important implications for personnel selection and training for futuristic multi-UV systems under human supervision. Although gamers may bring valuable skills, they may also be potentially prone to automation bias. Priming during training and regular priming throughout missions may be one potential method for overcoming this propensity to overtrust automation. © 2015, Human Factors and Ergonomics Society.

  19. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    Science.gov (United States)

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  20. Sample size determination in clinical trials with multiple endpoints

    CERN Document Server

    Sozu, Takashi; Hamasaki, Toshimitsu; Evans, Scott R

    2015-01-01

    This book integrates recent methodological developments for calculating the sample size and power in trials with more than one endpoint considered as multiple primary or co-primary, offering an important reference work for statisticians working in this area. The determination of sample size and the evaluation of power are fundamental and critical elements in the design of clinical trials. If the sample size is too small, important effects may go unnoticed; if the sample size is too large, it represents a waste of resources and unethically puts more participants at risk than necessary. Recently many clinical trials have been designed with more than one endpoint considered as multiple primary or co-primary, creating a need for new approaches to the design and analysis of these clinical trials. The book focuses on the evaluation of power and sample size determination when comparing the effects of two interventions in superiority clinical trials with multiple endpoints. Methods for sample size calculation in clin...

  1. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Hebel, S.; Hands, J.; Goering, F.; Kirchner, G.; Purtschert, R.

    2015-01-01

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  2. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  3. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    International Nuclear Information System (INIS)

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-01

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs

  4. SASSI: Subsystems for Automated Subsurface Sampling Instruments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Future robotic planetary exploration missions will benefit greatly from the ability to capture rock and/or regolith core samples that deliver the stratigraphy of the...

  5. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat...

  6. Development and Evaluation of a Pilot Prototype Automated Online Sampling System

    International Nuclear Information System (INIS)

    Whitaker, M.J.

    2000-01-01

    An automated online sampling system has been developed for the BNFL-Hanford Technetium Monitoring Program. The system was designed to be flexible and allows for the collection and delivery of samples to a variety of detection devices that may be used

  7. Automated modal parameter estimation using correlation analysis and bootstrap sampling

    Science.gov (United States)

    Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.

    2018-02-01

    The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to

  8. Automated injection of slurry samples in flow-injection analysis

    NARCIS (Netherlands)

    Hulsman, M.H.F.M.; Hulsman, M.; Bos, M.; van der Linden, W.E.

    1996-01-01

    Two types of injectors are described for introducing solid samples as slurries in flow analysis systems. A time-based and a volume-based injector based on multitube solenoid pinch valves were built, both can be characterized as hydrodynamic injectors. Reproducibility of the injections of dispersed

  9. Automated Blood Sample Preparation Unit (ABSPU) for Portable Microfluidic Flow Cytometry.

    Science.gov (United States)

    Chaturvedi, Akhil; Gorthi, Sai Siva

    2017-02-01

    Portable microfluidic diagnostic devices, including flow cytometers, are being developed for point-of-care settings, especially in conjunction with inexpensive imaging devices such as mobile phone cameras. However, two pervasive drawbacks of these have been the lack of automated sample preparation processes and cells settling out of sample suspensions, leading to inaccurate results. We report an automated blood sample preparation unit (ABSPU) to prevent blood samples from settling in a reservoir during loading of samples in flow cytometers. This apparatus automates the preanalytical steps of dilution and staining of blood cells prior to microfluidic loading. It employs an assembly with a miniature vibration motor to drive turbulence in a sample reservoir. To validate performance of this system, we present experimental evidence demonstrating prevention of blood cell settling, cell integrity, and staining of cells prior to flow cytometric analysis. This setup is further integrated with a microfluidic imaging flow cytometer to investigate cell count variability. With no need for prior sample preparation, a drop of whole blood can be directly introduced to the setup without premixing with buffers manually. Our results show that integration of this assembly with microfluidic analysis provides a competent automation tool for low-cost point-of-care blood-based diagnostics.

  10. Automated trajectory planning for multiple-flyby interplanetary missions

    Science.gov (United States)

    Englander, Jacob

    Many space mission planning problems may be formulated as hybrid optimal control problems (HOCP), i.e. problems that include both real-valued variables and categorical variables. In interplanetary trajectory design problems the categorical variables will typically specify the sequence of planets at which to perform flybys, and the real-valued variables will represent the launch date, ight times between planets, magnitudes and directions of thrust, flyby altitudes, etc. The contribution of this work is a framework for the autonomous optimization of multiple-flyby interplanetary trajectories. The trajectory design problem is converted into a HOCP with two nested loops: an "outer-loop" that finds the sequence of flybys and an "inner-loop" that optimizes the trajectory for each candidate yby sequence. The problem of choosing a sequence of flybys is posed as an integer programming problem and solved using a genetic algorithm (GA). This is an especially difficult problem to solve because GAs normally operate on a fixed-length set of decision variables. Since in interplanetary trajectory design the number of flyby maneuvers is not known a priori, it was necessary to devise a method of parameterizing the problem such that the GA can evolve a variable-length sequence of flybys. A novel "null gene" transcription was developed to meet this need. Then, for each candidate sequence of flybys, a trajectory must be found that visits each of the flyby targets and arrives at the final destination while optimizing some cost metric, such as minimizing ▵v or maximizing the final mass of the spacecraft. Three different classes of trajectory are described in this work, each of which requireda different physical model and optimization method. The choice of a trajectory model and optimization method is especially challenging because of the nature of the hybrid optimal control problem. Because the trajectory optimization problem is generated in real time by the outer-loop, the inner

  11. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  12. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  13. Automated sample mounting and technical advance alignment system for biological crystallography at a synchrotron source

    International Nuclear Information System (INIS)

    Snell, Gyorgy; Cork, Carl; Nordmeyer, Robert; Cornell, Earl; Meigs, George; Yegian, Derek; Jaklevic, Joseph; Jin, Jian; Stevens, Raymond C.; Earnest, Thomas

    2004-01-01

    High-throughput data collection for macromolecular crystallography requires an automated sample mounting system for cryo-protected crystals that functions reliably when integrated into protein-crystallography beamlines at synchrotrons. Rapid mounting and dismounting of the samples increases the efficiency of the crystal screening and data collection processes, where many crystals can be tested for the quality of diffraction. The sample-mounting subsystem has random access to 112 samples, stored under liquid nitrogen. Results of extensive tests regarding the performance and reliability of the system are presented. To further increase throughput, we have also developed a sample transport/storage system based on 'puck-shaped' cassettes, which can hold sixteen samples each. Seven cassettes fit into a standard dry shipping Dewar. The capabilities of a robotic crystal mounting and alignment system with instrumentation control software and a relational database allows for automated screening and data collection to be developed

  14. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    Science.gov (United States)

    Fischer, Jesse R.; Quist, Michael C.

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  15. Automated injection of a radioactive sample for preparative HPLC with feedback control

    International Nuclear Information System (INIS)

    Iwata, Ren; Yamazaki, Shigeki

    1990-01-01

    The injection of a radioactive reaction mixture into a preparative HPLC column has been automated with computer control for rapid purification of routinely prepared positron emitting radiopharmaceuticals. Using pneumatic valves, a motor-driven pump and a liquid level sensor, two intelligent injection methods for the automation were compared with regard to efficient and rapid sample loading into a 2 mL loop of the 6-way valve. One, a precise but rather slow method, was demonstrated to be suitable for purification of 18 F-radiopharmaceuticals, while the other, due to its rapid operation, was more suitable for 11 C-radiopharmaceuticals. A sample volume of approx 0.5 mL can be injected onto a preparative HPLC column with over 90% efficiency with the present automated system. (author)

  16. Multiple biopsy probe sampling enabled minimally invasive electrical impedance tomography

    International Nuclear Information System (INIS)

    Shini, Mohanad; Rubinsky, Boris

    2008-01-01

    Biopsies are a reliable method for examining tissues and organs inside the body, in particular for detection of tumors. However, a single biopsy produces only limited information on the site from which it is taken. Therefore, tumor detection now employs multiple biopsy samplings to examine larger volumes of tissue. Nevertheless, even with multiple biopsies, the information remains discrete, while the costs of biopsy increase. Here we propose and evaluate the feasibility of using minimally invasive medical imaging as a means to overcome the limitations of discrete biopsy sampling. The minimally invasive medical imaging technique employs the biopsy probe as electrodes for measurements of electrical impedance tomography relevant data during each biopsy sampling. The data from multiple samplings are combined and used to produce an EIT image of the tissue. Two- and three-dimensional mathematical simulations confirm that the minimally invasive medical imaging technique can produce electrical impedance tomography images of the tissues between the biopsy probe insertion sites. We show that these images can detect tumors that would be missed with multiple biopsy samplings only, and that the technique may facilitate the detection of tumors with fewer biopsies, thereby reducing the cost of cancer detection

  17. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  18. A bench-top automated workstation for nucleic acid isolation from clinical sample types.

    Science.gov (United States)

    Thakore, Nitu; Garber, Steve; Bueno, Arial; Qu, Peter; Norville, Ryan; Villanueva, Michael; Chandler, Darrell P; Holmberg, Rebecca; Cooney, Christopher G

    2018-04-18

    Systems that automate extraction of nucleic acid from cells or viruses in complex clinical matrices have tremendous value even in the absence of an integrated downstream detector. We describe our bench-top automated workstation that integrates our previously-reported extraction method - TruTip - with our newly-developed mechanical lysis method. This is the first report of this method for homogenizing viscous and heterogeneous samples and lysing difficult-to-disrupt cells using "MagVor": a rotating magnet that rotates a miniature stir disk amidst glass beads confined inside of a disposable tube. Using this system, we demonstrate automated nucleic acid extraction from methicillin-resistant Staphylococcus aureus (MRSA) in nasopharyngeal aspirate (NPA), influenza A in nasopharyngeal swabs (NPS), human genomic DNA from whole blood, and Mycobacterium tuberculosis in NPA. The automated workstation yields nucleic acid with comparable extraction efficiency to manual protocols, which include commercially-available Qiagen spin column kits, across each of these sample types. This work expands the scope of applications beyond previous reports of TruTip to include difficult-to-disrupt cell types and automates the process, including a method for removal of organics, inside a compact bench-top workstation. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Feasibility of surface sampling in automated inspection of concrete aggregates during bulk transport on a conveyor

    NARCIS (Netherlands)

    Bakker, M.C.M.; Di Maio, F.; Lotfi, S.; Bakker, M.; Hu, M.; Vahidi, A.

    2017-01-01

    Automated optic inspection of concrete aggregates for pollutants (e.g. wood, plastics, gypsum and brick) is required to establish the suitability for reuse in new concrete products. Inspection is more efficient when directly sampling the materials on the conveyor belt instead of feeding them in a

  20. Novel diffusion cell for in vitro transdermal permeation, compatible with automated dynamic sampling

    NARCIS (Netherlands)

    Bosman, I.J; Lawant, A.L; Avegaart, S.R.; Ensing, K; de Zeeuw, R.A

    The development of a new diffusion cell for in vitro transdermal permeation is described. The so-called Kelder cells were used in combination with the ASPEC system (Automatic Sample Preparation with Extraction Columns), which is designed for the automation of solid-phase extractions (SPE). Instead

  1. Field evaluation of personal sampling methods for multiple bioaerosols.

    Science.gov (United States)

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  2. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  3. Application of bar codes to the automation of analytical sample data collection

    International Nuclear Information System (INIS)

    Jurgensen, H.A.

    1986-01-01

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented

  4. Sampling plans in attribute mode with multiple levels of precision

    International Nuclear Information System (INIS)

    Franklin, M.

    1986-01-01

    This paper describes a method for deriving sampling plans for nuclear material inventory verification. The method presented is different from the classical approach which envisages two levels of measurement precision corresponding to NDA and DA. In the classical approach the precisions of the two measurement methods are taken as fixed parameters. The new approach is based on multiple levels of measurement precision. The design of the sampling plan consists of choosing the number of measurement levels, the measurement precision to be used at each level and the sample size to be used at each level

  5. Sample size determination for equivalence assessment with multiple endpoints.

    Science.gov (United States)

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  6. Feasibility of automated speech sample collection with stuttering children using interactive voice response (IVR) technology.

    Science.gov (United States)

    Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena

    2015-04-01

    To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.

  7. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  8. Automated cellular sample preparation using a Centrifuge-on-a-Chip.

    Science.gov (United States)

    Mach, Albert J; Kim, Jae Hyun; Arshi, Armin; Hur, Soojung Claire; Di Carlo, Dino

    2011-09-07

    The standard centrifuge is a laboratory instrument widely used by biologists and medical technicians for preparing cell samples. Efforts to automate the operations of concentration, cell separation, and solution exchange that a centrifuge performs in a simpler and smaller platform have had limited success. Here, we present a microfluidic chip that replicates the functions of a centrifuge without moving parts or external forces. The device operates using a purely fluid dynamic phenomenon in which cells selectively enter and are maintained in microscale vortices. Continuous and sequential operation allows enrichment of cancer cells from spiked blood samples at the mL min(-1) scale, followed by fluorescent labeling of intra- and extra-cellular antigens on the cells without the need for manual pipetting and washing steps. A versatile centrifuge-analogue may open opportunities in automated, low-cost and high-throughput sample preparation as an alternative to the standard benchtop centrifuge in standardized clinical diagnostics or resource poor settings.

  9. Automated SEM and TEM sample preparation applied to copper/low k materials

    Science.gov (United States)

    Reyes, R.; Shaapur, F.; Griffiths, D.; Diebold, A. C.; Foran, B.; Raz, E.

    2001-01-01

    We describe the use of automated microcleaving for preparation of both SEM and TEM samples as done by SELA's new MC500 and TEMstation tools. The MC500 is an automated microcleaving tool that is capable of producing cleaves with 0.25 μm accuracy resulting in SEM-ready samples. The TEMstation is capable of taking a sample output from the MC500 (or from SELA's earlier MC200 tool) and producing a FIB ready slice of 25±5 μm, mounted on a TEM-washer and ready for FIB thinning to electron transparency for TEM analysis. The materials selected for the tool set evaluation mainly included the Cu/TaN/HOSP low-k system. The paper is divided into three sections, experimental approach, SEM preparation and analysis of HOSP low-k, and TEM preparation and analysis of Cu/TaN/HOSP low-k samples. For the samples discussed, data is presented to show the quality of preparation provided by these new automated tools.

  10. Comparison and clinical utility evaluation of four multiple allergen simultaneous tests including two newly introduced fully automated analyzers

    Directory of Open Access Journals (Sweden)

    John Hoon Rim

    2016-04-01

    Full Text Available Background: We compared the diagnostic performances of two newly introduced fully automated multiple allergen simultaneous tests (MAST analyzers with two conventional MAST assays. Methods: The serum samples from a total of 53 and 104 patients were tested for food panels and inhalant panels, respectively, in four analyzers including AdvanSure AlloScreen (LG Life Science, Korea, AdvanSure Allostation Smart II (LG Life Science, PROTIA Allergy-Q (ProteomeTech, Korea, and RIDA Allergy Screen (R-Biopharm, Germany. We compared not only the total agreement percentages but also positive propensities among four analyzers. Results: Evaluation of AdvanSure Allostation Smart II as upgraded version of AdvanSure AlloScreen revealed good concordance with total agreement percentages of 93.0% and 92.2% in food and inhalant panel, respectively. Comparisons of AdvanSure Allostation Smart II or PROTIA Allergy-Q with RIDA Allergy Screen also showed good concordance performance with positive propensities of two new analyzers for common allergens (Dermatophagoides farina and Dermatophagoides pteronyssinus. The changes of cut-off level resulted in various total agreement percentage fluctuations among allergens by different analyzers, although current cut-off level of class 2 appeared to be generally suitable. Conclusions: AdvanSure Allostation Smart II and PROTIA Allergy-Q presented favorable agreement performances with RIDA Allergy Screen, although positive propensities were noticed in common allergens. Keywords: Multiple allergen simultaneous test, Automated analyzer

  11. Current status and future prospects of an automated sample exchange system PAM for protein crystallography

    Science.gov (United States)

    Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.

    2013-03-01

    To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.

  12. A longitudinal field multiple sampling ionization chamber for RIBLL2

    International Nuclear Information System (INIS)

    Tang Shuwen; Ma Peng; Lu Chengui; Duan Limin; Sun Zhiyu; Yang Herun; Zhang Jinxia; Hu Zhengguo; Xu Shanhu

    2012-01-01

    A longitudinal field MUltiple Sampling Ionization Chamber (MUSIC), which makes multiple measurements of energy loss for very high energy heavy ions at RIBLL2, has been constructed and tested with 3 constituent α source ( 239 Pu : 3.435 MeV, 241 Am : 3.913 MeV, 244 Cm : 4.356 MeV). The voltage plateau curve has been plotted and-500 V is determined as a proper work voltage. The energy resolution is 271.4 keV FWHM for the sampling unit when 3.435 MeV energy deposited. A Geant4 Monte Carlo simulation is made and it indicates the detector can provide unique particle identification for ions Z≥4. (authors)

  13. A multiple sampling ionization chamber for the External Target Facility

    International Nuclear Information System (INIS)

    Zhang, X.H.; Tang, S.W.; Ma, P.; Lu, C.G.; Yang, H.R.; Wang, S.T.; Yu, Y.H.; Yue, K.; Fang, F.; Yan, D.; Zhou, Y.; Wang, Z.M.; Sun, Y.; Sun, Z.Y.; Duan, L.M.; Sun, B.H.

    2015-01-01

    A multiple sampling ionization chamber used as a particle identification device for high energy heavy ions has been developed for the External Target Facility. The performance of this detector was tested with a 239 Pu α source and RI beams. A Z resolution (FWHM) of 0.4–0.6 was achieved for nuclear fragments of 18 O at 400 AMeV

  14. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  15. Reducing the cost of semi-automated in-gel tryptic digestion and GeLC sample preparation for high-throughput proteomics.

    Science.gov (United States)

    Ruelcke, Jayde E; Loo, Dorothy; Hill, Michelle M

    2016-10-21

    Peptide generation by trypsin digestion is typically the first step in mass spectrometry-based proteomics experiments, including 'bottom-up' discovery and targeted proteomics using multiple reaction monitoring. Manual tryptic digest and the subsequent clean-up steps can add variability even before the sample reaches the analytical platform. While specialized filter plates and tips have been designed for automated sample processing, the specialty reagents required may not be accessible or feasible due to their high cost. Here, we report a lower-cost semi-automated protocol for in-gel digestion and GeLC using standard 96-well microplates. Further cost savings were realized by re-using reagent tips with optimized sample ordering. To evaluate the methodology, we compared a simple mixture of 7 proteins and a complex cell-lysate sample. The results across three replicates showed that our semi-automated protocol had performance equal to or better than a manual in-gel digestion with respect to replicate variability and level of contamination. In this paper, we also provide the Agilent Bravo method file, which can be adapted to other liquid handlers. The simplicity, reproducibility, and cost-effectiveness of our semi-automated protocol make it ideal for routine in-gel and GeLC sample preparations, as well as high throughput processing of large clinical sample cohorts. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  17. Localized Multiple Kernel Learning Via Sample-Wise Alternating Optimization.

    Science.gov (United States)

    Han, Yina; Yang, Kunde; Ma, Yuanliang; Liu, Guizhong

    2014-01-01

    Our objective is to train support vector machines (SVM)-based localized multiple kernel learning (LMKL), using the alternating optimization between the standard SVM solvers with the local combination of base kernels and the sample-specific kernel weights. The advantage of alternating optimization developed from the state-of-the-art MKL is the SVM-tied overall complexity and the simultaneous optimization on both the kernel weights and the classifier. Unfortunately, in LMKL, the sample-specific character makes the updating of kernel weights a difficult quadratic nonconvex problem. In this paper, starting from a new primal-dual equivalence, the canonical objective on which state-of-the-art methods are based is first decomposed into an ensemble of objectives corresponding to each sample, namely, sample-wise objectives. Then, the associated sample-wise alternating optimization method is conducted, in which the localized kernel weights can be independently obtained by solving their exclusive sample-wise objectives, either linear programming (for l1-norm) or with closed-form solutions (for lp-norm). At test time, the learnt kernel weights for the training data are deployed based on the nearest-neighbor rule. Hence, to guarantee their generality among the test part, we introduce the neighborhood information and incorporate it into the empirical loss when deriving the sample-wise objectives. Extensive experiments on four benchmark machine learning datasets and two real-world computer vision datasets demonstrate the effectiveness and efficiency of the proposed algorithm.

  18. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    Science.gov (United States)

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Automated image analysis for quantitative fluorescence in situ hybridization with environmental samples.

    Science.gov (United States)

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L

    2007-05-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.

  20. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Ling-Yu Duan

    2010-01-01

    Full Text Available Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  1. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Tian Yonghong

    2010-01-01

    Full Text Available Abstract Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  2. Automated aerosol sampling and analysis for the Comprehensive Test Ban Treaty

    International Nuclear Information System (INIS)

    Miley, H.S.; Bowyer, S.M.; Hubbard, C.W.; McKinnon, A.D.; Perkins, R.W.; Thompson, R.C.; Warner, R.A.

    1998-01-01

    Detecting nuclear debris from a nuclear weapon exploded in or substantially vented to the Earth's atmosphere constitutes the most certain indication that a violation of the Comprehensive Test Ban Treaty has occurred. For this reason, a radionuclide portion of the International Monitoring System is being designed and implemented. The IMS will monitor aerosols and gaseous xenon isotopes to detect atmospheric and underground tests, respectively. An automated system, the Radionuclide Aerosol Sampler/Analyzer (RASA), has been developed at Pacific Northwest National Laboratory to meet CTBT aerosol measurement requirements. This is achieved by the use of a novel sampling apparatus, a high-resolution germanium detector, and very sophisticated software. This system draws a large volume of air (∼ 20,000 m 3 /day), performs automated gamma-ray spectral measurements (MDC( 140 Ba) 3 ), and communicates this and other data to a central data facility. Automated systems offer the added benefit of rigid controls, easily implemented QA/QC procedures, and centralized depot maintenance and operation. Other types of automated communication include pull or push transmission of State-Of-Health data, commands, and configuration data. In addition, a graphical user interface, Telnet, and other interactive communications are supported over ordinary phone or network lines. This system has been the subject of a USAF commercialization effort to meet US CTBT monitoring commitments. It will also be available to other CTBT signatories and the monitoring community for various governmental, environmental, or commercial needs. The current status of the commercialization is discussed

  3. Extensive monitoring through multiple blood samples in professional soccer players

    DEFF Research Database (Denmark)

    Heisterberg, Mette F; Fahrenkrug, Jan; Krustrup, Peter

    2013-01-01

    of the season. Leucocytes decreased with increased physical training. Lymphocytes decreased at the end of the season. VO2max decreased towards the end of the season whereas no significant changes were observed in the IE2 test.The regular blood samples from elite soccer players reveal significant changes......ABSTRACT: The aim of this study was to make a comprehensive gathering of consecutive detailed blood samples from professional soccer players, and to analyze different blood parameters in relation to seasonal changes in training and match exposure.Blood samples were collected five times during a six...... months period and analyzed for 37 variables in 27 professional soccer players from the best Danish league. Additionally, players were tested for body composition, VO2max and physical performance by the Yo-Yo intermittent endurance sub-max test (IE2).Multiple variations in blood parameters occurred during...

  4. Microwave-Assisted Sample Treatment in a Fully Automated Flow-Based Instrument: Oxidation of Reduced Technetium Species in the Analysis of Total Technetium-99 in Caustic Aged Nuclear Waste Samples

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; O'Hara, Matthew J.; Grate, Jay W.

    2004-01-01

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment, removal of the gaseous reaction products, and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total 99Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total 99Tc in caustic aged nuclear waste matrixes from the Hanford site

  5. Automated Registration of Images from Multiple Bands of Resourcesat-2 Liss-4 camera

    Science.gov (United States)

    Radhadevi, P. V.; Solanki, S. S.; Jyothi, M. V.; Varadan, G.

    2014-11-01

    Continuous and automated co-registration and geo-tagging of images from multiple bands of Liss-4 camera is one of the interesting challenges of Resourcesat-2 data processing. Three arrays of the Liss-4 camera are physically separated in the focal plane in alongtrack direction. Thus, same line on the ground will be imaged by extreme bands with a time interval of as much as 2.1 seconds. During this time, the satellite would have covered a distance of about 14 km on the ground and the earth would have rotated through an angle of 30". A yaw steering is done to compensate the earth rotation effects, thus ensuring a first level registration between the bands. But this will not do a perfect co-registration because of the attitude fluctuations, satellite movement, terrain topography, PSM steering and small variations in the angular placement of the CCD lines (from the pre-launch values) in the focal plane. This paper describes an algorithm based on the viewing geometry of the satellite to do an automatic band to band registration of Liss-4 MX image of Resourcesat-2 in Level 1A. The algorithm is using the principles of photogrammetric collinearity equations. The model employs an orbit trajectory and attitude fitting with polynomials. Then, a direct geo-referencing with a global DEM with which every pixel in the middle band is mapped to a particular position on the surface of the earth with the given attitude. Attitude is estimated by interpolating measurement data obtained from star sensors and gyros, which are sampled at low frequency. When the sampling rate of attitude information is low compared to the frequency of jitter or micro-vibration, images processed by geometric correction suffer from distortion. Therefore, a set of conjugate points are identified between the bands to perform a relative attitude error estimation and correction which will ensure the internal accuracy and co-registration of bands. Accurate calculation of the exterior orientation parameters with

  6. Oak ridge national laboratory automated clean chemistry for bulk analysis of environmental swipe samples

    Energy Technology Data Exchange (ETDEWEB)

    Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    To shorten the lengthy and costly manual chemical purification procedures, sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment. This addresses a serious need in the nuclear safeguards community to debottleneck the separation of U and Pu in environmental samples—currently performed by overburdened chemists—with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on current COTS equipment that was modified for U/Pu separations utilizing Eichrom™ TEVA and UTEVA resins. Initial verification of individual columns yielded small elution volumes with consistent elution profiles and good recovery. Combined column calibration demonstrated ample separation without crosscontamination of the eluent. Automated packing and unpacking of the built-in columns initially showed >15% deviation in resin loading by weight, which can lead to inconsistent separations. Optimization of the packing and unpacking methods led to a reduction in the variability of the packed resin to less than 5% daily. The reproducibility of the automated system was tested with samples containing 30 ng U and 15 pg Pu, which were separated in a series with alternating reagent blanks. These experiments showed very good washout of both the resin and the sample from the columns as evidenced by low blank values. Analysis of the major and minor isotope ratios for U and Pu provided values well within data quality limits for the International Atomic Energy Agency. Additionally, system process blanks spiked with 233U and 244Pu tracers were separated using the automated system after it was moved outside of a clean room and yielded levels equivalent to clean room blanks, confirming that the system can produce high quality results without the need for expensive clean room infrastructure. Comparison of the amount of personnel time necessary for successful manual vs

  7. Simple and multiple linear regression: sample size considerations.

    Science.gov (United States)

    Hanley, James A

    2016-11-01

    The suggested "two subjects per variable" (2SPV) rule of thumb in the Austin and Steyerberg article is a chance to bring out some long-established and quite intuitive sample size considerations for both simple and multiple linear regression. This article distinguishes two of the major uses of regression models that imply very different sample size considerations, neither served well by the 2SPV rule. The first is etiological research, which contrasts mean Y levels at differing "exposure" (X) values and thus tends to focus on a single regression coefficient, possibly adjusted for confounders. The second research genre guides clinical practice. It addresses Y levels for individuals with different covariate patterns or "profiles." It focuses on the profile-specific (mean) Y levels themselves, estimating them via linear compounds of regression coefficients and covariates. By drawing on long-established closed-form variance formulae that lie beneath the standard errors in multiple regression, and by rearranging them for heuristic purposes, one arrives at quite intuitive sample size considerations for both research genres. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling

    Directory of Open Access Journals (Sweden)

    Sergi Valverde

    2015-01-01

    Full Text Available Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. First of all, images were processed using the LST and SLS toolkits with different pipeline combinations that differed in either automated or manual lesion segmentation, and lesion filling or masking out lesions. Then, images processed following each of the pipelines were segmented into gray matter (GM and white matter (WM using SPM8, and compared with the same images where expert lesion annotations were filled before segmentation. Our results showed that fully automated lesion segmentation and filling pipelines reduced significantly the % of error in GM and WM volume on images of MS patients, and performed similarly to the images where expert lesion annotations were masked before segmentation. In all the pipelines, the amount of misclassified lesion voxels was the main cause in the observed error in GM and WM volume. However, the % of error was significantly lower when automatically estimated lesions were filled and not masked before segmentation. These results are relevant and suggest that LST and SLS toolboxes allow the performance of accurate brain tissue volume measurements without any kind of manual intervention, which can be convenient not only in terms of time and economic costs, but also to avoid the inherent intra/inter variability between manual annotations.

  9. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    Science.gov (United States)

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles.

  10. Synthetic Multiple-Imputation Procedure for Multistage Complex Samples

    Directory of Open Access Journals (Sweden)

    Zhou Hanzhi

    2016-03-01

    Full Text Available Multiple imputation (MI is commonly used when item-level missing data are present. However, MI requires that survey design information be built into the imputation models. For multistage stratified clustered designs, this requires dummy variables to represent strata as well as primary sampling units (PSUs nested within each stratum in the imputation model. Such a modeling strategy is not only operationally burdensome but also inferentially inefficient when there are many strata in the sample design. Complexity only increases when sampling weights need to be modeled. This article develops a generalpurpose analytic strategy for population inference from complex sample designs with item-level missingness. In a simulation study, the proposed procedures demonstrate efficient estimation and good coverage properties. We also consider an application to accommodate missing body mass index (BMI data in the analysis of BMI percentiles using National Health and Nutrition Examination Survey (NHANES III data. We argue that the proposed methods offer an easy-to-implement solution to problems that are not well-handled by current MI techniques. Note that, while the proposed method borrows from the MI framework to develop its inferential methods, it is not designed as an alternative strategy to release multiply imputed datasets for complex sample design data, but rather as an analytic strategy in and of itself.

  11. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-05

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  12. Quantification of multiple elements in dried blood spot samples

    DEFF Research Database (Denmark)

    Pedersen, Lise; Andersen-Ranberg, Karen; Hollergaard, Mads

    2017-01-01

    BACKGROUND: Dried blood spots (DBS) is a unique matrix that offers advantages compared to conventional blood collection making it increasingly popular in large population studies. We here describe development and validation of a method to determine multiple elements in DBS. METHODS: Elements were...... in venous blood. Samples with different hematocrit were spotted onto filter paper to assess hematocrit effect. RESULTS: The established method was precise and accurate for measurement of most elements in DBS. There was a significant but relatively weak correlation between measurement of the elements Mg, K...

  13. Automation of Sample Transfer and Counting on Fast Neutron ActivationSystem

    International Nuclear Information System (INIS)

    Dewita; Budi-Santoso; Darsono

    2000-01-01

    The automation of sample transfer and counting were the transfer processof the sample to the activation and counting place which have been done byswitch (manually) previously, than being developed by automaticallyprogrammed logic instructions. The development was done by constructed theelectronics hardware and software for that communication. Transfer timemeasurement is on seconds and was done automatically with an error 1.6 ms.The counting and activation time were decided by the user on seconds andminutes, the execution error on minutes was 8.2 ms. This development systemwill be possible for measuring short half live elements and cyclic activationprocesses. (author)

  14. A data-driven multiplicative fault diagnosis approach for automation processes.

    Science.gov (United States)

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  15. Automated facility for analysis of soil samples by neutron activation, counting, and data control

    International Nuclear Information System (INIS)

    Voegele, A.L.; Jesse, R.H.; Russell, W.L.; Baker, J.

    1978-01-01

    An automated facility remotely and automatically analyzes soil, water, and sediment samples for uranium. The samples travel through pneumatic tubes and switches to be first irradiated by neutrons and then counted for resulting neutron and gamma emission. Samples are loaded into special carriers, or rabbits, which are then automatically loaded into the pneumatic transfer system. The sample carriers have been previously coded with an identification number, which can be automatically read in the system. This number is used for correlating and filing data about the samples. The transfer system, counters, and identification system are controlled by a network of microprocessors. A master microprocessor initiates routines in other microprocessors assigned to specific tasks. The software in the microprocessors is unique for this type of application and lends flexibility to the system

  16. Fully automated gamma spectrometry gauge observing possible radioactive contamination of melting-shop samples

    International Nuclear Information System (INIS)

    Kroos, J.; Westkaemper, G.; Stein, J.

    1999-01-01

    At Salzgitter AG, several monitoring systems have been installed to check the scrap transport by rail and by car. At the moment, the scrap transport by ship is reloaded onto wagons for monitoring afterwards. In the future, a detection system will be mounted onto a crane for a direct check on scrap upon the departure of ship. Furthermore, at Salzgitter AG Central Chemical Laboratory, a fully automated gamma spectrometry gauge is installed in order to observe a possible radioactive contamination of the products. The gamma spectrometer is integrated into the automated OE spectrometry line for testing melting shop samples after performing the OE spectrometry. With this technique the specific activity of selected nuclides and dose rate will be determined. The activity observation is part of the release procedure. The corresponding measurement data are stored in a database for quality management reasons. (author)

  17. A real-time automated quality control of rain gauge data based on multiple sensors

    Science.gov (United States)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  18. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    Science.gov (United States)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  19. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  20. OASIS is Automated Statistical Inference for Segmentation, with applications to multiple sclerosis lesion segmentation in MRI.

    Science.gov (United States)

    Sweeney, Elizabeth M; Shinohara, Russell T; Shiee, Navid; Mateen, Farrah J; Chudgar, Avni A; Cuzzocreo, Jennifer L; Calabresi, Peter A; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M

    2013-01-01

    Magnetic resonance imaging (MRI) can be used to detect lesions in the brains of multiple sclerosis (MS) patients and is essential for diagnosing the disease and monitoring its progression. In practice, lesion load is often quantified by either manual or semi-automated segmentation of MRI, which is time-consuming, costly, and associated with large inter- and intra-observer variability. We propose OASIS is Automated Statistical Inference for Segmentation (OASIS), an automated statistical method for segmenting MS lesions in MRI studies. We use logistic regression models incorporating multiple MRI modalities to estimate voxel-level probabilities of lesion presence. Intensity-normalized T1-weighted, T2-weighted, fluid-attenuated inversion recovery and proton density volumes from 131 MRI studies (98 MS subjects, 33 healthy subjects) with manual lesion segmentations were used to train and validate our model. Within this set, OASIS detected lesions with a partial area under the receiver operating characteristic curve for clinically relevant false positive rates of 1% and below of 0.59% (95% CI; [0.50%, 0.67%]) at the voxel level. An experienced MS neuroradiologist compared these segmentations to those produced by LesionTOADS, an image segmentation software that provides segmentation of both lesions and normal brain structures. For lesions, OASIS out-performed LesionTOADS in 74% (95% CI: [65%, 82%]) of cases for the 98 MS subjects. To further validate the method, we applied OASIS to 169 MRI studies acquired at a separate center. The neuroradiologist again compared the OASIS segmentations to those from LesionTOADS. For lesions, OASIS ranked higher than LesionTOADS in 77% (95% CI: [71%, 83%]) of cases. For a randomly selected subset of 50 of these studies, one additional radiologist and one neurologist also scored the images. Within this set, the neuroradiologist ranked OASIS higher than LesionTOADS in 76% (95% CI: [64%, 88%]) of cases, the neurologist 66% (95% CI: [52%, 78

  1. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    International Nuclear Information System (INIS)

    El-Alaily, T.M.; El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M.; Assar, S.T.

    2015-01-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability

  2. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  3. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography...

  4. Extensive monitoring through multiple blood samples in professional soccer players.

    Science.gov (United States)

    Heisterberg, Mette F; Fahrenkrug, Jan; Krustrup, Peter; Storskov, Anders; Kjær, Michael; Andersen, Jesper L

    2013-05-01

    The aim of this study was to make a comprehensive gathering of consecutive detailed blood samples from professional soccer players and to analyze different blood parameters in relation to seasonal changes in training and match exposure. Blood samples were collected 5 times during a 6-month period and analyzed for 37 variables in 27 professional soccer players from the best Danish league. Additionally, the players were tested for body composition, V[Combining Dot Above]O2max and physical performance by the Yo-Yo intermittent endurance submax test (IE2). Multiple variations in blood parameters occurred during the observation period, including a decrease in hemoglobin and an increase in hematocrit as the competitive season progressed. Iron and transferrin were stable, whereas ferritin showed a decrease at the end of the season. The immunoglobulin A (IgA) and IgM increased in the period with basal physical training and at the end of the season. Leucocytes decreased with increased physical training. Lymphocytes decreased at the end of the season. The V[Combining Dot Above]O2max decreased toward the end of the season, whereas no significant changes were observed in the IE2 test. The regular blood samples from elite soccer players reveal significant changes that may be related to changes in training pattern, match exposure, or length of the match season. Especially the end of the preparation season and at the end of the competitive season seem to be time points were the blood-derived values indicate that the players are under excessive physical strain and might be more subjected to a possible overreaching-overtraining conditions. We suggest that regular analyses of blood samples could be an important initiative to optimize training adaptation, training load, and game participation, but sampling has to be regular, and a database has to be built for each individual player.

  5. Automated sample preparation using membrane microtiter extraction for bioanalytical mass spectrometry.

    Science.gov (United States)

    Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H

    1997-01-01

    The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.

  6. A new and standardized method to sample and analyse vitreous samples by the Cellient automated cell block system.

    Science.gov (United States)

    Van Ginderdeuren, Rita; Van Calster, Joachim; Stalmans, Peter; Van den Oord, Joost

    2014-08-01

    In this prospective study, a universal protocol for sampling and analysing vitreous material was investigated. Vitreous biopsies are difficult to handle because of the paucity of cells and the gelatinous structure of the vitreous. Histopathological analysis of the vitreous is useful in difficult uveitis cases to differentiate uveitis from lymphoma or infection and to define the type of cellular reaction. Hundred consecutive vitreous samples were analysed with the Cellient tissue processor (Hologic). This machine is a fully automated processor starting from a specified container with PreservCyt (fixative fluid) with cells to paraffin. Cytology was compared with fixatives Cytolyt (contains a mucolyticum) and PreservCyt. Routine histochemical and immunostainings were evaluated. In 92% of the cases, sufficient material was found for diagnosis. In 14%, a Cytolyt wash was necessary to prevent clotting of the tubes in the Cellient due to the viscosity of the sample. In 23%, the diagnosis was an acute inflammation (presence of granulocytes); in 33%, chronic active inflammation (presence of T lymphocytes); in 33%, low-grade inflammation (presence of CD68 cells, without T lymphocytes); and in 3%, a malignant process. A standardized protocol for sampling and handling vitreous biopsies, fixing in PreservCyt and processing by the Cellient gives a satisfactory result in morphology, number of cells and possibility of immuno-histochemical stainings. The diagnosis can be established or confirmed in more than 90% of cases. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  7. Automated cell counts on CSF samples: A multicenter performance evaluation of the GloCyte system.

    Science.gov (United States)

    Hod, E A; Brugnara, C; Pilichowska, M; Sandhaus, L M; Luu, H S; Forest, S K; Netterwald, J C; Reynafarje, G M; Kratz, A

    2018-02-01

    Automated cell counters have replaced manual enumeration of cells in blood and most body fluids. However, due to the unreliability of automated methods at very low cell counts, most laboratories continue to perform labor-intensive manual counts on many or all cerebrospinal fluid (CSF) samples. This multicenter clinical trial investigated if the GloCyte System (Advanced Instruments, Norwood, MA), a recently FDA-approved automated cell counter, which concentrates and enumerates red blood cells (RBCs) and total nucleated cells (TNCs), is sufficiently accurate and precise at very low cell counts to replace all manual CSF counts. The GloCyte System concentrates CSF and stains RBCs with fluorochrome-labeled antibodies and TNCs with nucleic acid dyes. RBCs and TNCs are then counted by digital image analysis. Residual adult and pediatric CSF samples obtained for clinical analysis at five different medical centers were used for the study. Cell counts were performed by the manual hemocytometer method and with the GloCyte System following the same protocol at all sites. The limits of the blank, detection, and quantitation, as well as precision and accuracy of the GloCyte, were determined. The GloCyte detected as few as 1 TNC/μL and 1 RBC/μL, and reliably counted as low as 3 TNCs/μL and 2 RBCs/μL. The total coefficient of variation was less than 20%. Comparison with cell counts obtained with a hemocytometer showed good correlation (>97%) between the GloCyte and the hemocytometer, including at very low cell counts. The GloCyte instrument is a precise, accurate, and stable system to obtain red cell and nucleated cell counts in CSF samples. It allows for the automated enumeration of even very low cell numbers, which is crucial for CSF analysis. These results suggest that GloCyte is an acceptable alternative to the manual method for all CSF samples, including those with normal cell counts. © 2017 John Wiley & Sons Ltd.

  8. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  9. SAMPL4 & DOCK3.7: lessons for automated docking procedures

    Science.gov (United States)

    Coleman, Ryan G.; Sterling, Teague; Weiss, Dahlia R.

    2014-03-01

    The SAMPL4 challenges were used to test current automated methods for solvation energy, virtual screening, pose and affinity prediction of the molecular docking pipeline DOCK 3.7. Additionally, first-order models of binding affinity were proposed as milestones for any method predicting binding affinity. Several important discoveries about the molecular docking software were made during the challenge: (1) Solvation energies of ligands were five-fold worse than any other method used in SAMPL4, including methods that were similarly fast, (2) HIV Integrase is a challenging target, but automated docking on the correct allosteric site performed well in terms of virtual screening and pose prediction (compared to other methods) but affinity prediction, as expected, was very poor, (3) Molecular docking grid sizes can be very important, serious errors were discovered with default settings that have been adjusted for all future work. Overall, lessons from SAMPL4 suggest many changes to molecular docking tools, not just DOCK 3.7, that could improve the state of the art. Future difficulties and projects will be discussed.

  10. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    Fisk, J.F.; Leasure, C.; Sauter, A.D.

    1993-01-01

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  11. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    DEFF Research Database (Denmark)

    Qiao, Jixin

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development...... and optimization for rapid determination of plutonium in environmental samples using SIextraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples...... for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including...

  12. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  13. Automated otolith image classification with multiple views: an evaluation on Sciaenidae.

    Science.gov (United States)

    Wong, J Y; Chu, C; Chong, V C; Dhillon, S K; Loh, K H

    2016-08-01

    Combined multiple 2D views (proximal, anterior and ventral aspects) of the sagittal otolith are proposed here as a method to capture shape information for fish classification. Classification performance of single view compared with combined 2D views show improved classification accuracy of the latter, for nine species of Sciaenidae. The effects of shape description methods (shape indices, Procrustes analysis and elliptical Fourier analysis) on classification performance were evaluated. Procrustes analysis and elliptical Fourier analysis perform better than shape indices when single view is considered, but all perform equally well with combined views. A generic content-based image retrieval (CBIR) system that ranks dissimilarity (Procrustes distance) of otolith images was built to search query images without the need for detailed information of side (left or right), aspect (proximal or distal) and direction (positive or negative) of the otolith. Methods for the development of this automated classification system are discussed. © 2016 The Fisheries Society of the British Isles.

  14. Automated whole-genome multiple alignment of rat, mouse, and human

    Energy Technology Data Exchange (ETDEWEB)

    Brudno, Michael; Poliakov, Alexander; Salamov, Asaf; Cooper, Gregory M.; Sidow, Arend; Rubin, Edward M.; Solovyev, Victor; Batzoglou, Serafim; Dubchak, Inna

    2004-07-04

    We have built a whole genome multiple alignment of the three currently available mammalian genomes using a fully automated pipeline which combines the local/global approach of the Berkeley Genome Pipeline and the LAGAN program. The strategy is based on progressive alignment, and consists of two main steps: (1) alignment of the mouse and rat genomes; and (2) alignment of human to either the mouse-rat alignments from step 1, or the remaining unaligned mouse and rat sequences. The resulting alignments demonstrate high sensitivity, with 87% of all human gene-coding areas aligned in both mouse and rat. The specificity is also high: <7% of the rat contigs are aligned to multiple places in human and 97% of all alignments with human sequence > 100kb agree with a three-way synteny map built independently using predicted exons in the three genomes. At the nucleotide level <1% of the rat nucleotides are mapped to multiple places in the human sequence in the alignment; and 96.5% of human nucleotides within all alignments agree with the synteny map. The alignments are publicly available online, with visualization through the novel Multi-VISTA browser that we also present.

  15. Improving neutron multiplicity counting for the spatial dependence of multiplication: Results for spherical plutonium samples

    Energy Technology Data Exchange (ETDEWEB)

    Göttsche, Malte, E-mail: malte.goettsche@physik.uni-hamburg.de; Kirchner, Gerald

    2015-10-21

    The fissile mass deduced from a neutron multiplicity counting measurement of high mass dense items is underestimated if the spatial dependence of the multiplication is not taken into account. It is shown that an appropriate physics-based correction successfully removes the bias. It depends on four correction coefficients which can only be exactly determined if the sample geometry and composition are known. In some cases, for example in warhead authentication, available information on the sample will be very limited. MCNPX-PoliMi simulations have been performed to obtain the correction coefficients for a range of spherical plutonium metal geometries, with and without polyethylene reflection placed around the spheres. For hollow spheres, the analysis shows that the correction coefficients can be approximated with high accuracy as a function of the sphere's thickness depending only slightly on the radius. If the thickness remains unknown, less accurate estimates of the correction coefficients can be obtained from the neutron multiplication. The influence of isotopic composition is limited. The correction coefficients become somewhat smaller when reflection is present.

  16. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    Science.gov (United States)

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  17. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    Directory of Open Access Journals (Sweden)

    Tongxin Shu

    2017-11-01

    Full Text Available Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA, while achieving around the same Normalized Mean Error (NME, DDASA is superior in saving 5.31% more battery energy.

  18. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  19. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    Energy Technology Data Exchange (ETDEWEB)

    Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McBay, Eddy H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-30

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oak Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.

  20. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S.

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m 3 /h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10 -6 Bq/m 3 . The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  1. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  2. MICROBIOLOGICAL MONITORING AND AUTOMATED EVENT SAMPLING AT KARST SPRINGS USING LEO-SATELLITES

    Science.gov (United States)

    Stadler, Hermann; Skritek, Paul; Sommer, Regina; Mach, Robert L.; Zerobin, Wolfgang; Farnleitner, Andreas H.

    2010-01-01

    Data communication via Low-Earth-Orbit Satellites between portable hydro-meteorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E.coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples by hand compared with the auto-sampling procedure revealed corresponding results and were in agreement to the ISO 9308-1 reference method. E.coli concentrations were individually corrected by event specific die-off rates (0.10–0.14 day−1) compensating losses due to sample storage at spring temperature in the auto sampler. Two large summer events 2005/2006 at a large alpine karst spring (LKAS2) were monitored including detailed analysis of E.coli dynamics (n = 271) together with comprehensive hydrological characterisations. High resolution time series demonstrated a sudden increase of E.coli concentrations in spring water (approx. 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorbent coefficient measured at 254nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-Satellite based system it is a helpful tool for Early-Warning-Systems in the field of drinking water protection. PMID:18776628

  3. Artificial Neural Network for Total Laboratory Automation to Improve the Management of Sample Dilution.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.

  4. Rapid and automated determination of plutonium and neptunium in environmental samples

    International Nuclear Information System (INIS)

    Qiao, J.

    2011-03-01

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242 Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  5. Rapid and automated determination of plutonium and neptunium in environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, J.

    2011-03-15

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  6. Automated, feature-based image alignment for high-resolution imaging mass spectrometry of large biological samples

    NARCIS (Netherlands)

    Broersen, A.; Liere, van R.; Altelaar, A.F.M.; Heeren, R.M.A.; McDonnell, L.A.

    2008-01-01

    High-resolution imaging mass spectrometry of large biological samples is the goal of several research groups. In mosaic imaging, the most common method, the large sample is divided into a mosaic of small areas that are then analyzed with high resolution. Here we present an automated alignment

  7. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  8. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    Science.gov (United States)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  9. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-01-01

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper

  10. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Automated patterning and probing with multiple nanoscale tools for single-cell analysis.

    Science.gov (United States)

    Li, Jiayao; Kim, Yeonuk; Liu, Boyin; Qin, Ruwen; Li, Jian; Fu, Jing

    2017-10-01

    The nano-manipulation approach that combines Focused Ion Beam (FIB) milling and various imaging and probing techniques enables researchers to investigate the cellular structures in three dimensions. Such fusion approach, however, requires extensive effort on locating and examining randomly-distributed targets due to limited Field of View (FOV) when high magnification is desired. In the present study, we present the development that automates 'pattern and probe' particularly for single-cell analysis, achieved by computer aided tools including feature recognition and geometric planning algorithms. Scheduling of serial FOVs for imaging and probing of multiple cells was considered as a rectangle covering problem, and optimal or near-optimal solutions were obtained with the heuristics developed. FIB milling was then employed automatically followed by downstream analysis using Atomic Force Microscopy (AFM) to probe the cellular interior. Our strategy was applied to examine bacterial cells (Klebsiella pneumoniae) and achieved high efficiency with limited human interference. The developed algorithms can be easily adapted and integrated with different imaging platforms towards high-throughput imaging analysis of single cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Archival Bone Marrow Samples: Suitable for Multiple Biomarker Analysis?

    DEFF Research Database (Denmark)

    Lund, Bendik; Najmi, A. Laeya; Wesolowska, Agata

    2015-01-01

    biopsies from 18 Danish and Norwegian childhood acute lymphoblastic leukemia patients were included and compared with corresponding blood samples. Samples were grouped according to the age of sample and whether WGA was performed or not. We found that measurements of DNA concentration after DNA extraction...

  13. A simple and automated sample preparation system for subsequent halogens determination: Combustion followed by pyrohydrolysis.

    Science.gov (United States)

    Pereira, L S F; Pedrotti, M F; Vecchia, P Dalla; Pereira, J S F; Flores, E M M

    2018-06-20

    A simple and automated system based on combustion followed by a pyrohydrolysis reaction was proposed for further halogens determination. This system was applied for digestion of soils containing high (90%) and also low (10%) organic matter content for further halogens determination. The following parameters were evaluated: sample mass, use of microcrystalline cellulose and heating time. For analytes absorption, a diluted alkaline solution (6 mL of 25 mmol L -1  NH 4 OH) was used in all experiments. Up to 400 mg of soil with high organic matter content and 100 mg of soil with low organic matter content (mixed with 400 mg of cellulose) could be completely digested using the proposed system. Quantitative results for all halogens were obtained using less than 12 min of sample preparation step (about 1.8 min for sample combustion and 10 min for pyrohydrolysis). The accuracy was evaluated using a certified reference material of coal and spiked samples. No statistical difference was observed between the certified values and results obtained by the proposed method. Additionally, the recoveries obtained using spiked samples were in the range of 98-103% with relative standard deviation values lower than 5%. The limits of quantification obtained for F, Cl, Br and I for soil with high (400 mg of soil) and low (100 mg of soil) organic matter were in the range of 0.01-2 μg g -1 and 0.07-59 μg g -1 , respectively. The proposed system was considered as a simple and suitable alternative for soils digestion for further halogens determination by ion chromatography and inductively coupled plasma mass spectrometry techniques. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Invention and validation of an automated camera system that uses optical character recognition to identify patient name mislabeled samples.

    Science.gov (United States)

    Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L

    2014-03-01

    Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.

  15. Retention model for sorptive extraction-thermal desorption of aqueous samples : application to the automated analysis of pesticides and polyaromatic hydrocarbons in water samples

    NARCIS (Netherlands)

    Baltussen, H.A.; David, F.; Sandra, P.J.F.; Janssen, J.G.M.; Cramers, C.A.M.G.

    1998-01-01

    In this report, an automated method for sorptive enrichment of aqueous samples is presented. It is based on sorption of the analytes of interest into a packed bed containing 100% polydimethylsiloxane (PDMS) particles followed by thermal desorption for complete transfer of the enriched solutes onto

  16. Plasma cortisol and noradrenalin concentrations in pigs: automated sampling of freely moving pigs housed in PigTurn versus manually sampled and restrained pigs

    Science.gov (United States)

    Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...

  17. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  18. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    Science.gov (United States)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  19. Improving automated multiple sclerosis lesion segmentation with a cascaded 3D convolutional neural network approach.

    Science.gov (United States)

    Valverde, Sergi; Cabezas, Mariano; Roura, Eloy; González-Villà, Sandra; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Rovira, Àlex; Oliver, Arnau; Lladó, Xavier

    2017-07-15

    In this paper, we present a novel automated method for White Matter (WM) lesion segmentation of Multiple Sclerosis (MS) patient images. Our approach is based on a cascade of two 3D patch-wise convolutional neural networks (CNN). The first network is trained to be more sensitive revealing possible candidate lesion voxels while the second network is trained to reduce the number of misclassified voxels coming from the first network. This cascaded CNN architecture tends to learn well from a small (n≤35) set of labeled data of the same MRI contrast, which can be very interesting in practice, given the difficulty to obtain manual label annotations and the large amount of available unlabeled Magnetic Resonance Imaging (MRI) data. We evaluate the accuracy of the proposed method on the public MS lesion segmentation challenge MICCAI2008 dataset, comparing it with respect to other state-of-the-art MS lesion segmentation tools. Furthermore, the proposed method is also evaluated on two private MS clinical datasets, where the performance of our method is also compared with different recent public available state-of-the-art MS lesion segmentation methods. At the time of writing this paper, our method is the best ranked approach on the MICCAI2008 challenge, outperforming the rest of 60 participant methods when using all the available input modalities (T1-w, T2-w and FLAIR), while still in the top-rank (3rd position) when using only T1-w and FLAIR modalities. On clinical MS data, our approach exhibits a significant increase in the accuracy segmenting of WM lesions when compared with the rest of evaluated methods, highly correlating (r≥0.97) also with the expected lesion volume. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. AST: an automated sequence-sampling method for improving the taxonomic diversity of gene phylogenetic trees.

    Science.gov (United States)

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php.

  1. Automation of the radiation measuring facilities for samples in health physics - MA 9

    International Nuclear Information System (INIS)

    Martini, M.

    1980-12-01

    Routine radation measurements of samples are performed by the HMI health physics department by means of test stations for individual samples and multiple samples (using a changing equipment). The basic device of these test stations is a SCALER/TIMER system (BF 22/25, BERTHOLD Corp.). This measuring facility has been extended by a CAMAC intrumentation which incorporates an autonomous CAMAC processor (CAPRO-1, INCAA B.V.) for monitoring an automatic control of the system. The programming language is BASIC. A DECwriter (LA 34) is used for user interaction and for printing the measurement results. This report describes the features of this system and present some examples of, the dialogue with the system and the printout of data. (orig.) [de

  2. Automated vessel shadow segmentation of fovea-centered spectral-domain images from multiple OCT devices

    Science.gov (United States)

    Wu, Jing; Gerendas, Bianca S.; Waldstein, Sebastian M.; Simader, Christian; Schmidt-Erfurth, Ursula

    2014-03-01

    Spectral-domain Optical Coherence Tomography (SD-OCT) is a non-invasive modality for acquiring high reso- lution, three-dimensional (3D) cross sectional volumetric images of the retina and the subretinal layers. SD-OCT also allows the detailed imaging of retinal pathology, aiding clinicians in the diagnosis of sight degrading diseases such as age-related macular degeneration (AMD) and glaucoma.1 Disease diagnosis, assessment, and treatment requires a patient to undergo multiple OCT scans, possibly using different scanning devices, to accurately and precisely gauge disease activity, progression and treatment success. However, the use of OCT imaging devices from different vendors, combined with patient movement may result in poor scan spatial correlation, potentially leading to incorrect patient diagnosis or treatment analysis. Image registration can be used to precisely compare disease states by registering differing 3D scans to one another. In order to align 3D scans from different time- points and vendors using registration, landmarks are required, the most obvious being the retinal vasculature. Presented here is a fully automated cross-vendor method to acquire retina vessel locations for OCT registration from fovea centred 3D SD-OCT scans based on vessel shadows. Noise filtered OCT scans are flattened based on vendor retinal layer segmentation, to extract the retinal pigment epithelium (RPE) layer of the retina. Voxel based layer profile analysis and k-means clustering is used to extract candidate vessel shadow regions from the RPE layer. In conjunction, the extracted RPE layers are combined to generate a projection image featuring all candidate vessel shadows. Image processing methods for vessel segmentation of the OCT constructed projection image are then applied to optimize the accuracy of OCT vessel shadow segmentation through the removal of false positive shadow regions such as those caused by exudates and cysts. Validation of segmented vessel shadows uses

  3. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    International Nuclear Information System (INIS)

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-01-01

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants

  4. A novel flow injection chemiluminescence method for automated and miniaturized determination of phenols in smoked food samples.

    Science.gov (United States)

    Vakh, Christina; Evdokimova, Ekaterina; Pochivalov, Aleksei; Moskvin, Leonid; Bulatov, Andrey

    2017-12-15

    An easily performed fully automated and miniaturized flow injection chemiluminescence (CL) method for determination of phenols in smoked food samples has been proposed. This method includes the ultrasound assisted solid-liquid extraction coupled with gas-diffusion separation of phenols from smoked food sample and analytes absorption into a NaOH solution in a specially designed gas-diffusion cell. The flow system was designed to focus on automation and miniaturization with minimal sample and reagent consumption by inexpensive instrumentation. The luminol - N-bromosuccinimide system in an alkaline medium was used for the CL determination of phenols. The limit of detection of the proposed procedure was 3·10 -8 ·molL -1 (0.01mgkg -1 ) in terms of phenol. The presented method demonstrated to be a good tool for easy, rapid and cost-effective point-of-need screening phenols in smoked food samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  6. Automated identification and quantification of glycerophospholipid molecular species by multiple precursor ion scanning

    DEFF Research Database (Denmark)

    Ejsing, Christer S.; Duchoslav, Eva; Sampaio, Julio

    2006-01-01

    We report a method for the identification and quantification of glycerophospholipid molecular species that is based on the simultaneous automated acquisition and processing of 41 precursor ion spectra, specific for acyl anions of common fatty acids moieties and several lipid class-specific fragment...... of glycerophospholipids. The automated analysis of total lipid extracts was powered by a robotic nanoflow ion source and produced currently the most detailed description of the glycerophospholipidome....

  7. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automatedsample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  8. Separation of pigment formulations by high-performance thin-layer chromatography with automated multiple development.

    Science.gov (United States)

    Stiefel, Constanze; Dietzel, Sylvia; Endress, Marc; Morlock, Gertrud E

    2016-09-02

    Food packaging is designed to provide sufficient protection for the respective filling, legally binding information for the consumers like nutritional facts or filling information, and an attractive appearance to promote the sale. For quality and safety of the package, a regular quality control of the used printing materials is necessary to get consistently good print results, to avoid migration of undesired ink components into the food and to identify potentially faulty ink batches. Analytical approaches, however, have hardly been considered for quality assurance so far due to the lack of robust, suitable methods for the analysis of rarely soluble pigment formulations. Thus, a simple and generic high-performance thin-layer chromatography (HPTLC) method for the separation of different colored pigment formulations was developed on HPTLC plates silica gel 60 by automated multiple development. The gradient system provided a sharp resolution for differently soluble pigment constituents like additives and coating materials. The results of multi-detection allowed a first assignment of the differently detectable bands to particular chemical substance classes (e.g., lipophilic components), enabled the comparison of different commercially available pigment batches and revealed substantial variations in the composition of the batches. Hyphenation of HPTLC with high resolution mass spectrometry and infrared spectroscopy allowed the characterization of single unknown pigment constituents, which may partly be responsible for known quality problems during printing. The newly developed, precise and selective HPTLC method can be used as part of routine quality control for both, incoming pigment batches and monitoring of internal pigment production processes, to secure a consistent pigment composition resulting in consistent ink quality, a faultless print image and safe products. Hyphenation of HPTLC with the A. fischeri bioassay gave first information on the bioactivity or rather

  9. Determining Sample Size for Accurate Estimation of the Squared Multiple Correlation Coefficient.

    Science.gov (United States)

    Algina, James; Olejnik, Stephen

    2000-01-01

    Discusses determining sample size for estimation of the squared multiple correlation coefficient and presents regression equations that permit determination of the sample size for estimating this parameter for up to 20 predictor variables. (SLD)

  10. ASPIRE: An automated sample positioning and irradiation system for radiation biology experiments at Inter University Accelerator Centre, New Delhi

    International Nuclear Information System (INIS)

    Kothari, Ashok; Barua, P.; Archunan, M.; Rani, Kusum; Subramanian, E.T.; Pujari, Geetanjali; Kaur, Harminder; Satyanarayanan, V.V.V.; Sarma, Asitikantha; Avasthi, D.K.

    2015-01-01

    An automated irradiation setup for biology samples has been built at Inter University Accelerator Centre (IUAC), New Delhi, India. It can automatically load and unload 20 biology samples in a run of experiment. It takes about 20 min [2% of the cell doubling time] to irradiate all the 20 samples. Cell doubling time is the time taken by the cells (kept in the medium) to grow double in numbers. The cells in the samples keep growing during entire of the experiment. The fluence irradiated to the samples is measured with two silicon surface barrier detectors. Tests show that the uniformity of fluence and dose of heavy ions reaches to 2% at the sample area in diameter of 40 mm. The accuracy of mean fluence at the center of the target area is within 1%. The irradiation setup can be used to the studies of radiation therapy, radiation dosimetry and molecular biology at the heavy ion accelerator. - Highlights: • Automated positioning and irradiation setup for biology samples at IUAC is built. • Loading and unloading of 20 biology samples can be automatically carried out. • Biologicals cells keep growing during entire experiment. • Fluence and dose of heavy ions are measured by two silicon barrier detectors. • Uniformity of fluence and dose of heavy ions at sample position reaches to 2%

  11. Effects of (α,n) contaminants and sample multiplication on statistical neutron correlation measurements

    International Nuclear Information System (INIS)

    Dowdy, E.J.; Hansen, G.E.; Robba, A.A.; Pratt, J.C.

    1980-01-01

    The complete formalism for the use of statistical neutron fluctuation measurements for the nondestructive assay of fissionable materials has been developed. This formalism includes the effect of detector deadtime, neutron multiplicity, random neutron pulse contributions from (α,n) contaminants in the sample, and the sample multiplication of both fission-related and background neutrons

  12. UAV : Warnings From Multiple Automated Static Analysis Tools At A Glance

    NARCIS (Netherlands)

    Buckers, T.B.; Cao, C.S.; Doesburg, M.S.; Gong, Boning; Wang, Sunwei; Beller, M.M.; Zaidman, A.E.; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Automated Static Analysis Tools (ASATs) are an integral part of today’s software quality assurance practices. At present, a plethora of ASATs exist, each with different strengths. However, there is little guidance for developers on which of these ASATs to choose and combine for a project. As a

  13. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  14. Automated hardwood lumber grading utilizing a multiple sensor machine vision technology

    Science.gov (United States)

    D. Earl Kline; Chris Surak; Philip A. Araman

    2003-01-01

    Over the last 10 years, scientists at the Thomas M. Brooks Forest Products Center, the Bradley Department of Electrical and Computer Engineering, and the USDA Forest Service have been working on lumber scanning systems that can accurately locate and identify defects in hardwood lumber. Current R&D efforts are targeted toward developing automated lumber grading...

  15. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  16. Performance of optimized McRAPD in identification of 9 yeast species frequently isolated from patient samples: potential for automation.

    Science.gov (United States)

    Trtkova, Jitka; Pavlicek, Petr; Ruskova, Lenka; Hamal, Petr; Koukalova, Dagmar; Raclavsky, Vladislav

    2009-11-10

    Rapid, easy, economical and accurate species identification of yeasts isolated from clinical samples remains an important challenge for routine microbiological laboratories, because susceptibility to antifungal agents, probability to develop resistance and ability to cause disease vary in different species. To overcome the drawbacks of the currently available techniques we have recently proposed an innovative approach to yeast species identification based on RAPD genotyping and termed McRAPD (Melting curve of RAPD). Here we have evaluated its performance on a broader spectrum of clinically relevant yeast species and also examined the potential of automated and semi-automated interpretation of McRAPD data for yeast species identification. A simple fully automated algorithm based on normalized melting data identified 80% of the isolates correctly. When this algorithm was supplemented by semi-automated matching of decisive peaks in first derivative plots, 87% of the isolates were identified correctly. However, a computer-aided visual matching of derivative plots showed the best performance with average 98.3% of the accurately identified isolates, almost matching the 99.4% performance of traditional RAPD fingerprinting. Since McRAPD technique omits gel electrophoresis and can be performed in a rapid, economical and convenient way, we believe that it can find its place in routine identification of medically important yeasts in advanced diagnostic laboratories that are able to adopt this technique. It can also serve as a broad-range high-throughput technique for epidemiological surveillance.

  17. Automated pose estimation of objects using multiple ID devices for handling and maintenance task in nuclear fusion reactor

    International Nuclear Information System (INIS)

    Umetani, Tomohiro; Morioka, Jun-ichi; Tamura, Yuichi; Inoue, Kenji; Arai, Tatsuo; Mae, Yasusi

    2011-01-01

    This paper describes a method for the automated estimation of three-dimensional pose (position and orientation) of objects by autonomous robots, using multiple identification (ID) devices. Our goal is to estimate the object pose for assembly or maintenance tasks in a real nuclear fusion reactor system, with autonomous robots cooperating in a virtual assembly system. The method estimates the three-dimensional pose for autonomous robots. This paper discusses a method of motion generation for ID acquisition using the sensory data acquired by the measurement system attached to the robots and from the environment. Experimental results show the feasibility of the proposed method. (author)

  18. Advantages of automation in plasma sample preparation prior to HPLC/MS/MS quantification: application to the determination of cilazapril and cilazaprilat in a bioequivalence study.

    Science.gov (United States)

    Kolocouri, Filomila; Dotsikas, Yannis; Apostolou, Constantinos; Kousoulos, Constantinos; Soumelas, Georgios-Stefanos; Loukas, Yannis L

    2011-01-01

    An HPLC/MS/MS method characterized by complete automation and high throughput was developed for the determination of cilazapril and its active metabolite cilazaprilat in human plasma. All sample preparation and analysis steps were performed by using 2.2 mL 96 deep-well plates, while robotic liquid handling workstations were utilized for all liquid transfer steps, including liquid-liquid extraction. The whole procedure was very fast compared to a manual procedure with vials and no automation. The method also had a very short chromatographic run time of 1.5 min. Sample analysis was performed by RP-HPLC/MS/MS with positive electrospray ionization using multiple reaction monitoring. The calibration curve was linear in the range of 0.500-300 and 0.250-150 ng/mL for cilazapril and cilazaprilat, respectively. The proposed method was fully validated and proved to be selective, accurate, precise, reproducible, and suitable for the determination of cilazapril and cilazaprilat in human plasma. Therefore, it was applied to a bioequivalence study after per os administration of 2.5 mg tablet formulations of cilazapril.

  19. Automated system for noise-measurements on low-ohmic samples and magnetic sensors

    NARCIS (Netherlands)

    Jonker, R.J.W.; Briaire, J.; Vandamme, L.K.J.

    1999-01-01

    An automated system for electronic noise measurements on metal films is presented. This new system, controlled by a personal computer which utilizes National Instruments' LabVIEW software, is designed to measure low frequency noise as a function of an externally imposed magnetic field and as a

  20. A Study on Integrated Control Network for Multiple Automation Services-1st year report

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, D.H.; Park, B.S.; Kim, M.S.; Lim, Y.H.; Ahn, S.K. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    This report describes the development of Integrated and Intelligent Gateway which is under developed. The network operating technique in this report can identifies the causes of the communication faults and can avoid communication network faults in advance. Utility companies spend large financial investment and time for supplying the stabilized power. Since this is deeply related to the reliability of Automation Systems, it is natural to employ Fault-Tolerant communication network for Automation Systems. Use of the network system developed in this report is not limited in DAS. It can be expandable to the many kinds of data services for customer. Thus this report suggests the direction of the communication network development. This 1st year report is composed of following contents, 1) The introduction and problems of DAS. 2) The configuration and functions of IIG. 3) The protocols. (author). 27 refs., 73 figs., 6 tabs.

  1. Automation of the software production process for multiple cryogenic control applications

    OpenAIRE

    Fluder, Czeslaw; Lefebvre, Victor; Pezzetti, Marco; Plutecki, Przemyslaw; Tovar-González, Antonio; Wolak, Tomasz

    2018-01-01

    The development of process control systems for the cryogenic infrastructure at CERN is based on an automatic software generation approach. The overall complexity of the systems, their frequent evolution as well as the extensive use of databases, repositories, commercial engineering software and CERN frameworks have led to further efforts towards improving the existing automation based software production methodology. A large number of control system upgrades were successfully performed for th...

  2. Biological Agent Sample Preparation for the Detection and Identification of Multiple Agents by Nucleic Acid-Based Analysis

    National Research Council Canada - National Science Library

    Fields, Robert

    2002-01-01

    .... The AutoLyser instrument which we have developed, provides fully automated purification of viral, bacterial and human genomic DNA and RNA from clinical samples, cell culture and swabs in as little...

  3. Automated Processing of Plasma Samples for Lipoprotein Separation by Rate-Zonal Ultracentrifugation.

    Science.gov (United States)

    Peters, Carl N; Evans, Iain E J

    2016-12-01

    Plasma lipoproteins are the primary means of lipid transport among tissues. Defining alterations in lipid metabolism is critical to our understanding of disease processes. However, lipoprotein measurement is limited to specialized centers. Preparation for ultracentrifugation involves the formation of complex density gradients that is both laborious and subject to handling errors. We created a fully automated device capable of forming the required gradient. The design has been made freely available for download by the authors. It is inexpensive relative to commercial density gradient formers, which generally create linear gradients unsuitable for rate-zonal ultracentrifugation. The design can easily be modified to suit user requirements and any potential future improvements. Evaluation of the device showed reliable peristaltic pump accuracy and precision for fluid delivery. We also demonstrate accurate fluid layering with reduced mixing at the gradient layers when compared to usual practice by experienced laboratory personnel. Reduction in layer mixing is of critical importance, as it is crucial for reliable lipoprotein separation. The automated device significantly reduces laboratory staff input and reduces the likelihood of error. Overall, this device creates a simple and effective solution to formation of complex density gradients. © 2015 Society for Laboratory Automation and Screening.

  4. Automated processing of whole blood samples into microliter aliquots of plasma.

    Science.gov (United States)

    Burtis, C A; Johnson, W W; Walker, W A

    1988-01-01

    A rotor that accepts and automatically processes a bulk aliquot of a single blood sample into multiple aliquots of plasma has been designed and built. The rotor consists of a central processing unit, which includes a disk containing eight precision-bore capillaries. By varying the internal diameters of the capillaries, aliquot volumes ranging 1 to 10 mul can be prepared. In practice, an unmeasured volume of blood is placed in a centre well, and, as the rotor begins to spin, is moved radially into a central annular ring where it is distributed into a series of processing chambers. The rotor is then spun at 3000 rpm for 10 min. When the centrifugal field is removed by slowly decreasing the rotor speed, an aliquot of plasma is withdrawn by capillary action into each of the capillary tubes. The disk containing the eight measured aliquots of plasma is subsequently removed and placed in a modifed rotor for conventional centrifugal analysis. Initial evaluation of the new rotor indicates that it is capable of producing discrete, microliter volumes of plasma with a degree of accuracy and precision approaching that of mechanical pipettes.

  5. An alarm filtering system for an automated process: a multiple-agent approach

    International Nuclear Information System (INIS)

    Khoualdi, Kamel

    1994-01-01

    Nowadays, the supervision process of industrial installations is more and more complex involving the automation of their control. A malfunction generates an avalanche of alarms. The operator, in charge of the supervision, must face the incident and execute right actions to recover a normal situation. Generally, he is drowned under the great number of alarms. Our aim, in the frame of our researches, is to perform an alarm filtering system for an automated metro line, to help the operator finding the main alarm responsible for the malfunction. Our works are divided into two parts, both dealing with study and development of an alarm filtering system but using two different approaches. The first part is developed in the frame of the SARA project (an operator assistance system for an automated metro line) which is an expert system prototype helping the operators of a command center. In this part, a centralized approach has been used representing the events with a single event graph and using a global procedure to perform diagnosis. This approach has itself shown its limits. In the second part of our works, we have considered the distributed artificial intelligence (DAI) techniques, and more especially the multi-agent approach. The multi-agent approach has been motivated by the natural distribution of the metro line equipment and by the fact that each equipment has its own local control and knowledge. Thus, each equipment has been considered as an autonomous agent. Through agents cooperation, the system is able to determine the main alarm and the faulty equipment responsible for the incident. A prototype, written in SPIRAL (a tool for knowledge-based system) is running on a workstation. This prototype has allowed the concretization and the validation of our multi-agent approach. (author) [fr

  6. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    Science.gov (United States)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  7. Automation of multiple neutral beam injector controls at Lawrence Livermore Laboratory

    International Nuclear Information System (INIS)

    Pollock, G.G.

    1977-01-01

    The computer control system used on the twelve Neutral Beams of the 2XIIB experiment at the Lawrence Livermore Laboratory (LLL) has evolved over the last three years. It is now in its final form and in regular use. It provides automatic data collection, reduction, and graphics presentation, as well as automatic conditioning, automatic normal operation, and processing of calorimeter data. This paper presents an overview of the capabilities and implementation of the current system, a detailed discussion of the automatic conditioning algorithm, and discusses the future directions for neutral beam automation

  8. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  9. MaxBin 2.0: an automated binning algorithm to recover genomes from multiple metagenomic datasets

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yu-Wei [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Simmons, Blake A. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Steven W. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-10-29

    The recovery of genomes from metagenomic datasets is a critical step to defining the functional roles of the underlying uncultivated populations. We previously developed MaxBin, an automated binning approach for high-throughput recovery of microbial genomes from metagenomes. Here, we present an expanded binning algorithm, MaxBin 2.0, which recovers genomes from co-assembly of a collection of metagenomic datasets. Tests on simulated datasets revealed that MaxBin 2.0 is highly accurate in recovering individual genomes, and the application of MaxBin 2.0 to several metagenomes from environmental samples demonstrated that it could achieve two complementary goals: recovering more bacterial genomes compared to binning a single sample as well as comparing the microbial community composition between different sampling environments. Availability and implementation: MaxBin 2.0 is freely available at http://sourceforge.net/projects/maxbin/ under BSD license. Supplementary information: Supplementary data are available at Bioinformatics online.

  10. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  11. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  12. Analysis of Product Sampling for New Product Diffusion Incorporating Multiple-Unit Ownership

    Directory of Open Access Journals (Sweden)

    Zhineng Hu

    2014-01-01

    Full Text Available Multiple-unit ownership of nondurable products is an important component of sales in many product categories. Based on the Bass model, this paper develops a new model considering the multiple-unit adoptions as a diffusion process under the influence of product sampling. Though the analysis aims to determine the optimal dynamic sampling effort for a firm and the results demonstrate that experience sampling can accelerate the diffusion process, the best time to send free samples is just before the product being launched. Multiple-unit purchasing behavior can increase sales to make more profit for a firm, and it needs more samples to make the product known much better. The local sensitivity analysis shows that the increase of both external coefficients and internal coefficients has a negative influence on the sampling level, but the internal influence on the subsequent multiple-unit adoptions has little significant influence on the sampling. Using the logistic regression along with linear regression, the global sensitivity analysis gives a whole analysis of the interaction of all factors, which manifests the external influence and multiunit purchase rate are two most important factors to influence the sampling level and net present value of the new product, and presents a two-stage method to determine the sampling level.

  13. Immunosuppressant therapeutic drug monitoring by LC-MS/MS: workflow optimization through automated processing of whole blood samples.

    Science.gov (United States)

    Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario

    2013-11-01

    Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.

  14. Carotid Catheterization and Automated Blood Sampling Induce Systemic IL-6 Secretion and Local Tissue Damage and Inflammation in the Heart, Kidneys, Liver and Salivary Glands in NMRI Mice

    DEFF Research Database (Denmark)

    Teilmann, Anne Charlotte; Rozell, Björn; Kalliokoski, Otto

    2016-01-01

    Automated blood sampling through a vascular catheter is a frequently utilized technique in laboratory mice. The potential immunological and physiological implications associated with this technique have, however, not been investigated in detail. The present study compared plasma levels of the cyt...... and embolized to distant sites. Thus, catheterization and subsequent automated blood sampling may have physiological impact. Possible confounding effects of visceral damage should be assessed and considered, when using catheterized mouse models....

  15. Semi-automated literature mining to identify putative biomarkers of disease from multiple biofluids

    Science.gov (United States)

    2014-01-01

    Background Computational methods for mining of biomedical literature can be useful in augmenting manual searches of the literature using keywords for disease-specific biomarker discovery from biofluids. In this work, we develop and apply a semi-automated literature mining method to mine abstracts obtained from PubMed to discover putative biomarkers of breast and lung cancers in specific biofluids. Methodology A positive set of abstracts was defined by the terms ‘breast cancer’ and ‘lung cancer’ in conjunction with 14 separate ‘biofluids’ (bile, blood, breastmilk, cerebrospinal fluid, mucus, plasma, saliva, semen, serum, synovial fluid, stool, sweat, tears, and urine), while a negative set of abstracts was defined by the terms ‘(biofluid) NOT breast cancer’ or ‘(biofluid) NOT lung cancer.’ More than 5.3 million total abstracts were obtained from PubMed and examined for biomarker-disease-biofluid associations (34,296 positive and 2,653,396 negative for breast cancer; 28,355 positive and 2,595,034 negative for lung cancer). Biological entities such as genes and proteins were tagged using ABNER, and processed using Python scripts to produce a list of putative biomarkers. Z-scores were calculated, ranked, and used to determine significance of putative biomarkers found. Manual verification of relevant abstracts was performed to assess our method’s performance. Results Biofluid-specific markers were identified from the literature, assigned relevance scores based on frequency of occurrence, and validated using known biomarker lists and/or databases for lung and breast cancer [NCBI’s On-line Mendelian Inheritance in Man (OMIM), Cancer Gene annotation server for cancer genomics (CAGE), NCBI’s Genes & Disease, NCI’s Early Detection Research Network (EDRN), and others]. The specificity of each marker for a given biofluid was calculated, and the performance of our semi-automated literature mining method assessed for breast and lung cancer

  16. EXPERIMENTS TOWARDS DETERMINING BEST TRAINING SAMPLE SIZE FOR AUTOMATED EVALUATION OF DESCRIPTIVE ANSWERS THROUGH SEQUENTIAL MINIMAL OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Sunil Kumar C

    2014-01-01

    Full Text Available With number of students growing each year there is a strong need to automate systems capable of evaluating descriptive answers. Unfortunately, there aren’t many systems capable of performing this task. In this paper, we use a machine learning tool called LightSIDE to accomplish auto evaluation and scoring of descriptive answers. Our experiments are designed to cater to our primary goal of identifying the optimum training sample size so as to get optimum auto scoring. Besides the technical overview and the experiments design, the paper also covers challenges, benefits of the system. We also discussed interdisciplinary areas for future research on this topic.

  17. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    Science.gov (United States)

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  18. Automated flow cytometric analysis across large numbers of samples and cell types.

    Science.gov (United States)

    Chen, Xiaoyi; Hasan, Milena; Libri, Valentina; Urrutia, Alejandra; Beitz, Benoît; Rouilly, Vincent; Duffy, Darragh; Patin, Étienne; Chalmond, Bernard; Rogge, Lars; Quintana-Murci, Lluis; Albert, Matthew L; Schwikowski, Benno

    2015-04-01

    Multi-parametric flow cytometry is a key technology for characterization of immune cell phenotypes. However, robust high-dimensional post-analytic strategies for automated data analysis in large numbers of donors are still lacking. Here, we report a computational pipeline, called FlowGM, which minimizes operator input, is insensitive to compensation settings, and can be adapted to different analytic panels. A Gaussian Mixture Model (GMM)-based approach was utilized for initial clustering, with the number of clusters determined using Bayesian Information Criterion. Meta-clustering in a reference donor permitted automated identification of 24 cell types across four panels. Cluster labels were integrated into FCS files, thus permitting comparisons to manual gating. Cell numbers and coefficient of variation (CV) were similar between FlowGM and conventional gating for lymphocyte populations, but notably FlowGM provided improved discrimination of "hard-to-gate" monocyte and dendritic cell (DC) subsets. FlowGM thus provides rapid high-dimensional analysis of cell phenotypes and is amenable to cohort studies. Copyright © 2015. Published by Elsevier Inc.

  19. Multiple products management system with sensors array in automated storage and retrieval systems

    Science.gov (United States)

    Vongbunyong, Supachai; Roengritronnachai, Perawat; Kongsanit, Savanut; Chanok-owat, Chawisa; Polchankajorn, Pongsakorn

    2018-01-01

    Automated Storage and Retrieval Systems (AS/RS) have now been widely used in a number of industries due to its capability to automatically manage the storage of products in effective ways. One of the key features of AS/RS is that each rack is not assigned for a specific product resulting in the benefit of space utilization and logistics related issues. In this research, sensor arrays are equipped at each rack in order to enhance this feature. As a result, various products can be identified and mixed in each rack, so that the space utilization efficiency can be increased. To prove the concept, a prototype system consisting of a Cartesian robot that manages the storage and retrieval of products with 9 variations based on size and color. The concept of Cyber-Physical System and self-awareness of the system are also implemented in this concept prototype.

  20. Analysis of multiple single nucleotide polymorphisms (SNP) on DNA traces from plasma and dried blood samples

    NARCIS (Netherlands)

    Catsburg, Arnold; van der Zwet, Wil C.; Morre, Servaas A.; Ouburg, Sander; Vandenbroucke-Grauls, Christina M. J. E.; Savelkoul, Paul H. M.

    2007-01-01

    Reliable analysis of single nucleotide polymorphisms (SNPs) in DNA derived from samples containing low numbers of cells or from suboptimal sources can be difficult. A new procedure to characterize multiple SNPs in traces of DNA from plasma and old dried blood samples was developed. Six SNPs in the

  1. A large-capacity sample-changer for automated gamma-ray spectroscopy

    International Nuclear Information System (INIS)

    Andeweg, A.H.

    1980-01-01

    An automatic sample-changer has been developed at the National Institute for Metallurgy for use in gamma-ray spectroscopy with a lithium-drifted germanium detector. The sample-changer features remote storage, which prevents cross-talk and reduces background. It has a capacity for 200 samples and a sample container that takes liquid or solid samples. The rotation and vibration of samples during counting ensure that powdered samples are compacted, and improve the precision and reproducibility of the counting geometry [af

  2. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Salim, Nazaratul Ashifa Bt. Abdullah [Division of Waste and Environmental Technology, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Ismail, Nadiah Binti [Fakulti Kejuruteraan Elektrik, UiTM Pulau Pinang, 13500 Permatang Pauh, Pulau Pinang (Malaysia)

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  3. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Science.gov (United States)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  4. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    Science.gov (United States)

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  5. Automated optimal coordination of multiple-DOF neuromuscular actions in feedforward neuroprostheses.

    Science.gov (United States)

    Lujan, J Luis; Crago, Patrick E

    2009-01-01

    This paper describes a new method for designing feedforward controllers for multiple-muscle, multiple-DOF, motor system neural prostheses. The design process is based on experimental measurement of the forward input/output properties of the neuromechanical system and numerical optimization of stimulation patterns to meet muscle coactivation criteria, thus resolving the muscle redundancy (i.e., overcontrol) and the coupled DOF problems inherent in neuromechanical systems. We designed feedforward controllers to control the isometric forces at the tip of the thumb in two directions during stimulation of three thumb muscles as a model system. We tested the method experimentally in ten able-bodied individuals and one patient with spinal cord injury. Good control of isometric force in both DOFs was observed, with rms errors less than 10% of the force range in seven experiments and statistically significant correlations between the actual and target forces in all ten experiments. Systematic bias and slope errors were observed in a few experiments, likely due to the neuromuscular fatigue. Overall, the tests demonstrated the ability of a general design approach to satisfy both control and coactivation criteria in multiple-muscle, multiple-axis neuromechanical systems, which is applicable to a wide range of neuromechanical systems and stimulation electrodes.

  6. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...

  7. Pharmacokinetic Studies of Chinese Medicinal Herbs Using an Automated Blood Sampling System and Liquid Chromatography-mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Yu-Tse Wu

    2012-01-01

    Full Text Available The safety of herbal products is one of the major concerns for the modernization of traditional Chinese medicine, and pharmacokinetic data of medicinal herbs guide us to design the rational use of the herbal formula. This article reviews the advantages of the automated blood sampling (ABS systems for pharmacokinetic studies. In addition, three commonly used sample preparative methods, protein precipitation, liquid-liquid extraction and solid-phase extraction, are introduced. Furthermore, the definition, causes and evaluation of matrix effects in liquid chromatography-mass spectrometry (LC/MS analysis are demonstrated. Finally, we present our previous works as practical examples of the application of ABS systems and LC/MS for the pharmacokinetic studies of Chinese medicinal herbs.

  8. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    Energy Technology Data Exchange (ETDEWEB)

    Milliard, Alex; Durand-Jezequel, Myriam [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada); Lariviere, Dominic, E-mail: dominic.lariviere@chm.ulaval.ca [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada)

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO{sub 2}/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg{sup -1} for 5-300 mg of sample.

  9. Extending laboratory automation to the wards: effect of an innovative pneumatic tube system on diagnostic samples and transport time.

    Science.gov (United States)

    Suchsland, Juliane; Winter, Theresa; Greiser, Anne; Streichert, Thomas; Otto, Benjamin; Mayerle, Julia; Runge, Sören; Kallner, Anders; Nauck, Matthias; Petersmann, Astrid

    2017-02-01

    The innovative pneumatic tube system (iPTS) transports one sample at a time without the use of cartridges and allows rapid sending of samples directly into the bulk loader of a laboratory automation system (LAS). We investigated effects of the iPTS on samples and turn-around time (TAT). During transport, a mini data logger recorded the accelerations in three dimensions and reported them in arbitrary area under the curve (AUC) units. In addition representative quantities of clinical chemistry, hematology and coagulation were measured and compared in 20 blood sample pairs transported by iPTS and courier. Samples transported by iPTS were brought to the laboratory (300 m) within 30 s without adverse effects on the samples. The information retrieved from the data logger showed a median AUC of 7 and 310 arbitrary units for courier and iPTS transport, respectively. This is considerably below the reported limit for noticeable hemolysis of 500 arbitrary units. iPTS reduces TAT by reducing the hands-on time and a fast transport. No differences in the measurement results were found for any of the investigated 36 analytes between courier and iPTS transport. Based on these findings the iPTS was cleared for clinical use in our hospital.

  10. An automated synthesis-purification-sample-management platform for the accelerated generation of pharmaceutical candidates.

    Science.gov (United States)

    Sutherland, J David; Tu, Noah P; Nemcek, Thomas A; Searle, Philip A; Hochlowski, Jill E; Djuric, Stevan W; Pan, Jeffrey Y

    2014-04-01

    A flexible and integrated flow-chemistry-synthesis-purification compound-generation and sample-management platform has been developed to accelerate the production of small-molecule organic-compound drug candidates in pharmaceutical research. Central to the integrated system is a Mitsubishi robot, which hands off samples throughout the process to the next station, including synthesis and purification, sample dispensing for purity and quantification analysis, dry-down, and aliquot generation.

  11. A novel sampling method for multiple multiscale targets from scattering amplitudes at a fixed frequency

    Science.gov (United States)

    Liu, Xiaodong

    2017-08-01

    A sampling method by using scattering amplitude is proposed for shape and location reconstruction in inverse acoustic scattering problems. Only matrix multiplication is involved in the computation, thus the novel sampling method is very easy and simple to implement. With the help of the factorization of the far field operator, we establish an inf-criterion for characterization of underlying scatterers. This result is then used to give a lower bound of the proposed indicator functional for sampling points inside the scatterers. While for the sampling points outside the scatterers, we show that the indicator functional decays like the bessel functions as the sampling point goes away from the boundary of the scatterers. We also show that the proposed indicator functional continuously depends on the scattering amplitude, this further implies that the novel sampling method is extremely stable with respect to errors in the data. Different to the classical sampling method such as the linear sampling method or the factorization method, from the numerical point of view, the novel indicator takes its maximum near the boundary of the underlying target and decays like the bessel functions as the sampling points go away from the boundary. The numerical simulations also show that the proposed sampling method can deal with multiple multiscale case, even the different components are close to each other.

  12. Trace element analysis of environmental samples by multiple prompt gamma-ray analysis method

    International Nuclear Information System (INIS)

    Oshima, Masumi; Matsuo, Motoyuki; Shozugawa, Katsumi

    2011-01-01

    The multiple γ-ray detection method has been proved to be a high-resolution and high-sensitivity method in application to nuclide quantification. The neutron prompt γ-ray analysis method is successfully extended by combining it with the γ-ray detection method, which is called Multiple prompt γ-ray analysis, MPGA. In this review we show the principle of this method and its characteristics. Several examples of its application to environmental samples, especially river sediments in the urban area and sea sediment samples are also described. (author)

  13. Low-sampling-rate M-ary multiple access UWB communications in multipath channels

    KAUST Repository

    Alkhodary, Mohammad T.

    2015-08-31

    The desirable characteristics of ultra-wideband (UWB) technology are challenged by formidable sampling frequency, performance degradation in the presence of multi-user interference, and complexity of the receiver due to the channel estimation process. In this paper, a low-rate-sampling technique is used to implement M-ary multiple access UWB communications, in both the detection and channel estimation stages. A novel approach is used for multiple-access-interference (MAI) cancelation for the purpose of channel estimation. Results show reasonable performance of the proposed receiver for different number of users operating many times below Nyquist rate.

  14. Low-sampling-rate M-ary multiple access UWB communications in multipath channels

    KAUST Repository

    Alkhodary, Mohammad T.; Ballal, Tarig; Al-Naffouri, Tareq Y.; Muqaibel, Ali H.

    2015-01-01

    The desirable characteristics of ultra-wideband (UWB) technology are challenged by formidable sampling frequency, performance degradation in the presence of multi-user interference, and complexity of the receiver due to the channel estimation process. In this paper, a low-rate-sampling technique is used to implement M-ary multiple access UWB communications, in both the detection and channel estimation stages. A novel approach is used for multiple-access-interference (MAI) cancelation for the purpose of channel estimation. Results show reasonable performance of the proposed receiver for different number of users operating many times below Nyquist rate.

  15. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Vilém Guryča

    2014-03-01

    Full Text Available The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility.

  16. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    Science.gov (United States)

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Automated spectrometer interface for measurement of short half-life samples for neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lapolli, André L.; Secco, Marcello; Genezini, Frederico A.; Zahn, Guilherme S.; Moreira, Edson G., E-mail: alapolli@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In this paper a source positioning system was developed, based on a HPGe detector coupled to a Canberra DAS 1000 data acquisition system and Canberra's GENIE2K software and libraries. The system is composed of a step motor coupled to an Arduino Uno microcontroller, which is programmed using C language to allow for a source-detector distance between 0.3 and 20 cm - both components are coupled to a PC computer using the USB interface. In order to allow automated data acquisition, two additional pieces of software were developed. The first one, a Human-Machine Interface (HMI) programmed in Visual Basic 6, allows the programming and monitoring of the data acquisition process, and the other, in REXX language, controls the data acquisition process in the background. The HMI is user-friendly and versatile, so that the even rather complex data acquisition processes may be easily programmed. When the experiment scheme is saved, two files are created and used by the REXX code to control the acquisition process so that the data acquisition is automatically stopped and saved after a user-defined time, then the source is repositioned and data acquisition is cleared and restarted. While in the present stage the system only offers three distinct source positions, finer source-position adjusting is under development. In its present configuration the system has been tested for stability and repeatability in all three positions, with an an excellent performance (author)

  18. Automated spectrometer interface for measurement of short half-life samples for neutron activation analysis

    International Nuclear Information System (INIS)

    Lapolli, André L.; Secco, Marcello; Genezini, Frederico A.; Zahn, Guilherme S.; Moreira, Edson G.

    2017-01-01

    In this paper a source positioning system was developed, based on a HPGe detector coupled to a Canberra DAS 1000 data acquisition system and Canberra's GENIE2K software and libraries. The system is composed of a step motor coupled to an Arduino Uno microcontroller, which is programmed using C language to allow for a source-detector distance between 0.3 and 20 cm - both components are coupled to a PC computer using the USB interface. In order to allow automated data acquisition, two additional pieces of software were developed. The first one, a Human-Machine Interface (HMI) programmed in Visual Basic 6, allows the programming and monitoring of the data acquisition process, and the other, in REXX language, controls the data acquisition process in the background. The HMI is user-friendly and versatile, so that the even rather complex data acquisition processes may be easily programmed. When the experiment scheme is saved, two files are created and used by the REXX code to control the acquisition process so that the data acquisition is automatically stopped and saved after a user-defined time, then the source is repositioned and data acquisition is cleared and restarted. While in the present stage the system only offers three distinct source positions, finer source-position adjusting is under development. In its present configuration the system has been tested for stability and repeatability in all three positions, with an an excellent performance (author)

  19. Automated statistical matching of multiple tephra records exemplified using five long maar sequences younger than 75 ka, Auckland, New Zealand

    Science.gov (United States)

    Green, Rebecca M.; Bebbington, Mark S.; Cronin, Shane J.; Jones, Geoff

    2014-09-01

    Detailed tephrochronologies are built to underpin probabilistic volcanic hazard forecasting, and to understand the dynamics and history of diverse geomorphic, climatic, soil-forming and environmental processes. Complicating factors include highly variable tephra distribution over time; difficulty in correlating tephras from site to site based on physical and chemical properties; and uncertain age determinations. Multiple sites permit construction of more accurate composite tephra records, but correctly merging individual site records by recognizing common events and site-specific gaps is complex. We present an automated procedure for matching tephra sequences between multiple deposition sites using stochastic local optimization techniques. If individual tephra age determinations are not significantly different between sites, they are matched and a more precise age is assigned. Known stratigraphy and mineralogical or geochemical compositions are used to constrain tephra matches. We apply this method to match tephra records from five long sediment cores (≤ 75 cal ka BP) in Auckland, New Zealand. Sediments at these sites preserve basaltic tephras from local eruptions of the Auckland Volcanic Field as well as distal rhyolitic and andesitic tephras from Okataina, Taupo, Egmont, Tongariro, and Tuhua (Mayor Island) volcanic centers. The new correlated record compiled is statistically more likely than previously published arrangements from this area.

  20. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    Science.gov (United States)

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample. © The Author(s) 2016.

  1. Automated multi-dimensional liquid chromatography : sample preparation and identification of peptides from human blood filtrate

    NARCIS (Netherlands)

    Machtejevas, Egidijus; John, Harald; Wagner, Knut; Standker, Ludger; Marko-Varga, Gyorgy; Georg Forssmann, Wolf; Bischoff, Rainer; K. Unger, Klaus

    2004-01-01

    A comprehensive on-line sample clean-up with an integrated two-dimensional HPLC system was developed for the analysis of natural peptides. Samples comprised of endogenous peptides with molecular weights up to 20 kDa were generated from human hemofiltrate (HF) obtained from patients with chronic

  2. Manual for the Epithermal Neutron Multiplicity Detector (ENMC) for Measurement of Impure MOX and Plutonium Samples

    International Nuclear Information System (INIS)

    Menlove, H. O.; Rael, C. D.; Kroncke, K. E.; DeAguero, K. J.

    2004-01-01

    We have designed a high-efficiency neutron detector for passive neutron coincidence and multiplicity counting of dirty scrap and bulk samples of plutonium. The counter will be used for the measurement of impure plutonium samples at the JNC MOX fabrication facility in Japan. The counter can also be used to create working standards from bulk process MOX. The detector uses advanced design "3He tubes to increase the efficiency and to shorten the neutron die-away time. The efficiency is 64% and the die-away time is 19.1 ?s. The Epithermal Neutron Multiplicity Counter (ENMC) is designed for high-precision measurements of bulk plutonium samples with diameters of less than 200 mm. The average neutron energy from the sample can be measured using the ratio of the inner ring of He-3 tubes to the outer ring. This report describes the hardware, performance, and calibration for the ENMC.

  3. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  4. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques.

    Science.gov (United States)

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J; Nobukawa, Kazutoshi; Pan, Christopher S

    2017-03-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs.

  5. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    Science.gov (United States)

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample

  6. The hybrid model for sampling multiple elastic scattering angular deflections based on Goudsmit-Saunderson theory

    Directory of Open Access Journals (Sweden)

    Wasaye Muhammad Abdul

    2017-01-01

    Full Text Available An algorithm for the Monte Carlo simulation of electron multiple elastic scattering based on the framework of SuperMC (Super Monte Carlo simulation program for nuclear and radiation process is presented. This paper describes efficient and accurate methods by which the multiple scattering angular deflections are sampled. The Goudsmit-Saunderson theory of multiple scattering has been used for sampling angular deflections. Differential cross-sections of electrons and positrons by neutral atoms have been calculated by using Dirac partial wave program ELSEPA. The Legendre coefficients are accurately computed by using the Gauss-Legendre integration method. Finally, a novel hybrid method for sampling angular distribution has been developed. The model uses efficient rejection sampling method for low energy electrons (500 mean free paths. For small path lengths, a simple, efficient and accurate analytical distribution function has been proposed. The later uses adjustable parameters determined from the fitting of Goudsmith-Saunderson angular distribution. A discussion of the sampling efficiency and accuracy of this newly developed algorithm is given. The efficiency of rejection sampling algorithm is at least 50 % for electron kinetic energies less than 500 keV and longer path lengths (>500 mean free paths. Monte Carlo Simulation results are then compared with measured angular distributions of Ross et al. The comparison shows that our results are in good agreement with experimental measurements.

  7. High-throughput automated microfluidic sample preparation for accurate microbial genomics.

    Science.gov (United States)

    Kim, Soohong; De Jonghe, Joachim; Kulesa, Anthony B; Feldman, David; Vatanen, Tommi; Bhattacharyya, Roby P; Berdy, Brittany; Gomez, James; Nolan, Jill; Epstein, Slava; Blainey, Paul C

    2017-01-27

    Low-cost shotgun DNA sequencing is transforming the microbial sciences. Sequencing instruments are so effective that sample preparation is now the key limiting factor. Here, we introduce a microfluidic sample preparation platform that integrates the key steps in cells to sequence library sample preparation for up to 96 samples and reduces DNA input requirements 100-fold while maintaining or improving data quality. The general-purpose microarchitecture we demonstrate supports workflows with arbitrary numbers of reaction and clean-up or capture steps. By reducing the sample quantity requirements, we enabled low-input (∼10,000 cells) whole-genome shotgun (WGS) sequencing of Mycobacterium tuberculosis and soil micro-colonies with superior results. We also leveraged the enhanced throughput to sequence ∼400 clinical Pseudomonas aeruginosa libraries and demonstrate excellent single-nucleotide polymorphism detection performance that explained phenotypically observed antibiotic resistance. Fully-integrated lab-on-chip sample preparation overcomes technical barriers to enable broader deployment of genomics across many basic research and translational applications.

  8. Automated sample preparation station for studying self-diffusion in porous solids with NMR spectroscopy

    Science.gov (United States)

    Hedin, Niklas; DeMartin, Gregory J.; Reyes, Sebastián C.

    2006-03-01

    In studies of gas diffusion in porous solids with nuclear magnetic resonance (NMR) spectroscopy the sample preparation procedure becomes very important. An apparatus is presented here that pretreats the sample ex situ and accurately sets the desired pressure and temperature within the NMR tube prior to its introduction in the spectrometer. The gas manifold that supplies the NMR tube is also connected to a microbalance containing another portion of the same sample, which is kept at the same temperature as the sample in the NMR tube. This arrangement permits the simultaneous measurement of the adsorption loading on the sample, which is required for the interpretation of the NMR diffusion experiments. Furthermore, to ensure a good seal of the NMR tube, a hybrid valve design composed of titanium, a Teflon® seat, and Kalrez® O-rings is utilized. A computer controlled algorithm ensures the accuracy and reproducibility of all the procedures, enabling the NMR diffusion experiments to be performed at well controlled conditions of pressure, temperature, and amount of gas adsorbed on the porous sample.

  9. Development of an automated method for determination of thorium in soil samples and aerosols

    International Nuclear Information System (INIS)

    Stuart, J.E.; Robertson, R.

    1986-09-01

    Methodology for determining trace thorium levels in a variety of sample types was further developed. Thorium in filtered water samples is concentrated by ferric hydroxide precipitation followed by dissolution and co-precipitation with lanthanum fluoride. Aerosols on glass fibre, cellulose ester, or teflon filters and solid soil and sediment samples are acid digested. Subsequently thorium is concentrated by lanthanum fluoride co-precipitation. Chemical separation and measurement is then done on a Technicon AA11-C autoanalyzer, using solvent extraction into thenoyltrifuoroacetone in kerosene followed by back extraction into 2 N H NO 3 , and colourometric measurement of the thorium arsenazo III complex. Chemical yields are determined by the addition of thorium-234 tracer using gamma-ray spectrometry. The sensitivities of the methods for water, aerosol and solid samples are approximately 1.0 μg/L, 0.5 μg/g and 1.0 μg/g respectively. At thorium levels about ten times the detection limit, accuracy is estimated to be ± 10% for liquids and aerosols and ± 15% for solid samples, and precision ± 5% for all samples

  10. Sample Size Calculation for Estimating or Testing a Nonzero Squared Multiple Correlation Coefficient

    Science.gov (United States)

    Krishnamoorthy, K.; Xia, Yanping

    2008-01-01

    The problems of hypothesis testing and interval estimation of the squared multiple correlation coefficient of a multivariate normal distribution are considered. It is shown that available one-sided tests are uniformly most powerful, and the one-sided confidence intervals are uniformly most accurate. An exact method of calculating sample size to…

  11. Prioritizing multiple therapeutic targets in parallel using automated DNA-encoded library screening

    Science.gov (United States)

    Machutta, Carl A.; Kollmann, Christopher S.; Lind, Kenneth E.; Bai, Xiaopeng; Chan, Pan F.; Huang, Jianzhong; Ballell, Lluis; Belyanskaya, Svetlana; Besra, Gurdyal S.; Barros-Aguirre, David; Bates, Robert H.; Centrella, Paolo A.; Chang, Sandy S.; Chai, Jing; Choudhry, Anthony E.; Coffin, Aaron; Davie, Christopher P.; Deng, Hongfeng; Deng, Jianghe; Ding, Yun; Dodson, Jason W.; Fosbenner, David T.; Gao, Enoch N.; Graham, Taylor L.; Graybill, Todd L.; Ingraham, Karen; Johnson, Walter P.; King, Bryan W.; Kwiatkowski, Christopher R.; Lelièvre, Joël; Li, Yue; Liu, Xiaorong; Lu, Quinn; Lehr, Ruth; Mendoza-Losana, Alfonso; Martin, John; McCloskey, Lynn; McCormick, Patti; O'Keefe, Heather P.; O'Keeffe, Thomas; Pao, Christina; Phelps, Christopher B.; Qi, Hongwei; Rafferty, Keith; Scavello, Genaro S.; Steiginga, Matt S.; Sundersingh, Flora S.; Sweitzer, Sharon M.; Szewczuk, Lawrence M.; Taylor, Amy; Toh, May Fern; Wang, Juan; Wang, Minghui; Wilkins, Devan J.; Xia, Bing; Yao, Gang; Zhang, Jean; Zhou, Jingye; Donahue, Christine P.; Messer, Jeffrey A.; Holmes, David; Arico-Muendel, Christopher C.; Pope, Andrew J.; Gross, Jeffrey W.; Evindar, Ghotas

    2017-07-01

    The identification and prioritization of chemically tractable therapeutic targets is a significant challenge in the discovery of new medicines. We have developed a novel method that rapidly screens multiple proteins in parallel using DNA-encoded library technology (ELT). Initial efforts were focused on the efficient discovery of antibacterial leads against 119 targets from Acinetobacter baumannii and Staphylococcus aureus. The success of this effort led to the hypothesis that the relative number of ELT binders alone could be used to assess the ligandability of large sets of proteins. This concept was further explored by screening 42 targets from Mycobacterium tuberculosis. Active chemical series for six targets from our initial effort as well as three chemotypes for DHFR from M. tuberculosis are reported. The findings demonstrate that parallel ELT selections can be used to assess ligandability and highlight opportunities for successful lead and tool discovery.

  12. Automated detection of age-related macular degeneration in OCT images using multiple instance learning

    Science.gov (United States)

    Sun, Weiwei; Liu, Xiaoming; Yang, Zhou

    2017-07-01

    Age-related Macular Degeneration (AMD) is a kind of macular disease which mostly occurs in old people,and it may cause decreased vision or even lead to permanent blindness. Drusen is an important clinical indicator for AMD which can help doctor diagnose disease and decide the strategy of treatment. Optical Coherence Tomography (OCT) is widely used in the diagnosis of ophthalmic diseases, include AMD. In this paper, we propose a classification method based on Multiple Instance Learning (MIL) to detect AMD. Drusen can exist in a few slices of OCT images, and MIL is utilized in our method. We divided the method into two phases: training phase and testing phase. We train the initial features and clustered to create a codebook, and employ the trained classifier in the test set. Experiment results show that our method achieved high accuracy and effectiveness.

  13. Evaluating the quality of medical multiple-choice items created with automated processes.

    Science.gov (United States)

    Gierl, Mark J; Lai, Hollis

    2013-07-01

    Computerised assessment raises formidable challenges because it requires large numbers of test items. Automatic item generation (AIG) can help address this test development problem because it yields large numbers of new items both quickly and efficiently. To date, however, the quality of the items produced using a generative approach has not been evaluated. The purpose of this study was to determine whether automatic processes yield items that meet standards of quality that are appropriate for medical testing. Quality was evaluated firstly by subjecting items created using both AIG and traditional processes to rating by a four-member expert medical panel using indicators of multiple-choice item quality, and secondly by asking the panellists to identify which items were developed using AIG in a blind review. Fifteen items from the domain of therapeutics were created in three different experimental test development conditions. The first 15 items were created by content specialists using traditional test development methods (Group 1 Traditional). The second 15 items were created by the same content specialists using AIG methods (Group 1 AIG). The third 15 items were created by a new group of content specialists using traditional methods (Group 2 Traditional). These 45 items were then evaluated for quality by a four-member panel of medical experts and were subsequently categorised as either Traditional or AIG items. Three outcomes were reported: (i) the items produced using traditional and AIG processes were comparable on seven of eight indicators of multiple-choice item quality; (ii) AIG items can be differentiated from Traditional items by the quality of their distractors, and (iii) the overall predictive accuracy of the four expert medical panellists was 42%. Items generated by AIG methods are, for the most part, equivalent to traditionally developed items from the perspective of expert medical reviewers. While the AIG method produced comparatively fewer plausible

  14. The T-lock: automated compensation of radio-frequency induced sample heating

    International Nuclear Information System (INIS)

    Hiller, Sebastian; Arthanari, Haribabu; Wagner, Gerhard

    2009-01-01

    Modern high-field NMR spectrometers can stabilize the nominal sample temperature at a precision of less than 0.1 K. However, the actual sample temperature may differ from the nominal value by several degrees because the sample heating caused by high-power radio frequency pulses is not readily detected by the temperature sensors. Without correction, transfer of chemical shifts between different experiments causes problems in the data analysis. In principle, the temperature differences can be corrected by manual procedures but this is cumbersome and not fully reliable. Here, we introduce the concept of a 'T-lock', which automatically maintains the sample at the same reference temperature over the course of different NMR experiments. The T-lock works by continuously measuring the resonance frequency of a suitable spin and simultaneously adjusting the temperature control, thus locking the sample temperature at the reference value. For three different nuclei, 13 C, 17 O and 31 P in the compounds alanine, water, and phosphate, respectively, the T-lock accuracy was found to be <0.1 K. The use of dummy scan periods with variable lengths allows a reliable establishment of the thermal equilibrium before the acquisition of an experiment starts

  15. Mechanized sephadex LH-20 multiple column chromatography as a prerequisite to automated multi-steroid radioimmunoassays

    International Nuclear Information System (INIS)

    Sippell, W.G.; Bidlingmaier, F.; Knorr, D.

    1977-01-01

    In order to establish a procedure for the simultaneous determination of all major corticosteroid hormones and their immediate biological precursors in the same plasma sample, two different mechanized methods for the simultaneous isolation of aldosterone (A), corticosterone (B), 11-deoxycorticosterone (DOC), progesterone (P), 17-hydroxyprogesterone (17-OHP), 11-deoxycorticol (S), cortisol (F), and cortisone (E) from the methylene chloride extracts of 0.1 to 2.0 ml plasma samples have been developed. In both methods, eluate fractions of each of the isolated steroids are automatically pooled and collected from all parallel columns by one programmable linear fraction collector. Due to the high reproducibility of the elution patterns both between different parallel columns and between 30 to 40 consecutive elutions, mean recoveries of tritiated steroids including extraction are 60 to 84% after a single elution and still over 50% after an additional chromatography on 40cm LH-20 colums, with coefficients of variation below 15%. Thus, the eight steroids can be completely isolated from each of ten plasma extracts within 3 to 4 hours, yielding 80 samples readily prepared for subsequent quantitation by radioimmunoassay. (orig./AJ) [de

  16. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Directory of Open Access Journals (Sweden)

    Casey Olives

    Full Text Available Originally a binary classifier, Lot Quality Assurance Sampling (LQAS has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%, and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa.We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa.Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87. In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50, the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error.This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  17. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Directory of Open Access Journals (Sweden)

    Kamfai Chan

    Full Text Available Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs. Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  18. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection

    Science.gov (United States)

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A.; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A.; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target’s nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer’s heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  19. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Science.gov (United States)

    El-Alaily, T. M.; El-Nimr, M. K.; Saafan, S. A.; Kamel, M. M.; Meaz, T. M.; Assar, S. T.

    2015-07-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability.

  20. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  1. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    International Nuclear Information System (INIS)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L.

    2015-01-01

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h"−"1). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates the

  2. Robust, Sensitive, and Automated Phosphopeptide Enrichment Optimized for Low Sample Amounts Applied to Primary Hippocampal Neurons

    NARCIS (Netherlands)

    Post, Harm; Penning, Renske; Fitzpatrick, Martin; Garrigues, L.B.; Wu, W.; Mac Gillavry, H.D.; Hoogenraad, C.C.; Heck, A.J.R.; Altelaar, A.F.M.

    2017-01-01

    Because of the low stoichiometry of protein phosphorylation, targeted enrichment prior to LC–MS/MS analysis is still essential. The trend in phosphoproteome analysis is shifting toward an increasing number of biological replicates per experiment, ideally starting from very low sample amounts,

  3. Portable automation of static chamber sample collection for quantifying soil gas flux

    Science.gov (United States)

    The collection of soil gas flux using the static chamber method is labor intensive. The number of chambers that can be sampled in a given time period is limited by the spacing between chambers and the availability of trained research technicians. However, the static chamber method can limit spatial ...

  4. Automated sample-processing and titration system for determining uranium in nuclear materials

    International Nuclear Information System (INIS)

    Harrar, J.E.; Boyle, W.G.; Breshears, J.D.; Pomernacki, C.L.; Brand, H.R.; Kray, A.M.; Sherry, R.J.; Pastrone, J.A.

    1977-01-01

    The system is designed for accurate, precise, and selective determination of from 10 to 180 mg of uranium in 2 to 12 cm 3 of solution. Samples, standards, and their solutions are handled on a weight basis. These weights, together with their appropriate identification numbers, are stored in computer memory and are used automatically in the assay calculations after each titration. The measurement technique (controlled-current coulometry) is based on the Davies-Gray and New Brunswick Laboratory method, in which U(VI) is reduced to U(IV) in strong H 3 PO 4 , followed by titration of the U(IV) with electrogenerated V(V). Solution pretreatment and titration are automatic. The analyzer is able to process 44 samples per loading of the sample changer, at a rate of 4 to 9 samples per hour. The system includes a comprehensive fault-monitoring system that detects analytical errors, guards against abnormal conditions which might cause errors, and prevents unsafe operation. A detailed description of the system, information on the reliability of the component subsystems, and a summary of its evaluation by the New Brunswick Laboratory are presented

  5. [Automated serial diagnosis of donor blood samples. Ergonomic and economic organization structure].

    Science.gov (United States)

    Stoll, T; Fischer-Fröhlich, C L; Mayer, G; Hanfland, P

    1990-01-01

    A comprehensive computer-aided administration-system for blood-donors is presented. Ciphered informations of barcode-labels allow the automatic and nevertheless selective pipetting of samples by pipetting-robots. Self-acting analysis-results are transferred to a host-computer in order to actualize a donor data-base.

  6. Mechanized Sephadex LH-20 multiple column chromatography as a prerequisite for automated multi-steroid radioimmunoassays

    International Nuclear Information System (INIS)

    Sippell, W.G.; Bidlingmaier, F.; Knorr, D.

    1978-01-01

    To establish a procedure for the simultaneous determination of all major corticosteroid hormones and their immediate biological precursors in the same plasma sample, two different mechanized methods for the simultaneous isolation of aldosterone (A), corticosterone (B), 11-deoxycorticosterone (DOC), progesterone (P), 17-hydroxyprogesterone (17-OHP), 11-deoxycortisol (S), cortisol (F) and cortisone (E) from the methylene chloride extracts of 0.1 to 2.0ml plasma samples have been developed. In method I, steroids are separated with methylene chloride:methanol=98:2 as solvent system on 60-cm Sephadex LH-20 columns, up to eight of which are eluted in parallel using a multi-channel peristaltic pump and individual flow-rate control (40ml/h) by capillary valves and micro-flowmeters. Method II, on the other hand, utilizes the same solvent system on ten 75-cm LH-20 columns which are eluted in reversed flow simultaneously by a ten-channel, double-piston pump that precisely maintains an elution flow rate of 40ml/h in every column. In both methods, eluate fractions of each of the isolated steroids are automatically pooled and collected from all parallel columns by one programmable linear fraction collector. As a result of the high reproducibility of the elution patterns, both between different parallel columns and between 30 to 40 consecutive elutions, mean recoveries of tritiated steroids including extraction are 60 to 84% after a single separation and still over 50% after an additional separation on 40-cm LH-20 columns, with coefficients of variation below 15% (method II). Thus, the eight steroids can be completely isolated from each of ten plasma extracts within 3 to 4 hours, yielding 80 samples readily prepared for subsequent quantitation by radioimmunoassay. (author)

  7. Fast automated segmentation of multiple objects via spatially weighted shape learning

    Science.gov (United States)

    Chandra, Shekhar S.; Dowling, Jason A.; Greer, Peter B.; Martin, Jarad; Wratten, Chris; Pichler, Peter; Fripp, Jurgen; Crozier, Stuart

    2016-11-01

    Active shape models (ASMs) have proved successful in automatic segmentation by using shape and appearance priors in a number of areas such as prostate segmentation, where accurate contouring is important in treatment planning for prostate cancer. The ASM approach however, is heavily reliant on a good initialisation for achieving high segmentation quality. This initialisation often requires algorithms with high computational complexity, such as three dimensional (3D) image registration. In this work, we present a fast, self-initialised ASM approach that simultaneously fits multiple objects hierarchically controlled by spatially weighted shape learning. Prominent objects are targeted initially and spatial weights are progressively adjusted so that the next (more difficult, less visible) object is simultaneously initialised using a series of weighted shape models. The scheme was validated and compared to a multi-atlas approach on 3D magnetic resonance (MR) images of 38 cancer patients and had the same (mean, median, inter-rater) Dice’s similarity coefficients of (0.79, 0.81, 0.85), while having no registration error and a computational time of 12-15 min, nearly an order of magnitude faster than the multi-atlas approach.

  8. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  9. A Simple Method for Automated Solid Phase Extraction of Water Samples for Immunological Analysis of Small Pollutants.

    Science.gov (United States)

    Heub, Sarah; Tscharner, Noe; Kehl, Florian; Dittrich, Petra S; Follonier, Stéphane; Barbe, Laurent

    2016-01-01

    A new method for solid phase extraction (SPE) of environmental water samples is proposed. The developed prototype is cost-efficient and user friendly, and enables to perform rapid, automated and simple SPE. The pre-concentrated solution is compatible with analysis by immunoassay, with a low organic solvent content. A method is described for the extraction and pre-concentration of natural hormone 17β-estradiol in 100 ml water samples. Reverse phase SPE is performed with octadecyl-silica sorbent and elution is done with 200 µl of methanol 50% v/v. Eluent is diluted by adding di-water to lower the amount of methanol. After preparing manually the SPE column, the overall procedure is performed automatically within 1 hr. At the end of the process, estradiol concentration is measured by using a commercial enzyme-linked immune-sorbent assay (ELISA). 100-fold pre-concentration is achieved and the methanol content in only 10% v/v. Full recoveries of the molecule are achieved with 1 ng/L spiked de-ionized and synthetic sea water samples.

  10. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    Science.gov (United States)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  11. Imaging a Large Sample with Selective Plane Illumination Microscopy Based on Multiple Fluorescent Microsphere Tracking

    Science.gov (United States)

    Ryu, Inkeon; Kim, Daekeun

    2018-04-01

    A typical selective plane illumination microscopy (SPIM) image size is basically limited by the field of view, which is a characteristic of the objective lens. If an image larger than the imaging area of the sample is to be obtained, image stitching, which combines step-scanned images into a single panoramic image, is required. However, accurately registering the step-scanned images is very difficult because the SPIM system uses a customized sample mount where uncertainties for the translational and the rotational motions exist. In this paper, an image registration technique based on multiple fluorescent microsphere tracking is proposed in the view of quantifying the constellations and measuring the distances between at least two fluorescent microspheres embedded in the sample. Image stitching results are demonstrated for optically cleared large tissue with various staining methods. Compensation for the effect of the sample rotation that occurs during the translational motion in the sample mount is also discussed.

  12. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    Science.gov (United States)

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  13. Multiple Smaller Missions as a Direct Pathway to Mars Sample Return

    Science.gov (United States)

    Niles, P. B.; Draper, D. S.; Evans, C. A.; Gibson, E. K.; Graham, L. D.; Jones, J. H.; Lederer, S. M.; Ming, D.; Seaman, C. H.; Archer, P. D.; hide

    2012-01-01

    Recent discoveries by the Mars Exploration Rovers, Mars Express, Mars Odyssey, and Mars Reconnaissance Orbiter spacecraft include multiple, tantalizing astrobiological targets representing both past and present environments on Mars. The most desirable path to Mars Sample Return (MSR) would be to collect and return samples from that site which provides the clearest examples of the variety of rock types considered a high priority for sample return (pristine igneous, sedimentary, and hydrothermal). Here we propose an MSR architecture in which the next steps (potentially launched in 2018) would entail a series of smaller missions, including caching, to multiple landing sites to verify the presence of high priority sample return targets through in situ analyses. This alternative architecture to one flagship-class sample caching mission to a single site would preserve a direct path to MSR as stipulated by the Planetary Decadal Survey, while permitting investigation of diverse deposit types and providing comparison of the site of returned samples to other aqueous environments on early Mars

  14. Automated Image Sampling and Classification Can Be Used to Explore Perceived Naturalness of Urban Spaces.

    Directory of Open Access Journals (Sweden)

    Roger Hyam

    Full Text Available The psychological restorative effects of exposure to nature are well established and extend to just viewing of images of nature. A previous study has shown that Perceived Naturalness (PN of images correlates with their restorative value. This study tests whether it is possible to detect degree of PN of images using an image classifier. It takes images that have been scored by humans for PN (including a subset that have been assessed for restorative value and passes them through the Google Vision API image classification service. The resulting labels are assigned to broad semantic classes to create a Calculated Semantic Naturalness (CSN metric for each image. It was found that CSN correlates with PN. CSN was then calculated for a geospatial sampling of Google Street View images across the city of Edinburgh. CSN was found to correlate with PN in this sample also indicating the technique may be useful in large scale studies. Because CSN correlates with PN which correlates with restorativeness it is suggested that CSN or a similar measure may be useful in automatically detecting restorative images and locations. In an exploratory aside CSN was not found to correlate with an indicator of socioeconomic deprivation.

  15. Genotyping of Bacillus anthracis strains based on automated capillary 25-loci Multiple Locus Variable-Number Tandem Repeats Analysis

    Directory of Open Access Journals (Sweden)

    Ciervo Alessandra

    2006-04-01

    Full Text Available Abstract Background The genome of Bacillus anthracis, the etiological agent of anthrax, is highly monomorphic which makes differentiation between strains difficult. A Multiple Locus Variable-number tandem repeats (VNTR Analysis (MLVA assay based on 20 markers was previously described. It has considerable discrimination power, reproducibility, and low cost, especially since the markers proposed can be typed by agarose-gel electrophoresis. However in an emergency situation, faster genotyping and access to representative databases is necessary. Results Genotyping of B. anthracis reference strains and isolates from France and Italy was done using a 25 loci MLVA assay combining 21 previously described loci and 4 new ones. DNA was amplified in 4 multiplex PCR reactions and the length of the resulting 25 amplicons was estimated by automated capillary electrophoresis. The results were reproducible and the data were consistent with other gel based methods once differences in mobility patterns were taken into account. Some alleles previously unresolved by agarose gel electrophoresis could be resolved by capillary electrophoresis, thus further increasing the assay resolution. One particular locus, Bams30, is the result of a recombination between a 27 bp tandem repeat and a 9 bp tandem repeat. The analysis of the array illustrates the evolution process of tandem repeats. Conclusion In a crisis situation of suspected bioterrorism, standardization, speed and accuracy, together with the availability of reference typing data are important issues, as illustrated by the 2001 anthrax letters event. In this report we describe an upgrade of the previously published MLVA method for genotyping of B. anthracis and apply the method to the typing of French and Italian B. anthracis strain collections. The increased number of markers studied compared to reports using only 8 loci greatly improves the discrimination power of the technique. An Italian strain belonging to the

  16. Are Flow Injection-based Approaches Suitable for Automated Handling of Solid Samples?

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald; Cerdà, Victor

    Flow-based approaches were originally conceived for liquid-phase analysis, implying that constituents in solid samples generally had to be transferred into the liquid state, via appropriate batch pretreatment procedures, prior to analysis. Yet, in recent years, much effort has been focused...... electrolytic or aqueous leaching, on-line dialysis/microdialysis, in-line filtration, and pervaporation-based procedures have been successfully implemented in continuous flow/flow injection systems. In this communication, the new generation of flow analysis, including sequential injection, multicommutated flow.......g., soils, sediments, sludges), and thus, ascertaining the potential mobility, bioavailability and eventual impact of anthropogenic elements on biota [2]. In this context, the principles of sequential injection-microcolumn extraction (SI-MCE) for dynamic fractionation are explained in detail along...

  17. Improved automation of dissolved organic carbon sampling for organic-rich surface waters.

    Science.gov (United States)

    Grayson, Richard P; Holden, Joseph

    2016-02-01

    In-situ UV-Vis spectrophotometers offer the potential for improved estimates of dissolved organic carbon (DOC) fluxes for organic-rich systems such as peatlands because they are able to sample and log DOC proxies automatically through time at low cost. In turn, this could enable improved total carbon budget estimates for peatlands. The ability of such instruments to accurately measure DOC depends on a number of factors, not least of which is how absorbance measurements relate to DOC and the environmental conditions. Here we test the ability of a S::can Spectro::lyser™ for measuring DOC in peatland streams with routinely high DOC concentrations. Through analysis of the spectral response data collected by the instrument we have been able to accurately measure DOC up to 66 mg L(-1), which is more than double the original upper calibration limit for this particular instrument. A linear regression modelling approach resulted in an accuracy >95%. The greatest accuracy was achieved when absorbance values for several different wavelengths were used at the same time in the model. However, an accuracy >90% was achieved using absorbance values for a single wavelength to predict DOC concentration. Our calculations indicated that, for organic-rich systems, in-situ measurement with a scanning spectrophotometer can improve fluvial DOC flux estimates by 6 to 8% compared with traditional sampling methods. Thus, our techniques pave the way for improved long-term carbon budget calculations from organic-rich systems such as peatlands. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    Science.gov (United States)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  19. Application of In-Segment Multiple Sampling in Object-Based Classification

    Directory of Open Access Journals (Sweden)

    Nataša Đurić

    2014-12-01

    Full Text Available When object-based analysis is applied to very high-resolution imagery, pixels within the segments reveal large spectral inhomogeneity; their distribution can be considered complex rather than normal. When normality is violated, the classification methods that rely on the assumption of normally distributed data are not as successful or accurate. It is hard to detect normality violations in small samples. The segmentation process produces segments that vary highly in size; samples can be very big or very small. This paper investigates whether the complexity within the segment can be addressed using multiple random sampling of segment pixels and multiple calculations of similarity measures. In order to analyze the effect sampling has on classification results, statistics and probability value equations of non-parametric two-sample Kolmogorov-Smirnov test and parametric Student’s t-test are selected as similarity measures in the classification process. The performance of both classifiers was assessed on a WorldView-2 image for four land cover classes (roads, buildings, grass and trees and compared to two commonly used object-based classifiers—k-Nearest Neighbor (k-NN and Support Vector Machine (SVM. Both proposed classifiers showed a slight improvement in the overall classification accuracies and produced more accurate classification maps when compared to the ground truth image.

  20. Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.

    Science.gov (United States)

    Price, W R; Olsen, R A; Hunter, J E

    1972-04-01

    A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.

  1. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility – High throughput sample evaluation and automation

    International Nuclear Information System (INIS)

    Theveneau, P; Baker, R; Barrett, R; Beteva, A; Bowler, M W; Carpentier, P; Caserotto, H; Sanctis, D de; Dobias, F; Flot, D; Guijarro, M; Giraud, T; Lentini, M; Leonard, G A; Mattenet, M; McSweeney, S M; Morawe, C; Nurizzo, D; McCarthy, A A; Nanao, M

    2013-01-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This 'first generation' of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  2. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    Directory of Open Access Journals (Sweden)

    Kottawattage S. A. Kottawatta

    2017-11-01

    Full Text Available Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants (n = 102 and wet markets (n = 25. From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter. Broiler meat purchased from retail outlets (n = 37 was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter-free broiler flocks were processed in semi-automated facilities 15% (5/33 of neck skin samples became contaminated by the end of processing whereas 25% (2/8 became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni. Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  3. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples.

    Science.gov (United States)

    Kottawatta, Kottawattage S A; Van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-11-29

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants ( n = 102) and wet markets ( n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter . Broiler meat purchased from retail outlets ( n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter -free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni . Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  4. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Brooker, Simon J; Pagano, Marcello

    2012-01-01

    Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  5. Analysis of stationary power/amplitude distributions for multiple channels of sampled FBGs.

    Science.gov (United States)

    Xing, Ya; Zou, Xihua; Pan, Wei; Yan, Lianshan; Luo, Bin; Shao, Liyang

    2015-08-10

    Stationary power/amplitude distributions for multiple channels of the sampled fiber Bragg grating (SFBG) along the grating length are analyzed. Unlike a uniform FBG, the SFBG has multiple channels in the reflection spectrum, not a single channel. Thus, the stationary power/amplitude distributions for these multiple channels are analyzed by using two different theoretical models. In the first model, the SFBG is regarded as a set of grating sections and non-grating sections, which are alternately stacked. A step-like distribution is obtained for the corresponding power/amplitude of each channel along the grating length. While, in the second model, the SFBG is decomposed into multiple uniform "ghost" gratings, and a continuous distribution is obtained for each ghost grating (i.e., each channel). After a comparison, the distributions obtained in the two models are identical, and the equivalence between the two models is demonstrated. In addition, the impacts of the duty cycle on the power/amplitude distributions of multiple channels of SFBG are presented.

  6. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  7. Efficient computation of the joint sample frequency spectra for multiple populations.

    Science.gov (United States)

    Kamm, John A; Terhorst, Jonathan; Song, Yun S

    2017-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.

  8. Miniaturized bead-beating device to automate full DNA sample preparation processes for gram-positive bacteria.

    Science.gov (United States)

    Hwang, Kyu-Youn; Kwon, Sung Hong; Jung, Sun-Ok; Lim, Hee-Kyun; Jung, Won-Jong; Park, Chin-Sung; Kim, Joon-Ho; Suh, Kahp-Yang; Huh, Nam

    2011-11-07

    We have developed a miniaturized bead-beating device to automate nucleic acids extraction from Gram-positive bacteria for molecular diagnostics. The microfluidic device was fabricated by sandwiching a monolithic flexible polydimethylsiloxane (PDMS) membrane between two glass wafers (i.e., glass-PDMS-glass), which acted as an actuator for bead collision via its pneumatic vibration without additional lysis equipment. The Gram-positive bacteria, S. aureus and methicillin-resistant S. aureus, were captured on surface-modified glass beads from 1 mL of initial sample solution and in situ lyzed by bead-beating operation. Then, 10 μL or 20 μL of bacterial DNA solution was eluted and amplified successfully by real-time PCR. It was found that liquid volume fraction played a crucial role in determining the cell lysis efficiency in a confined chamber by facilitating membrane deflection and bead motion. The miniaturized bead-beating operation disrupted most of S. aureus within 3 min, which turned out to be as efficient as the conventional benchtop vortexing machine or the enzyme-based lysis technique. The effective cell concentration was significantly enhanced with the reduction of initial sample volume by 50 or 100 times. Combination of such analyte enrichment and in situ bead-beating lysis provided an excellent PCR detection sensitivity amounting to ca. 46 CFU even for the Gram-positive bacteria. The proposed bead-beating microdevice is potentially useful as a nucleic acid extraction method toward a PCR-based sample-to-answer system. This journal is © The Royal Society of Chemistry 2011

  9. Re-Emergence of Under-Selected Stimuli, after the Extinction of Over-Selected Stimuli in an Automated Match to Samples Procedure

    Science.gov (United States)

    Broomfield, Laura; McHugh, Louise; Reed, Phil

    2008-01-01

    Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…

  10. Single- versus multiple-sample method to measure glomerular filtration rate.

    Science.gov (United States)

    Delanaye, Pierre; Flamant, Martin; Dubourg, Laurence; Vidal-Petiot, Emmanuelle; Lemoine, Sandrine; Cavalier, Etienne; Schaeffner, Elke; Ebert, Natalie; Pottel, Hans

    2018-01-08

    There are many different ways to measure glomerular filtration rate (GFR) using various exogenous filtration markers, each having their own strengths and limitations. However, not only the marker, but also the methodology may vary in many ways, including the use of urinary or plasma clearance, and, in the case of plasma clearance, the number of time points used to calculate the area under the concentration-time curve, ranging from only one (Jacobsson method) to eight (or more) blood samples. We collected the results obtained from 5106 plasma clearances (iohexol or 51Cr-ethylenediaminetetraacetic acid (EDTA)) using three to four time points, allowing GFR calculation using the slope-intercept method and the Bröchner-Mortensen correction. For each time point, the Jacobsson formula was applied to obtain the single-sample GFR. We used Bland-Altman plots to determine the accuracy of the Jacobsson method at each time point. The single-sample method showed within 10% concordances with the multiple-sample method of 66.4%, 83.6%, 91.4% and 96.0% at the time points 120, 180, 240 and ≥300 min, respectively. Concordance was poorer at lower GFR levels, and this trend is in parallel with increasing age. Results were similar in males and females. Some discordance was found in the obese subjects. Single-sample GFR is highly concordant with a multiple-sample strategy, except in the low GFR range (<30 mL/min). © The Author 2018. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  11. Analytical characterization of high-level mixed wastes using multiple sample preparation treatments

    International Nuclear Information System (INIS)

    King, A.G.; Baldwin, D.L.; Urie, M.W.; McKinley, S.G.

    1994-01-01

    The Analytical Chemistry Laboratory at the Pacific Northwest Laboratory in Richland, Washington, is actively involved in performing analytical characterization of high-level mixed waste from Hanford's single shell and double shell tank characterization programs. A full suite of analyses is typically performed on homogenized tank core samples. These analytical techniques include inductively-coupled plasma-atomic emission spectroscopy, total organic carbon methods and radiochemistry methods, as well as many others, all requiring some type of remote sample-preparation treatment to solubilize the tank sludge material for analysis. Most of these analytical methods typically use a single sample-preparation treatment, inherently providing elemental information only. To better understand and interpret tank chemistry and assist in identifying chemical compounds, selected analytical methods are performed using multiple sample-preparation treatments. The sample preparation treatments used at Pacific Northwest Laboratory for this work with high-level mixed waste include caustic fusion, acid digestion, and water leach. The type of information available by comparing results from different sample-prep treatments includes evidence for the presence of refractory compounds, acid-soluble compounds, or water-soluble compounds. Problems unique to the analysis of Hanford tank wastes are discussed. Selected results from the Hanford single shell ferrocyanide tank, 241-C-109, are presented, and the resulting conclusions are discussed

  12. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    Science.gov (United States)

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. © The Author(s) 2016.

  13. Development and evaluation of a multiple-plate fraction collector for sample processing: application to radioprofiling in drug metabolism studies.

    Science.gov (United States)

    Barros, Anthony; Ly, Van T; Chando, Theodore J; Ruan, Qian; Donenfeld, Scott L; Holub, David P; Christopher, Lisa J

    2011-04-05

    Microplate scintillation counters are utilized routinely in drug metabolism laboratories for the off-line radioanalysis of fractions collected during HPLC radioprofiling. In this process, the current fraction collection technology is limited by the number of plates that can be used per injection as well as the potential for sample loss due to dripping or spraying as the fraction collector head moves from well to well or between plates. More importantly, sample throughput is limited in the conventional process, since the collection plates must be manually exchanged after each injection. The Collect PAL, an innovative multiple-plate fraction collector, was developed to address these deficiencies and improve overall sample throughput. It employs a zero-loss design and has sub-ambient temperature control. Operation of the system is completely controlled with software and up to 24 (96- or 384-well) fraction collection plates can be loaded in a completely automated run. The system may also be configured for collection into various-sized tubes or vials. At flow rates of 0.5 or 1.0 mL/min and at collection times of 10 or 15s, the system precisely delivered 83-μL fractions (within 4.1% CV) and 250-μL fractions (within 1.4% CV), respectively, of three different mobile phases into 12 mm × 32 mm vials. Similarly, at a flow rate of 1 mL/min and 10s collection times, the system precisely dispensed mobile phase containing a [(14)C]-radiolabeled compound across an entire 96-well plate (% CV was within 5.3%). Triplicate analyses of metabolism test samples containing [(14)C]buspirone and its metabolites, derived from three different matrices (plasma, urine and bile), indicated that the Collect PAL produced radioprofiles that were reproducible and comparable to the current technology; the % CV for 9 selected peaks in the radioprofiles generated with the Collect PAL were within 9.3%. Radioprofiles generated by collecting into 96- and 384-well plates were qualitatively comparable

  14. Simultaneous small-sample comparisons in longitudinal or multi-endpoint trials using multiple marginal models

    DEFF Research Database (Denmark)

    Pallmann, Philip; Ritz, Christian; Hothorn, Ludwig A

    2018-01-01

    , however only asymptotically. In this paper, we show how to make the approach also applicable to small-sample data problems. Specifically, we discuss the computation of adjusted P values and simultaneous confidence bounds for comparisons of randomised treatment groups as well as for levels......Simultaneous inference in longitudinal, repeated-measures, and multi-endpoint designs can be onerous, especially when trying to find a reasonable joint model from which the interesting effects and covariances are estimated. A novel statistical approach known as multiple marginal models greatly...... simplifies the modelling process: the core idea is to "marginalise" the problem and fit multiple small models to different portions of the data, and then estimate the overall covariance matrix in a subsequent, separate step. Using these estimates guarantees strong control of the family-wise error rate...

  15. A multiple sampling time projection ionization chamber for nuclear fragment tracking and charge measurement

    International Nuclear Information System (INIS)

    Bauer, G.; Bieser, F.; Brady, F.P.; Chance, J.C.; Christie, W.F.; Gilkes, M.; Lindenstruth, V.; Lynen, U.; Mueller, W.F.J.; Romero, J.L.; Sann, H.; Tull, C.E.; Warren, P.

    1997-01-01

    A detector has been developed for the tracking and charge measurement of the projectile fragment nuclei produced in relativistic nuclear collisions. This device, MUSIC II, is a second generation Multiple Sampling Ionization Chamber (MUSIC), and employs the principles of ionization and time projection chambers. It provides unique charge determination for charges Z≥6, and excellent track position measurement. MUSIC II has been used most recently with the EOS (equation of state) TPC and other EOS collaboration detectors. Earlier it was used with other systems in experiments at the Heavy Ion Superconducting Spectrometer (HISS) facility at Lawrence Berkeley Laboratory and the ALADIN spectrometer at GSI. (orig.)

  16. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam; Aslam, Muhammad; Jun, Chi-Hyuck

    2017-01-01

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  17. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam

    2017-03-25

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  18. Hierarchical sampling of multiple strata: an innovative technique in exposure characterization

    International Nuclear Information System (INIS)

    Ericson, J.E.; Gonzalez, Elisabeth J.

    2003-01-01

    Sampling of multiple strata, or hierarchical sampling of various exposure sources and activity areas, has been tested and is suggested as a method to sample (or to locate) areas with a high prevalence of elevated blood lead in children. Hierarchical sampling was devised to supplement traditional soil lead sampling of a single stratum, either residential or fixed point source, using a multistep strategy. Blood lead (n=1141) and soil lead (n=378) data collected under the USEPA/UCI Tijuana Lead Project (1996-1999) were analyzed to evaluate the usefulness of sampling soil lead from background sites, schools and parks, point sources, and residences. Results revealed that industrial emissions have been a contributing factor to soil lead contamination in Tijuana. At the regional level, point source soil lead was associated with mean blood lead levels and concurrent high background, and point source soil lead levels were predictive of a high percentage of subjects with blood lead equal to or greater than 10 μg/dL (pe 10). Significant relationships were observed between mean blood lead level and fixed point source soil lead (r=0.93; P 2 =0.72 using a quadratic model) and between residential soil lead and fixed point source soil lead (r=0.90; P 2 =0.86 using a cubic model). This study suggests that point sources alone are not sufficient for predicting the relative risk of exposure to lead in the urban environment. These findings will be useful in defining regions for targeted or universal soil lead sampling by site type. Point sources have been observed to be predictive of mean blood lead at the regional level; however, this relationship alone was not sufficient to predict pe 10. It is concluded that when apparently undisturbed sites reveal high soil lead levels in addition to local point sources, dispersion of lead is widespread and will be associated with a high prevalence of elevated blood lead in children. Multiple strata sampling was shown to be useful in

  19. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    International Nuclear Information System (INIS)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-01-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults. (paper)

  20. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    Science.gov (United States)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-02-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults.

  1. A multiple sampling ionization chamber (MUSIC) for measuring the charge of relativistic heavy ions

    International Nuclear Information System (INIS)

    Christie, W.B.; Romero, J.L.; Brady, F.P.; Tull, C.E.; Castaneda, C.M.; Barasch, E.F.; Webb, M.L.; Drummond, J.R.; Sann, H.; Young, J.C.

    1987-01-01

    A large area (1 m x 2 m) multiple sampling ionization chamber (MUSIC) has been constructed and tested. The MUSIC detector makes multiple measurements of energy 'loss', dE/dx, for a relativistic heavy ion. Given the velocity, the charge of the ion can be extracted from the energy loss distributions. The widths of the distributions we observe are much narrower than predicted by Vavilov's theory for energy loss while agreeing well with the theory of Badhwar which deals with the energy deposited. The versatile design of MUSIC allows a variety of anode configurations which results in a large dynamic range of charge. In our tests to date we have observed charge resolutions of 0.25e fwhm for 727 MeV/nucleon 40 Ar and 0.30e fwhm for 1.08 GeV/nucleon 139 La and 139 La fragments. Vertical position and multiple track determination are obtained by using time projection chamber electronics. Preliminary tests indicate that the position resolution is also very good with σ≅100 μm. (orig.)

  2. Psychosocial risks associated with multiple births resulting from assisted reproduction: a Spanish sample.

    Science.gov (United States)

    Roca de Bes, Montserrat; Gutierrez Maldonado, José; Gris Martínez, José M

    2009-09-01

    To determine the psychosocial risks associated with multiple births (twins or triplets) resulting from assisted reproductive technology (ART). Transverse study. Infertility units of a university hospital and a private hospital. Mothers and fathers of children between 6 months and 4 years conceived by ART (n = 123). The sample was divided into three groups: parents of singletons (n = 77), twins (n = 37), and triplets (n = 9). The questionnaire was self-administered by patients. It was either completed at the hospital or mailed to participants' homes. Scales measured material needs, quality of life, social stigma, depression, stress, and marital satisfaction. Logistic regression models were applied. Significant odds ratios were obtained for the number of children, material needs, social stigma, quality of life, and marital satisfaction. The results were more significant for data provided by mothers than by fathers. The informed consent form handed out at the beginning of ART should include information on the high risk of conceiving twins and triplets and on the possible psychosocial consequences of multiple births. As soon as a multiple pregnancy is confirmed, it would be useful to provide information on support groups and institutions. Psychological advice should also be given to the parents.

  3. A neutron multiplicity analysis method for uranium samples with liquid scintillators

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Hao, E-mail: zhouhao_ciae@126.com [China Institute of Atomic Energy, P.O.BOX 275-8, Beijing 102413 (China); Lin, Hongtao [Xi' an Reasearch Institute of High-tech, Xi' an, Shaanxi 710025 (China); Liu, Guorong; Li, Jinghuai; Liang, Qinglei; Zhao, Yonggang [China Institute of Atomic Energy, P.O.BOX 275-8, Beijing 102413 (China)

    2015-10-11

    A new neutron multiplicity analysis method for uranium samples with liquid scintillators is introduced. An active well-type fast neutron multiplicity counter has been built, which consists of four BC501A liquid scintillators, a n/γdiscrimination module MPD-4, a multi-stop time to digital convertor MCS6A, and two Am–Li sources. A mathematical model is built to symbolize the detection processes of fission neutrons. Based on this model, equations in the form of R=F*P*Q*T could be achieved, where F indicates the induced fission rate by interrogation sources, P indicates the transfer matrix determined by multiplication process, Q indicates the transfer matrix determined by detection efficiency, T indicates the transfer matrix determined by signal recording process and crosstalk in the counter. Unknown parameters about the item are determined by the solutions of the equations. A {sup 252}Cf source and some low enriched uranium items have been measured. The feasibility of the method is proven by its application to the data analysis of the experiments.

  4. Multiple sampling ionization chamber (MUSIC) for measuring the charge of relativistic heavy ions

    Energy Technology Data Exchange (ETDEWEB)

    Christie, W.B.; Romero, J.L.; Brady, F.P.; Tull, C.E.; Castaneda, C.M.; Barasch, E.F.; Webb, M.L.; Drummond, J.R.; Crawford, H.J.; Flores, I.

    1987-04-01

    A large area (1 m x 2 m) multiple sampling ionization chamber (MUSIC) has been constructed and tested. The MUSIC detector makes multiple measurements of energy 'loss', dE/dx, for a relativistic heavy ion. Given the velocity, the charge of the ion can be extracted from the energy loss distributions. The widths of the distributions we observe are much narrower than predicted by Vavilov's theory for energy loss while agreeing well with the theory of Badhwar which deals with the energy deposited. The versatile design of MUSIC allows a variety of anode configurations which results in a large dynamic range of charge. In our tests to date we have observed charge resolutions of 0.25e fwhm for 727 MeV/nucleon /sup 40/Ar and 0.30e fwhm for 1.08 GeV/nucleon /sup 139/La and /sup 139/La fragments. Vertical position and multiple track determination are obtained by using time projection chamber electronics. Preliminary tests indicate that the position resolution is also very good with sigmaapprox. =100 ..mu..m.

  5. Use of multiple tobacco products in a national sample of persons enrolled in addiction treatment.

    Science.gov (United States)

    Guydish, Joseph; Tajima, Barbara; Pramod, Sowmya; Le, Thao; Gubner, Noah R; Campbell, Barbara; Roman, Paul

    2016-09-01

    To explore use of tobacco products in relationship to marketing exposure among persons in addiction treatment. A random sample of treatment programs was drawn from the National Institute on Drug Abuse (NIDA) Clinical Trials Network (CTN). Participants in each program completed surveys concerning use of tobacco products (N=1113). Exposure to tobacco marketing and counter-marketing, advertising receptivity, and perceived health risks of smoking were tested for their association with use of multiple tobacco products. Prevalence of combustible cigarette use was 77.9%. Weekly or greater use of other products was: e-cigarettes (17.7%), little filtered cigars (8.6%), smokeless tobacco (5.2%), and standard cigars (4.6%) with 24.4% using multiple tobacco products. Compared to single product users, multiple product users smoked more cigarettes per day (OR=1.03, 95% CI 1.01-1.05, padvertising for products other than combustible cigarettes (OR=1.93, CI 1.35-2.75, ptobacco counter-marketing (OR=1.70, 95% CI: 1.09-2.63, p=0.019). Heavier smokers and those trying to quit may be more likely to use e-cigarettes, little filtered cigars, or smokeless tobacco and have greater susceptibility to their advertising. This highlights the importance of regulating advertising related to smoking cessation as their effectiveness for this purpose has not been demonstrated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Automated identification of best-quality coronary artery segments from multiple-phase coronary CT angiography (cCTA) for vessel analysis

    Science.gov (United States)

    Zhou, Chuan; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A.

    2016-03-01

    We are developing an automated method to identify the best quality segment among the corresponding segments in multiple-phase cCTA. The coronary artery trees are automatically extracted from different cCTA phases using our multi-scale vessel segmentation and tracking method. An automated registration method is then used to align the multiple-phase artery trees. The corresponding coronary artery segments are identified in the registered vessel trees and are straightened by curved planar reformation (CPR). Four features are extracted from each segment in each phase as quality indicators in the original CT volume and the straightened CPR volume. Each quality indicator is used as a voting classifier to vote the corresponding segments. A newly designed weighted voting ensemble (WVE) classifier is finally used to determine the best-quality coronary segment. An observer preference study is conducted with three readers to visually rate the quality of the vessels in 1 to 6 rankings. Six and 10 cCTA cases are used as training and test set in this preliminary study. For the 10 test cases, the agreement between automatically identified best-quality (AI-BQ) segments and radiologist's top 2 rankings is 79.7%, and between AI-BQ and the other two readers are 74.8% and 83.7%, respectively. The results demonstrated that the performance of our automated method was comparable to those of experienced readers for identification of the best-quality coronary segments.

  7. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  8. Estimating Dbh of Trees Employing Multiple Linear Regression of the best Lidar-Derived Parameter Combination Automated in Python in a Natural Broadleaf Forest in the Philippines

    Science.gov (United States)

    Ibanez, C. A. G.; Carcellar, B. G., III; Paringit, E. C.; Argamosa, R. J. L.; Faelga, R. A. G.; Posilero, M. A. V.; Zaragosa, G. P.; Dimayacyac, N. A.

    2016-06-01

    Diameter-at-Breast-Height Estimation is a prerequisite in various allometric equations estimating important forestry indices like stem volume, basal area, biomass and carbon stock. LiDAR Technology has a means of directly obtaining different forest parameters, except DBH, from the behavior and characteristics of point cloud unique in different forest classes. Extensive tree inventory was done on a two-hectare established sample plot in Mt. Makiling, Laguna for a natural growth forest. Coordinates, height, and canopy cover were measured and types of species were identified to compare to LiDAR derivatives. Multiple linear regression was used to get LiDAR-derived DBH by integrating field-derived DBH and 27 LiDAR-derived parameters at 20m, 10m, and 5m grid resolutions. To know the best combination of parameters in DBH Estimation, all possible combinations of parameters were generated and automated using python scripts and additional regression related libraries such as Numpy, Scipy, and Scikit learn were used. The combination that yields the highest r-squared or coefficient of determination and lowest AIC (Akaike's Information Criterion) and BIC (Bayesian Information Criterion) was determined to be the best equation. The equation is at its best using 11 parameters at 10mgrid size and at of 0.604 r-squared, 154.04 AIC and 175.08 BIC. Combination of parameters may differ among forest classes for further studies. Additional statistical tests can be supplemented to help determine the correlation among parameters such as Kaiser- Meyer-Olkin (KMO) Coefficient and the Barlett's Test for Spherecity (BTS).

  9. Multiple sample setup for testing the hydrothermal stability of adsorbents in thermal energy storage applications

    International Nuclear Information System (INIS)

    Fischer, Fabian; Laevemann, Eberhard

    2015-01-01

    Thermal energy storage based on adsorption and desorption of water on an adsorbent can achieve high energy storage densities. Many adsorbents lose adsorption capacity when operated under unfavourable hydrothermal conditions during adsorption and desorption. The stability of an adsorbent against stressing hydrothermal conditions is a key issue for its usability in adsorption thermal energy storage. We built an experimental setup that simultaneously controls the hydrothermal conditions of 16 samples arranged in a matrix of four temperatures and four water vapour pressures. This setup allows the testing of potential adsorbents between temperatures of 50 °C and 350 °C and water vapour pressures of up to 32 kPa. A measurement procedure that allows the detection of the hydrothermal stability of an adsorbent after defined time spans has been designed. We verified the functionality of the multiple sample measurements with a microporous adsorbent, a zeolite NaMSX. The hydrothermal stability of this zeolite is tested by water uptake measurements. A standard deviation lower than 1% of the 16 samples for detecting the hydrothermal stability enables setting different conditions in each sample cell. Further, we compared the water uptake measurements by measuring their adsorption isotherms with the volumetric device BELSORP Aqua 3 from Bel Japan. (paper)

  10. Detecting Renibacterium salmoninarum in wild brown trout by use of multiple organ samples and diagnostic methods

    Science.gov (United States)

    Guomundsdottir, S.; Applegate, Lynn M.; Arnason, I.O.; Kristmundsson, A.; Purcell, Maureen K.; Elliott, Diane G.

    2017-01-01

    Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease (BKD), is endemic in many wild trout species in northerly regions. The aim of the present study was to determine the optimal R. salmoninarum sampling/testing strategy for wild brown trout (Salmo trutta L.) populations in Iceland. Fish were netted in a lake and multiple organs—kidney, spleen, gills, oesophagus and mid-gut—were sampled and subjected to five detection tests i.e. culture, polyclonal enzyme-linked immunosorbent assay (pELISA) and three different PCR tests. The results showed that each fish had encountered R. salmoninarum but there were marked differences between results obtained depending on organ and test. The bacterium was not cultured from any kidney sample while all kidney samples were positive by pELISA. At least one organ from 92.9% of the fish tested positive by PCR. The results demonstrated that the choice of tissue and diagnostic method can dramatically influence the outcome of R. salmoninarum surveys.

  11. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Modular time division multiplexer: Efficient simultaneous characterization of fast and slow transients in multiple samples

    Science.gov (United States)

    Kim, Stephan D.; Luo, Jiajun; Buchholz, D. Bruce; Chang, R. P. H.; Grayson, M.

    2016-09-01

    A modular time division multiplexer (MTDM) device is introduced to enable parallel measurement of multiple samples with both fast and slow decay transients spanning from millisecond to month-long time scales. This is achieved by dedicating a single high-speed measurement instrument for rapid data collection at the start of a transient, and by multiplexing a second low-speed measurement instrument for slow data collection of several samples in parallel for the later transients. The MTDM is a high-level design concept that can in principle measure an arbitrary number of samples, and the low cost implementation here allows up to 16 samples to be measured in parallel over several months, reducing the total ensemble measurement duration and equipment usage by as much as an order of magnitude without sacrificing fidelity. The MTDM was successfully demonstrated by simultaneously measuring the photoconductivity of three amorphous indium-gallium-zinc-oxide thin films with 20 ms data resolution for fast transients and an uninterrupted parallel run time of over 20 days. The MTDM has potential applications in many areas of research that manifest response times spanning many orders of magnitude, such as photovoltaics, rechargeable batteries, amorphous semiconductors such as silicon and amorphous indium-gallium-zinc-oxide.

  13. Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.

    Science.gov (United States)

    Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej

    2017-05-15

    Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.

  14. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  15. DEP-On-Go for Simultaneous Sensing of Multiple Heavy Metals Pollutants in Environmental Samples

    Directory of Open Access Journals (Sweden)

    Madhu Biyani

    2016-12-01

    Full Text Available We describe a simple and affordable “Disposable electrode printed (DEP-On-Go” sensing platform for the rapid on-site monitoring of trace heavy metal pollutants in environmental samples for early warning by developing a mobile electrochemical device composed of palm-sized potentiostat and disposable unmodified screen-printed electrode chips. We present the analytical performance of our device for the sensitive detection of major heavy metal ions, namely, mercury, cadmium, lead, arsenic, zinc, and copper with detection limits of 1.5, 2.6, 4.0, 5.0, 14.4, and, 15.5 μg·L−1, respectively. Importantly, the utility of this device is extended to detect multiple heavy metals simultaneously with well-defined voltammograms and similar sensitivity. Finally, “DEP-On-Go” was successfully applied to detect heavy metals in real environmental samples from groundwater, tap water, house dust, soil, and industry-processed rice and noodle foods. We evaluated the efficiency of this system with a linear correlation through inductively coupled plasma mass spectrometry, and the results suggested that this system can be reliable for on-site screening purposes. On-field applications using real samples of groundwater for drinking in the northern parts of India support the easy-to-detect, low-cost (<1 USD, rapid (within 5 min, and reliable detection limit (ppb levels performance of our device for the on-site detection and monitoring of multiple heavy metals in resource-limited settings.

  16. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  17. Digital Droplet Multiple Displacement Amplification (ddMDA for Whole Genome Sequencing of Limited DNA Samples.

    Directory of Open Access Journals (Sweden)

    Minsoung Rhee

    Full Text Available Multiple displacement amplification (MDA is a widely used technique for amplification of DNA from samples containing limited amounts of DNA (e.g., uncultivable microbes or clinical samples before whole genome sequencing. Despite its advantages of high yield and fidelity, it suffers from high amplification bias and non-specific amplification when amplifying sub-nanogram of template DNA. Here, we present a microfluidic digital droplet MDA (ddMDA technique where partitioning of the template DNA into thousands of sub-nanoliter droplets, each containing a small number of DNA fragments, greatly reduces the competition among DNA fragments for primers and polymerase thereby greatly reducing amplification bias. Consequently, the ddMDA approach enabled a more uniform coverage of amplification over the entire length of the genome, with significantly lower bias and non-specific amplification than conventional MDA. For a sample containing 0.1 pg/μL of E. coli DNA (equivalent of ~3/1000 of an E. coli genome per droplet, ddMDA achieves a 65-fold increase in coverage in de novo assembly, and more than 20-fold increase in specificity (percentage of reads mapping to E. coli compared to the conventional tube MDA. ddMDA offers a powerful method useful for many applications including medical diagnostics, forensics, and environmental microbiology.

  18. Automated gravimetric sample pretreatment using an industrial robot for the high-precision determination of plutonium by isotope dilution mass spectrometry

    International Nuclear Information System (INIS)

    Surugaya, Naoki; Hiyama, Toshiaki; Watahiki, Masaru

    2008-01-01

    A robotized sample-preparation method for the determination of Pu, which is recovered by extraction reprocessing of spent nuclear fuel, by isotope dilution mass spectrometry (IDMS) is described. The automated system uses a six-axis industrial robot, whose motility is very fast, accurate, and flexible, installed in a glove box. The automation of the weighing and dilution steps enables operator-unattended sample pretreatment for the high-precision analysis of Pu in aqueous solutions. Using the developed system, the Pu concentration in a HNO 3 medium was successfully determined using a set of subsequent mass spectrometric measurements. The relative uncertainty in determining the Pu concentration by IDMS using this system was estimated to be less than 0.1% (k=2), which is equal to that expected of a talented analysis. The operation time required was the same as that for a skilled operator. (author)

  19. Automation of registration of sample weights for high-volume neutron activation analysis at the IBR-2 reactor of FLNP, JINR

    International Nuclear Information System (INIS)

    Dmitriev, A.Yu.; Dmitriev, F.A.

    2015-01-01

    The 'Weight' software tool was created at FLNP JINR to automate the reading of analytical balance readouts and saving these values in the NAA database. The analytical balance connected to the personal computer is used to measure weight values. The 'Weight' software tool controls the reading of weight values and the exchange of information with the NAA database. The weighing process of a large amount of samples is reliably provided during high-volume neutron activation analysis. [ru

  20. MITIE: Simultaneous RNA-Seq-based transcript identification and quantification in multiple samples.

    Science.gov (United States)

    Behr, Jonas; Kahles, André; Zhong, Yi; Sreedharan, Vipin T; Drewe, Philipp; Rätsch, Gunnar

    2013-10-15

    High-throughput sequencing of mRNA (RNA-Seq) has led to tremendous improvements in the detection of expressed genes and reconstruction of RNA transcripts. However, the extensive dynamic range of gene expression, technical limitations and biases, as well as the observed complexity of the transcriptional landscape, pose profound computational challenges for transcriptome reconstruction. We present the novel framework MITIE (Mixed Integer Transcript IdEntification) for simultaneous transcript reconstruction and quantification. We define a likelihood function based on the negative binomial distribution, use a regularization approach to select a few transcripts collectively explaining the observed read data and show how to find the optimal solution using Mixed Integer Programming. MITIE can (i) take advantage of known transcripts, (ii) reconstruct and quantify transcripts simultaneously in multiple samples, and (iii) resolve the location of multi-mapping reads. It is designed for genome- and assembly-based transcriptome reconstruction. We present an extensive study based on realistic simulated RNA-Seq data. When compared with state-of-the-art approaches, MITIE proves to be significantly more sensitive and overall more accurate. Moreover, MITIE yields substantial performance gains when used with multiple samples. We applied our system to 38 Drosophila melanogaster modENCODE RNA-Seq libraries and estimated the sensitivity of reconstructing omitted transcript annotations and the specificity with respect to annotated transcripts. Our results corroborate that a well-motivated objective paired with appropriate optimization techniques lead to significant improvements over the state-of-the-art in transcriptome reconstruction. MITIE is implemented in C++ and is available from http://bioweb.me/mitie under the GPL license.

  1. Automated dispersive liquid-liquid microextraction coupled to high performance liquid chromatography - cold vapour atomic fluorescence spectroscopy for the determination of mercury species in natural water samples.

    Science.gov (United States)

    Liu, Yao-Min; Zhang, Feng-Ping; Jiao, Bao-Yu; Rao, Jin-Yu; Leng, Geng

    2017-04-14

    An automated, home-constructed, and low cost dispersive liquid-liquid microextraction (DLLME) device that directly coupled to a high performance liquid chromatography (HPLC) - cold vapour atomic fluorescence spectroscopy (CVAFS) system was designed and developed for the determination of trace concentrations of methylmercury (MeHg + ), ethylmercury (EtHg + ) and inorganic mercury (Hg 2+ ) in natural waters. With a simple, miniaturized and efficient automated DLLME system, nanogram amounts of these mercury species were extracted from natural water samples and injected into a hyphenated HPLC-CVAFS for quantification. The complete analytical procedure, including chelation, extraction, phase separation, collection and injection of the extracts, as well as HPLC-CVAFS quantification, was automated. Key parameters, such as the type and volume of the chelation, extraction and dispersive solvent, aspiration speed, sample pH, salt effect and matrix effect, were thoroughly investigated. Under the optimum conditions, linear range was 10-1200ngL -1 for EtHg + and 5-450ngL -1 for MeHg + and Hg 2+ . Limits of detection were 3.0ngL -1 for EtHg + and 1.5ngL -1 for MeHg + and Hg 2+ . Reproducibility and recoveries were assessed by spiking three natural water samples with different Hg concentrations, giving recoveries from 88.4-96.1%, and relative standard deviations <5.1%. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    Energy Technology Data Exchange (ETDEWEB)

    Kertesz, Vilmos [ORNL; Van Berkel, Gary J [ORNL

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  3. Water-quality assessment of south-central Texas : comparison of water quality in surface-water samples collected manually and by automated samplers

    Science.gov (United States)

    Ging, Patricia B.

    1999-01-01

    Surface-water sampling protocols of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program specify samples for most properties and constituents to be collected manually in equal-width increments across a stream channel and composited for analysis. Single-point sampling with an automated sampler (autosampler) during storms was proposed in the upper part of the South-Central Texas NAWQA study unit, raising the question of whether property and constituent concentrations from automatically collected samples differ significantly from those in samples collected manually. Statistical (Wilcoxon signed-rank test) analyses of 3 to 16 paired concentrations for each of 26 properties and constituents from water samples collected using both methods at eight sites in the upper part of the study unit indicated that there were no significant differences in concentrations for dissolved constituents, other than calcium and organic carbon.

  4. ReseqChip: Automated integration of multiple local context probe data from the MitoChip array in mitochondrial DNA sequence assembly

    Directory of Open Access Journals (Sweden)

    Spang Rainer

    2009-12-01

    Full Text Available Abstract Background The Affymetrix MitoChip v2.0 is an oligonucleotide tiling array for the resequencing of the human mitochondrial (mt genome. For each of 16,569 nucleotide positions of the mt genome it holds two sets of four 25-mer probes each that match the heavy and the light strand of a reference mt genome and vary only at their central position to interrogate all four possible alleles. In addition, the MitoChip v2.0 carries alternative local context probes to account for known mtDNA variants. These probes have been neglected in most studies due to the lack of software for their automated analysis. Results We provide ReseqChip, a free software that automates the process of resequencing mtDNA using multiple local context probes on the MitoChip v2.0. ReseqChip significantly improves base call rate and sequence accuracy. ReseqChip is available at http://code.open-bio.org/svnweb/index.cgi/bioperl/browse/bioperl-live/trunk/Bio/Microarray/Tools/. Conclusions ReseqChip allows for the automated consolidation of base calls from alternative local mt genome context probes. It thereby improves the accuracy of resequencing, while reducing the number of non-called bases.

  5. Identification of multiple mRNA and DNA sequences from small tissue samples isolated by laser-assisted microdissection.

    Science.gov (United States)

    Bernsen, M R; Dijkman, H B; de Vries, E; Figdor, C G; Ruiter, D J; Adema, G J; van Muijen, G N

    1998-10-01

    Molecular analysis of small tissue samples has become increasingly important in biomedical studies. Using a laser dissection microscope and modified nucleic acid isolation protocols, we demonstrate that multiple mRNA as well as DNA sequences can be identified from a single-cell sample. In addition, we show that the specificity of procurement of tissue samples is not compromised by smear contamination resulting from scraping of the microtome knife during sectioning of lesions. The procedures described herein thus allow for efficient RT-PCR or PCR analysis of multiple nucleic acid sequences from small tissue samples obtained by laser-assisted microdissection.

  6. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    Science.gov (United States)

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  7. Presence and significant determinants of cognitive impairment in a large sample of patients with multiple sclerosis.

    Directory of Open Access Journals (Sweden)

    Martina Borghi

    Full Text Available OBJECTIVES: To investigate the presence and the nature of cognitive impairment in a large sample of patients with Multiple Sclerosis (MS, and to identify clinical and demographic determinants of cognitive impairment in MS. METHODS: 303 patients with MS and 279 healthy controls were administered the Brief Repeatable Battery of Neuropsychological tests (BRB-N; measures of pre-morbid verbal competence and neuropsychiatric measures were also administered. RESULTS: Patients and healthy controls were matched for age, gender, education and pre-morbid verbal Intelligence Quotient. Patients presenting with cognitive impairment were 108/303 (35.6%. In the overall group of participants, the significant predictors of the most sensitive BRB-N scores were: presence of MS, age, education, and Vocabulary. The significant predictors when considering MS patients only were: course of MS, age, education, vocabulary, and depression. Using logistic regression analyses, significant determinants of the presence of cognitive impairment in relapsing-remitting MS patients were: duration of illness (OR = 1.053, 95% CI = 1.010-1.097, p = 0.015, Expanded Disability Status Scale score (OR = 1.247, 95% CI = 1.024-1.517, p = 0.028, and vocabulary (OR = 0.960, 95% CI = 0.936-0.984, p = 0.001, while in the smaller group of progressive MS patients these predictors did not play a significant role in determining the cognitive outcome. CONCLUSIONS: Our results corroborate the evidence about the presence and the nature of cognitive impairment in a large sample of patients with MS. Furthermore, our findings identify significant clinical and demographic determinants of cognitive impairment in a large sample of MS patients for the first time. Implications for further research and clinical practice were discussed.

  8. SIG: Multiple Views on Safety-Critical Automation: Aircraft, Autonomous Vehicles, Air Traffic Management and Satellite Ground Segments Perspectives

    Science.gov (United States)

    Feary, Michael; Palanque, Philippe; Martinie, Célia; Tscheligi, Manfred

    2016-01-01

    This SIG focuses on the engineering of automation in interactive critical systems. Automation has already been studied in a number of (sub-) disciplines and application fields: design, human factors, psychology, (software) engineering, aviation, health care, games. One distinguishing feature of the area we are focusing on is that in the field of interactive critical systems properties such as reliability, dependability, fault tolerance are as important as usability, user experience or overall acceptance issues. The SIG targets at two problem areas: first the engineering of the user interaction with (partly-) autonomous systems: how to design, build and assess autonomous behavior, especially in cases where there is a need to represent on the user interface both autonomous and interactive objects. An example of such integration is the representation of an unmanned aerial vehicle (UAV) (where no direct interaction is possible), together with aircrafts (that have to be instructed by an air traffic controller to avoid the UAV). Second the design and engineering of user interaction in general for autonomous objects/systems (for example a cruise control in a car or an autopilot in an aircraft). The goal of the SIG is to raise interest in the CHI community on the general aspects of automation and to identify a community of researchers and practitioners interested in those increasingly prominent issues of interfaces towards (semi)-autonomous systems. The expected audience should be interested in addressing the issues of integration of mainly unconnected research domains to formulate a new joint research agenda.

  9. Multiple Views on Safety-Critical Automation: Aircraft, Autonomous Vehicles, Air Traffic Management and Satellite Ground Segments Perspectives

    Science.gov (United States)

    Feary, Michael S.; Palanque, Philippe Andre Rolan; Martinie, De Almeida; Tscheligi, Manfred

    2016-01-01

    This SIG focuses on the engineering of automation in interactive critical systems. Automation has already been studied in a number of (sub-) disciplines and application fields: design, human factors, psychology, (software) engineering, aviation, health care, games. One distinguishing feature of the area we are focusing on is that in the field of interactive critical systems properties such as reliability, dependability, fault-tolerance are as important as usability, user experience or overall acceptance issues. The SIG targets at two problem areas: first the engineering of the user interaction with (partly-) autonomous systems: how to design, build and assess autonomous behavior, especially in cases where there is a need to represent on the user interface both autonomous and interactive objects. An example of such integration is the representation of an unmanned aerial vehicle (UAV) (where no direct interaction is possible), together with aircrafts (that have to be instructed by an air traffic controller to avoid the UAV). Second the design and engineering of user interaction in general for autonomous objects systems (for example a cruise control in a car or an autopilot in an aircraft). The goal of the SIG is to raise interest in the CHI community on the general aspects of automation and to identify a community of researchers and practitioners interested in those increasingly prominent issues of interfaces towards (semi)-autonomous systems. The expected audience should be interested in addressing the issues of integration of mainly unconnected research domains to formulate a new joint research agenda.

  10. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    Science.gov (United States)

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  11. A Fast Multiple Sampling Method for Low-Noise CMOS Image Sensors With Column-Parallel 12-bit SAR ADCs

    Directory of Open Access Journals (Sweden)

    Min-Kyu Kim

    2015-12-01

    Full Text Available This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs. The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB.

  12. Automated Classification of Seedlings Using Computer Vision

    DEFF Research Database (Denmark)

    Dyrmann, Mads; Christiansen, Peter

    The objective of this project is to investigate the possibilities of recognizing plant species at multiple growth stages based on RGB images. Plants and leaves are initially segmented from a database through a partly automated procedure providing samples of 2438 plants and 4767 leaves distributed...

  13. Language profiles in young children with autism spectrum disorder: A community sample using multiple assessment instruments.

    Science.gov (United States)

    Nevill, Rose; Hedley, Darren; Uljarević, Mirko; Sahin, Ensu; Zadek, Johanna; Butter, Eric; Mulick, James A

    2017-11-01

    This study investigated language profiles in a community-based sample of 104 children aged 1-3 years who had been diagnosed with autism spectrum disorder using Diagnostic and Statistical Manual of Mental Disorders (5th ed.) diagnostic criteria. Language was assessed with the Mullen scales, Preschool Language Scale, fifth edition, and Vineland-II parent-report. The study aimed to determine whether the receptive-to-expressive language profile is independent from the assessment instrument used, and whether nonverbal cognition, early communicative behaviors, and autism spectrum disorder symptoms predict language scores. Receptive-to-expressive language profiles differed between assessment instruments and reporters, and Preschool Language Scale, fifth edition profiles were also dependent on developmental level. Nonverbal cognition and joint attention significantly predicted receptive language scores, and nonverbal cognition and frequency of vocalizations predicted expressive language scores. These findings support the administration of multiple direct assessment and parent-report instruments when evaluating language in young children with autism spectrum disorder, for both research and in clinical settings. Results also support that joint attention is a useful intervention target for improving receptive language skills in young children with autism spectrum disorder. Future research comparing language profiles of young children with autism spectrum disorder to children with non-autism spectrum disorder developmental delays and typical development will add to our knowledge of early language development in children with autism spectrum disorder.

  14. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    International Nuclear Information System (INIS)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L"−"1 Na_2CO_3) and the proton donor solution (1 mol L"−"1 CH_3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min"−"1 during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L"−"1 of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L"−"1. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  15. Accelerated solvent extraction (ASE) - a fast and automated technique with low solvent consumption for the extraction of solid samples (T12)

    International Nuclear Information System (INIS)

    Hoefler, F.

    2002-01-01

    Full text: Accelerated solvent extraction (ASE) is a modern extraction technique that significantly streamlines sample preparation. A common organic solvent as well as water is used as extraction solvent at elevated temperature and pressure to increase extraction speed and efficiency. The entire extraction process is fully automated and performed within 15 minutes with a solvent consumption of 18 ml for a 10 g sample. For many matrices and for a variety of solutes, ASE has proven to be equivalent or superior to sonication, Soxhlet, and reflux extraction techniques while requiring less time, solvent and labor. First ASE has been applied for the extraction of environmental hazards from solid matrices. Within a very short time ASE was approved by the U.S. EPA for the extraction of BNAs, PAHs, PCBs, pesticides, herbicides, TPH, and dioxins from solid samples in method 3545. Especially for the extraction of dioxins the extraction time with ASE is reduced to 20 minutes in comparison to 18 h using Soxhlet. In food analysis ASE is used for the extraction of pesticide and mycotoxin residues from fruits and vegetables, the fat determination and extraction of vitamins. Time consuming and solvent intensive methods for the extraction of additives from polymers as well as for the extraction of marker compounds from herbal supplements can be performed with higher efficiencies using ASE. For the analysis of chemical weapons the extraction process and sample clean-up including derivatization can be automated and combined with GC-MS using an online ASE-APEC-GC system. (author)

  16. Manual versus automated streaking system in clinical microbiology laboratory: Performance evaluation of Previ Isola for blood culture and body fluid samples.

    Science.gov (United States)

    Choi, Qute; Kim, Hyun Jin; Kim, Jong Wan; Kwon, Gye Cheol; Koo, Sun Hoe

    2018-01-04

    The process of plate streaking has been automated to improve routine workflow of clinical microbiology laboratories. Although there were many evaluation reports about the inoculation of various body fluid samples, few evaluations have been reported for blood. In this study, we evaluated the performance of automated inoculating system, Previ Isola for various routine clinical samples including blood. Blood culture, body fluid, and urine samples were collected. All samples were inoculated on both sheep blood agar plate (BAP) and MacConkey agar plate (MCK) using Previ Isola and manual method. We compared two methods in aspect of quality and quantity of cultures, and sample processing time. To ensure objective colony counting, an enumeration reading reference was made through a preliminary experiment. A total of 377 nonduplicate samples (102 blood culture, 203 urine, 72 body fluid) were collected and inoculated. The concordance rate of quality was 100%, 97.0%, and 98.6% in blood, urine, and other body fluids, respectively. In quantitative aspect, it was 98.0%, 97.0%, and 95.8%, respectively. The Previ Isola took a little longer to inoculate the specimen than manual method, but the hands-on time decreased dramatically. The shortened hands-on time using Previ Isola was about 6 minutes per 10 samples. We demonstrated that the Previ Isola showed high concordance with the manual method in the inoculation of various body fluids, especially in blood culture sample. The use of Previ Isola in clinical microbiology laboratories is expected to save considerable time and human resources. © 2018 Wiley Periodicals, Inc.

  17. A Tool for Multiple Targeted Genome Deletions that Is Precise, Scar-Free, and Suitable for Automation.

    Directory of Open Access Journals (Sweden)

    Wayne Aubrey

    Full Text Available Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences, or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1 a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2 software to design the method's primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome.

  18. A replica exchange transition interface sampling method with multiple interface sets for investigating networks of rare events

    Science.gov (United States)

    Swenson, David W. H.; Bolhuis, Peter G.

    2014-07-01

    The multiple state transition interface sampling (TIS) framework in principle allows the simulation of a large network of complex rare event transitions, but in practice suffers from convergence problems. To improve convergence, we combine multiple state TIS [J. Rogal and P. G. Bolhuis, J. Chem. Phys. 129, 224107 (2008)] with replica exchange TIS [T. S. van Erp, Phys. Rev. Lett. 98, 268301 (2007)]. In addition, we introduce multiple interface sets, which allow more than one order parameter to be defined for each state. We illustrate the methodology on a model system of multiple independent dimers, each with two states. For reaction networks with up to 64 microstates, we determine the kinetics in the microcanonical ensemble, and discuss the convergence properties of the sampling scheme. For this model, we find that the kinetics depend on the instantaneous composition of the system. We explain this dependence in terms of the system's potential and kinetic energy.

  19. Sample size requirements for one-year treatment effects using deep gray matter volume from 3T MRI in progressive forms of multiple sclerosis.

    Science.gov (United States)

    Kim, Gloria; Chu, Renxin; Yousuf, Fawad; Tauhid, Shahamat; Stazzone, Lynn; Houtchens, Maria K; Stankiewicz, James M; Severson, Christopher; Kimbrough, Dorlan; Quintana, Francisco J; Chitnis, Tanuja; Weiner, Howard L; Healy, Brian C; Bakshi, Rohit

    2017-11-01

    The subcortical deep gray matter (DGM) develops selective, progressive, and clinically relevant atrophy in progressive forms of multiple sclerosis (PMS). This patient population is the target of active neurotherapeutic development, requiring the availability of outcome measures. We tested a fully automated MRI analysis pipeline to assess DGM atrophy in PMS. Consistent 3D T1-weighted high-resolution 3T brain MRI was obtained over one year in 19 consecutive patients with PMS [15 secondary progressive, 4 primary progressive, 53% women, age (mean±SD) 50.8±8.0 years, Expanded Disability Status Scale (median, range) 5.0, 2.0-6.5)]. DGM segmentation applied the fully automated FSL-FIRST pipeline ( http://fsl.fmrib.ox.ac.uk ). Total DGM volume was the sum of the caudate, putamen, globus pallidus, and thalamus. On-study change was calculated using a random-effects linear regression model. We detected one-year decreases in raw [mean (95% confidence interval): -0.749 ml (-1.455, -0.043), p = 0.039] and annualized [-0.754 ml/year (-1.492, -0.016), p = 0.046] total DGM volumes. A treatment trial for an intervention that would show a 50% reduction in DGM brain atrophy would require a sample size of 123 patients for a single-arm study (one-year run-in followed by one-year on-treatment). For a two-arm placebo-controlled one-year study, 242 patients would be required per arm. The use of DGM fraction required more patients. The thalamus, putamen, and globus pallidus, showed smaller effect sizes in their on-study changes than the total DGM; however, for the caudate, the effect sizes were somewhat larger. DGM atrophy may prove efficient as a short-term outcome for proof-of-concept neurotherapeutic trials in PMS.

  20. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  1. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  2. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    Science.gov (United States)

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Compensation Methods for Non-uniform and Incomplete Data Sampling in High Resolution PET with Multiple Scintillation Crystal Layers

    International Nuclear Information System (INIS)

    Lee, Jae Sung; Kim, Soo Mee; Lee, Dong Soo; Hong, Jong Hong; Sim, Kwang Souk; Rhee, June Tak

    2008-01-01

    To establish the methods for sinogram formation and correction in order to appropriately apply the filtered backprojection (FBP) reconstruction algorithm to the data acquired using PET scanner with multiple scintillation crystal layers. Formation for raw PET data storage and conversion methods from listmode data to histogram and sinogram were optimized. To solve the various problems occurred while the raw histogram was converted into sinogram, optimal sampling strategy and sampling efficiency correction method were investigated. Gap compensation methods that is unique in this system were also investigated. All the sinogram data were reconstructed using 2D filtered backprojection algorithm and compared to estimate the improvements by the correction algorithms. Optimal radial sampling interval and number of angular samples in terms of the sampling theorem and sampling efficiency correction algorithm were pitch/2 and 120, respectively. By applying the sampling efficiency correction and gap compensation, artifacts and background noise on the reconstructed image could be reduced. Conversion method from the histogram to sinogram was investigated for the FBP reconstruction of data acquired using multiple scintillation crystal layers. This method will be useful for the fast 2D reconstruction of multiple crystal layer PET data

  4. Rare events via multiple reaction channels sampled by path replica exchange

    NARCIS (Netherlands)

    Bolhuis, P.G.

    2008-01-01

    Transition path sampling (TPS) was developed for studying activated processes in complex systems with unknown reaction coordinate. Transition interface sampling (TIS) allows efficient evaluation of the rate constants. However, when the transition can occur via more than one reaction channel

  5. The gingival vein as a minimally traumatic site for multiple blood sampling in guinea pigs and hamsters.

    Science.gov (United States)

    Rodrigues, Mariana Valotta; de Castro, Simone Oliveira; de Albuquerque, Cynthia Zaccanini; Mattaraia, Vânia Gomes de Moura; Santoro, Marcelo Larami

    2017-01-01

    Laboratory animals are still necessary in scientific investigation and vaccine testing, but while novel methodological approaches are not available for their replacement, the search for new, humane, easy, and painless methods is necessary to diminish their stress and pain. When multiple blood samples are to be collected from hamsters and guinea pigs, the number of available venipuncture sites-which are greatly diminished in these species in comparison with other rodents due to the absence of a long tail-, harasses animal caregivers and researchers. Thus, this study aimed to evaluate if gingival vein puncture could be used as an additional route to obtain multiple blood samples from anesthetized hamsters and guinea pigs in such a way that animal behavior, well-being or hematological parameters would not be altered. Thus, twelve anesthetized Syrian golden hamsters and English guinea pigs were randomly allocated in two groups: a control group, whose blood samples were not collected, and an experimental group in which blood samples (200 microliters) were collected by gingival vein puncture at weekly intervals over six weeks. Clinical assessment, body weight gain and complete blood cell count were evaluated weekly, and control and experimental animals were euthanized at week seven, when the mentolabial region was processed to histological analyses. Multiple blood sampling from the gingival vein evoked no clinical manifestations or alteration in the behavioral repertoire, nor a statistically significant difference in weight gain in both species. Guinea pigs showed no alteration in red blood cell, leukocyte or platelet parameters over time. Hamsters developed a characteristic pattern of age-related physiological changes, which were considered normal. Histological analyses showed no difference in morphological structures in the interdental gingiva, confirming that multiple blood sampling is barely traumatic. Thus, these results evidence that blood collection from multiple

  6. A novel scheme for the validation of an automated classification method for epileptic spikes by comparison with multiple observers.

    Science.gov (United States)

    Sharma, Niraj K; Pedreira, Carlos; Centeno, Maria; Chaudhary, Umair J; Wehner, Tim; França, Lucas G S; Yadee, Tinonkorn; Murta, Teresa; Leite, Marco; Vos, Sjoerd B; Ourselin, Sebastien; Diehl, Beate; Lemieux, Louis

    2017-07-01

    To validate the application of an automated neuronal spike classification algorithm, Wave_clus (WC), on interictal epileptiform discharges (IED) obtained from human intracranial EEG (icEEG) data. Five 10-min segments of icEEG recorded in 5 patients were used. WC and three expert EEG reviewers independently classified one hundred IED events into IED classes or non-IEDs. First, we determined whether WC-human agreement variability falls within inter-reviewer agreement variability by calculating the variation of information for each classifier pair and quantifying the overlap between all WC-reviewer and all reviewer-reviewer pairs. Second, we compared WC and EEG reviewers' spike identification and individual spike class labels visually and quantitatively. The overlap between all WC-human pairs and all human pairs was >80% for 3/5 patients and >58% for the other 2 patients demonstrating WC falling within inter-human variation. The average sensitivity of spike marking for WC was 91% and >87% for all three EEG reviewers. Finally, there was a strong visual and quantitative similarity between WC and EEG reviewers. WC performance is indistinguishable to that of EEG reviewers' suggesting it could be a valid clinical tool for the assessment of IEDs. WC can be used to provide quantitative analysis of epileptic spikes. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  7. Column-Parallel Single Slope ADC with Digital Correlated Multiple Sampling for Low Noise CMOS Image Sensors

    NARCIS (Netherlands)

    Chen, Y.; Theuwissen, A.J.P.; Chae, Y.

    2011-01-01

    This paper presents a low noise CMOS image sensor (CIS) using 10/12 bit configurable column-parallel single slope ADCs (SS-ADCs) and digital correlated multiple sampling (CMS). The sensor used is a conventional 4T active pixel with a pinned-photodiode as photon detector. The test sensor was

  8. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    Poellaenen, R.; Ilander, T.; Lehtinen, J.; Leppaenen, A.; Nikkinen, M.; Toivonen, H.; Ylaetalo, S.; Smartt, H.; Garcia, R.; Martinez, R.; Glidewell, D.; Krantz, K.

    1999-01-01

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  9. Automated rapid method for the determination of Ni-63 in deconstruction of nuclear facilities- and environmental samples; Automatisiertes Schnellverfahren zur Bestimmung von Ni-63 in Rueckbau- und Umweltproben

    Energy Technology Data Exchange (ETDEWEB)

    Bach, M.; Flucht, R.; Burow, M.; Zoriy, M.V. [Forschungszentrum Juelich GmbH, Juelich (Germany)

    2012-07-01

    Because of the increasing demand on the deconstruction of nuclear facilities in the last few years, a rapid method for the determination of Ni-63 in environmental samples is required. At Juelich, Division of Safety and Radiation Protection we have established a routine method, based on the determination of Ni-63 in the environmental samples using extraction chromatography separation from 2000[1] and optimize the sample preparation. Samples of different matrices were prepared by wet chemical methods for the extraction chromatography separation. For the separation the samples were introduced manually via a peristaltic pump or by using developed automated separation system equipped with the extraction chromatography column of Ni-Resin. Nickel[2] forms a Dimethylglyoximkomplex (DMG) and can be separated. The activity of the Ni-63 then were measured bei means of liquid scintillation counter (LSC), the recovery was determined using MC-ICP-MS. The yields of the Ni-carrier for the develped procure was higher than 90% with the accuracy below 10%. The developed method allows the relatively simple and fast determination of Ni-63 in environmental samples with a detection limit of 0.1 Bq/l. (orig.)

  10. The automated sample preparation system MixMaster for investigation of volatile organic compounds with mid-infrared evanescent wave spectroscopy.

    Science.gov (United States)

    Vogt, F; Karlowatz, M; Jakusch, M; Mizaikoff, B

    2003-04-01

    For efficient development assessment, and calibration of new chemical analyzers a large number of independently prepared samples of target analytes is necessary. Whereas mixing units for gas analysis are readily available, there is a lack of instrumentation for accurate preparation of liquid samples containing volatile organic compounds (VOCs). Manual preparation of liquid samples containing VOCs at trace concentration levels is a particularly challenging and time consuming task. Furthermore, regularly scheduled calibration of sensors and analyzer systems demands for computer controlled automated sample preparation systems. In this paper we present a novel liquid mixing device enabling extensive measurement series with focus on volatile organic compounds, facilitating analysis of water polluted by traces of volatile hydrocarbons. After discussing the mixing system and control software, first results obtained by coupling with an FT-IR spectrometer are reported. Properties of the mixing system are assessed by mid-infrared attenuated total reflection (ATR) spectroscopy of methanol-acetone mixtures and by investigation of multicomponent samples containing volatile hydrocarbons such as 1,2,4-trichlorobenzene and tetrachloroethylene. Obtained ATR spectra are evaluated by principal component regression (PCR) algorithms. It is demonstrated that the presented sample mixing device provides reliable multicomponent mixtures with sufficient accuracy and reproducibility at trace concentration levels.

  11. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    Energy Technology Data Exchange (ETDEWEB)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Andruch, Vasil, E-mail: vasil.andruch@upjs.sk [Department of Analytical Chemistry, University of P.J. Šafárik, SK-04154 Košice (Slovakia); Moskvin, Leonid [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Bulatov, Andrey, E-mail: bulatov_andrey@mail.ru [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation)

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L{sup −1} Na{sub 2}CO{sub 3}) and the proton donor solution (1 mol L{sup −1} CH{sub 3}COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min{sup −1} during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L{sup −1} of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L{sup −1}. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  12. EPA Method 3135.2I: Cyanide, Total and Amenable in Aqueous and Solid Samples Automated Colorimetric With Manual Digestion

    Science.gov (United States)

    This method describes procedures for preparation and analysis of solid, water and wipe samples for detection and measurement of cyanide amendable to chlorination using acid digestion and spectrophotometry.

  13. Role of NAA in characterizations of sampling behaviors of multiple elements in CRMs

    International Nuclear Information System (INIS)

    Tian Weizhi; Ni Bangfa; Wang Pingsheng; Nie Huiling

    1997-01-01

    Taking the advantage of high precision and accuracy of neutron activation analysis (NAA), sampling constants have been determined for multielements in several international and Chinese reference materials. The suggested technique may be used for finding elements in existing CRMs qualified for quality control (QC) of small size samples (several mg or less), and characterizing sampling behaviors of multielements in new CRMs specifically made for QC of microanalysis

  14. Role of NAA in determination and characterisation of sampling behaviours of multiple elements in CRMs

    International Nuclear Information System (INIS)

    Tian Weizhi; Ni Bangfa; Wang Pingsheng; Nie Huiling

    2002-01-01

    Taking the advantage of high precision and accuracy of neutron activation analysis (NAA), sampling constants have been determined for multielements in several international and Chinese reference materials. The suggested technique may be used for finding elements in existing CRMs qualified for quality control (QC) of small size samples (several mg or less), and characterizing sampling behaviors of multielements in new CRMs specifically made for QC of microanalysis. (author)

  15. Symmetry relationships for multiple scattering of polarized light in turbid spherical samples: theory and a Monte Carlo simulation.

    Science.gov (United States)

    Otsuki, Soichi

    2016-02-01

    This paper presents a theory describing totally incoherent multiple scattering of turbid spherical samples. It is proved that if reciprocity and mirror symmetry hold for single scattering by a particle, they also hold for multiple scattering in spherical samples. Monte Carlo simulations generate a reduced effective scattering Mueller matrix, which virtually satisfies reciprocity and mirror symmetry. The scattering matrix was factorized by using the symmetric decomposition in a predefined form, as well as the Lu-Chipman polar decomposition, approximately into a product of a pure depolarizer and vertically oriented linear retarding diattenuators. The parameters of these components were calculated as a function of the polar angle. While the turbid spherical sample is a pure depolarizer at low polar angles, it obtains more functions of the retarding diattenuator with increasing polar angle.

  16. Linking and Psychological Functioning in a Chinese Sample: The Multiple Mediation of Response to Positive Affect

    Science.gov (United States)

    Yang, Hongfei; Li, Juan

    2016-01-01

    The present study examined the associations between linking, response to positive affect, and psychological functioning in Chinese college students. The results of conducting multiple mediation analyses indicated that emotion- and self-focused positive rumination mediated the relationship between linking and psychological functioning, whereas…

  17. Development of a novel concept for performing multiple assays on clinical samples using fluidic device

    DEFF Research Database (Denmark)

    Søe, Martin Jensen

    RNA-130a sammenlignet med konventionelle teknikker, hvor der benyttes prober af DNA og LNA. Desuden kunne multiple miRNAer detekteres ved brug af sekventielle inkuberinger af TSA reagens. Dette tillod detektion af to miRNA og et protein på samme vævssnit. Den anden model involverede design og konstruktion...

  18. Multiple double cross-section transmission electron microscope sample preparation of specific sub-10 nm diameter Si nanowire devices.

    Science.gov (United States)

    Gignac, Lynne M; Mittal, Surbhi; Bangsaruntip, Sarunya; Cohen, Guy M; Sleight, Jeffrey W

    2011-12-01

    The ability to prepare multiple cross-section transmission electron microscope (XTEM) samples from one XTEM sample of specific sub-10 nm features was demonstrated. Sub-10 nm diameter Si nanowire (NW) devices were initially cross-sectioned using a dual-beam focused ion beam system in a direction running parallel to the device channel. From this XTEM sample, both low- and high-resolution transmission electron microscope (TEM) images were obtained from six separate, specific site Si NW devices. The XTEM sample was then re-sectioned in four separate locations in a direction perpendicular to the device channel: 90° from the original XTEM sample direction. Three of the four XTEM samples were successfully sectioned in the gate region of the device. From these three samples, low- and high-resolution TEM images of the Si NW were taken and measurements of the NW diameters were obtained. This technique demonstrated the ability to obtain high-resolution TEM images in directions 90° from one another of multiple, specific sub-10 nm features that were spaced 1.1 μm apart.

  19. A sample of high multiplicity pp reactions at 19 GeV/c

    International Nuclear Information System (INIS)

    Allan, J.; Blomqvist, G.

    1976-02-01

    This report describes the experimental procedure used in obtaining samples of high muliplicity pp reactions at 19 GeV/c. Various methods to improve the quality of the samples are tested. The analysis is part of the general study of pp collisions at 19 GeV/c which is performed within the Scandinavian Bubble Chamber Collaboration. (Auth.)

  20. Variability and reliability of POP concentrations in multiple breast milk samples collected from the same mothers.

    Science.gov (United States)

    Kakimoto, Risa; Ichiba, Masayoshi; Matsumoto, Akiko; Nakai, Kunihiko; Tatsuta, Nozomi; Iwai-Shimada, Miyuki; Ishiyama, Momoko; Ryuda, Noriko; Someya, Takashi; Tokumoto, Ieyasu; Ueno, Daisuke

    2018-01-13

    Risk assessment of infant using a realistic persistent organic pollutant (POP) exposure through breast milk is essential to devise future regulation of POPs. However, recent investigations have demonstrated that POP levels in breast milk collected from the same mother showed a wide range of variation daily and monthly. To estimate the appropriate sample size of breast milk from the same mother to obtain reliable POP concentrations, breast milk samples were collected from five mothers living in Japan from 2006 to 2012. Milk samples from each mother were collected 3 to 6 times a day through 3 to 7 days consecutively. Food samples as the duplicated method were collected from two mothers during the period of breast milk sample collection. Those were employed for POP (PCBs, DDTs, chlordanes, and HCB) analysis. PCB concentrations detected in breast milk samples showed a wide range of variation which was maximum 63 and 60% of relative standard deviation (RSD) in lipid and wet weight basis, respectively. The time course trend of those variations among the mothers did not show any typical pattern. A larger amount of PCB intake through food seemed to affect 10 h after those concentrations in breast milk in lipid weight basis. Intraclass correlation coefficient (ICC) analyses indicated that the appropriate sample size for good reproducibility of POP concentrations in breast milk required at least two samples for lipid and wet weight basis.

  1. Significant increase in cultivation of Gardnerella vaginalis, Alloscardovia omnicolens, Actinotignum schaalii, and Actinomyces spp. in urine samples with total laboratory automation.

    Science.gov (United States)

    Klein, Sabrina; Nurjadi, Dennis; Horner, Susanne; Heeg, Klaus; Zimmermann, Stefan; Burckhardt, Irene

    2018-04-13

    While total laboratory automation (TLA) is well established in laboratory medicine, only a few microbiological laboratories are using TLA systems. Especially in terms of speed and accuracy, working with TLA is expected to be superior to conventional microbiology. We compared in total 35,564 microbiological urine cultures with and without incubation and processing with BD Kiestra TLA for a 6-month period each retrospectively. Sixteen thousand three hundred thirty-eight urine samples were analyzed in the pre-TLA period and 19,226 with TLA. Sixty-two percent (n = 10,101/16338) of the cultures processed without TLA and 68% (n = 13,102/19226) of the cultures processed with TLA showed growth. There were significantly more samples with two or more species per sample and with low numbers of colony forming units (CFU) after incubation with TLA. Regarding the type of bacteria, there were comparable amounts of Enterobacteriaceae in the samples, slightly less non-fermenting Gram-negative bacteria, but significantly more Gram-positive cocci, and Gram-positive rods. Especially Alloscardivia omnicolens, Gardnerella vaginalis, Actinomyces spp., and Actinotignum schaalii were significantly more abundant in the samples incubated and processed with TLA. The time to report was significantly lower in the TLA processed samples by 1.5 h. We provide the first report in Europe of a large number of urine samples processed with TLA. TLA showed enhanced growth of non-classical and rarely cultured bacteria from urine samples. Our findings suggest that previously underestimated bacteria may be relevant pathogens for urinary tract infections. Further studies are needed to confirm our findings.

  2. A New Capability for Automated Target Selection and Sampling for use with Remote Sensing Instruments on the MER Rovers

    Science.gov (United States)

    Castano, R.; Estlin, T.; Anderson, R. C.; Gaines, D.; Bornstein, B.; de Granville, C.; Tang, B.; Thompson, D.; Judd, M.

    2008-12-01

    The Onboard Autonomous Science Investigation System (OASIS) evaluates geologic data gathered by a planetary rover. The system is designed to operate onboard a rover identifying and reacting to serendipitous science opportunities, such as rocks with novel properties. OASIS operates by analyzing data the rover gathers, and then using machine learning techniques, prioritizing the data based on criteria set by the science team. This prioritization can be used to organize data for transmission back to Earth and it can be used to search for specific targets it has been told to find by the science team. If one of these targets is found, it is identified as a new science opportunity and a "science alert" is sent to a planning and scheduling system. After reviewing the rover's current operational status to ensure that it has enough resources to complete its traverse and act on the new science opportunity, OASIS can change the command sequence of the rover in order to obtain additional science measurements. Currently, OASIS is being applied on a new front. OASIS is providing a new rover mission technology that enables targeted remote-sensing science in an automated fashion during or after rover traverses. Currently, targets for remote sensing instruments, especially narrow field-of-view instruments (such as the MER Mini- TES spectrometer or the 2009 MSL ChemCam spectrometer) must be selected manually based on imagery already on the ground with the operations team. OASIS will enable the rover flight software to analyze imagery onboard in order to autonomously select and sequence targeted remote-sensing observations in an opportunistic fashion. We are in the process of scheduling an onboard MER experiment to demonstrate the OASIS capability in early 2009.

  3. THE USE OF MULTIPLE DISPLACEMENT AMPLIFICATION TO INCREASE THE DETECTION AND GENOTYPING OF TRYPANOSOMA SPECIES SAMPLES IMMOBILISED ON FTA FILTERS

    Science.gov (United States)

    MORRISON, LIAM J.; McCORMACK, GILLIAN; SWEENEY, LINDSAY; LIKEUFACK, ANNE C. L.; TRUC, PHILIPPE; TURNER, C. MICHAEL; TAIT, ANDY; MacLEOD, ANNETTE

    2007-01-01

    Whole genome amplification methods are a recently developed tool for amplifying DNA from limited template. We report its application in trypanosome infections, characterised by low parasitaemias. Multiple Displacement Amplification (MDA) amplifies DNA with a simple in vitro step, and was evaluated on mouse blood samples on FTA filter cards with known numbers of Trypanosoma brucei parasites. The data showed a twenty-fold increase in the number of PCRs possible per sample, using primers diagnostic for the multi-copy ribosomal ITS region or 177 bp repeats, and a twenty-fold increase in sensitivity over nested PCR against a single copy microsatellite. Using MDA for microsatellite genotyping caused allele dropout at low DNA concentrations, which was overcome by pooling multiple MDA reactions. The validity of using MDA was established with samples from Human African Trypanosomiasis patients. The use of MDA allows maximal use of finite DNA samples and may prove a valuable tool in studies where multiple reactions are necessary, such as population genetic analyses. PMID:17556624

  4. Multiple surveys employing a new sample-processing protocol reveal the genetic diversity of placozoans in Japan.

    Science.gov (United States)

    Miyazawa, Hideyuki; Nakano, Hiroaki

    2018-03-01

    Placozoans, flat free-living marine invertebrates, possess an extremely simple bauplan lacking neurons and muscle cells and represent one of the earliest-branching metazoan phyla. They are widely distributed from temperate to tropical oceans. Based on mitochondrial 16S rRNA sequences, 19 haplotypes forming seven distinct clades have been reported in placozoans to date. In Japan, placozoans have been found at nine locations, but 16S genotyping has been performed at only two of these locations. Here, we propose a new processing protocol, "ethanol-treated substrate sampling," for collecting placozoans from natural environments. We also report the collection of placozoans from three new locations, the islands of Shikine-jima, Chichi-jima, and Haha-jima, and we present the distribution of the 16S haplotypes of placozoans in Japan. Multiple surveys conducted at multiple locations yielded five haplotypes that were not reported previously, revealing high genetic diversity in Japan, especially at Shimoda and Shikine-jima Island. The observed geographic distribution patterns were different among haplotypes; some were widely distributed, while others were sampled only from a single location. However, samplings conducted on different dates at the same sites yielded different haplotypes, suggesting that placozoans of a given haplotype do not inhabit the same site constantly throughout the year. Continued sampling efforts conducted during all seasons at multiple locations worldwide and the development of molecular markers within the haplotypes are needed to reveal the geographic distribution pattern and dispersal history of placozoans in greater detail.

  5. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  6. Communication: Multiple atomistic force fields in a single enhanced sampling simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hoang Viet, Man [Department of Physics, North Carolina State University, Raleigh, North Carolina 27695-8202 (United States); Derreumaux, Philippe, E-mail: philippe.derreumaux@ibpc.fr [Laboratoire de Biochimie Théorique, UPR 9080, CNRS, Université Denis Diderot, Sorbonne Paris Cité IBPC, 13 rue Pierre et Marie Curie, 75005 Paris (France); Institut Universitaire de France, 103 Bvd Saint-Germain, 75005 Paris (France); Nguyen, Phuong H., E-mail: phuong.nguyen@ibpc.fr [Laboratoire de Biochimie Théorique, UPR 9080, CNRS, Université Denis Diderot, Sorbonne Paris Cité IBPC, 13 rue Pierre et Marie Curie, 75005 Paris (France)

    2015-07-14

    The main concerns of biomolecular dynamics simulations are the convergence of the conformational sampling and the dependence of the results on the force fields. While the first issue can be addressed by employing enhanced sampling techniques such as simulated tempering or replica exchange molecular dynamics, repeating these simulations with different force fields is very time consuming. Here, we propose an automatic method that includes different force fields into a single advanced sampling simulation. Conformational sampling using three all-atom force fields is enhanced by simulated tempering and by formulating the weight parameters of the simulated tempering method in terms of the energy fluctuations, the system is able to perform random walk in both temperature and force field spaces. The method is first demonstrated on a 1D system and then validated by the folding of the 10-residue chignolin peptide in explicit water.

  7. Using multiple criteria for fingerprinting unknown oil samples having very similar chemical composition

    International Nuclear Information System (INIS)

    Wang, Z.; Fingas, M.F.; Sigouin, L.

    2002-01-01

    A study was conducted in which 3 mystery oil samples from Quebec were fingerprinted using a multi-criterion approach. The three objectives of the study were to determine the nature and the type of product, to obtain the detailed hydrocarbon composition of the samples, and to determine if the samples came from the same source. The product type was first determined by identifying the hydrocarbon distribution patterns. Polycyclic aromatic hydrocarbon (PAH) profiles were then compared and then the conclusions were verified by quantifying biomarkers and by determining several diagnostic ratios of source-specific marker compounds. Additives in the oil were also identified. The samples were analyzed using gas chromatography combined with flame ionization detection (GC-FID), and by gas chromatography-mass spectrometry (GC-MS). It was determined that the 3 oils were probably hydraulic-fluid type oil. They were very pure, and composed mostly of saturated hydrocarbons with the total aromatics being 4 to 10 per cent of the total petroleum hydrocarbon. Although it was determined that the oils were mixtures of 2 different hydraulic fluids, there was no clear indication if they had been weathered. The PAH concentration was very low, while the biomarker concentration was very high. Three unknown compounds (antioxidants) were positively identified. Two of the samples came from the same source. One of the samples had similar group hydrocarbon composition but it was not identical in chemical composition and did not come from the same source. 34 refs., 3 tabs., 6 figs

  8. Quantification of Parvovirus B19 DNA Using COBAS AmpliPrep Automated Sample Preparation and LightCycler Real-Time PCR

    Science.gov (United States)

    Schorling, Stefan; Schalasta, Gunnar; Enders, Gisela; Zauke, Michael

    2004-01-01

    The COBAS AmpliPrep instrument (Roche Diagnostics GmbH, D-68305 Mannheim, Germany) automates the entire sample preparation process of nucleic acid isolation from serum or plasma for polymerase chain reaction analysis. We report the analytical performance of the LightCycler Parvovirus B19 Quantification Kit (Roche Diagnostics) using nucleic acids isolated with the COBAS AmpliPrep instrument. Nucleic acids were extracted using the Total Nucleic Acid Isolation Kit (Roche Diagnostics) and amplified with the LightCycler Parvovirus B19 Quantification Kit. The kit combination processes 72 samples per 8-hour shift. The lower detection limit is 234 IU/ml at a 95% hit-rate, linear range approximately 104-1010 IU/ml, and overall precision 16 to 40%. Relative sensitivity and specificity in routine samples from pregnant women are 100% and 93%, respectively. Identification of a persistent parvovirus B19-infected individual by the polymerase chain reaction among 51 anti-parvovirus B19 IgM-negative samples underlines the importance of additional nucleic acid testing in pregnancy and its superiority to serology in identifying the risk of parvovirus B19 transmission via blood or blood products. Combination of the Total Nucleic Acid Isolation Kit on the COBAS AmpliPrep instrument with the LightCycler Parvovirus B19 Quantification Kit provides a reliable and time-saving tool for sensitive and accurate detection of parvovirus B19 DNA. PMID:14736825

  9. Double sampling with multiple imputation to answer large sample meta-research questions: Introduction and illustration by evaluating adherence to two simple CONSORT guidelines

    Directory of Open Access Journals (Sweden)

    Patrice L. Capers

    2015-03-01

    Full Text Available BACKGROUND: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. OBJECTIVE: To evaluate the use of double sampling combined with multiple imputation (DS+MI to address meta-research questions, using as an example adherence of PubMed entries to two simple Consolidated Standards of Reporting Trials (CONSORT guidelines for titles and abstracts. METHODS: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT; human; abstract available; and English language (n=322,107. For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO human rating method. Multiple imputation of the missing-completely-at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. RESULTS: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title=1.00, abstract=0.92. Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS+MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by Year: subsample RHITLO 1.050-1.174 vs. DS+MI 1.082-1.151. As evidence of improved accuracy, DS+MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. CONCLUSIONS: Our results support our hypothesis that DS+MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of

  10. Time optimization of 90Sr measurements: Sequential measurement of multiple samples during ingrowth of 90Y

    International Nuclear Information System (INIS)

    Holmgren, Stina; Tovedal, Annika; Björnham, Oscar; Ramebäck, Henrik

    2016-01-01

    The aim of this paper is to contribute to a more rapid determination of a series of samples containing 90 Sr by making the Cherenkov measurement of the daughter nuclide 90 Y more time efficient. There are many instances when an optimization of the measurement method might be favorable, such as; situations requiring rapid results in order to make urgent decisions or, on the other hand, to maximize the throughput of samples in a limited available time span. In order to minimize the total analysis time, a mathematical model was developed which calculates the time of ingrowth as well as individual measurement times for n samples in a series. This work is focused on the measurement of 90 Y during ingrowth, after an initial chemical separation of strontium, in which it is assumed that no other radioactive strontium isotopes are present. By using a fixed minimum detectable activity (MDA) and iterating the measurement time for each consecutive sample the total analysis time will be less, compared to using the same measurement time for all samples. It was found that by optimization, the total analysis time for 10 samples can be decreased greatly, from 21 h to 6.5 h, when assuming a MDA of 1 Bq/L and at a background count rate of approximately 0.8 cpm. - Highlights: • An approach roughly a factor of three more efficient than an un-optimized method. • The optimization gives a more efficient use of instrument time. • The efficiency increase ranges from a factor of three to 10, for 10 to 40 samples.

  11. Addressing small sample size bias in multiple-biomarker trials: Inclusion of biomarker-negative patients and Firth correction.

    Science.gov (United States)

    Habermehl, Christina; Benner, Axel; Kopp-Schneider, Annette

    2018-03-01

    In recent years, numerous approaches for biomarker-based clinical trials have been developed. One of these developments are multiple-biomarker trials, which aim to investigate multiple biomarkers simultaneously in independent subtrials. For low-prevalence biomarkers, small sample sizes within the subtrials have to be expected, as well as many biomarker-negative patients at the screening stage. The small sample sizes may make it unfeasible to analyze the subtrials individually. This imposes the need to develop new approaches for the analysis of such trials. With an expected large group of biomarker-negative patients, it seems reasonable to explore options to benefit from including them in such trials. We consider advantages and disadvantages of the inclusion of biomarker-negative patients in a multiple-biomarker trial with a survival endpoint. We discuss design options that include biomarker-negative patients in the study and address the issue of small sample size bias in such trials. We carry out a simulation study for a design where biomarker-negative patients are kept in the study and are treated with standard of care. We compare three different analysis approaches based on the Cox model to examine if the inclusion of biomarker-negative patients can provide a benefit with respect to bias and variance of the treatment effect estimates. We apply the Firth correction to reduce the small sample size bias. The results of the simulation study suggest that for small sample situations, the Firth correction should be applied to adjust for the small sample size bias. Additional to the Firth penalty, the inclusion of biomarker-negative patients in the analysis can lead to further but small improvements in bias and standard deviation of the estimates. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Evaluation of Multiple-Sampling Function used with a Microtek flatbed scanner for Radiation Dosimetry Calibration of EBT2 Film

    International Nuclear Information System (INIS)

    Chang, Liyun; Ho, Sheng-Yow; Ding, Hueisch-Jy; Hwang, Ing-Ming; Chen, Pang-Yu; Lee, Tsair-Fwu

    2016-01-01

    The radiochromic EBT2 film is a widely used quality assurance device for radiation therapy. This study evaluated the film calibration performance of the multiple-sampling function, a function of the ScanWizard Pro scanning software provided by the manufacturer, when used with Microtek 9800XL plus (9800XL + ) flatbed scanner. By using the PDD method, each one of the eight EBT2 films, four delivered by 290 monitor unit (MU) and four by 88 MU via 6-MV photon beams, was tightly sandwiched in a 30 3 -cm 3 water equivalent polystyrene phantom prior to irradiation. Before and after irradiation, all films were scanned using the Microtek 9800XL + scanner with five different modes of the multiple-sampling function, which could generate the image with the averaged result of multiple-sampling. The net optical densities (netOD) on the beam central axis of film were assigned to corresponding depth doses for calibration. For each sampling mode with either delivered MU, the depth-dose uncertainty of a single film from repeated scans and that of a single scan of the four films were analyzed. Finally, the calibration error and the combined calibration uncertainty between film determined depth-doses and delivered depth-doses were calculated and evaluated for each sampling mode. All standard deviations and the calibration error were demonstrated to be unrelated to the number of sampling lines. The calibration error of the 2-line and 16-line mode was within 3 cGy and better than that of the other modes. The combined uncertainty of the 2-line mode was the lowest, which was generally less than 6 cGy except for the delivered dose around 100 cGy. The evaluation described herein revealed that the EBT2 film calibrated with the 2-line mode has relatively lower error, scanning time and combined uncertianty. Therefore, it is recommended for routine EBT2 film calibration and verification of treatment plans.

  13. Simultaneous analysis of organochlorinated pesticides (OCPs) and polychlorinated biphenyls (PCBs) from marine samples using automated pressurized liquid extraction (PLE) and Power Prep™ clean-up.

    Science.gov (United States)

    Helaleh, Murad I H; Al-Rashdan, Amal; Ibtisam, A

    2012-05-30

    An automated pressurized liquid extraction (PLE) method followed by Power Prep™ clean-up was developed for organochlorinated pesticide (OCP) and polychlorinated biphenyl (PCB) analysis in environmental marine samples of fish, squid, bivalves, shells, octopus and shrimp. OCPs and PCBs were simultaneously determined in a single chromatographic run using gas chromatography-mass spectrometry-negative chemical ionization (GC-MS-NCI). About 5 g of each biological marine sample was mixed with anhydrous sodium sulphate and placed in the extraction cell of the PLE system. PLE is controlled by means of a PC using DMS 6000 software. Purification of the extract was accomplished using automated Power Prep™ clean-up with a pre-packed disposable silica column (6 g) supplied by Fluid Management Systems (FMS). All OCPs and PCBs were eluted from the silica column using two types of solvent: 80 mL of hexane and a 50 mL mixture of hexane and dichloromethane (1:1). A wide variety of fish and shellfish were collected from the fish market and analyzed using this method. The total PCB concentrations were 2.53, 0.25, 0.24, 0.24, 0.17 and 1.38 ng g(-1) (w/w) for fish, squid, bivalves, shells, octopus and shrimp, respectively, and the corresponding total OCP concentrations were 30.47, 2.86, 0.92, 10.72, 5.13 and 18.39 ng g(-1) (w/w). Lipids were removed using an SX-3 Bio-Beads gel permeation chromatography (GPC) column. Analytical criteria such as recovery, reproducibility and repeatability were evaluated through a range of biological matrices. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Data-driven soft sensor design with multiple-rate sampled data

    DEFF Research Database (Denmark)

    Lin, Bao; Recke, Bodil; Knudsen, Jørgen K.H.

    2007-01-01

    Multi-rate systems are common in industrial processes where quality measurements have slower sampling rate than other process variables. Since inter-sample information is desirable for effective quality control, different approaches have been reported to estimate the quality between samples......, including numerical interpolation, polynomial transformation, data lifting and weighted partial least squares (WPLS). Two modifications to the original data lifting approach are proposed in this paper: reformulating the extraction of a fast model as an optimization problem and ensuring the desired model...... properties through Tikhonov Regularization. A comparative investigation of the four approaches is performed in this paper. Their applicability, accuracy and robustness to process noise are evaluated on a single-input single output (SISO) system. The regularized data lifting and WPLS approaches...

  15. Multiple stage MS in analysis of plasma, serum, urine and in vitro samples relevant to clinical and forensic toxicology.

    Science.gov (United States)

    Meyer, Golo M; Maurer, Hans H; Meyer, Markus R

    2016-01-01

    This paper reviews MS approaches applied to metabolism studies, structure elucidation and qualitative or quantitative screening of drugs (of abuse) and/or their metabolites. Applications in clinical and forensic toxicology were included using blood plasma or serum, urine, in vitro samples, liquids, solids or plant material. Techniques covered are liquid chromatography coupled to low-resolution and high-resolution multiple stage mass analyzers. Only PubMed listed studies published in English between January 2008 and January 2015 were considered. Approaches are discussed focusing on sample preparation and mass spectral settings. Comments on advantages and limitations of these techniques complete the review.

  16. A high-pressure thermal gradient block for investigating microbial activity in multiple deep-sea samples

    DEFF Research Database (Denmark)

    Kallmeyer, J.; Ferdelman, TG; Jansen, KH

    2003-01-01

    Details about the construction and use of a high-pressure thermal gradient block for the simultaneous incubation of multiple samples are presented. Most parts used are moderately priced off-the-shelf components that easily obtainable. In order to keep the pressure independent of thermal expansion....... Sulfate reduction rates increase with increasing pressure and show maximum values at pressures higher than in situ. (C) 2003 Elsevier Science B.V. All rights reserved....

  17. Evaluation of two automated enzyme-immunoassays for detection of thermophilic campylobacters in faecal samples from cattle and swine

    DEFF Research Database (Denmark)

    Hoorfar, Jeffrey; Nielsen, E.M.; Stryhn, H.

    1999-01-01

    We evaluated the performance of two enzyme-immunoassays (EIA) for the detection of naturally occurring, thermophilic Campylobacter spp. found in faecal samples from cattle (n = 21 and n = 26) and swine (n = 43) relative to the standard culture method, and also assuming that none of the tests...

  18. Isotope Enrichment Detection by Laser Ablation - Laser Absorption Spectrometry: Automated Environmental Sampling and Laser-Based Analysis for HEU Detection

    International Nuclear Information System (INIS)

    Anheier, Norman C.; Bushaw, Bruce A.

    2010-01-01

    The global expansion of nuclear power, and consequently the uranium enrichment industry, requires the development of new safeguards technology to mitigate proliferation risks. Current enrichment monitoring instruments exist that provide only yes/no detection of highly enriched uranium (HEU) production. More accurate accountancy measurements are typically restricted to gamma-ray and weight measurements taken in cylinder storage yards. Analysis of environmental and cylinder content samples have much higher effectiveness, but this approach requires onsite sampling, shipping, and time-consuming laboratory analysis and reporting. Given that large modern gaseous centrifuge enrichment plants (GCEPs) can quickly produce a significant quantity (SQ ) of HEU, these limitations in verification suggest the need for more timely detection of potential facility misuse. The Pacific Northwest National Laboratory (PNNL) is developing an unattended safeguards instrument concept, combining continuous aerosol particulate collection with uranium isotope assay, to provide timely analysis of enrichment levels within low enriched uranium facilities. This approach is based on laser vaporization of aerosol particulate samples, followed by wavelength tuned laser diode spectroscopy to characterize the uranium isotopic ratio through subtle differences in atomic absorption wavelengths. Environmental sampling (ES) media from an integrated aerosol collector is introduced into a small, reduced pressure chamber, where a focused pulsed laser vaporizes material from a 10 to 20-(micro)m diameter spot of the surface of the sampling media. The plume of ejected material begins as high-temperature plasma that yields ions and atoms, as well as molecules and molecular ions. We concentrate on the plume of atomic vapor that remains after the plasma has expanded and then cooled by the surrounding cover gas. Tunable diode lasers are directed through this plume and each isotope is detected by monitoring absorbance

  19. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  20. Serum sample containing endogenous antibodies interfering with multiple hormone immunoassays. Laboratory strategies to detect interference

    Directory of Open Access Journals (Sweden)

    Elena García-González

    2016-04-01

    Full Text Available Objectives: Endogenous antibodies (EA may interfere with immunoassays, causing erroneous results for hormone analyses. As (in most cases this interference arises from the assay format and most immunoassays, even from different manufacturers, are constructed in a similar way, it is possible for a single type of EA to interfere with different immunoassays. Here we describe the case of a patient whose serum sample contains EA that interfere several hormones tests. We also discuss the strategies deployed to detect interference. Subjects and methods: Over a period of four years, a 30-year-old man was subjected to a plethora of laboratory and imaging diagnostic procedures as a consequence of elevated hormone results, mainly of pituitary origin, which did not correlate with the overall clinical picture. Results: Once analytical interference was suspected, the best laboratory approaches to investigate it were sample reanalysis on an alternative platform and sample incubation with antibody blocking tubes. Construction of an in-house ‘nonsense’ sandwich assay was also a valuable strategy to confirm interference. In contrast, serial sample dilutions were of no value in our case, while polyethylene glycol (PEG precipitation gave inconclusive results, probably due to the use of inappropriate PEG concentrations for several of the tests assayed. Conclusions: Clinicians and laboratorians must be aware of the drawbacks of immunometric assays, and alert to the possibility of EA interference when results do not fit the clinical pattern. Keywords: Endogenous antibodies, Immunoassay, Interference, Pituitary hormones, Case report

  1. Dependability of Data Derived from Time Sampling Methods with Multiple Observation Targets

    Science.gov (United States)

    Johnson, Austin H.; Chafouleas, Sandra M.; Briesch, Amy M.

    2017-01-01

    In this study, generalizability theory was used to examine the extent to which (a) time-sampling methodology, (b) number of simultaneous behavior targets, and (c) individual raters influenced variance in ratings of academic engagement for an elementary-aged student. Ten graduate-student raters, with an average of 7.20 hr of previous training in…

  2. Multiplicity and contiguity of ablation mechanisms in laser-assisted analytical micro-sampling

    International Nuclear Information System (INIS)

    Bleiner, Davide; Bogaerts, Annemie

    2006-01-01

    Laser ablation is implemented in several scientific and technological fields, as well as a rapid sample introduction technique in elemental and trace analysis. At high laser fluence, the ejection of micro-sized droplets causes the enhancement of the surface recession speed and depth resolution degradation as well as the alteration of the sampling stoichiometry. The origin of such large particles seems to be due to at least two different processes, phase explosion and melt splashing. Experimental evidence for both was found in metallic matrices, whereas non-metallic samples showed more complex phenomena like cracking. The spatial distribution of the beam energy profile is responsible for significant differences in the ablation mechanism across the irradiated region and for heterogeneous sampling. Under Gaussian irradiance distribution, the center of the crater, where the irradiance is the highest, experienced a fast heating with rapid ejection of a mixture of particles and vapor (spinodal breakdown). The crater periphery was subjected to more modest irradiation, with melt mobilization and walls formation. The overall resulting particle size distribution was composed of an abundant nano-sized fraction, produced by vapor condensation, and a micro-sized fraction during melt expulsion

  3. Feasibility Study of Neutron Multiplicity Assay for a Heterogeneous Sludge Sample containing Na, Pu and other Impurities

    International Nuclear Information System (INIS)

    Nakamura, H.; Nakamichi, H.; Mukai, Y.; Yoshimoto, K.; Beddingfield, D.H.

    2010-01-01

    To reduce radioactivity of liquid waste generated at PCDF, a neutralization precipitation processes of radioactive nuclides by sodium hydroxide is used. We call the precipitate a 'sludge' after calcination. Pu mass in the sludge is normally determined by sampling and DA within the required uncertainty on DIQ. Annual yield of the mass is small but it accumulates and reaches to a few kilograms, so it is declared as retained waste and verified at PIV. A HM-5-based verification is applied for sludge verification. The sludge contains many chemical components. For example, Pu (-10wt%), U, Am, SUS components, halogens, NaNO 3 (main component), residual NaOH, and moisture. They are mixed together as an impure heterogeneous sludge sample. As a result, there is a large uncertainty in the sampling and DA that is currently used at PCDF. In order to improve the material accounting, we performed a feasibility study using neutron multiplicity assay for impure sludge samples. We have measured selected sludge samples using a multiplicity counter which is called FCAS (Fast Carton Assay System) which was designed by JAEA and Canberra. The PCDF sludge materials fall into the category of 'difficult to measure' because of the high levels of impurities, high alpha values and somewhat small Pu mass. For the sludge measurements, it was confirmed that good consistency between Pu mass in a pure sludge standard (PuO 2 -Na 2 U 2 O 7 , alpha=7) and the DA could be obtained. For unknown samples, using 14-hour measurements, we could obtain quite low statistical uncertainty on Doubles (-1%) and Triples (-10%) count rate although the alpha value was extremely high (15-25) and FCAS efficiency was relatively low (40%) for typical multiplicity counters. Despite the detector efficiency challenges and the material challenges (high alpha, low Pu mass, heterogeneous matrix), we have been able to obtain assay results that greatly exceed the accountancy requirements for retained waste materials. We have

  4. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes.

    Science.gov (United States)

    Chen, Xiao; Lu, Bin; Yan, Chao-Gan

    2018-01-01

    Concerns regarding reproducibility of resting-state functional magnetic resonance imaging (R-fMRI) findings have been raised. Little is known about how to operationally define R-fMRI reproducibility and to what extent it is affected by multiple comparison correction strategies and sample size. We comprehensively assessed two aspects of reproducibility, test-retest reliability and replicability, on widely used R-fMRI metrics in both between-subject contrasts of sex differences and within-subject comparisons of eyes-open and eyes-closed (EOEC) conditions. We noted permutation test with Threshold-Free Cluster Enhancement (TFCE), a strict multiple comparison correction strategy, reached the best balance between family-wise error rate (under 5%) and test-retest reliability/replicability (e.g., 0.68 for test-retest reliability and 0.25 for replicability of amplitude of low-frequency fluctuations (ALFF) for between-subject sex differences, 0.49 for replicability of ALFF for within-subject EOEC differences). Although R-fMRI indices attained moderate reliabilities, they replicated poorly in distinct datasets (replicability < 0.3 for between-subject sex differences, < 0.5 for within-subject EOEC differences). By randomly drawing different sample sizes from a single site, we found reliability, sensitivity and positive predictive value (PPV) rose as sample size increased. Small sample sizes (e.g., < 80 [40 per group]) not only minimized power (sensitivity < 2%), but also decreased the likelihood that significant results reflect "true" effects (PPV < 0.26) in sex differences. Our findings have implications for how to select multiple comparison correction strategies and highlight the importance of sufficiently large sample sizes in R-fMRI studies to enhance reproducibility. Hum Brain Mapp 39:300-318, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  5. Rapid and automated on-line solid phase extraction HPLC-MS/MS with peak focusing for the determination of ochratoxin A in wine samples.

    Science.gov (United States)

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca

    2018-04-01

    This study reports a fast and automated analytical procedure based on an on-line SPE-HPLC-MS/MS method for the automatic pre-concentration, clean up and sensitive determination of OTA in wine. The amount of OTA contained in 100μL of sample (pH≅5.5) was retained and concentrated on an Oasis MAX SPE cartridge. After a washing step to remove matrix interferents, the analyte was eluted in back-flush mode and the eluent from the SPE column was diluted through a mixing Tee, using an aqueous solution before the chromatographic separation achieved on a monolithic column. The developed method has been validated according to EU regulation N. 519/2014 and applied for the analysis of 41 red and 17 white wines. The developed method features minimal sample handling, low solvent consumption, high sample throughput, low analysis cost and provides an accurate and highly selective results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A Large-Sample Test of a Semi-Automated Clavicle Search Engine to Assist Skeletal Identification by Radiograph Comparison.

    Science.gov (United States)

    D'Alonzo, Susan S; Guyomarc'h, Pierre; Byrd, John E; Stephan, Carl N

    2017-01-01

    In 2014, a morphometric capability to search chest radiograph databases by quantified clavicle shape was published to assist skeletal identification. Here, we extend the validation tests conducted by increasing the search universe 18-fold, from 409 to 7361 individuals to determine whether there is any associated decrease in performance under these more challenging circumstances. The number of trials and analysts were also increased, respectively, from 17 to 30 skeletons, and two to four examiners. Elliptical Fourier analysis was conducted on clavicles from each skeleton by each analyst (shadowgrams trimmed from scratch in every instance) and compared to the search universe. Correctly matching individuals were found in shortlists of 10% of the sample 70% of the time. This rate is similar to, although slightly lower than, rates previously found for much smaller samples (80%). Accuracy and reliability are thereby maintained, even when the comparison system is challenged by much larger search universes. © 2016 American Academy of Forensic Sciences.

  7. Occurrence of multiple mental health or substance use outcomes among bisexuals: a respondent-driven sampling study

    Directory of Open Access Journals (Sweden)

    Greta R. Bauer

    2016-06-01

    Full Text Available Abstract Background Bisexual populations have higher prevalence of depression, anxiety, suicidality and substance use than heterosexuals, and often than gay men or lesbians. The co-occurrence of multiple outcomes has rarely been studied. Methods Data were collected from 405 bisexuals using respondent-driven sampling. Weighted analyses were conducted for 387 with outcome data. Multiple outcomes were defined as 3 or more of: depression, anxiety, suicide ideation, problematic alcohol use, or polysubstance use. Results Among bisexuals, 19.0 % had multiple outcomes. We did not find variation in raw frequency of multiple outcomes across sociodemographic variables (e.g. gender, age. After adjustment, gender and sexual orientation identity were associated, with transgender women and those identifying as bisexual only more likely to have multiple outcomes. Social equity factors had a strong impact in both crude and adjusted analysis: controlling for other factors, high mental health/substance use burden was associated with greater discrimination (prevalence risk ratio (PRR = 5.71; 95 % CI: 2.08, 15.63 and lower education (PRR = 2.41; 95 % CI: 1.06, 5.49, while higher income-to-needs ratio was protective (PRR = 0.44; 0.20, 1.00. Conclusions Mental health and substance use outcomes with high prevalence among bisexuals frequently co-occurred. We find some support for the theory that these multiple outcomes represent a syndemic, defined as co-occurring and mutually reinforcing adverse outcomes driven by social inequity.

  8. A method for multiple sequential analyses of macrophage functions using a small single cell sample

    Directory of Open Access Journals (Sweden)

    F.R.F. Nascimento

    2003-09-01

    Full Text Available Microbial pathogens such as bacillus Calmette-Guérin (BCG induce the activation of macrophages. Activated macrophages can be characterized by the increased production of reactive oxygen and nitrogen metabolites, generated via NADPH oxidase and inducible nitric oxide synthase, respectively, and by the increased expression of major histocompatibility complex class II molecules (MHC II. Multiple microassays have been developed to measure these parameters. Usually each assay requires 2-5 x 10(5 cells per well. In some experimental conditions the number of cells is the limiting factor for the phenotypic characterization of macrophages. Here we describe a method whereby this limitation can be circumvented. Using a single 96-well microassay and a very small number of peritoneal cells obtained from C3H/HePas mice, containing as little as <=2 x 10(5 macrophages per well, we determined sequentially the oxidative burst (H2O2, nitric oxide production and MHC II (IAk expression of BCG-activated macrophages. More specifically, with 100 µl of cell suspension it was possible to quantify H2O2 release and nitric oxide production after 1 and 48 h, respectively, and IAk expression after 48 h of cell culture. In addition, this microassay is easy to perform, highly reproducible and more economical.

  9. Numeracy of multiple sclerosis patients: A comparison of patients from the PERCEPT study to a German probabilistic sample.

    Science.gov (United States)

    Gaissmaier, Wolfgang; Giese, Helge; Galesic, Mirta; Garcia-Retamero, Rocio; Kasper, Juergen; Kleiter, Ingo; Meuth, Sven G; Köpke, Sascha; Heesen, Christoph

    2018-01-01

    A shared decision-making approach is suggested for multiple sclerosis (MS) patients. To properly evaluate benefits and risks of different treatment options accordingly, MS patients require sufficient numeracy - the ability to understand quantitative information. It is unknown whether MS affects numeracy. Therefore, we investigated whether patients' numeracy was impaired compared to a probabilistic national sample. As part of the larger prospective, observational, multicenter study PERCEPT, we assessed numeracy for a clinical study sample of German MS patients (N=725) with a standard test and compared them to a German probabilistic sample (N=1001), controlling for age, sex, and education. Within patients, we assessed whether disease variables (disease duration, disability, annual relapse rate, cognitive impairment) predicted numeracy beyond these demographics. MS patients showed a comparable level of numeracy as the probabilistic national sample (68.9% vs. 68.5% correct answers, P=0.831). In both samples, numeracy was higher for men and the highly educated. Disease variables did not predict numeracy beyond demographics within patients, and predictability was generally low. This sample of MS patients understood quantitative information on the same level as the general population. There is no reason to withhold quantitative information from MS patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Testing sequential extraction methods for the analysis of multiple stable isotope systems from a bone sample

    Science.gov (United States)

    Sahlstedt, Elina; Arppe, Laura

    2017-04-01

    Stable isotope composition of bones, analysed either from the mineral phase (hydroxyapatite) or from the organic phase (mainly collagen) carry important climatological and ecological information and are therefore widely used in paleontological and archaeological research. For the analysis of the stable isotope compositions, both of the phases, hydroxyapatite and collagen, have their more or less well established separation and analytical techniques. Recent development in IRMS and wet chemical extraction methods have facilitated the analysis of very small bone fractions (500 μg or less starting material) for PO43-O isotope composition. However, the uniqueness and (pre-) historical value of each archaeological and paleontological finding lead to preciously little material available for stable isotope analyses, encouraging further development of microanalytical methods for the use of stable isotope analyses. Here we present the first results in developing extraction methods for combining collagen C- and N-isotope analyses to PO43-O-isotope analyses from a single bone sample fraction. We tested sequential extraction starting with dilute acid demineralization and collection of both collagen and PO43-fractions, followed by further purification step by H2O2 (PO43-fraction). First results show that bone sample separates as small as 2 mg may be analysed for their δ15N, δ13C and δ18OPO4 values. The method may be incorporated in detailed investigation of sequentially developing skeletal material such as teeth, potentially allowing for the investigation of interannual variability in climatological/environmental signals or investigation of the early life history of an individual.

  11. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Shoji Kawahito

    2016-11-01

    Full Text Available This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs. This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC. The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median: 0.29 e−rms when compared with the CMS gain of two (2.4 e−rms, or 16 (1.1 e−rms.

  12. The effect of albedo neutrons on the neutron multiplication of small plutonium oxide samples in a PNCC chamber

    CERN Document Server

    Bourva, L C A; Weaver, D R

    2002-01-01

    This paper describes how to evaluate the effect of neutrons reflected from parts of a passive neutron coincidence chamber on the neutron leakage self-multiplication, M sub L , of a fissile sample. It is shown that albedo neutrons contribute, in the case of small plutonium bearing samples, to a significant part of M sub L , and that their effect has to be taken into account in the relationship between the measured coincidence count rates and the sup 2 sup 4 sup 0 Pu effective mass of the sample. A simple one-interaction model has been used to write the balance of neutron gains and losses in the material when exposed to the re-entrant neutron flux. The energy and intensity profiles of the re-entrant flux have been parameterised using Monte Carlo MCNP sup T sup M calculations. This technique has been implemented for the On Site Laboratory neutron/gamma counter within the existing MEPL 1.0 code for the determination of the neutron leakage self-multiplication. Benchmark tests of the resulting MEPL 2.0 code with MC...

  13. Using the Direct Sampling Multiple-Point Geostatistical Method for Filling Gaps in Landsat 7 ETM+ SLC-off Imagery

    KAUST Repository

    Yin, Gaohong

    2016-05-01

    Since the failure of the Scan Line Corrector (SLC) instrument on Landsat 7, observable gaps occur in the acquired Landsat 7 imagery, impacting the spatial continuity of observed imagery. Due to the highly geometric and radiometric accuracy provided by Landsat 7, a number of approaches have been proposed to fill the gaps. However, all proposed approaches have evident constraints for universal application. The main issues in gap-filling are an inability to describe the continuity features such as meandering streams or roads, or maintaining the shape of small objects when filling gaps in heterogeneous areas. The aim of the study is to validate the feasibility of using the Direct Sampling multiple-point geostatistical method, which has been shown to reconstruct complicated geological structures satisfactorily, to fill Landsat 7 gaps. The Direct Sampling method uses a conditional stochastic resampling of known locations within a target image to fill gaps and can generate multiple reconstructions for one simulation case. The Direct Sampling method was examined across a range of land cover types including deserts, sparse rural areas, dense farmlands, urban areas, braided rivers and coastal areas to demonstrate its capacity to recover gaps accurately for various land cover types. The prediction accuracy of the Direct Sampling method was also compared with other gap-filling approaches, which have been previously demonstrated to offer satisfactory results, under both homogeneous area and heterogeneous area situations. Studies have shown that the Direct Sampling method provides sufficiently accurate prediction results for a variety of land cover types from homogeneous areas to heterogeneous land cover types. Likewise, it exhibits superior performances when used to fill gaps in heterogeneous land cover types without input image or with an input image that is temporally far from the target image in comparison with other gap-filling approaches.

  14. Sequence-based analysis of the bacterial and fungal compositions of multiple kombucha (tea fungus) samples.

    Science.gov (United States)

    Marsh, Alan J; O'Sullivan, Orla; Hill, Colin; Ross, R Paul; Cotter, Paul D

    2014-04-01

    Kombucha is a sweetened tea beverage that, as a consequence of fermentation, contains ethanol, carbon dioxide, a high concentration of acid (gluconic, acetic and lactic) as well as a number of other metabolites and is thought to contain a number of health-promoting components. The sucrose-tea solution is fermented by a symbiosis of bacteria and yeast embedded within a cellulosic pellicle, which forms a floating mat in the tea, and generates a new layer with each successful fermentation. The specific identity of the microbial populations present has been the focus of attention but, to date, the majority of studies have relied on culture-based analyses. To gain a more comprehensive insight into the kombucha microbiota we have carried out the first culture-independent, high-throughput sequencing analysis of the bacterial and fungal populations of 5 distinct pellicles as well as the resultant fermented kombucha at two time points. Following the analysis it was established that the major bacterial genus present was Gluconacetobacter, present at >85% in most samples, with only trace populations of Acetobacter detected (kombucha, also being revealed. The yeast populations were found to be dominated by Zygosaccharomyces at >95% in the fermented beverage, with a greater fungal diversity present in the cellulosic pellicle, including numerous species not identified in kombucha previously. Ultimately, this study represents the most accurate description of the microbiology of kombucha to date. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Reducing bias in population and landscape genetic inferences: the effects of sampling related individuals and multiple life stages.

    Science.gov (United States)

    Peterman, William; Brocato, Emily R; Semlitsch, Raymond D; Eggert, Lori S

    2016-01-01

    In population or landscape genetics studies, an unbiased sampling scheme is essential for generating accurate results, but logistics may lead to deviations from the sample design. Such deviations may come in the form of sampling multiple life stages. Presently, it is largely unknown what effect sampling different life stages can have on population or landscape genetic inference, or how mixing life stages can affect the parameters being measured. Additionally, the removal of siblings from a data set is considered best-practice, but direct comparisons of inferences made with and without siblings are limited. In this study, we sampled embryos, larvae, and adult Ambystoma maculatum from five ponds in Missouri, and analyzed them at 15 microsatellite loci. We calculated allelic richness, heterozygosity and effective population sizes for each life stage at each pond and tested for genetic differentiation (F ST and D C ) and isolation-by-distance (IBD) among ponds. We tested for differences in each of these measures between life stages, and in a pooled population of all life stages. All calculations were done with and without sibling pairs to assess the effect of sibling removal. We also assessed the effect of reducing the number of microsatellites used to make inference. No statistically significant differences were found among ponds or life stages for any of the population genetic measures, but patterns of IBD differed among life stages. There was significant IBD when using adult samples, but tests using embryos, larvae, or a combination of the three life stages were not significant. We found that increasing the ratio of larval or embryo samples in the analysis of genetic distance weakened the IBD relationship, and when using D C , the IBD was no longer significant when larvae and embryos exceeded 60% of the population sample. Further, power to detect an IBD relationship was reduced when fewer microsatellites were used in the analysis.

  16. Reducing bias in population and landscape genetic inferences: the effects of sampling related individuals and multiple life stages

    Directory of Open Access Journals (Sweden)

    William Peterman

    2016-03-01

    Full Text Available In population or landscape genetics studies, an unbiased sampling scheme is essential for generating accurate results, but logistics may lead to deviations from the sample design. Such deviations may come in the form of sampling multiple life stages. Presently, it is largely unknown what effect sampling different life stages can have on population or landscape genetic inference, or how mixing life stages can affect the parameters being measured. Additionally, the removal of siblings from a data set is considered best-practice, but direct comparisons of inferences made with and without siblings are limited. In this study, we sampled embryos, larvae, and adult Ambystoma maculatum from five ponds in Missouri, and analyzed them at 15 microsatellite loci. We calculated allelic richness, heterozygosity and effective population sizes for each life stage at each pond and tested for genetic differentiation (FST and DC and isolation-by-distance (IBD among ponds. We tested for differences in each of these measures between life stages, and in a pooled population of all life stages. All calculations were done with and without sibling pairs to assess the effect of sibling removal. We also assessed the effect of reducing the number of microsatellites used to make inference. No statistically significant differences were found among ponds or life stages for any of the population genetic measures, but patterns of IBD differed among life stages. There was significant IBD when using adult samples, but tests using embryos, larvae, or a combination of the three life stages were not significant. We found that increasing the ratio of larval or embryo samples in the analysis of genetic distance weakened the IBD relationship, and when using DC, the IBD was no longer significant when larvae and embryos exceeded 60% of the population sample. Further, power to detect an IBD relationship was reduced when fewer microsatellites were used in the analysis.

  17. Automated on-line solid phase extraction coupled to HPLC-APCI-MS detection as a versatile tool for the analysis of phenols in water samples

    International Nuclear Information System (INIS)

    Wissiack, R.

    2001-05-01

    determination of the entire US EPA phenol range within a single chromatographic run with only one MSD interface and could be easily adapted for the analysis of further phenolic compounds. This represents a significant improvement over methods reported for the analysis of phenolic compounds by on-line SPE HPLC-MS so far. For the on-line SPE of phenols from water samples the recently introduced Hysphere GP and the Waters Oasis adsorbent materials were found to be most satisfactory. Their application resulted in quantitative recoveries for sample volumes up to 100 ml, excellent elution behavior (enabling fast elution resulting in narrower peaks) and relative standard deviations for the overall analysis system below 8 percent for all phenols. Typical enrichment factors for automated on-line SPE were estimated to be about one thousand compared to autosampler-injections. Thus, LODs ranging between 40-280 ng/l in SCAN mode could be achieved even when only 10 ml of spiked distilled or river water sample were processed which attests to the excellent screening capabilities of the optimized method. When using the SIM mode the sensitivity could be further increased by about one order of magnitude. The applicability of the proposed method to environmental analysis was demonstrated by preconcentrating phenols from spiked river water samples or waster water treatment effluents via automated on-line SPE HPLC-MS. Due to the very high concentration of matrix in the case of waste water treatment effluents, the sample volume preconcentrated had to be decreased to only 1 ml. Still, the sensitivity is high enough to monitor phenols at levels relevant for waste water monitoring. As a further example for the general applicability of the HPLC-MS method for the tentative structural elucidation of phenolic compounds, it was also used for the analysis of diesel exhaust condensate samples where a number of phenolic compounds could be tentatively identified. (author)

  18. Evaluation of Multiple-Sampling Function used with a Microtek flatbed scanner for Radiation Dosimetry Calibration of EBT2 Film

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Liyun [Department of Medical Imaging and Radiological Sciences, I-Shou University, Kaohsiung 82445, Taiwan (China); Ho, Sheng-Yow [Department of Nursing, Chang Jung Christian University, Tainan 71101, Taiwan (China); Department of Radiation Oncology, Chi Mei Medical Center, Liouying, Tainan 73657, Taiwan (China); Ding, Hueisch-Jy [Department of Medical Imaging and Radiological Sciences, I-Shou University, Kaohsiung 82445, Taiwan (China); Hwang, Ing-Ming [Department of Medical Imaging and Radiology, Shu Zen College of Medicine and Management, Kaohsiung 82144, Taiwan (China); Chen, Pang-Yu, E-mail: pangyuchen@yahoo.com.tw [Department of Radiation Oncology, Sinlau Christian Hospital, Tainan 70142, Taiwan (China); Lee, Tsair-Fwu, E-mail: tflee@kuas.edu.tw [Medical Physics and Informatics Laboratory, Department of Electronics Engineering, National Kaohsiung University of Applied Sciences, Kaohsiung 80778, Taiwan (China)

    2016-10-01

    The radiochromic EBT2 film is a widely used quality assurance device for radiation therapy. This study evaluated the film calibration performance of the multiple-sampling function, a function of the ScanWizard Pro scanning software provided by the manufacturer, when used with Microtek 9800XL plus (9800XL{sup +}) flatbed scanner. By using the PDD method, each one of the eight EBT2 films, four delivered by 290 monitor unit (MU) and four by 88 MU via 6-MV photon beams, was tightly sandwiched in a 30{sup 3}-cm{sup 3} water equivalent polystyrene phantom prior to irradiation. Before and after irradiation, all films were scanned using the Microtek 9800XL{sup +} scanner with five different modes of the multiple-sampling function, which could generate the image with the averaged result of multiple-sampling. The net optical densities (netOD) on the beam central axis of film were assigned to corresponding depth doses for calibration. For each sampling mode with either delivered MU, the depth-dose uncertainty of a single film from repeated scans and that of a single scan of the four films were analyzed. Finally, the calibration error and the combined calibration uncertainty between film determined depth-doses and delivered depth-doses were calculated and evaluated for each sampling mode. All standard deviations and the calibration error were demonstrated to be unrelated to the number of sampling lines. The calibration error of the 2-line and 16-line mode was within 3 cGy and better than that of the other modes. The combined uncertainty of the 2-line mode was the lowest, which was generally less than 6 cGy except for the delivered dose around 100 cGy. The evaluation described herein revealed that the EBT2 film calibrated with the 2-line mode has relatively lower error, scanning time and combined uncertianty. Therefore, it is recommended for routine EBT2 film calibration and verification of treatment plans.

  19. 32P-postlabeling assay for carcinogen-DNA adducts: description of beta shielding apparatus and semi-automatic spotting and washing devices that facilitate the handling of multiple samples

    International Nuclear Information System (INIS)

    Reddy, M.V.; Blackburn, G.R.

    1990-01-01

    The utilization of the 32 P-postlabeling assay in combination with TLC for the sensitive detection and estimation of aromatic DNA adducts has been increasing. The procedure consists of 32 P-labeling of carcinogen-adducted 3'-nucleotides in the DNA digests using γ- 32 P ATP and polynucleotide kinase, separation of 32 P-labeled adducts by TLC, and their detection by autoradiography. During both 32 P-labeling and initial phases of TLC, a relatively high amount of γ- 32 P ATP is handled when 30 samples are processed simultaneously. We describe the design of acrylic shielding apparatus, semi-automatic TLC spotting devices, and devices for development and washing of multiple TLC plates, which not only provide substantial protection from exposure to 32 P beta radiation, but also allow quick and easy handling of a large number of samples. Specifically, the equipment includes: (i) a multi-tube carousel rack having 15 wells to hold capless Eppendorf tubes and a rotatable lid with an aperture to access individual tubes; (ii) a pipette shielder; (iii) two semi-automatic spotting devices to apply radioactive solutions to TLC plates; (iv) a multi-plate holder for TLC plates; and (v) a mechanical device for washing multiple TLC plates. Item (i) is small enough to be held in one-hand, vortexed, and centrifuged to mix the solutions in each tube while beta radiation is shielded. Items (iii) to (iv) aid in the automation of the assay. (author)

  20. Mixture effects in samples of multiple contaminants - An inter-laboratory study with manifold bioassays.

    Science.gov (United States)

    Altenburger, Rolf; Scholze, Martin; Busch, Wibke; Escher, Beate I; Jakobs, Gianina; Krauss, Martin; Krüger, Janet; Neale, Peta A; Ait-Aissa, Selim; Almeida, Ana Catarina; Seiler, Thomas-Benjamin; Brion, François; Hilscherová, Klára; Hollert, Henner; Novák, Jiří; Schlichting, Rita; Serra, Hélène; Shao, Ying; Tindall, Andrew; Tolefsen, Knut-Erik; Umbuzeiro, Gisela; Williams, Tim D; Kortenkamp, Andreas

    2018-05-01

    -based endpoints produced mixture responses in agreement with the additivity expectation of concentration addition. Exceptionally, expected (additive) mixture response did not occur due to masking effects such as general toxicity from other compounds. Generally, deviations from an additivity expectation could be explained due to experimental factors, specific limitations of the effect endpoint or masking side effects such as cytotoxicity in in vitro assays. The majority of bioassays were able to quantitatively detect the predicted non-interactive, additive combined effect of the specifically bioactive compounds against a background of complex mixture of other chemicals in the sample. This supports the use of a combination of chemical and bioanalytical monitoring tools for the identification of chemicals that drive a specific mixture effect. Furthermore, we demonstrated that a panel of bioassays can provide a diverse profile of effect responses to a complex contaminated sample. This could be extended towards representing mixture adverse outcome pathways. Our findings support the ongoing development of bioanalytical tools for (i) compiling comprehensive effect-based batteries for water quality assessment, (ii) designing tailored surveillance methods to safeguard specific water uses, and (iii) devising strategies for effect-based diagnosis of complex contamination. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Development of a real-time multiplex PCR assay for the detection of multiple Salmonella serotypes in chicken samples

    Directory of Open Access Journals (Sweden)

    Whyte Paul

    2008-09-01

    Full Text Available Abstract Background A real-time multiplex PCR assay was developed for the detection of multiple Salmonella serotypes in chicken samples. Poultry-associated serotypes detected in the assay include Enteritidis, Gallinarum, Typhimurium, Kentucky and Dublin. The traditional cultural method according to EN ISO 6579:2002 for the detection of Salmonella in food was performed in parallel. The real-time PCR based method comprised a pre-enrichment step in Buffered Peptone Water (BPW overnight, followed by a shortened selective enrichment in Rappaport Vasilliadis Soya Broth (RVS for 6 hours and subsequent DNA extraction. Results The real-time multiplex PCR assay and traditional cultural method showed 100% inclusivity and 100% exclusivity on all strains tested. The real-time multiplex PCR assay was as sensitive as the traditional cultural method in detecting Salmonella in artificially contaminated chicken samples and correctly identified the serotype. Artificially contaminated chicken samples resulted in a detection limit of between 1 and 10 CFU per 25 g sample for both methods. A total of sixty-three naturally contaminated chicken samples were investigated by both methods and relative accuracy, relative sensitivity and relative specificity of the real-time PCR method were determined to be 89, 94 and 87%, respectively. Thirty cultures blind tested were correctly identified by the real-time multiplex PCR method. Conclusion Real-time PCR methodology can contribute to meet the need for rapid identification and detection methods in food testing laboratories.

  2. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  3. AutoGNI, the Robot Under the Aircraft Floor: An Automated System for Sampling Giant Aerosol Particles by Impaction in the Free Airstream Outside a Research Aircraft

    Science.gov (United States)

    Jensen, J. B.; Schwenz, K.; Aquino, J.; Carnes, J.; Webster, C.; Munnerlyn, J.; Wissman, T.; Lugger, T.

    2017-12-01

    Giant sea-salt aerosol particles, also called Giant Cloud Condensation Nuclei (GCCN), have been proposed as a means of rapidly forming precipitation sized drizzle drops in warm marine clouds (e.g., Jensen and Nugent, 2017). Such rare particles are best sampled from aircraft in air below cloud base, where normal laser optical instruments have too low sample volume to give statistically significant samples of the large particle tail. An automated sampling system (the AutoGNI) has been built to operate from inside a pressurized aircraft. Under the aircraft floor, a pressurized vessel contains 32 custom-built polycarbonate microscope slides. Using robotics with 5 motor drives and 18 positioning switches, the AutoGNI can take slides from their holding cassettes, pass them onto a caddy in an airfoil that extends 200 mm outside the aircraft, where they are exposed in the free airstream, thus avoiding the usual problems with large particle losses in air intakes. Slides are typically exposed for 10-30 s in the marine boundary layer, giving sample volumes of about 100-300 L or more. Subsequently the slides are retracted into the pressure vessel, stored and transported for laboratory microscope image analysis, in order to derive size-distribution histograms. While the aircraft is flying, the AutoGNI system is remotely controlled from a laptop on the ground, using an encrypted commercial satellite connection to the NSF/NCAR GV research aircraft's main server, and onto the AutoGNI microprocessor. The sampling of such GCCN is becoming increasingly important in order to provide complete input data for model calculations of aerosol-cloud interactions and their feedbacks in climate prediction. The AutoGNI has so far been sampling sea-salt GCCN in the Magellan Straight during the 2016 ORCAS project and over the NW Pacific during the 2017 ARISTO project, both from the NSF/NCAR GV research aircraft. Sea-salt particle sizes of 1.4 - 32 μm dry diameter have been observed.

  4. A high-pressure thermal gradient block for investigating microbial activity in multiple deep-sea samples

    DEFF Research Database (Denmark)

    Kallmeyer, J.; Ferdelman, TG; Jansen, KH

    2003-01-01

    Details about the construction and use of a high-pressure thermal gradient block for the simultaneous incubation of multiple samples are presented. Most parts used are moderately priced off-the-shelf components that easily obtainable. In order to keep the pressure independent of thermal expansion...... range of temperatures and pressures and can easily be modified to accommodate different experiments, either biological or chemical. As an application, we present measurements of bacterial sulfate reduction rates in hydrothermal sediments from Guyamas Basin over a wide range of temperatures and pressures...

  5. MicroRNA expression profiles of multiple system atrophy from formalin-fixed paraffin-embedded samples.

    Science.gov (United States)

    Wakabayashi, Koichi; Mori, Fumiaki; Kakita, Akiyoshi; Takahashi, Hitoshi; Tanaka, Shinya; Utsumi, Jun; Sasaki, Hidenao

    2016-12-02

    MicroRNAs (miRNAs) are small noncoding RNAs that regulate gene expression. Recently, we have shown that informative miRNA data can be derived from archived formalin-fixed paraffin-embedded (FFPE) samples from postmortem cases of amyotrophic lateral sclerosis and normal controls. miRNA analysis has now been performed on FFPE samples from affected brain regions in patients with multiple system atrophy (MSA) and the same areas in neurologically normal controls. We evaluated 50 samples from patients with MSA (n=13) and controls (n=13). Twenty-six samples were selected for miRNA analysis on the basis of the criteria reported previously: (i) a formalin fixation time of less than 4 weeks, (ii) a total RNA yield per sample of more than 500ng, and (iii) sufficient quality of the RNA electrophoresis pattern. These included 11 cases of MSA and 5 controls. Thus, the success rate for analysis of RNA from FFPE samples was 52% (26 of 50). For MSA, a total of 395 and 383 miRNAs were identified in the pons and cerebellum, respectively; 5 were up-regulated and 33 were down-regulated in the pons and 5 were up-regulated and 18 were down-regulated in the cerebellum. Several miRNAs down-regulated in the pons (miR-129-2-3p and miR-129-5p) and cerebellum (miR-129-2-3p, miR-129-5p and miR-132-3p) had already been identified in frozen cerebellum from MSA patients. These findings suggest that archived FFPE postmortem samples can be a valuable source for miRNA profiling in MSA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Devices used by automated milking systems are similarly accurate in estimating milk yield and in collecting a representative milk sample compared with devices used by farms with conventional milk recording

    NARCIS (Netherlands)

    Kamphuis, Claudia; Dela Rue, B.; Turner, S.A.; Petch, S.

    2015-01-01

    Information on accuracy of milk-sampling devices used on farms with automated milking systems (AMS) is essential for development of milk recording protocols. The hypotheses of this study were (1) devices used by AMS units are similarly accurate in estimating milk yield and in collecting

  7. A multiple linear regression analysis of factors affecting the simulated Basic Life Support (BLS) performance with Automated External Defibrillator (AED) in Flemish lifeguards.

    Science.gov (United States)

    Iserbyt, Peter; Schouppe, Gilles; Charlier, Nathalie

    2015-04-01

    Research investigating lifeguards' performance of Basic Life Support (BLS) with Automated External Defibrillator (AED) is limited. Assessing simulated BLS/AED performance in Flemish lifeguards and identifying factors affecting this performance. Six hundred and sixteen (217 female and 399 male) certified Flemish lifeguards (aged 16-71 years) performed BLS with an AED on a Laerdal ResusciAnne manikin simulating an adult victim of drowning. Stepwise multiple linear regression analysis was conducted with BLS/AED performance as outcome variable and demographic data as explanatory variables. Mean BLS/AED performance for all lifeguards was 66.5%. Compression rate and depth adhered closely to ERC 2010 guidelines. Ventilation volume and flow rate exceeded the guidelines. A significant regression model, F(6, 415)=25.61, p<.001, ES=.38, explained 27% of the variance in BLS performance (R2=.27). Significant predictors were age (beta=-.31, p<.001), years of certification (beta=-.41, p<.001), time on duty per year (beta=-.25, p<.001), practising BLS skills (beta=.11, p=.011), and being a professional lifeguard (beta=-.13, p=.029). 71% of lifeguards reported not practising BLS/AED. Being young, recently certified, few days of employment per year, practising BLS skills and not being a professional lifeguard are factors associated with higher BLS/AED performance. Measures should be taken to prevent BLS/AED performances from decaying with age and longer certification. Refresher courses could include a formal skills test and lifeguards should be encouraged to practise their BLS/AED skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. A column exchange chromatographic procedure for the automated purification of analytical samples in nuclear spent fuel reprocessing and plutonium fuel fabrication

    International Nuclear Information System (INIS)

    Zahradnik, P.; Swietly, H.; Doubek, N.; Bagliano, G.

    1992-11-01

    A Column Exchange Chromatographic procedure using Tri-n-Octyl-Phosphine-Oxide (TOPO) as stationary phase, is in routine use at SAL since 1984 on nuclear spent fuel reprocessing and on Pu product samples, prior to alpha and mass spectrometric analysis. This standard procedure was further on modified in view of its automation in a glove box; the resulting new procedure is described in this paper. Laboratory Robot Compatible (LRC) disposable columns were selected because their dimensions are particularly favorable and reproducible. A less corrosive HNO 3 -HI mixture substituted the former HC1-HI plutonium eluant. The inorganic support of the stationary phase used to test the above mentioned changes was unexpectedly withdrawn from the market so that another support had to be selected and the procedure reoptimized accordingly. The resulting procedure was tested with the robot and validated against the manual procedure taken as reference: the comparison showed that the modified procedure meets the analytical requirements and has the same performance than the original procedure. (author). Refs, figs and tabs

  9. Intensified Sampling in Response to a Salmonella Heidelberg Outbreak Associated with Multiple Establishments Within a Single Poultry Corporation.

    Science.gov (United States)

    Green, Alice; Defibaugh-Chavez, Stephanie; Douris, Aphrodite; Vetter, Danah; Atkinson, Richard; Kissler, Bonnie; Khroustalev, Allison; Robertson, Kis; Sharma, Yudhbir; Becker, Karen; Dessai, Uday; Antoine, Nisha; Allen, Latasha; Holt, Kristin; Gieraltowski, Laura; Wise, Matthew; Schwensohn, Colin

    2018-03-01

    On June 28, 2013, the Food Safety and Inspection Service (FSIS) was notified by the Centers for Disease Control and Prevention (CDC) of an investigation of a multistate cluster of illnesses of Salmonella enterica serovar Heidelberg. Since case-patients in the cluster reported consumption of a variety of chicken products, FSIS used a simple likelihood-based approach using traceback information to focus on intensified sampling efforts. This article describes the multiphased product sampling approach taken by FSIS when epidemiologic evidence implicated chicken products from multiple establishments operating under one corporation. The objectives of sampling were to (1) assess process control of chicken slaughter and further processing and (2) determine whether outbreak strains were present in products from these implicated establishments. As part of the sample collection process, data collected by FSIS personnel to characterize product included category (whole chicken and type of chicken parts), brand, organic or conventional product, injection with salt solutions or flavorings, and whether product was skinless or skin-on. From the period September 9, 2013, through October 31, 2014, 3164 samples were taken as part of this effort. Salmonella percent positive declined from 19.7% to 5.3% during this timeframe as a result of regulatory and company efforts. The results of intensified sampling for this outbreak investigation informed an FSIS regulatory response and corrective actions taken by the implicated establishments. The company noted that a multihurdle approach to reduce Salmonella in products was taken, including on-farm efforts such as environmental testing, depopulation of affected flocks, disinfection of affected houses, vaccination, and use of various interventions within the establishments over the course of several months.

  10. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique; Diseno y construccion de un prototipo de intercambiador para la automatizacion de la tecnica de analisis por activacion neutronica

    Energy Technology Data Exchange (ETDEWEB)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia [Direccion de Investigacion y Desarrollo, Instituto Peruano de Energia Nuclear, Lima (Peru); Lopez, Yon [Universidad Nacional de Ingenieria, Lima (Peru); Urquizo, Rafael [Universidad Tecnologica del Peru, Lima (Peru)

    2014-07-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  11. Speciation analysis of arsenic in biological matrices by automated hydride generation-cryotrapping-atomic absorption spectrometry with multiple microflame quartz tube atomizer (multiatomizer).

    Science.gov (United States)

    This paper describes an automated system for the oxidation state specific speciation of inorganic and methylated arsenicals by selective hydride generation - cryotrapping- gas chromatography - atomic absorption spectrometry with the multiatomizer. The corresponding arsines are ge...

  12. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  13. First evaluation of automated specimen inoculation for wound swab samples by use of the Previ Isola system compared to manual inoculation in a routine laboratory: finding a cost-effective and accurate approach.

    Science.gov (United States)

    Mischnik, Alexander; Mieth, Markus; Busch, Cornelius J; Hofer, Stefan; Zimmermann, Stefan

    2012-08-01

    Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis.

  14. Testing of an automated online EA-IRMS method for fast and simultaneous carbon content and stable isotope measurement of aerosol samples

    Science.gov (United States)

    Major, István; Gyökös, Brigitta; Túri, Marianna; Futó, István; Filep, Ágnes; Hoffer, András; Molnár, Mihály

    2016-04-01

    Comprehensive atmospheric studies have demonstrated that carbonaceous aerosol is one of the main components of atmospheric particulate matter over Europe. Various methods, considering optical or thermal properties, have been developed for quantification of the accurate amount of both organic and elemental carbon constituents of atmospheric aerosol. The aim of our work was to develop an alternative fast and easy method for determination of the total carbon content of individual aerosol samples collected on prebaked quartz filters whereby the mass and surface concentration becomes simply computable. We applied the conventional "elemental analyzer (EA) coupled online with an isotope ratio mass spectrometer (IRMS)" technique which is ubiquitously used in mass spectrometry. Using this technique we are able to measure simultaneously the carbon stable isotope ratio of the samples, as well. During the developing process, we compared the EA-IRMS technique with an off-line catalytic combustion method worked out previously at Hertelendi Laboratory of Environmental Studies (HEKAL). We tested the combined online total carbon content and stable isotope ratio measurement both on standard materials and real aerosol samples. Regarding the test results the novel method assures, on the one hand, at least 95% of carbon recovery yield in a broad total carbon mass range (between 100 and 3000 ug) and, on the other hand, a good reproducibility of stable isotope measurements with an uncertainty of ± 0.2 per mill. Comparing the total carbon results obtained by the EA-IRMS and the off-line catalytic combustion method we found a very good correlation (R2=0.94) that proves the applicability of both preparation method. Advantages of the novel method are the fast and simplified sample preparation steps and the fully automated, simultaneous carbon stable isotope ratio measurement processes. Furthermore stable isotope ratio results can effectively be applied in the source apportionment

  15. Determine Multiple Elements Simultaneously in the Sera of Umbilical Cord Blood Samples-a Very Simple Method.

    Science.gov (United States)

    Liang, Chunmei; Li, Zhijuan; Xia, Xun; Wang, Qunan; Tao, Ruiwen; Tao, Yiran; Xiang, Haiyun; Tong, Shilu; Tao, Fangbiao

    2017-05-01

    Analyzing the concentrations of heavy metals in the sera of umbilical cord blood samples can provide useful information about prenatal exposure to environmental agents. An analytical method based on ICP-MS to simultaneously determine multiple elements in umbilical cord blood samples was developed for assessing the in utero exposure to metallic and metalloid elements. The method only required as little as 100 μL of serum diluted 1:25 for direct analysis. Matrix-matched protocol was used to eliminate mass matrix interference and kinetic energy discrimination mode was used to eliminate the polyatomic ion interference. The assay was completed on average within 4 min with the detection limits ranging from 0.0002 to 44.4 μg/L for all the targeted elements. The detection rates for most of elements were 100 % other than cadmium (Cd), lead (Pb), and mercury (Hg). The testing results of the certified reference materials were ideal. The method is simple and sensitive, so it is suitable for the monitoring of large quantities of samples.

  16. Elevated body temperature is linked to fatigue in an Italian sample of relapsing-remitting multiple sclerosis patients.

    Science.gov (United States)

    Leavitt, V M; De Meo, E; Riccitelli, G; Rocca, M A; Comi, G; Filippi, M; Sumowski, J F

    2015-11-01

    Elevated body temperature was recently reported for the first time in patients with relapsing-remitting multiple sclerosis (RRMS) relative to healthy controls. In addition, warmer body temperature was associated with worse fatigue. These findings are highly novel, may indicate a novel pathophysiology for MS fatigue, and therefore warrant replication in a geographically separate sample. Here, we investigated body temperature and its association to fatigue in an Italian sample of 44 RRMS patients and 44 age- and sex-matched healthy controls. Consistent with our original report, we found elevated body temperature in the RRMS sample compared to healthy controls. Warmer body temperature was associated with worse fatigue, thereby supporting the notion of endogenous temperature elevations in patients with RRMS as a novel pathophysiological factor underlying fatigue. Our findings highlight a paradigm shift in our understanding of the effect of heat in RRMS, from exogenous (i.e., Uhthoff's phenomenon) to endogenous. Although randomized controlled trials of cooling treatments (i.e., aspirin, cooling garments) to reduce fatigue in RRMS have been successful, consideration of endogenously elevated body temperature as the underlying target will enhance our development of novel treatments.

  17. Using multiple sampling approaches to measure sexual risk-taking among young people in Haiti: programmatic implications.

    Science.gov (United States)

    Speizer, Ilene S; Beauvais, Harry; Gómez, Anu Manchikanti; Outlaw, Theresa Finn; Roussel, Barbara

    2009-12-01

    No previous published research has examined the applicability of varying methods for identifying young people who are at high risk of experiencing unintended pregnancy and acquiring HIV infection. This study compares three surveys of young people aged 15-24 in Port-au-Prince, Haiti, in terms of their sociodemographic characteristics and sexual behaviors and the surveys'usefulness for identifying young people at high risk and for program planning. The surveys consist of responses from: a representative sample of young people in the 2005-06 Haiti Demographic and Health Survey (HDHS), a 2004 facility-based study, and a 2006-07 venue-based study that used the Priorities for Local AIDS Control Efforts (PLACE) method. The facility-based and PLACE studies included larger proportions of single, sexually experienced young people and people who knew someone with HIV/ AIDS than did the HDHS. More respondents in the PLACE sample had multiple sex partners in the past year and received money or gifts in return for sex, compared with respondents in the facility study. At first and last sex, more PLACE respondents used contraceptives, including condoms. Experience of pregnancy was most commonly reported in the data from the facility-based sample; however, more ever-pregnant PLACE respondents than others reported ever having terminated a pregnancy. Program managers seeking to implement prevention activities should consider using facility- or venue-based methods to identify and understand the behaviors of young people at high risk.

  18. Complex, non-monotonic dose-response curves with multiple maxima: Do we (ever) sample densely enough?

    Science.gov (United States)

    Cvrčková, Fatima; Luštinec, Jiří; Žárský, Viktor

    2015-01-01

    We usually expect the dose-response curves of biological responses to quantifiable stimuli to be simple, either monotonic or exhibiting a single maximum or minimum. Deviations are often viewed as experimental noise. However, detailed measurements in plant primary tissue cultures (stem pith explants of kale and tobacco) exposed to varying doses of sucrose, cytokinins (BA or kinetin) or auxins (IAA or NAA) revealed that growth and several biochemical parameters exhibit multiple reproducible, statistically significant maxima over a wide range of exogenous substance concentrations. This results in complex, non-monotonic dose-response curves, reminiscent of previous reports of analogous observations in both metazoan and plant systems responding to diverse pharmacological treatments. These findings suggest the existence of a hitherto neglected class of biological phenomena resulting in dose-response curves exhibiting periodic patterns of maxima and minima, whose causes remain so far uncharacterized, partly due to insufficient sampling frequency used in many studies.

  19. Multiple drug resistance of Aeromonas hydrophila isolates from Chicken samples collected from Mhow and Indore city of Madhyapradesh

    Directory of Open Access Journals (Sweden)

    Kaskhedikar

    2009-02-01

    Full Text Available Fourteen antibacterial agents belonging to 9 different groups of antibiotics viz. aminoglycosides, cephalosporins, nitrofurantoin, fluroquinolones, chloramphenicol, sulphonamides, tetracyclines, penicillin and polymixin were used for in vitro sensitivity testing of Aeromonas hydrophila isolated from fifteen samples of chicken collected from retail shops in Mhow city. The sensitivity (100% was attributed to ciprofloxacin, cefuroxime, ceftriaxone, cephotaxime, chloramphenicol, gentamycin, kanamycin, nitrofurantoin, nalidixic acid and ofloxacin followed by oxytetracycline (50%. All the isolates were resistant to ampicillin and colistin antibiotics. That means, none of the isolates were found to be sensitive for penicillin and polymixin group of antibiotics. Multiple drug resistance was also observed in all A. hydrophila isolates. Out of total isolates, 100% were resistant to two antimicrobial drugs and 50% to three drugs. [Vet. World 2009; 2(1.000: 31-32

  20. New sampling electronics using CCD for DIOGENE: a high multiplicity, 4 π detector for relativistic heavy ions

    International Nuclear Information System (INIS)

    Babinet, R.P.

    1987-01-01

    DIOGENE is a small time projection chamber which has been developed to study central collisions of relativistic heavy ions. The maximum multiplicity (up to 40 charged particles) that can be accepted by this detector is limited by the present electronics. In view of the heavier mass ions that should become readily available at the Saturne national facility (France), a new sampling electronics has been tested. In the first part of this talk they will present a brief description of the actual detector, insisting on the performances that have been effectively obtained with α-particles and Neon beams. The motivation for and characteristics of a renewed electronic set-up should thus appear more clearly. The second part of the talk is devoted to results of the tests that have been performed using charged couple devices. They will finally conclude on the future perspectives that have been opened by these developments

  1. Improved detection of multiple environmental antibiotics through an optimized sample extraction strategy in liquid chromatography-mass spectrometry analysis.

    Science.gov (United States)

    Yi, Xinzhu; Bayen, Stéphane; Kelly, Barry C; Li, Xu; Zhou, Zhi

    2015-12-01

    A solid-phase extraction/liquid chromatography/electrospray ionization/multi-stage mass spectrometry (SPE-LC-ESI-MS/MS) method was optimized in this study for sensitive and simultaneous detection of multiple antibiotics in urban surface waters and soils. Among the seven classes of tested antibiotics, extraction efficiencies of macrolides, lincosamide, chloramphenicol, and polyether antibiotics were significantly improved under optimized sample extraction pH. Instead of only using acidic extraction in many existing studies, the results indicated that antibiotics with low pK a values (antibiotics with high pK a values (>7) were extracted more efficiently under neutral conditions. The effects of pH were more obvious on polar compounds than those on non-polar compounds. Optimization of extraction pH resulted in significantly improved sample recovery and better detection limits. Compared with reported values in the literature, the average reduction of minimal detection limits obtained in this study was 87.6% in surface waters (0.06-2.28 ng/L) and 67.1% in soils (0.01-18.16 ng/g dry wt). This method was subsequently applied to detect antibiotics in environmental samples in a heavily populated urban city, and macrolides, sulfonamides, and lincomycin were frequently detected. Antibiotics with highest detected concentrations were sulfamethazine (82.5 ng/L) in surface waters and erythromycin (6.6 ng/g dry wt) in soils. The optimized sample extraction strategy can be used to improve the detection of a variety of antibiotics in environmental surface waters and soils.

  2. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    Science.gov (United States)

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  3. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  4. Marijuana use and sex with multiple partners among lesbian, gay and bisexual youth: results from a national sample.

    Science.gov (United States)

    Zhang, Xiaoyun; Wu, Li-Tzy

    2017-01-05

    Sex with multiple partners (SMP) is one of the important contributing factors for contracting sexually transmitted infections (STIs) among adolescents and young adults, especially among Lesbian, Gay, and Bisexual (LGB) youth. Past studies mainly focus on examining associations of alcohol or club drugs use with unprotected sexual behaviors among adult homo/bisexual men, while little is known about the temporal association between marijuana use (MU) and SMP among LGB youth. This study examined the relationship between MU and SMP among LGB adolescents and young adults. Generalized estimating equations (GEE) logistic regression analyses were utilized to analyze four waves' public-use Add Health data (N = 694, youth who reported a homo/bisexual status at any wave; Wave 1: aged 11-21; Wave 4: aged 24-32). After adjusting for other substance use, current depression, mother-child relationship quality at Wave 1, and socioeconomic variables, past-year MU was both concurrently and prospectively associated with past-year SMP. The moderating effect of age was not found. MU is concurrently and prospectively associated with increased odds of SMP in the adolescent sample and in the young adult sample. Findings imply that prevention/intervention on HIV risk behaviors may benefit from MU reduction not only in LGB adolescents but also in young adults.

  5. Marijuana use and sex with multiple partners among lesbian, gay and bisexual youth: results from a national sample

    Directory of Open Access Journals (Sweden)

    Xiaoyun Zhang

    2017-01-01

    Full Text Available Abstract Background Sex with multiple partners (SMP is one of the important contributing factors for contracting sexually transmitted infections (STIs among adolescents and young adults, especially among Lesbian, Gay, and Bisexual (LGB youth. Past studies mainly focus on examining associations of alcohol or club drugs use with unprotected sexual behaviors among adult homo/bisexual men, while little is known about the temporal association between marijuana use (MU and SMP among LGB youth. Methods This study examined the relationship between MU and SMP among LGB adolescents and young adults. Generalized estimating equations (GEE logistic regression analyses were utilized to analyze four waves’ public-use Add Health data (N = 694, youth who reported a homo/bisexual status at any wave; Wave 1: aged 11–21; Wave 4: aged 24–32. Results After adjusting for other substance use, current depression, mother-child relationship quality at Wave 1, and socioeconomic variables, past-year MU was both concurrently and prospectively associated with past-year SMP. The moderating effect of age was not found. Conclusion MU is concurrently and prospectively associated with increased odds of SMP in the adolescent sample and in the young adult sample. Findings imply that prevention/intervention on HIV risk behaviors may benefit from MU reduction not only in LGB adolescents but also in young adults.

  6. GENESIS 1.1: A hybrid-parallel molecular dynamics simulator with enhanced sampling algorithms on multiple computational platforms.

    Science.gov (United States)

    Kobayashi, Chigusa; Jung, Jaewoon; Matsunaga, Yasuhiro; Mori, Takaharu; Ando, Tadashi; Tamura, Koichi; Kamiya, Motoshi; Sugita, Yuji

    2017-09-30

    GENeralized-Ensemble SImulation System (GENESIS) is a software package for molecular dynamics (MD) simulation of biological systems. It is designed to extend limitations in system size and accessible time scale by adopting highly parallelized schemes and enhanced conformational sampling algorithms. In this new version, GENESIS 1.1, new functions and advanced algorithms have been added. The all-atom and coarse-grained potential energy functions used in AMBER and GROMACS packages now become available in addition to CHARMM energy functions. The performance of MD simulations has been greatly improved by further optimization, multiple time-step integration, and hybrid (CPU + GPU) computing. The string method and replica-exchange umbrella sampling with flexible collective variable choice are used for finding the minimum free-energy pathway and obtaining free-energy profiles for conformational changes of a macromolecule. These new features increase the usefulness and power of GENESIS for modeling and simulation in biological research. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  7. Analysis of nitrosamines in water by automated SPE and isotope dilution GC/HRMS Occurrence in the different steps of a drinking water treatment plant, and in chlorinated samples from a reservoir and a sewage treatment plant effluent.

    Science.gov (United States)

    Planas, Carles; Palacios, Oscar; Ventura, Francesc; Rivera, Josep; Caixach, Josep

    2008-08-15

    A method based on automated solid-phase extraction (SPE) and isotope dilution gas chromatography/high resolution mass spectrometry (GC/HRMS) has been developed for the analysis of nine nitrosamines in water samples. The combination of automated SPE and GC/HRMS for the analysis of nitrosamines has not been reported previously. The method shows as advantages the selectivity and sensitivity of GC/HRMS analysis and the high efficiency of automated SPE with coconut charcoal EPA 521 cartridges. Low method detection limits (MDLs) were achieved, along with a greater facility of the procedure and less dependence on the operator with regard to the methods based on manual SPE. Quality requirements for isotope dilution-based methods were accomplished for most analysed nitrosamines, regarding to trueness (80-120%), method precision (water samples (16 samples from a drinking water treatment plant {DWTP}, 2 chlorinated samples from a sewage treatment plant {STP} effluent, and 1 chlorinated sample from a reservoir) were analysed. Concentrations of nitrosamines in the STP effluent were 309.4 and 730.2 ng/L, being higher when higher doses of chlorine were applied. N-Nitrosodimethylamine (NDMA) and N-nitrosodiethylamine (NDEA) were the main compounds identified in the STP effluent, and NDEA was detected above 200 ng/L, regulatory level for NDMA in effluents stated in Ontario (Canada). Lower concentrations of nitrosamines were found in the reservoir (20.3 ng/L) and in the DWTP samples (n.d. -28.6 ng/L). NDMA and NDEA were respectively found in the reservoir and in treated and highly chlorinated DWTP samples at concentrations above 10 ng/L (guide value established in different countries). The highest concentrations of nitrosamines were found after chlorination and ozonation processes (ozonated, treated and highly chlorinated water) in DWTP samples.

  8. Less is more? Assessing the validity of the ICD-11 model of PTSD across multiple trauma samples

    Science.gov (United States)

    Hansen, Maj; Hyland, Philip; Armour, Cherie; Shevlin, Mark; Elklit, Ask

    2015-01-01

    Background In the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the symptom profile of posttraumatic stress disorder (PTSD) was expanded to include 20 symptoms. An alternative model of PTSD is outlined in the proposed 11th edition of the International Classification of Diseases (ICD-11) that includes just six symptoms. Objectives and method The objectives of the current study are: 1) to independently investigate the fit of the ICD-11 model of PTSD, and three DSM-5-based models of PTSD, across seven different trauma samples (N=3,746) using confirmatory factor analysis; 2) to assess the concurrent validity of the ICD-11 model of PTSD; and 3) to determine if there are significant differences in diagnostic rates between the ICD-11 guidelines and the DSM-5 criteria. Results The ICD-11 model of PTSD was found to provide excellent model fit in six of the seven trauma samples, and tests of factorial invariance showed that the model performs equally well for males and females. DSM-5 models provided poor fit of the data. Concurrent validity was established as the ICD-11 PTSD factors were all moderately to strongly correlated with scores of depression, anxiety, dissociation, and aggression. Levels of association were similar for ICD-11 and DSM-5 suggesting that explanatory power is not affected due to the limited number of items included in the ICD-11 model. Diagnostic rates were significantly lower according to ICD-11 guidelines compared to the DSM-5 criteria. Conclusions The proposed factor structure of the ICD-11 model of PTSD appears valid across multiple trauma types, possesses good concurrent validity, and is more stringent in terms of diagnosis compared to the DSM-5 criteria. PMID:26450830

  9. Isotopic analysis of calcium in blood plasma and bone from mouse samples by multiple collector-ICP-mass spectrometry

    International Nuclear Information System (INIS)

    Hirata, Takafumi; Tanoshima, Mina; Suga, Akinobu; Tanaka, Yu-ki; Nagata, Yuichi; Shinohara, Atsuko; Chiba, Momoko

    2008-01-01

    The biological processing of Ca produces significant stable isotope fractionation. The level of isotopic fractionation can provide key information about the variation in dietary consumption or Ca metabolism. To investigate this, we measured the 43 Ca/ 42 Ca and 44 Ca/ 42 Ca ratios for bone and blood plasma samples collected from mice of various ages using multiple collector-ICP-mass spectrometry (MC-ICP-MS). The 44 Ca/ 42 Ca ratio in bones was significantly (0.44 - 0.84 per mille) lower than the corresponding ratios in the diet, suggesting that Ca was isotopically fractionated during Ca metabolism for bone formation. The resulting 44 Ca/ 42 Ca ratios for blood plasma showed almost identical, or slightly higher, values (0.03 - 0.2 per mille) than found in a corresponding diet. This indicates that a significant amount of Ca in the blood plasma was from dietary sources. Unlike that discovered for Fe, there were not significant differences in the measured 44 Ca/ 42 Ca ratios between female and male specimens (for either bone or blood plasma samples). Similarity, the 44 Ca/ 42 Ca ratios suggests that there were no significant differences in Ca dietary consumption or Ca metabolism between female and male specimens. In contrast, the 44 Ca/ 42 Ca ratios of blood plasma from mother mice during the lactation period were significantly higher than those for all other adult specimens. This suggests that Ca supplied to infants through lactation was isotopically lighter, and the preferential supply of isotropically lighter Ca resulted in isotopically heavier Ca in blood plasma of mother mice during the lactation period. The data obtained here clearly demonstrate that the Ca isotopic ratio has a potential to become a new tool for evaluating changes in dietary consumption, or Ca metabolism of animals. (author)

  10. Less is more? Assessing the validity of the ICD-11 model of PTSD across multiple trauma samples

    Directory of Open Access Journals (Sweden)

    Maj Hansen

    2015-10-01

    Full Text Available Background: In the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5, the symptom profile of posttraumatic stress disorder (PTSD was expanded to include 20 symptoms. An alternative model of PTSD is outlined in the proposed 11th edition of the International Classification of Diseases (ICD-11 that includes just six symptoms. Objectives and method: The objectives of the current study are: 1 to independently investigate the fit of the ICD-11 model of PTSD, and three DSM-5-based models of PTSD, across seven different trauma samples (N=3,746 using confirmatory factor analysis; 2 to assess the concurrent validity of the ICD-11 model of PTSD; and 3 to determine if there are significant differences in diagnostic rates between the ICD-11 guidelines and the DSM-5 criteria. Results: The ICD-11 model of PTSD was found to provide excellent model fit in six of the seven trauma samples, and tests of factorial invariance showed that the model performs equally well for males and females. DSM-5 models provided poor fit of the data. Concurrent validity was established as the ICD-11 PTSD factors were all moderately to strongly correlated with scores of depression, anxiety, dissociation, and aggression. Levels of association were similar for ICD-11 and DSM-5 suggesting that explanatory power is not affected due to the limited number of items included in the ICD-11 model. Diagnostic rates were significantly lower according to ICD-11 guidelines compared to the DSM-5 criteria. Conclusions: The proposed factor structure of the ICD-11 model of PTSD appears valid across multiple trauma types, possesses good concurrent validity, and is more stringent in terms of diagnosis compared to the DSM-5 criteria.

  11. Less is more? Assessing the validity of the ICD-11 model of PTSD across multiple trauma samples.

    Science.gov (United States)

    Hansen, Maj; Hyland, Philip; Armour, Cherie; Shevlin, Mark; Elklit, Ask

    2015-01-01

    In the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the symptom profile of posttraumatic stress disorder (PTSD) was expanded to include 20 symptoms. An alternative model of PTSD is outlined in the proposed 11th edition of the International Classification of Diseases (ICD-11) that includes just six symptoms. The objectives of the current study are: 1) to independently investigate the fit of the ICD-11 model of PTSD, and three DSM-5-based models of PTSD, across seven different trauma samples (N=3,746) using confirmatory factor analysis; 2) to assess the concurrent validity of the ICD-11 model of PTSD; and 3) to determine if there are significant differences in diagnostic rates between the ICD-11 guidelines and the DSM-5 criteria. The ICD-11 model of PTSD was found to provide excellent model fit in six of the seven trauma samples, and tests of factorial invariance showed that the model performs equally well for males and females. DSM-5 models provided poor fit of the data. Concurrent validity was established as the ICD-11 PTSD factors were all moderately to strongly correlated with scores of depression, anxiety, dissociation, and aggression. Levels of association were similar for ICD-11 and DSM-5 suggesting that explanatory power is not affected due to the limited number of items included in the ICD-11 model. Diagnostic rates were significantly lower according to ICD-11 guidelines compared to the DSM-5 criteria. The proposed factor structure of the ICD-11 model of PTSD appears valid across multiple trauma types, possesses good concurrent validity, and is more stringent in terms of diagnosis compared to the DSM-5 criteria.

  12. A Centrifugal Microfluidic Platform That Separates Whole Blood Samples into Multiple Removable Fractions Due to Several Discrete but Continuous Density Gradient Sections

    Science.gov (United States)

    Moen, Scott T.; Hatcher, Christopher L.; Singh, Anup K.

    2016-01-01

    We present a miniaturized centrifugal platform that uses density centrifugation for separation and analysis of biological components in small volume samples (~5 μL). We demonstrate the ability to enrich leukocytes for on-disk visualization via microscopy, as well as recovery of viable cells from each of the gradient partitions. In addition, we simplified the traditional Modified Wright-Giemsa staining by decreasing the time, volume, and expertise involved in the procedure. From a whole blood sample, we were able to extract 95.15% of leukocytes while excluding 99.8% of red blood cells. This platform has great potential in both medical diagnostics and research applications as it offers a simpler, automated, and inexpensive method for biological sample separation, analysis, and downstream culturing. PMID:27054764

  13. COMPARISON OF IMMUNOASSAY AND GAS CHROMATOGRAPHY/MASS SPECTROMETRY METHODS FOR MEASURING 3,5,6-TRICHLORO-2PYRIDINOL IN MULTIPLE SAMPLE MEDIA

    Science.gov (United States)

    Two enzyme-linked immunosorbent assay (ELISA) methods were evaluated for the determination of 3,5,6-trichloro-2-pyridinol (3,5,6-TCP) in multiple sample media (dust, soil, food, and urine). The dust and soil samples were analyzed by a commercial RaPID immunoassay testing kit. ...

  14. [Correlation between demyelinating lesions and executive function decline in a sample of Mexican patients with multiple sclerosis].

    Science.gov (United States)

    Aldrete Cortez, V R; Duriez-Sotelo, E; Carrillo-Mora, P; Pérez-Zuno, J A

    2013-09-01

    Multiple Sclerosis (MS) is characterised by several neurological symptoms including cognitive impairment, which has recently been the subject of considerable study. At present, evidence pointing to a correlation between lesion characteristics and specific cognitive impairment is not conclusive. To investigate the presence of a correlation between the characteristics of demyelinating lesions and performance of basic executive functions in a sample of MS patients. We included 21 adult patients with scores of 0 to 5 on the Kurtzke scale and no exacerbations of the disease in at least 3 months prior to the evaluation date. They completed the Stroop test and the Wisconsin Card Sorting Test (WCST). The location of the lesions was determined using magnetic resonance imaging (MRI) performed by a blinded expert in neuroimaging. Demyelinating lesions were more frequently located in the frontal and occipital lobes. The Stroop test showed that as cognitive demand increased on each of the sections in the test, reaction time and number of errors increased. On the WCST, 33.33% of patients registered as having moderate cognitive impairment. No correlation could be found between demyelinating lesion characteristics (location, size, and number) and patients' scores on the tests. Explanations of the causes of cognitive impairment in MS should examine a variety of biological, psychological, and social factors instead of focusing solely on demyelinating lesions. Copyright © 2012 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.

  15. Multiple-image authentication with a cascaded multilevel architecture based on amplitude field random sampling and phase information multiplexing.

    Science.gov (United States)

    Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Pan, Xuemei; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi

    2015-04-10

    A multiple-image authentication method with a cascaded multilevel architecture in the Fresnel domain is proposed, in which a synthetic encoded complex amplitude is first fabricated, and its real amplitude component is generated by iterative amplitude encoding, random sampling, and space multiplexing for the low-level certification images, while the phase component of the synthetic encoded complex amplitude is constructed by iterative phase information encoding and multiplexing for the high-level certification images. Then the synthetic encoded complex amplitude is iteratively encoded into two phase-type ciphertexts located in two different transform planes. During high-level authentication, when the two phase-type ciphertexts and the high-level decryption key are presented to the system and then the Fresnel transform is carried out, a meaningful image with good quality and a high correlation coefficient with the original certification image can be recovered in the output plane. Similar to the procedure of high-level authentication, in the case of low-level authentication with the aid of a low-level decryption key, no significant or meaningful information is retrieved, but it can result in a remarkable peak output in the nonlinear correlation coefficient of the output image and the corresponding original certification image. Therefore, the method realizes different levels of accessibility to the original certification image for different authority levels with the same cascaded multilevel architecture.

  16. Automating ActionScript Projects with Eclipse and Ant

    CERN Document Server

    Koning, Sidney

    2011-01-01

    Automating repetitive programming tasks is easier than many Flash/AS3 developers think. With the Ant build tool, the Eclipse IDE, and this concise guide, you can set up your own "ultimate development machine" to code, compile, debug, and deploy projects faster. You'll also get started with versioning systems, such as Subversion and Git. Create a consistent workflow for multiple machines, or even complete departments, with the help of extensive Ant code samples. If you want to work smarter and take your skills to a new level, this book will get you on the road to automation-with Ant. Set up y

  17. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune

    2011-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO...... handler leading to the reduction of manual work, and increased quality and throughput....

  18. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka : Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    NARCIS (Netherlands)

    Kottawatta, Kottawattage S A; van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-01-01

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on

  19. Clinically significant fatigue: prevalence and associated factors in an international sample of adults with multiple sclerosis recruited via the internet.

    Directory of Open Access Journals (Sweden)

    Tracey J Weiland

    Full Text Available Fatigue contributes a significant burden of disease for people with multiple sclerosis (PwMS. Modifiable lifestyle factors have been recognized as having a role in a range of morbidity outcomes in PwMS. There is significant potential to prevent and treat fatigue in PwMS by addressing modifiable risk factors.To explore the associations between clinically significant fatigue and demographic factors, clinical factors (health-related quality of life, disability and relapse rate and modifiable lifestyle, disease-modifying drugs (DMD and supplement use in a large international sample of PwMS.PwMS were recruited to the study via Web 2.0 platforms and completed a comprehensive survey measuring demographic, lifestyle and clinical characteristics, including health-related quality of life, disability, and relapse rate.Of 2469 participants with confirmed MS, 2138 (86.6% completed a validated measure of clinically significant fatigue, the Fatigue Severity Scale. Participants were predominantly female from English speaking countries, with relatively high levels of education, and due to recruitment methods may have been highly pro-active about engaging in lifestyle management and self-help. Approximately two thirds of our sample (1402/2138; 65.6% (95% CI 63.7-67.7 screened positive for clinically significant fatigue. Bivariate associations were present between clinically significant fatigue and several demographic, clinical, lifestyle, and medication variables. After controlling for level of disability and a range of stable socio-demographic variables, we found increased odds of fatigue associated with obesity, DMD use, poor diet, and reduced odds of fatigue with exercise, fish consumption, moderate alcohol use, and supplementation with vitamin D and flaxseed oil.This study supports strong and significant associations between clinically significant fatigue and modifiable lifestyle factors. Longitudinal follow-up of this sample may help clarify the contribution

  20. Direct sampling during multiple sediment density flows reveals dynamic sediment transport and depositional environment in Monterey submarine canyon

    Science.gov (United States)

    Maier, K. L.; Gales, J. A.; Paull, C. K.; Gwiazda, R.; Rosenberger, K. J.; McGann, M.; Lundsten, E. M.; Anderson, K.; Talling, P.; Xu, J.; Parsons, D. R.; Barry, J.; Simmons, S.; Clare, M. A.; Carvajal, C.; Wolfson-Schwehr, M.; Sumner, E.; Cartigny, M.

    2017-12-01

    Sediment density flows were directly sampled with a coupled sediment trap-ADCP-instrument mooring array to evaluate the character and frequency of turbidity current events through Monterey Canyon, offshore California. This novel experiment aimed to provide links between globally significant sediment density flow processes and their resulting deposits. Eight to ten Anderson sediment traps were repeatedly deployed at 10 to 300 meters above the seafloor on six moorings anchored at 290 to 1850 meters water depth in the Monterey Canyon axial channel during 6-month deployments (October 2015 - April 2017). Anderson sediment traps include a funnel and intervalometer (discs released at set time intervals) above a meter-long tube, which preserves fine-scale stratigraphy and chronology. Photographs, multi-sensor logs, CT scans, and grain size analyses reveal layers from multiple sediment density flow events that carried sediment ranging from fine sand to granules. More sediment accumulation from sediment density flows, and from between flows, occurred in the upper canyon ( 300 - 800 m water depth) compared to the lower canyon ( 1300 - 1850 m water depth). Sediment accumulated in the traps during sediment density flows is sandy and becomes finer down-canyon. In the lower canyon where sediment directly sampled from density flows are clearly distinguished within the trap tubes, sands have sharp basal contacts, normal grading, and muddy tops that exhibit late-stage pulses. In at least two of the sediment density flows, the simultaneous low velocity and high backscatter measured by the ADCPs suggest that the trap only captured the collapsing end of a sediment density flow event. In the upper canyon, accumulation between sediment density flow events is twice as fast compared to the lower canyon; it is characterized by sub-cm-scale layers in muddy sediment that appear to have accumulated with daily to sub-daily frequency, likely related to known internal tidal dynamics also measured

  1. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  2. A spreadsheet template compatible with Microsoft Excel and iWork Numbers that returns the simultaneous confidence intervals for all pairwise differences between multiple sample means.

    Science.gov (United States)

    Brown, Angus M

    2010-04-01

    The objective of the method described in this paper is to develop a spreadsheet template for the purpose of comparing multiple sample means. An initial analysis of variance (ANOVA) test on the data returns F--the test statistic. If F is larger than the critical F value drawn from the F distribution at the appropriate degrees of freedom, convention dictates rejection of the null hypothesis and allows subsequent multiple comparison testing to determine where the inequalities between the sample means lie. A variety of multiple comparison methods are described that return the 95% confidence intervals for differences between means using an inclusive pairwise comparison of the sample means. 2009 Elsevier Ireland Ltd. All rights reserved.

  3. MZDASoft: a software architecture that enables large-scale comparison of protein expression levels over multiple samples based on liquid chromatography/tandem mass spectrometry.

    Science.gov (United States)

    Ghanat Bari, Mehrab; Ramirez, Nelson; Wang, Zhiwei; Zhang, Jianqiu Michelle

    2015-10-15

    Without accurate peak linking/alignment, only the expression levels of a small percentage of proteins can be compared across multiple samples in Liquid Chromatography/Mass Spectrometry/Tandem Mass Spectrometry (LC/MS/MS) due to the selective nature of tandem MS peptide identification. This greatly hampers biomedical research that aims at finding biomarkers for disease diagnosis, treatment, and the understanding of disease mechanisms. A recent algorithm, PeakLink, has allowed the accurate linking of LC/MS peaks without tandem MS identifications to their corresponding ones with identifications across multiple samples collected from different instruments, tissues and labs, which greatly enhanced the ability of comparing proteins. However, PeakLink cannot be implemented practically for large numbers of samples based on existing software architectures, because it requires access to peak elution profiles from multiple LC/MS/MS samples simultaneously. We propose a new architecture based on parallel processing, which extracts LC/MS peak features, and saves them in database files to enable the implementation of PeakLink for multiple samples. The software has been deployed in High-Performance Computing (HPC) environments. The core part of the software, MZDASoft Parallel Peak Extractor (PPE), can be downloaded with a user and developer's guide, and it can be run on HPC centers directly. The quantification applications, MZDASoft TandemQuant and MZDASoft PeakLink, are written in Matlab, which are compiled with a Matlab runtime compiler. A sample script that incorporates all necessary processing steps of MZDASoft for LC/MS/MS quantification in a parallel processing environment is available. The project webpage is http://compgenomics.utsa.edu/zgroup/MZDASoft. The proposed architecture enables the implementation of PeakLink for multiple samples. Significantly more (100%-500%) proteins can be compared over multiple samples with better quantification accuracy in test cases. MZDASoft

  4. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  5. Automated gamma counters

    International Nuclear Information System (INIS)

    Regener, M.

    1977-01-01

    This is a report on the most recent developments in the full automation of gamma counting in RIA, in particular by Messrs. Kontron. The development targets were flexibility in sample capacity and shape of test tubes, the possibility of using different radioisotopes for labelling due to an optimisation of the detector system and the use of microprocessers to substitute software for hardware. (ORU) [de

  6. Multiple Continental Radiations and Correlates of Diversification in Lupinus (Leguminosae): Testing for Key Innovation with Incomplete Taxon Sampling

    Science.gov (United States)

    Drummond, Christopher S.; Eastwood, Ruth J.; Miotto, Silvia T. S.; Hughes, Colin E.

    2012-01-01

    Replicate radiations provide powerful comparative systems to address questions about the interplay between opportunity and innovation in driving episodes of diversification and the factors limiting their subsequent progression. However, such systems have been rarely documented at intercontinental scales. Here, we evaluate the hypothesis of multiple radiations in the genus Lupinus (Leguminosae), which exhibits some of the highest known rates of net diversification in plants. Given that incomplete taxon sampling, background extinction, and lineage-specific variation in diversification rates can confound macroevolutionary inferences regarding the timing and mechanisms of cladogenesis, we used Bayesian relaxed clock phylogenetic analyses as well as MEDUSA and BiSSE birth–death likelihood models of diversification, to evaluate the evolutionary patterns of lineage accumulation in Lupinus. We identified 3 significant shifts to increased rates of net diversification (r) relative to background levels in the genus (r = 0.18–0.48 lineages/myr). The primary shift occurred approximately 4.6 Ma (r = 0.48–1.76) in the montane regions of western North America, followed by a secondary shift approximately 2.7 Ma (r = 0.89–3.33) associated with range expansion and diversification of allopatrically distributed sister clades in the Mexican highlands and Andes. We also recovered evidence for a third independent shift approximately 6.5 Ma at the base of a lower elevation eastern South American grassland and campo rupestre clade (r = 0.36–1.33). Bayesian ancestral state reconstructions and BiSSE likelihood analyses of correlated diversification indicated that increased rates of speciation are strongly associated with the derived evolution of perennial life history and invasion of montane ecosystems. Although we currently lack hard evidence for “replicate adaptive radiations” in the sense of convergent morphological and ecological trajectories among species in different

  7. A semi-automated magnetic capture probe based DNA extraction and real-time PCR method applied in the Swedish surveillance of Echinococcus multilocularis in red fox (Vulpes vulpes) faecal samples.

    Science.gov (United States)

    Isaksson, Mats; Hagström, Åsa; Armua-Fernandez, Maria Teresa; Wahlström, Helene; Ågren, Erik Olof; Miller, Andrea; Holmberg, Anders; Lukacs, Morten; Casulli, Adriano; Deplazes, Peter; Juremalm, Mikael

    2014-12-19

    Following the first finding of Echinococcus multilocularis in Sweden in 2011, 2985 red foxes (Vulpes vulpes) were analysed by the segmental sedimentation and counting technique. This is a labour intensive method and requires handling of the whole carcass of the fox, resulting in a costly analysis. In an effort to reduce the cost of labour and sample handling, an alternative method has been developed. The method is sensitive and partially automated for detection of E. multilocularis in faecal samples. The method has been used in the Swedish E. multilocularis monitoring program for 2012-2013 on more than 2000 faecal samples. We describe a new semi-automated magnetic capture probe DNA extraction method and real time hydrolysis probe polymerase chain reaction assay (MC-PCR) for the detection of E. multilocularis DNA in faecal samples from red fox. The diagnostic sensitivity was determined by validating the new method against the sedimentation and counting technique in fox samples collected in Switzerland where E. multilocularis is highly endemic. Of 177 foxes analysed by the sedimentation and counting technique, E. multilocularis was detected in 93 animals. Eighty-two (88%, 95% C.I 79.8-93.9) of these were positive in the MC-PCR. In foxes with more than 100 worms, the MC-PCR was positive in 44 out of 46 (95.7%) cases. The two MC-PCR negative samples originated from foxes with only immature E. multilocularis worms. In foxes with 100 worms or less, (n = 47), 38 (80.9%) were positive in the MC-PCR. The diagnostic specificity of the MC-PCR was evaluated using fox scats collected within the Swedish screening. Of 2158 samples analysed, two were positive. This implies that the specificity is at least 99.9% (C.I. = 99.7-100). The MC-PCR proved to have a high sensitivity and a very high specificity. The test is partially automated but also possible to perform manually if desired. The test is well suited for nationwide E. multilocularis surveillance programs where sampling

  8. A Chip-Capillary Hybrid Device for Automated Transfer of Sample Pre-Separated by Capillary Isoelectric Focusing to Parallel Capillary Gel Electrophoresis for Two-Dimensional Protein Separation

    Science.gov (United States)

    Lu, Joann J.; Wang, Shili; Li, Guanbin; Wang, Wei; Pu, Qiaosheng; Liu, Shaorong

    2012-01-01

    In this report, we introduce a chip-capillary hybrid device to integrate capillary isoelectric focusing (CIEF) with parallel capillary sodium dodecyl sulfate – polyacrylamide gel electrophoresis (SDS-PAGE) or capillary gel electrophoresis (CGE) toward automating two-dimensional (2D) protein separations. The hybrid device consists of three chips that are butted together. The middle chip can be moved between two positions to re-route the fluidic paths, which enables the performance of CIEF and injection of proteins partially resolved by CIEF to CGE capillaries for parallel CGE separations in a continuous and automated fashion. Capillaries are attached to the other two chips to facilitate CIEF and CGE separations and to extend the effective lengths of CGE columns. Specifically, we illustrate the working principle of the hybrid device, develop protocols for producing and preparing the hybrid device, and demonstrate the feasibility of using this hybrid device for automated injection of CIEF-separated sample to parallel CGE for 2D protein separations. Potentials and problems associated with the hybrid device are also discussed. PMID:22830584

  9. Smart management of sample dilution using an artificial neural network to achieve streamlined processes and saving resources: the automated nephelometric testing of serum free light chain as case study.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Saving resources is a paramount issue for the modern laboratory, and new trainable as well as smart technologies can be used to allow the automated instrumentation to manage samples more efficiently in order to achieve streamlined processes. In this regard the serum free light chain (sFLC) testing represents an interesting challenge, as it usually causes using a number of assays before achieving an acceptable result within the analytical range. An artificial neural network based on the multi-layer perceptron (MLP-ANN) was used to infer the starting dilution status of sFLC samples based on the information available through the laboratory information system (LIS). After the learning phase, the MLP-ANN simulation was applied to the nephelometric testing routinely performed in our laboratory on a BN ProSpec® System analyzer (Siemens Helathcare) using the N Latex FLC kit. The MLP-ANN reduced the serum kappa free light chain (κ-FLC) and serum lambda free light chain (λ-FLC) wasted tests by 69.4% and 70.8% with respect to the naïve stepwise dilution scheme used by the automated analyzer, and by 64.9% and 66.9% compared to a "rational" dilution scheme based on a 4-step dilution. Although it was restricted to follow-up samples, the MLP-ANN showed good predictive performance, which alongside the possibility to implement it in any automated system, made it a suitable solution for achieving streamlined laboratory processes and saving resources.

  10. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  11. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  12. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    Science.gov (United States)

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Interlaboratory study of DNA extraction from multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR for individual kernel detection system of genetically modified maize.

    Science.gov (United States)

    Akiyama, Hiroshi; Sakata, Kozue; Makiyma, Daiki; Nakamura, Kosuke; Teshima, Reiko; Nakashima, Akie; Ogawa, Asako; Yamagishi, Toru; Futo, Satoshi; Oguchi, Taichi; Mano, Junichi; Kitta, Kazumi

    2011-01-01

    In many countries, the labeling of grains, feed, and foodstuff is mandatory if the genetically modified (GM) organism content exceeds a certain level of approved GM varieties. We previously developed an individual kernel detection system consisting of grinding individual kernels, DNA extraction from the individually ground kernels, GM detection using multiplex real-time PCR, and GM event detection using multiplex qualitative PCR to analyze the precise commingling level and varieties of GM maize in real sample grains. We performed the interlaboratory study of the DNA extraction with multiple ground samples, multiplex real-time PCR detection, and multiplex qualitative PCR detection to evaluate its applicability, practicality, and ruggedness for the individual kernel detection system of GM maize. DNA extraction with multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR were evaluated by five laboratories in Japan, and all results from these laboratories were consistent with the expected results in terms of the commingling level and event analysis. Thus, the DNA extraction with multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR for the individual kernel detection system is applicable and practicable in a laboratory to regulate the commingling level of GM maize grain for GM samples, including stacked GM maize.

  14. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  15. Auxiliary variables in multiple imputation in regression with missing X: a warning against including too many in small sample research

    Directory of Open Access Journals (Sweden)

    Hardt Jochen

    2012-12-01

    Full Text Available Abstract Background Multiple imputation is becoming increasingly popular. Theoretical considerations as well as simulation studies have shown that the inclusion of auxiliary variables is generally of benefit. Methods A simulation study of a linear regression with a response Y and two predictors X1 and X2 was performed on data with n = 50, 100 and 200 using complete cases or multiple imputation with 0, 10, 20, 40 and 80 auxiliary variables. Mechanisms of missingness were either 100% MCAR or 50% MAR + 50% MCAR. Auxiliary variables had low (r=.10 vs. moderate correlations (r=.50 with X’s and Y. Results The inclusion of auxiliary variables can improve a multiple imputation model. However, inclusion of too many variables leads to downward bias of regression coefficients and decreases precision. When the correlations are low, inclusion of auxiliary variables is not useful. Conclusion More research on auxiliary variables in multiple imputation should be performed. A preliminary rule of thumb could be that the ratio of variables to cases with complete data should not go below 1 : 3.

  16. PTSD and Comorbid Disorders in a Representative Sample of Adolescents: The Risk Associated with Multiple Exposures to Potentially Traumatic Events

    Science.gov (United States)

    Macdonald, Alexandra; Danielson, Carla Kmett; Resnick, Heidi S.; Saunders, Benjamin E.; Kilpatrick, Dean G.

    2010-01-01

    Objective: This study compared the impact of multiple exposures to potentially traumatic events (PTEs), including sexual victimization, physical victimization, and witnessed violence, on posttraumatic stress disorder (PTSD) and comorbid conditions (i.e., major depressive episode [MDE], and substance use [SUD]). Methods: Participants were a…

  17. Multi-Locus Next-Generation Sequence Typing of DNA Extracted From Pooled Colonies Detects Multiple Unrelated Candida albicans Strains in a Significant Proportion of Patient Samples

    Directory of Open Access Journals (Sweden)

    Ningxin Zhang

    2018-06-01

    Full Text Available The yeast Candida albicans is an important opportunistic human pathogen. For C. albicans strain typing or drug susceptibility testing, a single colony recovered from a patient sample is normally used. This is insufficient when multiple strains are present at the site sampled. How often this is the case is unclear. Previous studies, confined to oral, vaginal and vulvar samples, have yielded conflicting results and have assessed too small a number of colonies per sample to reliably detect the presence of multiple strains. We developed a next-generation sequencing (NGS modification of the highly discriminatory C. albicans MLST (multilocus sequence typing method, 100+1 NGS-MLST, for detection and typing of multiple strains in clinical samples. In 100+1 NGS-MLST, DNA is extracted from a pool of colonies from a patient sample and also from one of the colonies. MLST amplicons from both DNA preparations are analyzed by high-throughput sequencing. Using base call frequencies, our bespoke DALMATIONS software determines the MLST type of the single colony. If base call frequency differences between pool and single colony indicate the presence of an additional strain, the differences are used to computationally infer the second MLST type without the need for MLST of additional individual colonies. In mixes of previously typed pairs of strains, 100+1 NGS-MLST reliably detected a second strain. Inferred MLST types of second strains were always more similar to their real MLST types than to those of any of 59 other isolates (22 of 31 inferred types were identical to the real type. Using 100+1 NGS-MLST we found that 7/60 human samples, including three superficial candidiasis samples, contained two unrelated strains. In addition, at least one sample contained two highly similar variants of the same strain. The probability of samples containing unrelated strains appears to differ considerably between body sites. Our findings indicate the need for wider surveys to

  18. Overcoming Matrix Effects in a Complex Sample: Analysis of Multiple Elements in Multivitamins by Atomic Absorption Spectroscopy

    Science.gov (United States)

    Arnold, Randy J.; Arndt, Brett; Blaser, Emilia; Blosser, Chris; Caulton, Dana; Chung, Won Sog; Fiorenza, Garrett; Heath, Wyatt; Jacobs, Alex; Kahng, Eunice; Koh, Eun; Le, Thao; Mandla, Kyle; McCory, Chelsey; Newman, Laura; Pithadia, Amit; Reckelhoff, Anna; Rheinhardt, Joseph; Skljarevski, Sonja; Stuart, Jordyn; Taylor, Cassie; Thomas, Scott; Tse, Kyle; Wall, Rachel; Warkentien, Chad

    2011-01-01

    A multivitamin tablet and liquid are analyzed for the elements calcium, magnesium, iron, zinc, copper, and manganese using atomic absorption spectrometry. Linear calibration and standard addition are used for all elements except calcium, allowing for an estimate of the matrix effects encountered for this complex sample. Sample preparation using…

  19. Mockup of an automated material transport system for remote handling

    International Nuclear Information System (INIS)

    Porter, M.L.

    1992-01-01

    The automated material transport system (AMTS) was conceived for the transport of samples within the material and process control laboratory (MPCL), located in the plutonium processing building of the special isotope separation (SIS) facility. The MPCL was designed with a dry sample handling laboratory and a wet chemistry analysis laboratory. Each laboratory contained several processing glove boxes. The function of the AMTS was to automate the handling of materials, multiple process samples, and bulky items between process stations with a minimum of operator intervention and with a minimum of waiting periods and nonproductive activities. The AMTS design requirements, design verification mockup plan, and AMTS mockup procurement specification were established prior to cancellation of the SIS project. Due to the AMTS's flexibility, the need for technology development, and applicability to other US Department of Energy facilities, mockup of the AMTS continued. This paper discusses the system design features, capabilities, and results of initial testing

  20. Mockup of an automated material transport system for remote handling

    International Nuclear Information System (INIS)

    Porter, M.L.

    1992-01-01

    An Automated Material Transport System (AMTS) was identified for transport of samples within a Material and Process Control Laboratory (MPCL). The MPCL was designed with a dry sample handling laboratory and a wet chemistry analysis laboratory. Each laboratory contained several processing gloveboxes. The function of the AMTS was to automate the handling of materials, multiple process samples, and bulky items between process stations with a minimum of operator intervention and with minimum o[ waiting periods and nonproductive activities. This paper discusses the system design features, capabilities and results of initial testing. The overall performance of the AMTS is very good. No major problems or concerns were identified. System commands are simple and logical making the system user friendly. Operating principle and design of individual components is simple. With the addition of various track modules, the system can be configured in most any configuration. The AMTS lends itself very well for integration with other automated systems or products. The AMTS is suited for applications involving light payloads which require multiple sample and material handling, lot tracking, and system integration with other products

  1. Using the Direct Sampling Multiple-Point Geostatistical Method for Filling Gaps in Landsat 7 ETM+ SLC-off Imagery

    KAUST Repository

    Yin, Gaohong

    2016-01-01

    simulation case. The Direct Sampling method was examined across a range of land cover types including deserts, sparse rural areas, dense farmlands, urban areas, braided rivers and coastal areas to demonstrate its capacity to recover gaps accurately

  2. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  3. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  4. Long-term monitoring of soil gas fluxes with closed chambers using automated and manual systems

    Energy Technology Data Exchange (ETDEWEB)

    Scott, A.; Crichton, I.; Ball, B.C.

    1999-10-01

    The authors describe two gas sample collection techniques, each of which is used in conjunction with custom made automated or manually operated closed chambers. The automated system allows automatic collection of gas samples for simultaneous analysis of multiple trace gas efflux from soils, permitting long-term monitoring. Since the manual system is cheaper to produce, it can be replicated more than the automated and used to estimate spatial variability of soil fluxes. The automated chamber covers a soil area of 0.5 m{sup 2} and has a motor driven lid that remains operational throughout a range of weather conditions. Both systems use gas-tight containers of robust metal construction, which give good sample retention, thereby allowing long-term storage and convenience of transport from remote locations. The containers in the automated system are filled by pumping gas from the closed chamber via a multiway rotary valve. Stored samples from both systems are analyzed simultaneously for N{sub 2}O and CO{sub 2} using automated injection into laboratory-based gas chromatographs. The use of both collection systems is illustrated by results from a field experiment on sewage sludge disposal to land where N{sub 2}O fluxes were high. The automated gas sampling system permitted quantification of the marked temporal variability of concurrent N{sub 2}O and CO{sub 2} fluxes and allowed improved estimation of cumulative fluxes. The automated measurement approach yielded higher estimates of cumulative flux because integration of manual point-in-time observations missed a number of transient high-flux events.

  5. Automated determination of the stable carbon isotopic composition (δ13C) of total dissolved inorganic carbon (DIC) and total nonpurgeable dissolved organic carbon (DOC) in aqueous samples: RSIL lab codes 1851 and 1852

    Science.gov (United States)

    Révész, Kinga M.; Doctor, Daniel H.

    2014-01-01

    The purposes of the Reston Stable Isotope Laboratory (RSIL) lab codes 1851 and 1852 are to determine the total carbon mass and the ratio of the stable isotopes of carbon (δ13C) for total dissolved inorganic carbon (DIC, lab code 1851) and total nonpurgeable dissolved organic carbon (DOC, lab code 1852) in aqueous samples. The analysis procedure is automated according to a method that utilizes a total carbon analyzer as a peripheral sample preparation device for analysis of carbon dioxide (CO2) gas by a continuous-flow isotope ratio mass spectrometer (CF-IRMS). The carbon analyzer produces CO2 and determines the carbon mass in parts per million (ppm) of DIC and DOC in each sample separately, and the CF-IRMS determines the carbon isotope ratio of the produced CO2. This configuration provides a fully automated analysis of total carbon mass and δ13C with no operator intervention, additional sample preparation, or other manual analysis. To determine the DIC, the carbon analyzer transfers a specified sample volume to a heated (70 °C) reaction vessel with a preprogrammed volume of 10% phosphoric acid (H3PO4), which allows the carbonate and bicarbonate species in the sample to dissociate to CO2. The CO2 from the reacted sample is subsequently purged with a flow of helium gas that sweeps the CO2 through an infrared CO2 detector and quantifies the CO2. The CO2 is then carried through a high-temperature (650 °C) scrubber reactor, a series of water traps, and ultimately to the inlet of the mass spectrometer. For the analysis of total dissolved organic carbon, the carbon analyzer performs a second step on the sample in the heated reaction vessel during which a preprogrammed volume of sodium persulfate (Na2S2O8) is added, and the hydroxyl radicals oxidize the organics to CO2. Samples containing 2 ppm to 30,000 ppm of carbon are analyzed. The precision of the carbon isotope analysis is within 0.3 per mill for DIC, and within 0.5 per mill for DOC.

  6. Sample Handling and Processing on Mars for Future Astrobiology Missions

    Science.gov (United States)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  7. Method for the radioimmunoassay of large numbers of samples using quantitative autoradiography of multiple-well plates

    International Nuclear Information System (INIS)

    Luner, S.J.

    1978-01-01

    A double antibody assay for thyroxine using 125 I as label was carried out on 10-μl samples in Microtiter V-plates. After an additional centrifugation to compact the precipitates the plates were placed in contact with x-ray film overnight and the spots were scanned. In the 20 to 160 ng/ml range the average coefficient of variation for thyroxine concentration determined on the basis of film spot optical density was 11 percent compared to 4.8 percent obtained using a standard gamma counter. Eliminating the need for each sample to spend on the order of 1 min in a crystal well detector makes the method convenient for large-scale applications involving more than 3000 samples per day

  8. Identifying Shifts in Leaf-Litter Ant Assemblages (Hymenoptera: Formicidae across Ecosystem Boundaries Using Multiple Sampling Methods.

    Directory of Open Access Journals (Sweden)

    Michal Wiezik

    Full Text Available Global or regional environmental changes in climate or land use have been increasingly implied in shifts in boundaries (ecotones between adjacent ecosystems such as beech or oak-dominated forests and forest-steppe ecotones that frequently co-occur near the southern range limits of deciduous forest biome in Europe. Yet, our ability to detect changes in biological communities across these ecosystems, or to understand their environmental drivers, can be hampered when different sampling methods are required to characterize biological communities of the adjacent but ecologically different ecosystems. Ants (Hymenoptera: Formicidae have been shown to be particularly sensitive to changes in temperature and vegetation and they require different sampling methods in closed vs. open habitats. We compared ant assemblages of closed-forests (beech- or oak-dominated and open forest-steppe habitats in southwestern Carpathians using methods for closed-forest (litter sifting and open habitats (pitfall trapping, and developed an integrated sampling approach to characterize changes in ant assemblages across these adjacent ecosystems. Using both methods, we collected 5,328 individual ant workers from 28 species. Neither method represented ant communities completely, but pitfall trapping accounted for more species (24 than litter sifting (16. Although pitfall trapping characterized differences in species richness and composition among the ecosystems better, with beech forest being most species poor and ecotone most species rich, litter sifting was more successful in identifying characteristic litter-dwelling species in oak-dominated forest. The integrated sampling approach using both methods yielded more accurate characterization of species richness and composition, and particularly so in species-rich forest-steppe habitat where the combined sample identified significantly higher number of species compared to either of the two methods on their own. Thus, an integrated

  9. Identifying Shifts in Leaf-Litter Ant Assemblages (Hymenoptera: Formicidae) across Ecosystem Boundaries Using Multiple Sampling Methods.

    Science.gov (United States)

    Wiezik, Michal; Svitok, Marek; Wieziková, Adela; Dovčiak, Martin

    2015-01-01

    Global or regional environmental changes in climate or land use have been increasingly implied in shifts in boundaries (ecotones) between adjacent ecosystems such as beech or oak-dominated forests and forest-steppe ecotones that frequently co-occur near the southern range limits of deciduous forest biome in Europe. Yet, our ability to detect changes in biological communities across these ecosystems, or to understand their environmental drivers, can be hampered when different sampling methods are required to characterize biological communities of the adjacent but ecologically different ecosystems. Ants (Hymenoptera: Formicidae) have been shown to be particularly sensitive to changes in temperature and vegetation and they require different sampling methods in closed vs. open habitats. We compared ant assemblages of closed-forests (beech- or oak-dominated) and open forest-steppe habitats in southwestern Carpathians using methods for closed-forest (litter sifting) and open habitats (pitfall trapping), and developed an integrated sampling approach to characterize changes in ant assemblages across these adjacent ecosystems. Using both methods, we collected 5,328 individual ant workers from 28 species. Neither method represented ant communities completely, but pitfall trapping accounted for more species (24) than litter sifting (16). Although pitfall trapping characterized differences in species richness and composition among the ecosystems better, with beech forest being most species poor and ecotone most species rich, litter sifting was more successful in identifying characteristic litter-dwelling species in oak-dominated forest. The integrated sampling approach using both methods yielded more accurate characterization of species richness and composition, and particularly so in species-rich forest-steppe habitat where the combined sample identified significantly higher number of species compared to either of the two methods on their own. Thus, an integrated sampling

  10. Stratifying empiric risk of schizophrenia among first degree relatives using multiple predictors in two independent Indian samples.

    Science.gov (United States)

    Bhatia, Triptish; Gettig, Elizabeth A; Gottesman, Irving I; Berliner, Jonathan; Mishra, N N; Nimgaonkar, Vishwajit L; Deshpande, Smita N

    2016-12-01

    Schizophrenia (SZ) has an estimated heritability of 64-88%, with the higher values based on twin studies. Conventionally, family history of psychosis is the best individual-level predictor of risk, but reliable risk estimates are unavailable for Indian populations. Genetic, environmental, and epigenetic factors are equally important and should be considered when predicting risk in 'at risk' individuals. To estimate risk based on an Indian schizophrenia participant's family history combined with selected demographic factors. To incorporate variables in addition to family history, and to stratify risk, we constructed a regression equation that included demographic variables in addition to family history. The equation was tested in two independent Indian samples: (i) an initial sample of SZ participants (N=128) with one sibling or offspring; (ii) a second, independent sample consisting of multiply affected families (N=138 families, with two or more sibs/offspring affected with SZ). The overall estimated risk was 4.31±0.27 (mean±standard deviation). There were 19 (14.8%) individuals in the high risk group, 75 (58.6%) in the moderate risk and 34 (26.6%) in the above average risk (in Sample A). In the validation sample, risks were distributed as: high (45%), moderate (38%) and above average (17%). Consistent risk estimates were obtained from both samples using the regression equation. Familial risk can be combined with demographic factors to estimate risk for SZ in India. If replicated, the proposed stratification of risk may be easier and more realistic for family members. Copyright © 2016. Published by Elsevier B.V.

  11. Evaluation of the Multiple Sclerosis Walking Scale-12 (MSWS-12) in a Dutch sample: Application of item response theory.

    Science.gov (United States)

    Mokkink, Lidwine Brigitta; Galindo-Garre, Francisca; Uitdehaag, Bernard Mj

    2016-12-01

    The Multiple Sclerosis Walking Scale-12 (MSWS-12) measures walking ability from the patients' perspective. We examined the quality of the MSWS-12 using an item response theory model, the graded response model (GRM). A total of 625 unique Dutch multiple sclerosis (MS) patients were included. After testing for unidimensionality, monotonicity, and absence of local dependence, a GRM was fit and item characteristics were assessed. Differential item functioning (DIF) for the variables gender, age, duration of MS, type of MS and severity of MS, reliability, total test information, and standard error of the trait level (θ) were investigated. Confirmatory factor analysis showed a unidimensional structure of the 12 items of the scale, explaining 88% of the variance. Item 2 did not fit into the GRM model. Reliability was 0.93. Items 8 and 9 (of the 11 and 12 item version respectively) showed DIF on the variable severity, based on the Expanded Disability Status Scale (EDSS). However, the EDSS is strongly related to the content of both items. Our results confirm the good quality of the MSWS-12. The trait level (θ) scores and item parameters of both the 12- and 11-item versions were highly comparable, although we do not suggest to change the content of the MSWS-12. © The Author(s), 2016.

  12. Perception of prescription medicine sample packs among Australian professional, government, industry, and consumer organizations, based on automated textual analysis of one-on-one interviews.

    Science.gov (United States)

    Kyle, Greg J; Nissen, Lisa; Tett, Susan

    2008-12-01

    Prescription medicine samples provided by pharmaceutical companies are predominantly newer and more expensive products. The range of samples provided to practices may not represent the drugs that the doctors desire to have available. Few studies have used a qualitative design to explore the reasons behind sample use. The aim of this study was to explore the opinions of a variety of Australian key informants about prescription medicine samples, using a qualitative methodology. Twenty-three organizations involved in quality use of medicines in Australia were identified, based on the authors' previous knowledge. Each organization was invited to nominate 1 or 2 representatives to participate in semistructured interviews utilizing seeding questions. Each interview was recorded and transcribed verbatim. Leximancer v2.25 text analysis software (Leximancer Pty Ltd., Jindalee, Queensland, Australia) was used for textual analysis. The top 10 concepts from each analysis group were interrogated back to the original transcript text to determine the main emergent opinions. A total of 18 key interviewees representing 16 organizations participated. Samples, patient, doctor, and medicines were the major concepts among general opinions about samples. The concept drug became more frequent and the concept companies appeared when marketing issues were discussed. The Australian Pharmaceutical Benefits Scheme and cost were more prevalent in discussions about alternative sample distribution models, indicating interviewees were cognizant of budgetary implications. Key interviewee opinions added richness to the single-word concepts extracted by Leximancer. Participants recognized that prescription medicine samples have an influence on quality use of medicines and play a role in the marketing of medicines. They also believed that alternative distribution systems for samples could provide benefits. The cost of a noncommercial system for distributing samples or starter packs was a concern

  13. Fast and accurate mutation detection in whole genome sequences of multiple isogenic samples with IsoMut

    DEFF Research Database (Denmark)

    Pipek, Orsolya; Ribli, Dezső; Molnar, Janos

    2017-01-01

    for testing purposes. Optimal values of the filtering parameters of IsoMut were determined in a thorough and strict optimization procedure based on these test sets. We show that IsoMut, when tuned correctly, decreases the false positive rate compared to conventional tools in a 30 sample experimental setup...

  14. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  15. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  16. Future of Automated Insulin Delivery Systems

    NARCIS (Netherlands)

    Castle, Jessica R.; DeVries, J. Hans; Kovatchev, Boris

    2017-01-01

    Advances in continuous glucose monitoring (CGM) have brought on a paradigm shift in the management of type 1 diabetes. These advances have enabled the automation of insulin delivery, where an algorithm determines the insulin delivery rate in response to the CGM values. There are multiple automated

  17. An ecological method for the sampling of nonverbal signalling behaviours of young children with profound and multiple learning disabilities (PMLD).

    Science.gov (United States)

    Atkin, Keith; Lorch, Marjorie Perlman

    2016-08-01

    Profound and multiple learning disabilities (PMLD) are a complex range of disabilities that affect the general health and well-being of the individual and their capacity to interact and learn. We developed a new methodology to capture the non-symbolic signalling behaviours of children with PMLD within the context of a face-to-face interaction with a caregiver to provide analysis at a micro-level of descriptive detail incorporating the use of the ELAN digital video software. The signalling behaviours of participants in a natural, everyday interaction can be better understood with the use of this innovation in methodology, which is predicated on the ecology of communication. Recognition of the developmental ability of the participants is an integral factor within that ecology. The method presented establishes an advanced account of the modalities through which a child affected by PMLD is able to communicate.

  18. Validation of an semi-automated multi component method using protein precipitation LC-MS-MS for the analysis of whole blood samples

    DEFF Research Database (Denmark)

    Slots, Tina

    BACKGROUND: Solid phase extraction (SPE) are one of many multi-component methods, but can be very time-consuming and labour-intensive. Protein precipitation is, on the other hand, a much simpler and faster sample pre-treatment than SPE, and protein precipitation also has the ability to cover a wi......-mortem whole blood sample preparation for toxicological analysis; from the primary sample tube to a 96-deepwell plate ready for injection on the liquid chromatography mass spectrometry (LC-MS/MS)....

  19. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    method opens a new avenue for automated DLLME that not only greatly expands the range of viable extractants, especially functional ILs but also enhances its application for various detection methods. Furthermore, multiple samples can be processed simultaneously, which accelerates the sample preparation and allows the examination of a large number of samples.

  20. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    for automated DLLME that not only greatly expands the range of viable extractants, especially functional ILs but also enhances its application for various detection methods. Furthermore, multiple samples can be processed simultaneously, which accelerates the sample preparation and allows the examination of a large number of samples

  1. Memory and selective attention in multiple sclerosis: cross-sectional computer-based assessment in a large outpatient sample.

    Science.gov (United States)

    Adler, Georg; Lembach, Yvonne

    2015-08-01

    Cognitive impairments may have a severe impact on everyday functioning and quality of life of patients with multiple sclerosis (MS). However, there are some methodological problems in the assessment and only a few studies allow a representative estimate of the prevalence and severity of cognitive impairments in MS patients. We applied a computer-based method, the memory and attention test (MAT), in 531 outpatients with MS, who were assessed at nine neurological practices or specialized outpatient clinics. The findings were compared with those obtained in an age-, sex- and education-matched control group of 84 healthy subjects. Episodic short-term memory was substantially decreased in the MS patients. About 20% of them reached a score of only less than two standard deviations below the mean of the control group. The episodic short-term memory score was negatively correlated with the EDSS score. Minor but also significant impairments in the MS patients were found for verbal short-term memory, episodic working memory and selective attention. The computer-based MAT was found to be useful for a routine assessment of cognition in MS outpatients.

  2. Validation of a fully automated solid‐phase extraction and ultra‐high‐performance liquid chromatography–tandem mass spectrometry method for quantification of 30 pharmaceuticals and metabolites in post‐mortem blood and brain samples

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Nedahl, Michael; Johansen, Sys Stybe

    2018-01-01

    In this study, we present the validation of an analytical method capable of quantifying 30 commonly encountered pharmaceuticals and metabolites in whole blood and brain tissue from forensic cases. Solid‐phase extraction was performed by a fully automated robotic system, thereby minimising manual...... labour and human error while increasing sample throughput, robustness, and traceability. The method was validated in blood in terms of selectivity, linear range, matrix effect, extraction recovery, process efficiency, carry‐over, stability, precision, and accuracy. Deuterated analogues of each analyte....../kg. Thus, the linear range covered both therapeutic and toxic levels. The method showed acceptable accuracy and precision, with accuracies ranging from 80 to 118% and precision below 19% for the majority of the analytes. Linear range, matrix effect, extraction recovery, process efficiency, precision...

  3. Automation of sample processing for ICP-MS determination of 90Sr radionuclide at ppq level for nuclear technology and environmental purposes.

    Science.gov (United States)

    Kołacińska, Kamila; Chajduk, Ewelina; Dudek, Jakub; Samczyński, Zbigniew; Łokas, Edyta; Bojanowska-Czajka, Anna; Trojanowicz, Marek

    2017-07-01

    90 Sr is a widely determined radionuclide for environmental purposes, nuclear waste control, and can be also monitored in coolants in nuclear reactor plants. In the developed method, the ICP-MS detection was employed together with sample processing in sequential injection analysis (SIA) setup, equipped with a lab-on-valve with mechanized renewal of sorbent bed for solid-phase extraction. The optimized conditions of determination included preconcentration of 90 Sr on cation-exchange column and removal of different type of interferences using extraction Sr-resin. The limit of detection of the developed procedure depends essentially on the configuration of the employed ICP-MS spectrometer and on the available volume of the sample to be analyzed. For 1L initial sample volume, the method detection limit (MDL) value was evaluated as 2.9ppq (14.5BqL -1 ). The developed method was applied to analyze spiked river water samples, water reference materials, and also simulated and real samples of the nuclear reactor coolant. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Relationship between single and multiple perpetrator rape perpetration in South Africa: A comparison of risk factors in a population-based sample.

    Science.gov (United States)

    R, Jewkes; Y, Sikweyiya; K, Dunkle; R, Morrell

    2015-07-07

    Studies of rape of women seldom distinguish between men's participation in acts of single and multiple perpetrator rape. Multiple perpetrator rape (MPR) occurs globally with serious consequences for women. In South Africa it is a cultural practice with defined circumstances in which it commonly occurs. Prevention requires an understanding of whether it is a context specific intensification of single perpetrator rape, or a distinctly different practice of different men. This paper aims to address this question. We conducted a cross-sectional household study with a multi-stage, randomly selected sample of 1686 men aged 18-49 who completed a questionnaire administered using an Audio-enhanced Personal Digital Assistant. We attempted to fit an ordered logistic regression model for factors associated with rape perpetration. 27.6 % of men had raped and 8.8 % had perpetrated multiple perpetrator rape (MPR). Thus 31.9 % of men who had ever raped had done so with other perpetrators. An ordered regression model was fitted, showing that the same associated factors, albeit at higher prevalence, are associated with SPR and MPR. Multiple perpetrator rape appears as an intensified form of single perpetrator rape, rather than a different form of rape. Prevention approaches need to be mainstreamed among young men.

  5. Full second order chromatographic/spectrometric data matrices for automated sample identification and component analysis by non-data-reducing image analysis

    DEFF Research Database (Denmark)

    Nielsen, Niles-Peter Vest; Smedsgaard, Jørn; Frisvad, Jens Christian

    1999-01-01

    A data analysis method is proposed for identification and for confirmation of classification schemes, based on single- or multiple-wavelength chromatographic profiles. The proposed method works directly on the chromatographic data without data reduction procedures such as peak area or retention...... classes from the reference chromatograms, This feature is a valuable aid in selecting components for further analysis, The identification method is demonstrated on two data sets: 212 isolates from 41 food-borne Penicillium species and 61 isolates from 6 soil-borne Penicillium species. Both data sets...

  6. Development of an Analysis Pipeline Characterizing Multiple Hypervariable Regions of 16S rRNA Using Mock Samples.

    Directory of Open Access Journals (Sweden)

    Jennifer J Barb

    Full Text Available There is much speculation on which hypervariable region provides the highest bacterial specificity in 16S rRNA sequencing. The optimum solution to prevent bias and to obtain a comprehensive view of complex bacterial communities would be to sequence the entire 16S rRNA gene; however, this is not possible with second generation standard library design and short-read next-generation sequencing technology.This paper examines a new process using seven hypervariable or V regions of the 16S rRNA (six amplicons: V2, V3, V4, V6-7, V8, and V9 processed simultaneously on the Ion Torrent Personal Genome Machine (Life Technologies, Grand Island, NY. Four mock samples were amplified using the 16S Ion Metagenomics Kit™ (Life Technologies and their sequencing data is subjected to a novel analytical pipeline.Results are presented at family and genus level. The Kullback-Leibler divergence (DKL, a measure of the departure of the computed from the nominal bacterial distribution in the mock samples, was used to infer which region performed best at the family and genus levels. Three different hypervariable regions, V2, V4, and V6-7, produced the lowest divergence compared to the known mock sample. The V9 region gave the highest (worst average DKL while the V4 gave the lowest (best average DKL. In addition to having a high DKL, the V9 region in both the forward and reverse directions performed the worst finding only 17% and 53% of the known family level and 12% and 47% of the genus level bacteria, while results from the forward and reverse V4 region identified all 17 family level bacteria.The results of our analysis have shown that our sequencing methods using 6 hypervariable regions of the 16S rRNA and subsequent analysis is valid. This method also allowed for the assessment of how well each of the variable regions might perform simultaneously. Our findings will provide the basis for future work intended to assess microbial abundance at different time points

  7. Does Alcoholics Anonymous work differently for men and women? A moderated multiple-mediation analysis in a large clinical sample.

    Science.gov (United States)

    Kelly, John F; Hoeppner, Bettina B

    2013-06-01

    Alcoholics Anonymous (AA) began as a male organization, but about one third is now female. Studies have found that women participate at least as much as men and benefit equally from AA, but it is unclear whether women benefit from AA in the same or different ways as men. This study tested whether gender moderated the mechanisms through which AA aids recovery. A cohort study of alcohol dependent adults (N=1726; 24% female; Project MATCH) was assessed on AA attendance during treatment; with mediators at 9 months; outcomes (Percent Days Abstinent [PDA] and Drinks per Drinking Day [DDD]) at 15 months. Multiple mediator models tested whether purported mechanisms (i.e., self-efficacy, depression, social networks, spirituality/religiosity) explained AA's effects differently for men and women controlling for baseline values, mediators, treatment, and other confounders. For PDA, the proportion of AA's effect accounted for by the mediators was similar for men (53%) and women (49%). Both men and women were found to benefit from changes in social factors but these mechanisms were more important among men. For DDD, the mediators accounted for 70% of the effect of AA for men and 41% for women. Again, men benefitted mostly from social changes. Independent of AA's effects, negative affect self-efficacy was shown to have a strong relationship to outcome for women but not men. The recovery benefits derived from AA differ in nature and magnitude between men and women and may reflect differing needs based on recovery challenges related to gender-based social roles and drinking contexts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Optimized Heart Sampling and Systematic Evaluation of Cardiac Therapies in Mouse Models of Ischemic Injury: Assessment of Cardiac Remodeling and Semi-Automated Quantification of Myocardial Infarct Size.

    Science.gov (United States)

    Valente, Mariana; Araújo, Ana; Esteves, Tiago; Laundos, Tiago L; Freire, Ana G; Quelhas, Pedro; Pinto-do-Ó, Perpétua; Nascimento, Diana S

    2015-12-02

    Cardiac therapies are commonly tested preclinically in small-animal models of myocardial infarction. Following functional evaluation, post-mortem histological analysis is essential to assess morphological and molecular alterations underlying the effectiveness of treatment. However, non-methodical and inadequate sampling of the left ventricle often leads to misinterpretations and variability, making direct study comparisons unreliable. Protocols are provided for representative sampling of the ischemic mouse heart followed by morphometric analysis of the left ventricle. Extending the use of this sampling to other types of in situ analysis is also illustrated through the assessment of neovascularization and cellular engraftment in a cell-based therapy setting. This is of interest to the general cardiovascular research community as it details methods for standardization and simplification of histo-morphometric evaluation of emergent heart therapies. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  9. An Automated Sample Preparation Instrument to Accelerate Positive Blood Cultures Microbial Identification by MALDI-TOF Mass Spectrometry (Vitek®MS

    Directory of Open Access Journals (Sweden)

    Patrick Broyer

    2018-05-01

    Full Text Available Sepsis is the leading cause of death among patients in intensive care units (ICUs requiring an early diagnosis to introduce efficient therapeutic intervention. Rapid identification (ID of a causative pathogen is key to guide directed antimicrobial selection and was recently shown to reduce hospitalization length in ICUs. Direct processing of positive blood cultures by MALDI-TOF MS technology is one of the several currently available tools used to generate rapid microbial ID. However, all recently published protocols are still manual and time consuming, requiring dedicated technician availability and specific strategies for batch processing. We present here a new prototype instrument for automated preparation of Vitek®MS slides directly from positive blood culture broth based on an “all-in-one” extraction strip. This bench top instrument was evaluated on 111 and 22 organisms processed using artificially inoculated blood culture bottles in the BacT/ALERT® 3D (SA/SN blood culture bottles or the BacT/ALERT VirtuoTM system (FA/FN Plus bottles, respectively. Overall, this new preparation station provided reliable and accurate Vitek MS species-level identification of 87% (Gram-negative bacteria = 85%, Gram-positive bacteria = 88%, and yeast = 100% when used with BacT/ALERT® 3D and of 84% (Gram-negative bacteria = 86%, Gram-positive bacteria = 86%, and yeast = 75% with Virtuo® instruments, respectively. The prototype was then evaluated in a clinical microbiology laboratory on 102 clinical blood culture bottles and compared to routine laboratory ID procedures. Overall, the correlation of ID on monomicrobial bottles was 83% (Gram-negative bacteria = 89%, Gram-positive bacteria = 79%, and yeast = 78%, demonstrating roughly equivalent performance between manual and automatized extraction methods. This prototype instrument exhibited a high level of performance regardless of bottle type or BacT/ALERT system. Furthermore, blood culture workflow could

  10. Efficient Determination of Free Energy Landscapes in Multiple Dimensions from Biased Umbrella Sampling Simulations Using Linear Regression.

    Science.gov (United States)

    Meng, Yilin; Roux, Benoît

    2015-08-11

    The weighted histogram analysis method (WHAM) is a standard protocol for postprocessing the information from biased umbrella sampling simulations to construct the potential of mean force with respect to a set of order parameters. By virtue of the WHAM equations, the unbiased density of state is determined by satisfying a self-consistent condition through an iterative procedure. While the method works very effectively when the number of order parameters is small, its computational cost grows rapidly in higher dimension. Here, we present a simple and efficient alternative strategy, which avoids solving the self-consistent WHAM equations iteratively. An efficient multivariate linear regression framework is utilized to link the biased probability densities of individual umbrella windows and yield an unbiased global free energy landscape in the space of order parameters. It is demonstrated with practical examples that free energy landscapes that are comparable in accuracy to WHAM can be generated at a small fraction of the cost.

  11. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1981-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  12. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.; Denton, M.M.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day

  13. Analysis of microcontaminants in aqueous samples by fully automated on-line solid-phase extraction-gas chromatography-mass selective detection.

    NARCIS (Netherlands)

    Louter, A.J.H.; van Beekvelt, C.A.; Cid Montanes, P.; Slobodník, J.; Vreuls, J.J.; Brinkman, U.A.T.

    1996-01-01

    The trace-level analysis of unknown organic pollutants in water requires the use of fast and sensitive methods which also provide structural information. In the present study, an on-line technique was used which combines sample preparation by means of solid-phase extraction (SPE) on a small

  14. Improving the characterization of fish assemblage structure through the use of multiple sampling methods: a case study in a subtropical tidal flat ecosystem.

    Science.gov (United States)

    Contente, Riguel Feltrin; Del Bianco Rossi-Wongtschowski, Carmen Lucia

    2017-06-01

    The use of multiple sampling gears is indispensible to obtain robust characterizations of fish assemblage structure in species-rich subtropical ecosystems. In this study, such a dependence was demonstrated by characterizing the structure of the high-tide fish assemblage in a subtropical tidal flat ecosystem (the Araçá Bay, southeastern Brazil) using eight different gears along five seasonal surveys and estimating the bay's fish species richness, combining these data with those from local tide pool fish surveys. The high-tide fish assemblage was spatially structured, contained five threatened species, and was dominated by persistent and large populations of Eucinostomus argenteus and of the fisheries species Mugil curema and Diapterus rhombeus that intensively use the bay throughout their life cycles. Large, small-bodied fish populations supported a regular use of the bay by piscivores. The autumn-winter peak in abundance of juvenile fishes caused a subsequent increase in piscivore abundance, and both events explained the bulk of the seasonal variability of the fish assemblage. The estimated richness revealed that the combination of sampling methods was enough for sampling the bulk of the local richness, and the bay may hold a surprisingly high richness compared to other costal ecosystem of the region. This faunal characterization, only viable using multiple gears, will be critical to support the implementation of a future study to monitor the impacts on local fish biodiversity of an imminent port expansion over the tidal flat.

  15. Social attribution test--multiple choice (SAT-MC) in schizophrenia: comparison with community sample and relationship to neurocognitive, social cognitive and symptom measures.

    Science.gov (United States)

    Bell, Morris D; Fiszdon, Joanna M; Greig, Tamasine C; Wexler, Bruce E

    2010-09-01

    This is the first report on the use of the Social Attribution Task - Multiple Choice (SAT-MC) to assess social cognitive impairments in schizophrenia. The SAT-MC was originally developed for autism research, and consists of a 64-second animation showing geometric figures enacting a social drama, with 19 multiple choice questions about the interactions. Responses from 85 community-dwelling participants and 66 participants with SCID confirmed schizophrenia or schizoaffective disorders (Scz) revealed highly significant group differences. When the two samples were combined, SAT-MC scores were significantly correlated with other social cognitive measures, including measures of affect recognition, theory of mind, self-report of egocentricity and the Social Cognition Index from the MATRICS battery. Using a cut-off score, 53% of Scz were significantly impaired on SAT-MC compared with 9% of the community sample. Most Scz participants with impairment on SAT-MC also had impairment on affect recognition. Significant correlations were also found with neurocognitive measures but with less dependence on verbal processes than other social cognitive measures. Logistic regression using SAT-MC scores correctly classified 75% of both samples. Results suggest that this measure may have promise, but alternative versions will be needed before it can be used in pre-post or longitudinal designs. (c) 2009 Elsevier B.V. All rights reserved.

  16. Systematic approach to optimize a pretreatment method for ultrasensitive liquid chromatography with tandem mass spectrometry analysis of multiple target compounds in biological samples.

    Science.gov (United States)

    Togashi, Kazutaka; Mutaguchi, Kuninori; Komuro, Setsuko; Kataoka, Makoto; Yamazaki, Hiroshi; Yamashita, Shinji

    2016-08-01

    In current approaches for new drug development, highly sensitive and robust analytical methods for the determination of test compounds in biological samples are essential. These analytical methods should be optimized for every target compound. However, for biological samples that contain multiple compounds as new drug candidates obtained by cassette dosing tests, it would be preferable to develop a single method that allows the determination of all compounds at once. This study aims to establish a systematic approach that enables a selection of the most appropriate pretreatment method for multiple target compounds without the use of their chemical information. We investigated the retention times of 27 known compounds under different mobile phase conditions and determined the required pretreatment of human plasma samples using several solid-phase and liquid-liquid extractions. From the relationship between retention time and recovery in a principal component analysis, appropriate pretreatments were categorized into several types. Based on the category, we have optimized a pretreatment method for the identification of three calcium channel blockers in human plasma. Plasma concentrations of these drugs in a cassette-dose clinical study at microdose level were successfully determined with a lower limit of quantitation of 0.2 pg/mL for diltiazem, 1 pg/mL for nicardipine, and 2 pg/mL for nifedipine. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. The Relationship between Multiple Substance Use, Perceived Academic Achievements, and Selected Socio-Demographic Factors in a Polish Adolescent Sample

    Science.gov (United States)

    Mazur, Joanna; Tabak, Izabela; Dzielska, Anna; Wąż, Krzysztof; Oblacińska, Anna

    2016-01-01

    Predictors of high-risk patterns of substance use are often analysed in relation to demographic and school-related factors. The interaction between these factors and the additional impact of family wealth are still new areas of research. The aim of this study was to find determinants of the most common patterns of psychoactive substance use in mid-adolescence, compared to non-users. A sample of 1202 Polish students (46.1% boys, mean age of 15.6 years) was surveyed in 2013/2014. Four patterns of psychoactive substance use were defined using cluster analysis: non-users—71.9%, mainly tobacco and alcohol users—13.7%, high alcohol and cannabis users—7.2%, poly-users—7.2%. The final model contained the main effects of gender and age, and one three-way (perceived academic achievement × gender × family affluence) interaction. Girls with poor perception of school performance (as compared to girls with better achievements) were at significantly higher risk of being poly-users, in both less and more affluent families (adjusted odds ratio (OR) = 5.55 and OR = 3.60, respectively). The impact of family affluence was revealed only in interaction with other factors. Patterns of substance use in mid-adolescence are strongly related to perceived academic achievements, and these interact with selected socio-demographic factors. PMID:28009806

  18. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  19. The Relationship between Multiple Substance Use, Perceived Academic Achievements, and Selected Socio-Demographic Factors in a Polish Adolescent Sample

    Directory of Open Access Journals (Sweden)

    Joanna Mazur

    2016-12-01

    Full Text Available Predictors of high-risk patterns of substance use are often analysed in relation to demographic and school-related factors. The interaction between these factors and the additional impact of family wealth are still new areas of research. The aim of this study was to find determinants of the most common patterns of psychoactive substance use in mid-adolescence, compared to non-users. A sample of 1202 Polish students (46.1% boys, mean age of 15.6 years was surveyed in 2013/2014. Four patterns of psychoactive substance use were defined using cluster analysis: non-users—71.9%, mainly tobacco and alcohol users—13.7%, high alcohol and cannabis users—7.2%, poly-users—7.2%. The final model contained the main effects of gender and age, and one three-way (perceived academic achievement × gender × family affluence interaction. Girls with poor perception of school performance (as compared to girls with better achievements were at significantly higher risk of being poly-users, in both less and more affluent families (adjusted odds ratio (OR = 5.55 and OR = 3.60, respectively. The impact of family affluence was revealed only in interaction with other factors. Patterns of substance use in mid-adolescence are strongly related to perceived academic achievements, and these interact with selected socio-demographic factors.

  20. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi; Zhou, Lan; Najibi, Seyed Morteza; Gao, Xin; Huang, Jianhua Z.

    2015-01-01

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.