WorldWideScience

Sample records for automated sampling assessment

  1. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...... corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters...

  2. AUTOMATING GROUNDWATER SAMPLING AT HANFORD

    Energy Technology Data Exchange (ETDEWEB)

    CONNELL CW; HILDEBRAND RD; CONLEY SF; CUNNINGHAM DE

    2009-01-16

    Until this past October, Fluor Hanford managed Hanford's integrated groundwater program for the U.S. Department of Energy (DOE). With the new contract awards at the Site, however, the CH2M HILL Plateau Remediation Company (CHPRC) has assumed responsibility for the groundwater-monitoring programs at the 586-square-mile reservation in southeastern Washington State. These programs are regulated by the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). The purpose of monitoring is to track existing groundwater contamination from past practices, as well as other potential contamination that might originate from RCRA treatment, storage, and disposal (TSD) facilities. An integral part of the groundwater-monitoring program involves taking samples of the groundwater and measuring the water levels in wells scattered across the site. More than 1,200 wells are sampled each year. Historically, field personnel or 'samplers' have been issued pre-printed forms that have information about the well(s) for a particular sampling evolution. This information is taken from the Hanford Well Information System (HWIS) and the Hanford Environmental Information System (HEIS)--official electronic databases. The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and the collected information was posted onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. This is a pilot project for automating this tedious process by providing an electronic tool for automating water-level measurements and groundwater field-sampling activities. The automation will eliminate the manual forms and associated data entry, improve the

  3. Technology modernization assessment flexible automation

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  4. Automated sample preparation for CE-SDS.

    Science.gov (United States)

    Le, M Eleanor; Vizel, Alona; Hutterer, Katariina M

    2013-05-01

    Traditionally, CE with SDS (CE-SDS) places many restrictions on sample composition. Requirements include low salt content, known initial sample concentration, and a narrow window of final sample concentration. As these restrictions require buffer exchange for many sample types, sample preparation is often tedious and yields poor sample recoveries. To improve capacity and streamline sample preparation, an automated robotic platform was developed using the PhyNexus Micro-Extractor Automated Instrument (MEA) for both the reduced and nonreduced CE-SDS assays. This automated sample preparation normalizes sample concentration, removes salts and other contaminants, and adds the required CE-SDS reagents, essentially eliminating manual steps during sample preparation. Fc-fusion proteins and monoclonal antibodies were used in this work to demonstrate benefits of this approach when compared to the manual method. With optimized conditions, this application has demonstrated decreased analyst "hands on" time and reduced total assay time. Sample recovery greater than 90% can be achieved, regardless of initial composition and concentration of analyte.

  5. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  6. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  7. Automating Spreadsheet Discovery & Risk Assessment

    CERN Document Server

    Perry, Eric

    2008-01-01

    There have been many articles and mishaps published about the risks of uncontrolled spreadsheets in today's business environment, including non-compliance, operational risk, errors, and fraud all leading to significant loss events. Spreadsheets fall into the realm of end user developed applications and are often absent the proper safeguards and controls an IT organization would enforce for enterprise applications. There is also an overall lack of software programming discipline enforced in how spreadsheets are developed. However, before an organization can apply proper controls and discipline to critical spreadsheets, an accurate and living inventory of spreadsheets across the enterprise must be created, and all critical spreadsheets must be identified. As such, this paper proposes an automated approach to the initial stages of the spreadsheet management lifecycle - discovery, inventory and risk assessment. Without the use of technology, these phases are often treated as a one-off project. By leveraging techn...

  8. Automated Assessment in a Programming Tools Course

    Science.gov (United States)

    Fernandez Aleman, J. L.

    2011-01-01

    Automated assessment systems can be useful for both students and instructors. Ranking and immediate feedback can have a strongly positive effect on student learning. This paper presents an experience using automatic assessment in a programming tools course. The proposal aims at extending the traditional use of an online judging system with a…

  9. Automated Training Sample Extraction for Global Land Cover Mapping

    Directory of Open Access Journals (Sweden)

    Julien Radoux

    2014-05-01

    Full Text Available Land cover is one of the essential climate variables of the ESA Climate Change Initiative (CCI. In this context, the Land Cover CCI (LC CCI project aims at building global land cover maps suitable for climate modeling based on Earth observation by satellite sensors.  The  challenge  is  to  generate  a  set  of  successive  maps  that  are  both  accurate and consistent over time. To do so, operational methods for the automated classification of optical images are investigated. The proposed approach consists of a locally trained classification using an automated selection of training samples from existing, but outdated land cover information. Combinations of local extraction (based on spatial criteria and self-cleaning of training samples (based on spectral criteria are quantitatively assessed. Two large study areas, one in Eurasia and the other in South America, are considered. The proposed morphological cleaning of the training samples leads to higher accuracies than the statistical outlier removal in the spectral domain. An optimal neighborhood has been identified for the local sample extraction. The results are coherent for the two test areas, showing an improvement of the overall accuracy compared with the original reference datasets and a significant reduction of macroscopic errors. More importantly, the proposed method partly controls the reliability of existing land cover maps as sources of training samples for supervised classification.

  10. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  11. Automating defence generation for risk assessment

    NARCIS (Netherlands)

    Gadyatskaya, Olga

    2016-01-01

    Efficient risk assessment requires automation of its most tedious tasks: identification of vulnerabilities, attacks that can exploit these vulnerabilities, and countermeasures that can mitigate the attacks. E.g., the attack tree generation by policy invalidation approach looks at systematic automati

  12. Automated Autonomy Assessment System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA has expressed the need to assess crew autonomy relative to performance and evaluate an optimal level of autonomy that maximizes individual and team performance....

  13. Automated PolyU Palmprint sample Registration and Coarse Classification

    CERN Document Server

    M., Dhananjay D; Muralikrishna, I V

    2011-01-01

    Biometric based authentication for secured access to resources has gained importance, due to their reliable, invariant and discriminating features. Palmprint is one such biometric entity. Prior to classification and identification registering a sample palmprint is an important activity. In this paper we propose a computationally effective method for automated registration of samples from PlolyU palmprint database. In our approach we preprocess the sample and trace the border to find the nearest point from center of sample. Angle between vector representing the nearest point and vector passing through the center is used for automated palm sample registration. The angle of inclination between start and end point of heart line and life line is used for basic classification of palmprint samples in left class and right class.

  14. Automated PolyU Palmprint sample Registration and Coarse Classification

    Directory of Open Access Journals (Sweden)

    Dhananjay D M

    2011-11-01

    Full Text Available Biometric based authentication for secured access to resources has gained importance, due to their reliable, invariant and discriminating features. Palmprint is one such biometric entity. Prior to classification and identification registering a sample palmprint is an important activity. In this paper we propose a computationally effective method for automated registration of samples from PlolyU palmprint database. In our approach we preprocess the sample and trace the border to find the nearest point from center of sample. Angle between vector representing the nearest point and vector passing through the center is used for automated palm sample registration. The angle of inclination between start and end point of heart line and life line is used for basic classification of palmprint samples in left class and right class.

  15. Automated microdroplet platform for sample manipulation and polymerase chain reaction.

    Science.gov (United States)

    Chabert, Max; Dorfman, Kevin D; de Cremoux, Patricia; Roeraade, Johan; Viovy, Jean-Louis

    2006-11-15

    We present a fully automated system performing continuous sampling, reagent mixing, and polymerase chain reaction (PCR) in microdroplets transported in immiscible oil. Sample preparation and analysis are totally automated, using an original injection method from a modified 96-well plate layered with three superimposed liquid layers and in-capillary laser-induced fluorescence endpoint detection. The process is continuous, allowing sample droplets to be carried uninterruptedly into the reaction zone while new drops are aspirated from the sample plate. Reproducible amplification, negligible cross-contamination, and detection of low sample concentrations were demonstrated on numerous consecutive sample drops. The system, which opens the route to strong reagents and labor savings in high-throughput applications, was validated on the clinically relevant quantification of progesterone receptor gene expression in human breast cancer cell lines.

  16. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  17. An automated 55 GHz cryogenic Josephson sampling oscilloscope

    DEFF Research Database (Denmark)

    Bodin, P.; Jacobsen, M. L.; Kyhle, Anders;

    1993-01-01

    A computer-automated superconductive 55 GHz sampling oscilloscope based on 4 kA/cm2, Nb/Nb2O5/Pb edge Josephson junctions is presented. The Josephson sampler chip was flip-chip bonded to a carrier chip with a coplanar transmission line by use of a novel flip-chip bonding machine. A 5.6 ps step...

  18. An Automated Home Made Low Cost Vibrating Sample Magnetometer

    CERN Document Server

    Kundu, S

    2011-01-01

    The design and operation of a homemade low cost vibrating sample magnetometer is described here. The sensitivity of this instrument is better than 10-2 emu and found to be very efficient for the measurement of magnetization of most of the ferromagnetic and other magnetic materials as a function of temperature down to 77 K and magnetic field upto 800 Oe. Both M(H) and M(T) data acquisition are fully automated employing computer and Labview software

  19. An Automated Home Made Low Cost Vibrating Sample Magnetometer

    Science.gov (United States)

    Kundu, S.; Nath, T. K.

    2011-07-01

    The design and operation of a homemade low cost vibrating sample magnetometer is described here. The sensitivity of this instrument is better than 10-2 emu and found to be very efficient for the measurement of magnetization of most of the ferromagnetic and other magnetic materials as a function of temperature down to 77 K and magnetic field upto 800 Oe. Both M(H) and M(T) data acquisition are fully automated employing computer and Labview software.

  20. AUTOMATING GROUNDWATER SAMPLING AT HANFORD THE NEXT STEP

    Energy Technology Data Exchange (ETDEWEB)

    CONNELL CW; CONLEY SF; HILDEBRAND RD; CUNNINGHAM DE; R_D_Doug_Hildebrand@rl.gov; DeVon_E_Cunningham@rl.gov

    2010-01-21

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very "people intensive." Approximately 1500 wells are sampled each year by field personnel or "samplers." These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  1. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    Energy Technology Data Exchange (ETDEWEB)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States); Prasanna, P.G.S. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States)], E-mail: prasanna@afrri.usuhs.mil

    2007-07-15

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and

  2. Automated acoustic matrix deposition for MALDI sample preparation.

    Science.gov (United States)

    Aerni, Hans-Rudolf; Cornett, Dale S; Caprioli, Richard M

    2006-02-01

    Novel high-throughput sample preparation strategies for MALDI imaging mass spectrometry (IMS) and profiling are presented. An acoustic reagent multispotter was developed to provide improved reproducibility for depositing matrix onto a sample surface, for example, such as a tissue section. The unique design of the acoustic droplet ejector and its optimization for depositing matrix solution are discussed. Since it does not contain a capillary or nozzle for fluid ejection, issues with clogging of these orifices are avoided. Automated matrix deposition provides better control of conditions affecting protein extraction and matrix crystallization with the ability to deposit matrix accurately onto small surface features. For tissue sections, matrix spots of 180-200 microm in diameter were obtained and a procedure is described for generating coordinate files readable by a mass spectrometer to permit automated profile acquisition. Mass spectral quality and reproducibility was found to be better than that obtained with manual pipet spotting. The instrument can also deposit matrix spots in a dense array pattern so that, after analysis in a mass spectrometer, two-dimensional ion images may be constructed. Example ion images from a mouse brain are presented.

  3. The Development of the Missouri Automated Reinforcer Assessment (MARA).

    Science.gov (United States)

    Vatterott, Madeleine

    A knowledge of an individual's preferences is essential to create an effective reward or reinforcer program for individuals who have either a need to reduce maladaptive behaviors or to increase adaptive behaviors. The goal of the Missouri Automated Reinforcer Assessment (MARA) project is to develop an efficient yet thorough automated reinforcer…

  4. Automated Scanning Electron Microscopy Analysis of Sampled Aerosol

    DEFF Research Database (Denmark)

    Bluhme, Anders Brostrøm; Kling, Kirsten; Mølhave, Kristian

    development of an automated software-based analysis of aerosols using Scanning Electron Microscopy (SEM) and Scanning Transmission Electron Microscopy (STEM) coupled with Energy-Dispersive X-ray Spectroscopy (EDS). The automated analysis will be capable of providing both detailed physical and chemical single...

  5. Impact of office automation: an empirical assessment

    OpenAIRE

    1988-01-01

    Approved for public release; distribution is unlimited This study examined the productivity of the Standard Automated Contracting System (SACONS), in a before/after quasi-experimental design that measured outputs (workload, quality of service), inputs (size of staff, staff grade structure, usage of overtime) and by-product social effects (morale, teamwork, professionalism) using archival data. While workload increased slightly, the quality measure (procurement, action lead time) improved ...

  6. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.;

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...... data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...

  7. Validity Arguments for Diagnostic Assessment Using Automated Writing Evaluation

    Science.gov (United States)

    Chapelle, Carol A.; Cotos, Elena; Lee, Jooyoung

    2015-01-01

    Two examples demonstrate an argument-based approach to validation of diagnostic assessment using automated writing evaluation (AWE). "Criterion"®, was developed by Educational Testing Service to analyze students' papers grammatically, providing sentence-level error feedback. An interpretive argument was developed for its use as part of…

  8. Automated assessment of upper extremity movement impairment due to stroke.

    Directory of Open Access Journals (Sweden)

    Erienne V Olesh

    Full Text Available Current diagnosis and treatment of movement impairment post-stroke is based on the subjective assessment of select movements by a trained clinical specialist. However, modern low-cost motion capture technology allows for the development of automated quantitative assessment of motor impairment. Such outcome measures are crucial for advancing post-stroke treatment methods. We sought to develop an automated method of measuring the quality of movement in clinically-relevant terms from low-cost motion capture. Unconstrained movements of upper extremity were performed by people with chronic hemiparesis and recorded by standard and low-cost motion capture systems. Quantitative scores derived from motion capture were compared to qualitative clinical scores produced by trained human raters. A strong linear relationship was found between qualitative scores and quantitative scores derived from both standard and low-cost motion capture. Performance of the automated scoring algorithm was matched by averaged qualitative scores of three human raters. We conclude that low-cost motion capture combined with an automated scoring algorithm is a feasible method to assess objectively upper-arm impairment post stroke. The application of this technology may not only reduce the cost of assessment of post-stroke movement impairment, but also promote the acceptance of objective impairment measures into routine medical practice.

  9. Automated assessment of upper extremity movement impairment due to stroke.

    Science.gov (United States)

    Olesh, Erienne V; Yakovenko, Sergiy; Gritsenko, Valeriya

    2014-01-01

    Current diagnosis and treatment of movement impairment post-stroke is based on the subjective assessment of select movements by a trained clinical specialist. However, modern low-cost motion capture technology allows for the development of automated quantitative assessment of motor impairment. Such outcome measures are crucial for advancing post-stroke treatment methods. We sought to develop an automated method of measuring the quality of movement in clinically-relevant terms from low-cost motion capture. Unconstrained movements of upper extremity were performed by people with chronic hemiparesis and recorded by standard and low-cost motion capture systems. Quantitative scores derived from motion capture were compared to qualitative clinical scores produced by trained human raters. A strong linear relationship was found between qualitative scores and quantitative scores derived from both standard and low-cost motion capture. Performance of the automated scoring algorithm was matched by averaged qualitative scores of three human raters. We conclude that low-cost motion capture combined with an automated scoring algorithm is a feasible method to assess objectively upper-arm impairment post stroke. The application of this technology may not only reduce the cost of assessment of post-stroke movement impairment, but also promote the acceptance of objective impairment measures into routine medical practice.

  10. Fast detection of Noroviruses using a real-time PCR assay and automated sample preparation

    Directory of Open Access Journals (Sweden)

    Schmid Michael

    2004-06-01

    Full Text Available Abstract Background Noroviruses (NoV have become one of the most commonly reported causative agents of large outbreaks of non-bacterial acute gastroenteritis worldwide as well as sporadic gastroenteritis in the community. Currently, reverse transcriptase polymerase chain reaction (RT-PCR assays have been implemented in NoV diagnosis, but improvements that simplify and standardize sample preparation, amplification, and detection will be further needed. The combination of automated sample preparation and real-time PCR offers such refinements. Methods We have designed a new real-time RT-PCR assay on the LightCycler (LC with SYBR Green detection and melting curve analysis (Tm to detect NoV RNA in patient stool samples. The performance of the real-time PCR assay was compared with that obtained in parallel with a commercially available enzyme immunoassay (ELISA for antigen detection by testing a panel of 52 stool samples. Additionally, in a collaborative study with the Baden-Wuerttemberg State Health office, Stuttgart (Germany the real-time PCR results were blindly assessed using a previously well-established nested PCR (nPCR as the reference method, since PCR-based techniques are now considered as the "gold standard" for NoV detection in stool specimens. Results Analysis of 52 clinical stool samples by real-time PCR yielded results that were consistent with reference nPCR results, while marked differences between the two PCR-based methods and antigen ELISA were observed. Our results indicate that PCR-based procedures are more sensitive and specific than antigen ELISA for detecting NoV in stool specimens. Conclusions The combination of automated sample preparation and real-time PCR provided reliable diagnostic results in less time than conventional RT-PCR assays. These benefits make it a valuable tool for routine laboratory practice especially in terms of rapid and appropriate outbreak-control measures in health-care facilities and other settings.

  11. Automated washing of FTA Card punches and PCR setup for reference samples using a LIMS-controlled Sias Xantus automated liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Olsen, Addie Nina; Frøslev, Tobias G.;

    2009-01-01

    We have implemented and validated automated methods for washing FTA Card punches containing buccal samples and subsequent PCR setup using a Sias Xantus automated liquid handler. The automated methods were controlled by worklists generated by our LabWare Laboratory Information Management System (L...

  12. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  13. Evaluation of the measurement uncertainty in automated long-term sampling of PCDD/PCDFs.

    Science.gov (United States)

    Vicaretti, M; D'Emilia, G; Mosca, S; Guerriero, E; Rotatori, M

    2013-12-01

    Since the publication of the first version of European standard EN-1948 in 1996, long-term sampling equipment has been improved to a high standard for the sampling and analysis of polychlorodibenzo-p-dioxin (PCDD)/polychlorodibenzofuran (PCDF) emissions from industrial sources. The current automated PCDD/PCDF sampling systems enable to extend the measurement time from 6-8 h to 15-30 days in order to have data values better representative of the real pollutant emission of the plant in the long period. EN-1948:2006 is still the European technical reference standard for the determination of PCDD/PCDF from stationary source emissions. In this paper, a methodology to estimate the measurement uncertainty of long-term automated sampling is presented. The methodology has been tested on a set of high concentration sampling data resulting from a specific experience; it is proposed with the intent that it is to be applied on further similar studies and generalized. A comparison between short-term sampling data resulting from manual and automated parallel measurements has been considered also in order to verify the feasibility and usefulness of automated systems and to establish correlations between results of the two methods to use a manual method for calibration of automatic long-term one. The uncertainty components of the manual method are analyzed, following the requirements of EN-1948-3:2006, allowing to have a preliminary evaluation of the corresponding uncertainty components of the automated system. Then, a comparison between experimental data coming from parallel sampling campaigns carried out in short- and long-term sampling periods is realized. Long-term sampling is more reliable to monitor PCDD/PCDF emissions than occasional short-term sampling. Automated sampling systems can assure very useful emission data both in short and long sampling periods. Despite this, due to the different application of the long-term sampling systems, the automated results could not be

  14. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  15. Automated assessment of medical training evaluation text.

    Science.gov (United States)

    Zhang, Rui; Pakhomov, Serguei; Gladding, Sophia; Aylward, Michael; Borman-Shoap, Emily; Melton, Genevieve B

    2012-01-01

    Medical post-graduate residency training and medical student training increasingly utilize electronic systems to evaluate trainee performance based on defined training competencies with quantitative and qualitative data, the later of which typically consists of text comments. Medical education is concomitantly becoming a growing area of clinical research. While electronic systems have proliferated in number, little work has been done to help manage and analyze qualitative data from these evaluations. We explored the use of text-mining techniques to assist medical education researchers in sentiment analysis and topic analysis of residency evaluations with a sample of 812 evaluation statements. While comments were predominantly positive, sentiment analysis improved the ability to discriminate statements with 93% accuracy. Similar to other domains, Latent Dirichlet Analysis and Information Gain revealed groups of core subjects and appear to be useful for identifying topics from this data.

  16. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    Science.gov (United States)

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  17. Biological Environmental Sampling Technologies Assessment

    Science.gov (United States)

    2015-12-01

    Nano Intelligent Detection System (NIDS) ........................5 2.3 BBI Detection BWA Integrated Multiplex Assay and Sampling System (IMASS...the samples can be collected ft2 Informational only Not provided N/A N/A Info only GRAND TOTAL 920 5 2.2 ANP Technologies Nano Intelligent ...from all types of surfaces and absorb unknown liquids. The Aklus Shield system can also be used to sample debris, soil, or vegetation . For this

  18. Rapid and automated determination of plutonium and neptunium in environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, J.

    2011-03-15

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  19. Automated Research Impact Assessment: A New Bibliometrics Approach.

    Science.gov (United States)

    Drew, Christina H; Pettibone, Kristianna G; Finch, Fallis Owen; Giles, Douglas; Jordan, Paul

    2016-03-01

    As federal programs are held more accountable for their research investments, The National Institute of Environmental Health Sciences (NIEHS) has developed a new method to quantify the impact of our funded research on the scientific and broader communities. In this article we review traditional bibliometric analyses, address challenges associated with them, and describe a new bibliometric analysis method, the Automated Research Impact Assessment (ARIA). ARIA taps into a resource that has only rarely been used for bibliometric analyses: references cited in "important" research artifacts, such as policies, regulations, clinical guidelines, and expert panel reports. The approach includes new statistics that science managers can use to benchmark contributions to research by funding source. This new method provides the ability to conduct automated impact analyses of federal research that can be incorporated in program evaluations. We apply this method to several case studies to examine the impact of NIEHS funded research.

  20. SASSI: Subsystems for Automated Subsurface Sampling Instruments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Future robotic planetary exploration missions will benefit greatly from the ability to capture rock and/or regolith core samples that deliver the stratigraphy of the...

  1. SASSI: Subsystems for Automated Subsurface Sampling Instruments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Autonomous surface sampling systems are necessary, near term, to construct a historical view of planetary significant events; as well as allow for the identification...

  2. Experiences of Using Automated Assessment in Computer Science Courses

    Directory of Open Access Journals (Sweden)

    John English

    2015-10-01

    Full Text Available In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students type in their own answers. Students were allowed to correct errors based on feedback provided by the system and resubmit their answers. A total of 141 students were surveyed to assess their opinions of this approach, and we analysed their responses. Analysis of the questionnaire showed a low correlation between questions, indicating the statistical independence of the individual questions. As a whole, student feedback on using Checkpoint was very positive, emphasizing the benefits of multiple attempts, impartial marking, and a quick turnaround time for submissions. Many students said that Checkpoint gave them confidence in learning and motivation to practise. Students also said that the detailed feedback that Checkpoint generated when their programs failed helped them understand their mistakes and how to correct them.

  3. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    Science.gov (United States)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  4. The Impact of Sampling Approach on Population Invariance in Automated Scoring of Essays. Research Report. ETS RR-13-18

    Science.gov (United States)

    Zhang, Mo

    2013-01-01

    Many testing programs use automated scoring to grade essays. One issue in automated essay scoring that has not been examined adequately is population invariance and its causes. The primary purpose of this study was to investigate the impact of sampling in model calibration on population invariance of automated scores. This study analyzed scores…

  5. Development of an automated sample preparation module for environmental monitoring of biowarfare agents.

    Science.gov (United States)

    Hindson, Benjamin J; Brown, Steve B; Marshall, Graham D; McBride, Mary T; Makarewicz, Anthony J; Gutierrez, Dora M; Wolcott, Duane K; Metz, Thomas R; Madabhushi, Ramakrishna S; Dzenitis, John M; Colston, Billy W

    2004-07-01

    An automated sample preparation module, based upon sequential injection analysis (SIA), has been developed for use within an autonomous pathogen detection system. The SIA system interfaced aerosol sampling with multiplexed microsphere immunoassay-flow cytometric detection. Metering and sequestering of microspheres using SIA was found to be reproducible and reliable, over 24-h periods of autonomous operation. Four inbuilt immunoassay controls showed excellent immunoassay and system stability over five days of unattended continuous operation. Titration curves for two biological warfare agents, Bacillus anthracis and Yersinia pestis, obtained using the automated SIA procedure were shown to be similar to those generated using a manual microtiter plate procedure.

  6. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  7. Integrating Electrochemical Detection with Centrifugal Microfluidics for Real-Time and Fully Automated Sample Testing

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga; Kwasny, Dorota; Amato, Letizia

    2015-01-01

    experiments, even when the microfluidic disc is spinning at high velocities. Automated sample handling is achieved by designing a microfluidic system to release analyte sequentially, utilizing on-disc passive valving. In addition, the microfluidic system is designed to trap and keep the liquid sample...... electrochemical experiment, including all intermediate sample handling steps, is demonstrated by amperometric detection of on-disc mixing of analytes (PBS and ferricyanide)....

  8. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    Science.gov (United States)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A.G.; Sellergren, Börje; Reubsaet, Léon

    2017-01-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting. PMID:28303910

  9. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    Science.gov (United States)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon

    2017-03-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.

  10. Security Measures in Automated Assessment System for Programming Courses

    Directory of Open Access Journals (Sweden)

    Jana Šťastná

    2015-12-01

    Full Text Available A desirable characteristic of programming code assessment is to provide the learner the most appropriate information regarding the code functionality as well as a chance to improve. This can be hardly achieved in case the number of learners is high (500 or more. In this paper we address the problem of risky code testing and availability of an assessment platform Arena, dealing with potential security risks when providing an automated assessment for a large set of source code. Looking at students’ programs as if they were potentially malicious inspired us to investigate separated execution environments, used by security experts for secure software analysis. The results also show that availability issues of our assessment platform can be conveniently resolved with task queues. A special attention is paid to Docker, a virtual container ensuring no risky code can affect the assessment system security. The assessment platform Arena enables to regularly, effectively and securely assess students' source code in various programming courses. In addition to that it is a motivating factor and helps students to engage in the educational process.

  11. Quantitative Vulnerability Assessment of Cyber Security for Distribution Automation Systems

    Directory of Open Access Journals (Sweden)

    Xiaming Ye

    2015-06-01

    Full Text Available The distribution automation system (DAS is vulnerable to cyber-attacks due to the widespread use of terminal devices and standard communication protocols. On account of the cost of defense, it is impossible to ensure the security of every device in the DAS. Given this background, a novel quantitative vulnerability assessment model of cyber security for DAS is developed in this paper. In the assessment model, the potential physical consequences of cyber-attacks are analyzed from two levels: terminal device level and control center server level. Then, the attack process is modeled based on game theory and the relationships among different vulnerabilities are analyzed by introducing a vulnerability adjacency matrix. Finally, the application process of the proposed methodology is illustrated through a case study based on bus 2 of the Roy Billinton Test System (RBTS. The results demonstrate the reasonability and effectiveness of the proposed methodology.

  12. Quantification of Human Movement for Assessment in Automated Exercise Coaching

    CERN Document Server

    Hagler, Stuart; Bajczy, Ruzena; Pavel, Misha

    2016-01-01

    Quantification of human movement is a challenge in many areas, ranging from physical therapy to robotics. We quantify of human movement for the purpose of providing automated exercise coaching in the home. We developed a model-based assessment and inference process that combines biomechanical constraints with movement assessment based on the Microsoft Kinect camera. To illustrate the approach, we quantify the performance of a simple squatting exercise using two model-based metrics that are related to strength and endurance, and provide an estimate of the strength and energy-expenditure of each exercise session. We look at data for 5 subjects, and show that for some subjects the metrics indicate a trend consistent with improved exercise performance.

  13. Automated bone age assessment of older children using the radius

    Science.gov (United States)

    Tsao, Sinchai; Gertych, Arkadiusz; Zhang, Aifeng; Liu, Brent J.; Huang, Han K.

    2008-03-01

    The Digital Hand Atlas in Assessment of Skeletal Development is a large-scale Computer Aided Diagnosis (CAD) project for automating the process of grading Skeletal Development of children from 0-18 years of age. It includes a complete collection of 1,400 normal hand X-rays of children between the ages of 0-18 years of age. Bone Age Assessment is used as an index of skeletal development for detection of growth pathologies that can be related to endocrine, malnutrition and other disease types. Previous work at the Image Processing and Informatics Lab (IPILab) allowed the bone age CAD algorithm to accurately assess bone age of children from 1 to 16 (male) or 14 (female) years of age using the Phalanges as well as the Carpal Bones. At the older ages (16(male) or 14(female) -19 years of age) the Phalanges as well as the Carpal Bones are fully developed and do not provide well-defined features for accurate bone age assessment. Therefore integration of the Radius Bone as a region of interest (ROI) is greatly needed and will significantly improve the ability to accurately assess the bone age of older children. Preliminary studies show that an integrated Bone Age CAD that utilizes the Phalanges, Carpal Bones and Radius forms a robust method for automatic bone age assessment throughout the entire age range (1-19 years of age).

  14. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.;

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat......, neckskin and environmental samples) were collected over a period of 4 months at a turkey slaughterhouse and meat-cutting plant in Denmark. Faecal and environmental samples were tested by the conventional culture method and by the two EIAs, whereas meat and neckskin samples were tested by the two EIAs only...

  15. Using sample entropy for automated sign language recognition on sEMG and accelerometer data.

    Science.gov (United States)

    Kosmidou, Vasiliki E; Hadjileontiadis, Leontios I

    2010-03-01

    Communication using sign language (SL) provides alternative means for information transmission among the deaf. Automated gesture recognition involved in SL, however, could further expand this communication channel to the world of hearers. In this study, data from five-channel surface electromyogram and three-dimensional accelerometer from signers' dominant hand were subjected to a feature extraction process. The latter consisted of sample entropy (SampEn)-based analysis, whereas time-frequency feature (TFF) analysis was also performed as a baseline method for the automated recognition of 60-word lexicon Greek SL (GSL) isolated signs. Experimental results have shown a 66 and 92% mean classification accuracy threshold using TFF and SampEn, respectively. These results justify the superiority of SampEn against conventional methods, such as TFF, to provide with high recognition hit-ratios, combined with feature vector dimension reduction, toward a fast and reliable automated GSL gesture recognition.

  16. Automated quality assessment in three-dimensional breast ultrasound images.

    Science.gov (United States)

    Schwaab, Julia; Diez, Yago; Oliver, Arnau; Martí, Robert; van Zelst, Jan; Gubern-Mérida, Albert; Mourri, Ahmed Bensouda; Gregori, Johannes; Günther, Matthias

    2016-04-01

    Automated three-dimensional breast ultrasound (ABUS) is a valuable adjunct to x-ray mammography for breast cancer screening of women with dense breasts. High image quality is essential for proper diagnostics and computer-aided detection. We propose an automated image quality assessment system for ABUS images that detects artifacts at the time of acquisition. Therefore, we study three aspects that can corrupt ABUS images: the nipple position relative to the rest of the breast, the shadow caused by the nipple, and the shape of the breast contour on the image. Image processing and machine learning algorithms are combined to detect these artifacts based on 368 clinical ABUS images that have been rated manually by two experienced clinicians. At a specificity of 0.99, 55% of the images that were rated as low quality are detected by the proposed algorithms. The areas under the ROC curves of the single classifiers are 0.99 for the nipple position, 0.84 for the nipple shadow, and 0.89 for the breast contour shape. The proposed algorithms work fast and reliably, which makes them adequate for online evaluation of image quality during acquisition. The presented concept may be extended to further image modalities and quality aspects.

  17. LAVA: a conceptual framework for automated risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Brown, D.C.; Erkkila, T.H.; FitzGerald, P.D.; Lim, J.J.; Massagli, L.; Phillips, J.R.; Tisinger, R.M.

    1986-01-01

    At the Los Alamos National Laboratory we are developing the framework for generating knowledge-based systems that perform automated risk analyses on an organization's assets. An organization's assets can be subdivided into tangible and intangible assets. Tangible assets include facilities, materiel, personnel, and time, while intangible assets include such factors as reputation, employee morale, and technical knowledge. The potential loss exposure of an asset is dependent upon the threats (both static and dynamic), the vulnerabilities in the mechanisms protecting the assets from the threats, and the consequences of the threats successfully exploiting the protective systems vulnerabilities. The methodology is based upon decision analysis, fuzzy set theory, natural-language processing, and event-tree structures. The Los Alamos Vulnerability and Risk Assessment (LAVA) methodology has been applied to computer security. LAVA is modeled using an interactive questionnaire in natural language and is fully automated on a personal computer. The program generates both summary reports for use by both management personnel and detailed reports for use by operations staff. LAVA has been in use by the Nuclear Regulatory Commission and the National Bureau of Standards for nearly two years and is presently under evaluation by other governmental agencies. 7 refs.

  18. LAVA: A conceptual framework for automated risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Brown, D.C.; Erkkila, T.H.; FitzGerald, P.D.; Lim, J.J.; Massagli, L.; Phillips, J.R.; Tisinger, R.M.

    1986-01-01

    At the Los Alamos National Laboratory the authors are developing the framework for generating knowledge-based systems that perform automated risk analyses on an organizations's assets. An organization's assets can be subdivided into tangible and intangible assets. Tangible assets include facilities, material, personnel, and time, while intangible assets include such factors as reputation, employee morale, and technical knowledge. The potential loss exposure of an asset is dependent upon the threats (both static and dynamic), the vulnerabilities in the mechanisms protecting the assets from the threats, and the consequences of the threats successfully exploiting the protective systems vulnerabilities. The methodology is based upon decision analysis, fuzzy set theory, natural language processing, and event tree structures. The Los Alamos Vulnerability and Risk Assessment (LAVA) methodology has been applied to computer security. The program generates both summary reports for use by both management personnel and detailed reports for use by operations staff.

  19. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33

    Science.gov (United States)

    Round, A. R.; Franke, D.; Moritz, S.; Huchler, R.; Fritsche, M.; Malthan, D.; Klaering, R.; Svergun, D. I.; Roessle, M.

    2008-01-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client–server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841

  20. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    Science.gov (United States)

    2012-03-01

    coatings Vinyl/asbestos floor tile Automatic transmission components Clutch facings Disc brake pads Drum brake linings Brake blocks Commercial and...1EMDQ March 2012 ASBESTOS WORKSHOP: SAMPLING, ANALYSIS , AND RISK ASSESSMENT Paul Black, PhD, Neptune and Company Ralph Perona, DABT, Neptune and...Sampling, Analysis , and Risk Assessment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  1. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies

    Science.gov (United States)

    Abdi, Asad; Idris, Norisma; Alguliyev, Rasim M.; Aliguliyev, Ramiz M.

    2016-01-01

    Background Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively. Design/Results This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing. PMID:26735139

  2. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies.

    Directory of Open Access Journals (Sweden)

    Asad Abdi

    Full Text Available Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively.This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing.

  3. Functional profiling of live melanoma samples using a novel automated platform.

    Directory of Open Access Journals (Sweden)

    Adam Schayowitz

    Full Text Available AIMS: This proof-of-concept study was designed to determine if functional, pharmacodynamic profiles relevant to targeted therapy could be derived from live human melanoma samples using a novel automated platform. METHODS: A series of 13 melanoma cell lines was briefly exposed to a BRAF inhibitor (PLX-4720 on a platform employing automated fluidics for sample processing. Levels of the phosphoprotein p-ERK in the mitogen-activated protein kinase (MAPK pathway from treated and untreated sample aliquots were determined using a bead-based immunoassay. Comparison of these levels provided a determination of the pharmacodynamic effect of the drug on the MAPK pathway. A similar ex vivo analysis was performed on fine needle aspiration (FNA biopsy samples from four murine xenograft models of metastatic melanoma, as well as 12 FNA samples from patients with metastatic melanoma. RESULTS: Melanoma cell lines with known sensitivity to BRAF inhibitors displayed marked suppression of the MAPK pathway in this system, while most BRAF inhibitor-resistant cell lines showed intact MAPK pathway activity despite exposure to a BRAF inhibitor (PLX-4720. FNA samples from melanoma xenografts showed comparable ex vivo MAPK activity as their respective cell lines in this system. FNA samples from patients with metastatic melanoma successfully yielded three categories of functional profiles including: MAPK pathway suppression; MAPK pathway reactivation; MAPK pathway stimulation. These profiles correlated with the anticipated MAPK activity, based on the known BRAF mutation status, as well as observed clinical responses to BRAF inhibitor therapy. CONCLUSION: Pharmacodynamic information regarding the ex vivo effect of BRAF inhibitors on the MAPK pathway in live human melanoma samples can be reproducibly determined using a novel automated platform. Such information may be useful in preclinical and clinical drug development, as well as predicting response to targeted therapy in

  4. Assessment of a five-color flow cytometric assay for verifying automated white blood cell differentials

    Institute of Scientific and Technical Information of China (English)

    HUANG Chun-mei; YU Lian-hui; PU Cheng-wei; WANG Xin; WANG Geng; SHEN Li-song; WANG Jian-zhong

    2013-01-01

    Background White blood cell (WBC) counts and differentials performed using an automated cell counter typically require manual microscopic review.However,this last step is time consuming and requires experienced personnel.We evaluated the clinical efficiency of using flow cytometry (FCM) employing a six-antibody/five-color reagent for verifying automated WBC differentials.Methods A total of 56 apparently healthy samples were assessed using a five-color flow cytometer to verify the normal reference ranges of WBC differentials.WBC differentials of 622 samples were also determined using both a cell counter and FCM.These results were then confirmed using manual microscopic methods.Results The probabilities for all of the parameters of WBC differentials exceeded the corresponding normal reference ranges by no more than 7.5%.The resulting WBC differentials were well correlated between FCM and the cell counter (r >0.88,P <0.001),except in the case of basophils.Neutrophils,lymphocytes,and eosinophils were well correlated between FCM and standard microscopic cytology assessment (r >0.80,P <0.001).The sensitivities of FCM for identification of immature granulocytes and blast cells (72.03% and 22.22%,respectively) were higher than those of the cell counter method (44.92% and 11.11%,respectively).The specificities of FCM were all above 85%,substantially better than those of the cell counter method.Conclusion These five-color FCM assays could be applied to accurately verify abnormal results of automated assessment of WBC differentials.

  5. An instrument for automated purification of nucleic acids from contaminated forensic samples.

    Science.gov (United States)

    Broemeling, David J; Pel, Joel; Gunn, Dylan C; Mai, Laura; Thompson, Jason D; Poon, Hiron; Marziali, Andre

    2008-02-01

    Forensic crime scene sample analysis, by its nature, often deals with samples in which there are low amounts of nucleic acids, on substrates that often lead to inhibition of subsequent enzymatic reactions such as PCR amplification for STR profiling. Common substrates include denim from blue jeans, which yields indigo dye as a PCR inhibitor, and soil, which yields humic substances as inhibitors. These inhibitors frequently co-extract with nucleic acids in standard column or bead-based preps, leading to frequent failure of STR profiling. We present a novel instrument for DNA purification of forensic samples that is capable of highly effective concentration of nucleic acids from soil particulates, fabric, and other complex samples including solid components. The novel concentration process, known as SCODA, is inherently selective for long charged polymers such as DNA, and therefore is able to effectively reject known contaminants. We present an automated sample preparation instrument based on this process, and preliminary results based on mock forensic samples.

  6. Automated mango fruit assessment using fuzzy logic approach

    Science.gov (United States)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  7. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.

    2016-01-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  8. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-05

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  9. Automation of Workplace Lifting Hazard Assessment for Musculoskeletal Injury Prevention

    Science.gov (United States)

    2014-01-01

    posture and temporal elements of tasks such as task frequency in an automated fashion, although these findings should be confirmed in a larger study. Further work is needed to incorporate force assessments and address workplace feasibility challenges. We anticipate that this approach could ultimately be used to perform large-scale musculoskeletal exposure assessment not only for research but also to provide real-time feedback to workers and employers during work method improvement activities and employee training. PMID:24987523

  10. Assessment of organic matter resistance to biodegradation in volcanic ash soils assisted by automated interpretation of infrared spectra from humic acid and whole soil samples by using partial least squares

    Science.gov (United States)

    Hernández, Zulimar; Pérez Trujillo, Juan Pedro; Hernández-Hernández, Sergio Alexander; Almendros, Gonzalo; Sanz, Jesús

    2014-05-01

    From a practical viewpoint, the most interesting possibilities of applying infrared (IR) spectroscopy to soil studies lie on processing IR spectra of whole soil (WS) samples [1] in order to forecast functional descriptors at high organizational levels of the soil system, such as soil C resilience. Currently, there is a discussion on whether the resistance to biodegradation of soil organic matter (SOM) depends on its molecular composition or on environmental interactions between SOM and mineral components, such could be the case with physical encapsulation of particulate SOM or organo-mineral derivatives, e.g., those formed with amorphous oxides [2]. A set of about 200 dependent variables from WS and isolated, ash free, humic acids (HA) [3] was obtained in 30 volcanic ash soils from Tenerife Island (Spain). Soil biogeochemical properties such as SOM, allophane (Alo + 1 /2 Feo), total mineralization coefficient (TMC) or aggregate stability were determined in WS. In addition, structural information on SOM was obtained from the isolated HA fractions by visible spectroscopy and analytical pyrolysis (Py-GC/MS). Aiming to explore the potential of partial least squares regression (PLS) in forecasting soil dependent variables, exclusively using the information extracted from WS and HA IR spectral profiles, data were processed by using ParLeS [4] and Unscrambler programs. Data pre-treatments should be carefully chosen: the most significant PLS models from IR spectra of HA were obtained after second derivative pre-treatment, which prevented effects of intrinsically broadband spectral profiles typical in macromolecular heterogeneous material such as HA. Conversely, when using IR spectra of WS, the best forecasting models were obtained using linear baseline correction and maximum normalization pre-treatment. With WS spectra, the most successful prediction models were obtained for SOM, magnetite, allophane, aggregate stability, clay and total aromatic compounds, whereas the PLS

  11. Sample preparation and in situ hybridization techniques for automated molecular cytogenetic analysis of white blood cells

    Energy Technology Data Exchange (ETDEWEB)

    Rijke, F.M. van de; Vrolijk, H.; Sloos, W. [Leiden Univ. (Netherlands)] [and others

    1996-06-01

    With the advent in situ hybridization techniques for the analysis of chromosome copy number or structure in interphase cells, the diagnostic and prognostic potential of cytogenetics has been augmented considerably. In theory, the strategies for detection of cytogenetically aberrant cells by in situ hybridization are simple and straightforward. In practice, however, they are fallible, because false classification of hybridization spot number or patterns occurs. When a decision has to be made on molecular cytogenetic normalcy or abnormalcy of a cell sample, the problem of false classification becomes particularly prominent if the fraction of aberrant cells is relatively small. In such mosaic situations, often > 200 cells have to be evaluated to reach a statistical sound figure. The manual enumeration of in situ hybridization spots in many cells in many patient samples is tedious. Assistance in the evaluation process by automation of microscope functions and image analysis techniques is, therefore, strongly indicated. Next to research and development of microscope hardware, camera technology, and image analysis, the optimization of the specimen for the (semi)automated microscopic analysis is essential, since factors such as cell density, thickness, and overlap have dramatic influences on the speed and complexity of the analysis process. Here we describe experiments that have led to a protocol for blood cell specimen that results in microscope preparations that are well suited for automated molecular cytogenetic analysis. 13 refs., 4 figs., 1 tab.

  12. Reliability Assessment of the Defense Automated Neurobehavioral Assessment (DANA) in Extreme Environments

    Science.gov (United States)

    2015-05-01

    1985). 2. D. M. Levinson, D. L. Reeves , "Monitoring Recovery from Traumatic Brain Injury Using the Automated Neuropsychological Assessment Metrics...Concussions on Emotional Distress, Post-Concussive Symptoms, and Neurocognitive Functioning in Active Duty United States Marines Independent of...Combat Exposure or Emotional Distress," Journal of Neurotrauma, Vol. 31 (2014), pp. 1823-1834. 12. J. D. Hardy, E. F. Dubois, "The Technic of

  13. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    Science.gov (United States)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  14. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  15. Application of existing technology to meet increasing demands for automated sample handling.

    Science.gov (United States)

    Chow, A T; Kegelman, J E; Kohli, C; McCabe, D D; Moore, J F

    1990-09-01

    As the clinical laboratory advances toward total automation, the marketplace is now demanding more-efficient sample-handling systems. These demands have arisen over a relatively short period of time, in part because of heightened concern over laboratory safety and the resulting manpower shortages. Adding sample-handling capabilities to existing instrumentation is often a challenge, because usually mechanical or system constraints are present that interfere. This challenge has been overcome in the DuPont Sample Management System (SMS), a second-generation general chemistry analyzer that incorporates the latest barcode and computer-interfacing technology. The development of the SMS system relies heavily on recent advances in technology, e.g., software modeling and computer-aided design. The SMS system includes a barcode scanner based on "charge-coupled device" technology, a random-access sample wheel, and new software that oversees the various functions.

  16. Device and method for automated separation of a sample of whole blood into aliquots

    Science.gov (United States)

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  17. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography...

  18. Automated aerosol Raman spectrometer for semi-continuous sampling of atmospheric aerosol

    Science.gov (United States)

    Doughty, David C.; Hill, Steven C.

    2017-02-01

    Raman spectroscopy (RS) is useful in characterizing atmospheric aerosol. It is not commonly used in studying ambient particles partly because automated instrumentation for aerosol RS has not been available. Battelle (Columbus, Ohio, USA) has developed the Resource Effective Bioidentification System (REBS) for automated detection of airborne bioagents based on RS. We use a version of the REBS that measures Raman spectra of one set of particles while the next set of particles is collected from air, then moves the newly collected particles to the analysis region and repeats. Here we investigate the use of the REBS as the core of a general-purpose automated Aerosol Raman Spectrometer (ARS) for atmospheric applications. This REBS-based ARS can be operated as a line-scanning Raman imaging spectrometer. Spectra measured by this ARS for single particles made of polystyrene, black carbon, and several other materials are clearly distinguishable. Raman spectra from a 15 min ambient sample (approximately 35-50 particles, 158 spectra) were analyzed using a hierarchical clustering method to find that the cluster spectra are consistent with soot, inorganic aerosol, and other organic compounds. The ARS ran unattended, collecting atmospheric aerosol and measuring spectra for a 7 hr period at 15-min intervals. A total of 32,718 spectra were measured; 5892 exceeded a threshold and were clustered during this time. The number of particles exhibiting the D-G bands of amorphous carbon plotted vs time (at 15-min intervals) increases during the morning commute, then decreases. This data illustrates the potential of the ARS to measure thousands of time resolved aerosol Raman spectra in the ambient atmosphere over the course of several hours. The capability of this ARS for automated measurements of Raman spectra should lead to more extensive RS-based studies of atmospheric aerosols.

  19. Analysis of inflammatory response in human plasma samples by an automated multicapillary electrophoresis system.

    Science.gov (United States)

    Larsson, Anders; Hansson, Lars-Olof

    2004-01-01

    A new automated multicapillary zone electrophoresis instrument with a new high-resolution (HR) buffer (Capillarys with HR buffer) for analysis of human plasma proteins was evaluated. Albumin, alpha(1)-antitrypsin, alpha(1)-acid glycoprotein, haptoglobin, fibrinogen, immunoglobulin (Ig)A, IgG and IgM were determined nephelometrically in 200 patient plasma samples. The same samples were then analyzed on the Capillarys system (Sebia, Paris, France). The albumin concentration from the nephelometric determination was used for quantification of the individual peaks in the capillary electrophoresis (CE) electropherogram. There was strong linear correlation between the nephelometric and electrophoretic determination of alpha(1)-antitrypsin (R(2) = 0.906), alpha(1)-acid glycoprotein (R(2) =0.894) and haptoglobin (R(2) = 0.913). There was also good correlation between the two determinations of gamma-globulins (R(2) = 0.883), while the correlation was weaker for fibrinogen (R(2) = 0.377). The Capillarys instrument is a reliable system for plasma protein analysis, combining the advantages of full automation, good analytical performance and high throughput. The HR buffer in combination with albumin quantification allows the simultaneous quantification of inflammatory markers in plasma samples without the need for nephelometric determination of these proteins.

  20. Surveillance cultures of samples obtained from biopsy channels and automated endoscope reprocessors after high-level disinfection of gastrointestinal endoscopes

    Directory of Open Access Journals (Sweden)

    Chiu King-Wah

    2012-09-01

    Full Text Available Abstract Background The instrument channels of gastrointestinal (GI endoscopes may be heavily contaminated with bacteria even after high-level disinfection (HLD. The British Society of Gastroenterology guidelines emphasize the benefits of manually brushing endoscope channels and using automated endoscope reprocessors (AERs for disinfecting endoscopes. In this study, we aimed to assess the effectiveness of decontamination using reprocessors after HLD by comparing the cultured samples obtained from biopsy channels (BCs of GI endoscopes and the internal surfaces of AERs. Methods We conducted a 5-year prospective study. Every month random consecutive sampling was carried out after a complete reprocessing cycle; 420 rinse and swabs samples were collected from BCs and internal surface of AERs, respectively. Of the 420 rinse samples collected from the BC of the GI endoscopes, 300 were obtained from the BCs of gastroscopes and 120 from BCs of colonoscopes. Samples were collected by flushing the BCs with sterile distilled water, and swabbing the residual water from the AERs after reprocessing. These samples were cultured to detect the presence of aerobic and anaerobic bacteria and mycobacteria. Results The number of culture-positive samples obtained from BCs (13.6%, 57/420 was significantly higher than that obtained from AERs (1.7%, 7/420. In addition, the number of culture-positive samples obtained from the BCs of gastroscopes (10.7%, 32/300 and colonoscopes (20.8%, 25/120 were significantly higher than that obtained from AER reprocess to gastroscopes (2.0%, 6/300 and AER reprocess to colonoscopes (0.8%, 1/120. Conclusions Culturing rinse samples obtained from BCs provides a better indication of the effectiveness of the decontamination of GI endoscopes after HLD than culturing the swab samples obtained from the inner surfaces of AERs as the swab samples only indicate whether the AERs are free from microbial contamination or not.

  1. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    DEFF Research Database (Denmark)

    Qiao, Jixin

    and optimization for rapid determination of plutonium in environmental samples using SIextraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples...... (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography......This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development...

  2. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    Science.gov (United States)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P 0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  3. Automation of high-frequency sampling of environmental waters for reactive species

    Science.gov (United States)

    Kim, H.; Bishop, J. K.; Wood, T.; Fung, I.; Fong, M.

    2011-12-01

    Trace metals, particularly iron and manganese, play a critical role in some ecosystems as a limiting factor to determine primary productivity, in geochemistry, especially redox chemistry as important electron donors and acceptors, and in aquatic environments as carriers of contaminant transport. Dynamics of trace metals are closely related to various hydrologic events such as rainfall. Storm flow triggers dramatic changes of both dissolved and particulate trace metals concentrations and affects other important environmental parameters linked to trace metal behavior such as dissolved organic carbon (DOC). To improve our understanding of behaviors of trace metals and underlying processes, water chemistry information must be collected for an adequately long period of time at higher frequency than conventional manual sampling (e.g. weekly, biweekly). In this study, we developed an automated sampling system to document the dynamics of trace metals, focusing on Fe and Mn, and DOC for a multiple-year high-frequency geochemistry time series in a small catchment, called Rivendell located at Angelo Coast Range Reserve, California. We are sampling ground and streamwater using the automated sampling system in daily-frequency and the condition of the site is substantially variable from season to season. The ranges of pH of ground and streamwater are pH 5 - 7 and pH 7.8 - 8.3, respectively. DOC is usually sub-ppm, but during rain events, it increases by an order of magnitude. The automated sampling system focuses on two aspects- 1) a modified design of sampler to improve sample integrity for trace metals and DOC and 2) remote controlling system to update sampling volume and timing according to hydrological conditions. To maintain sample integrity, the developed method employed gravity filtering using large volume syringes (140mL) and syringe filters connected to a set of polypropylene bottles and a borosilicate bottle via Teflon tubing. Without filtration, in a few days, the

  4. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  5. An automated method for 'clumped-isotope' measurements on small carbonate samples.

    Science.gov (United States)

    Schmid, Thomas W; Bernasconi, Stefano M

    2010-07-30

    Clumped-isotope geochemistry deals with the state of ordering of rare isotopes in molecules, in particular with their tendency to form bonds with other rare isotopes rather than with the most abundant ones. Among its possible applications, carbonate clumped-isotope thermometry is the one that has gained most attention because of the wide potential of applications in many disciplines of earth sciences. Clumped-isotope thermometry allows reconstructing the temperature of formation of carbonate minerals without knowing the isotopic composition of the water from which they were formed. This feature enables new approaches in paleothermometry. The currently published method is, however, limited by sample weight requirements of 10-15 mg and because measurements are performed manually. In this paper we present a new method using an automated sample preparation device coupled to an isotope ratio mass spectrometer. The method is based on the repeated analysis (n = 6-8) of 200 microg aliquots of sample material and completely automated measurements. In addition, we propose to use precisely calibrated carbonates spanning a wide range in Delta(47) instead of heated gases to correct for isotope effects caused by the source of the mass spectrometer, following the principle of equal treatment of the samples and standards. We present data for international standards (NBS 19 and LSVEC) and different carbonates formed at temperatures exceeding 600 degrees C to show that precisions in the range of 10 to 15 ppm (1 SE) can be reached for repeated analyses of a single sample. Finally, we discuss and validate the correction procedure based on high-temperature carbonates instead of heated gases.

  6. Image cytometer method for automated assessment of human spermatozoa concentration

    DEFF Research Database (Denmark)

    Egeberg, D L; Kjaerulff, S; Hansen, C

    2013-01-01

    to investigator bias. Here we show that image cytometry can be used to accurately measure the sperm concentration of human semen samples with great ease and reproducibility. The impact of several factors (pipetting, mixing, round cell content, sperm concentration), which can influence the read-out as well......In the basic clinical work-up of infertile couples, a semen analysis is mandatory and the sperm concentration is one of the most essential variables to be determined. Sperm concentration is usually assessed by manual counting using a haemocytometer and is hence labour intensive and may be subjected...... and easy measurement of human sperm concentration....

  7. Automated high-volume aerosol sampling station for environmental radiation monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m{sup 3}/h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10{sup -6} Bq/m{sup 3}. The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too 10 refs.

  8. ASSESSING SMALL SAMPLE WAR-GAMING DATASETS

    Directory of Open Access Journals (Sweden)

    W. J. HURLEY

    2013-10-01

    Full Text Available One of the fundamental problems faced by military planners is the assessment of changes to force structure. An example is whether to replace an existing capability with an enhanced system. This can be done directly with a comparison of measures such as accuracy, lethality, survivability, etc. However this approach does not allow an assessment of the force multiplier effects of the proposed change. To gauge these effects, planners often turn to war-gaming. For many war-gaming experiments, it is expensive, both in terms of time and dollars, to generate a large number of sample observations. This puts a premium on the statistical methodology used to examine these small datasets. In this paper we compare the power of three tests to assess population differences: the Wald-Wolfowitz test, the Mann-Whitney U test, and re-sampling. We employ a series of Monte Carlo simulation experiments. Not unexpectedly, we find that the Mann-Whitney test performs better than the Wald-Wolfowitz test. Resampling is judged to perform slightly better than the Mann-Whitney test.

  9. Automated Generation and Assessment of Autonomous Systems Test Cases

    Science.gov (United States)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results

  10. Automated negotiation in environmental resource management: Review and assessment.

    Science.gov (United States)

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach.

  11. Automated Scoring in Context: Rapid Assessment for Placed Students

    Science.gov (United States)

    Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal

    2013-01-01

    This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…

  12. Harmonization of automated hemolysis index assessment and use: Is it possible?

    Science.gov (United States)

    Dolci, Alberto; Panteghini, Mauro

    2014-05-15

    The major source of errors producing unreliable laboratory test results is the pre-analytical phase with hemolysis accounting for approximately half of them and being the leading cause of unsuitable blood specimens. Hemolysis may produce interference in many laboratory tests by a variety of biological and analytical mechanisms. Consequently, laboratories need to systematically detect and reliably quantify hemolysis in every collected sample by means of objective and consistent technical tools that assess sample integrity. This is currently done by automated estimation of hemolysis index (HI), available on almost all clinical chemistry platforms, making the hemolysis detection reliable and reportable patient test results more accurate. Despite these advantages, a degree of variability still affects the HI estimate and more efforts should be placed on harmonization of this index. The harmonization of HI results from different analytical systems should be the immediate goal, but the scope of harmonization should go beyond analytical steps to include other aspects, such as HI decision thresholds, criteria for result interpretation and application in clinical practice as well as report formats. With regard to this, relevant issues to overcome remain the objective definition of a maximum allowable bias for hemolysis interference based on the clinical application of the measurements and the management of unsuitable samples. Particularly, for the latter a recommended harmonized approach is required when not reporting numerical results of unsuitable samples with significantly increased HI and replacing the test result with a specific comment highlighting hemolysis of the sample.

  13. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  14. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    Science.gov (United States)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  15. Artificial Neural Network for Total Laboratory Automation to Improve the Management of Sample Dilution.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.

  16. A modifiable microarray-based universal sensor: providing sample-to-results automation.

    Science.gov (United States)

    Yasmin, Rubina; Zhu, Hui; Chen, Zongyuan; Montagna, Richard A

    2016-10-01

    A microfluidic system consisting of generic single use cartridges which interface with a workstation allows the automatic performance of all necessary sample preparation, PCR analysis and interpretation of multiplex PCR assays. The cartridges contain a DNA array with 20 different 16mer DNA "universal" probes immobilized at defined locations. PCR amplicons can be detected via hybridization of user-defined "reporter" probes that are complementary at their 3' termini to one or more of the universal probes and complementary to the target amplicons at their 5' termini. The system was able to detect single-plex and multiplex PCR amplicons from various infectious agents as well as wild type and mutant alleles of single nucleotide polymorphisms. The system's ease of use was further demonstrated by converting a published PCR assay for the detection of Mycobacterium genitalium in a fully automated manner. Excellent correlation between traditional manual methods and the automated analysis performed by the workstation suggests that the system can provide a means to easily design and implement a variety of customized PCR-based assays. The system will be useful to researchers or clinical investigators seeking to develop their own user defined assays. As the U.S. FDA continues to pursue regulatory oversight of LDTs, the system would also allow labs to continue to develop compliant assays.

  17. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  18. Automated Device for Asynchronous Extraction of RNA, DNA, or Protein Biomarkers from Surrogate Patient Samples.

    Science.gov (United States)

    Bitting, Anna L; Bordelon, Hali; Baglia, Mark L; Davis, Keersten M; Creecy, Amy E; Short, Philip A; Albert, Laura E; Karhade, Aditya V; Wright, David W; Haselton, Frederick R; Adams, Nicholas M

    2016-12-01

    Many biomarker-based diagnostic methods are inhibited by nontarget molecules in patient samples, necessitating biomarker extraction before detection. We have developed a simple device that purifies RNA, DNA, or protein biomarkers from complex biological samples without robotics or fluid pumping. The device design is based on functionalized magnetic beads, which capture biomarkers and remove background biomolecules by magnetically transferring the beads through processing solutions arrayed within small-diameter tubing. The process was automated by wrapping the tubing around a disc-like cassette and rotating it past a magnet using a programmable motor. This device recovered biomarkers at ~80% of the operator-dependent extraction method published previously. The device was validated by extracting biomarkers from a panel of surrogate patient samples containing clinically relevant concentrations of (1) influenza A RNA in nasal swabs, (2) Escherichia coli DNA in urine, (3) Mycobacterium tuberculosis DNA in sputum, and (4) Plasmodium falciparum protein and DNA in blood. The device successfully extracted each biomarker type from samples representing low levels of clinically relevant infectivity (i.e., 7.3 copies/µL of influenza A RNA, 405 copies/µL of E. coli DNA, 0.22 copies/µL of TB DNA, 167 copies/µL of malaria parasite DNA, and 2.7 pM of malaria parasite protein).

  19. Assessment of the relative error in sessile drop method automation task

    OpenAIRE

    Levitskaya T.О.

    2015-01-01

    Assessment of the relative error in the sessile drop method automation. Further development of the sessile drop method is directly related to the development of new techniques and specially developed algorithms enabling automatic computer calculation of surface properties. The sessile drop method mathematical apparatus improvement, drop circuit equation transformation to a form suitable for working, the drop surface calculation method automation, analysis of relative errors in the calculation...

  20. Automated Three-Dimensional Microbial Sensing and Recognition Using Digital Holography and Statistical Sampling

    Directory of Open Access Journals (Sweden)

    Inkyu Moon

    2010-09-01

    Full Text Available We overview an approach to providing automated three-dimensional (3D sensing and recognition of biological micro/nanoorganisms integrating Gabor digital holographic microscopy and statistical sampling methods. For 3D data acquisition of biological specimens, a coherent beam propagates through the specimen and its transversely and longitudinally magnified diffraction pattern observed by the microscope objective is optically recorded with an image sensor array interfaced with a computer. 3D visualization of the biological specimen from the magnified diffraction pattern is accomplished by using the computational Fresnel propagation algorithm. For 3D recognition of the biological specimen, a watershed image segmentation algorithm is applied to automatically remove the unnecessary background parts in the reconstructed holographic image. Statistical estimation and inference algorithms are developed to the automatically segmented holographic image. Overviews of preliminary experimental results illustrate how the holographic image reconstructed from the Gabor digital hologram of biological specimen contains important information for microbial recognition.

  1. An automated method for fibrin clot permeability assessment.

    Science.gov (United States)

    Ząbczyk, Michał; Piłat, Adam; Awsiuk, Magdalena; Undas, Anetta

    2015-01-01

    The fibrin clot permeability coefficient (Ks) is a useful measure of porosity of the fibrin network, which is determined by a number of genetic and environmental factors. Currently available methods to evaluate Ks are time-consuming, require constant supervision and provide only one parameter. We present an automated method in which drops are weighed individually, buffer is dosed by the pump and well defined clot washing is controlled by the software. The presence of a straight association between drop mass and their dripping time allows to shorten the measurement time twice. In 40 healthy individuals, Ks, the number of drops required to reach the plateau (DTP), the time to achieve the plateau (TTP) and the DTP/TTP ratio (DTR) were calculated. There was a positive association between Ks (r = 0.69, P Ks (r = -0.55, P Ks (r = 0.70, P < 0.0001 for the manual method and r = 0.76, P < 0.0001 for the automated method), fibrinogen (r = -0.58, P < 0.0001) and C-reactive protein (CRP) (r = -0.47, P < 0.01). The automated method might be a suitable tool for research and clinical use and may offer more additional parameters describing fibrin clot structure.

  2. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  3. In vivo hippocampal measurement and memory: a comparison of manual tracing and automated segmentation in a large community-based sample.

    Directory of Open Access Journals (Sweden)

    Nicolas Cherbuin

    Full Text Available While manual tracing is the method of choice in measuring hippocampal volume, its time intensive nature and proneness to human error make automated methods attractive, especially when applied to large samples. Few studies have systematically compared the performance of the two techniques. In this study, we measured hippocampal volumes in a large (N = 403 population-based sample of individuals aged 44-48 years using manual tracing by a trained researcher and automated procedure using Freesurfer (http://surfer.nmr.mgh.harvard.edu imaging suite. Results showed that absolute hippocampal volumes assessed with these methods were significantly different, with automated measures using the Freesurfer software suite being significantly larger, by 23% for the left and 29% for the right hippocampus. The correlation between the two methods varied from 0.61 to 0.80, with lower correlations for hippocampi with visible abnormalities. Inspection of 2D and 3D models suggested that this difference was largely due to greater inclusion of boundary voxels by the automated method and variations in subiculum/entorhinal segmentation. The correlation between left and right hippocampal volumes was very similar by the two methods. The relationship of hippocampal volumes to selected sociodemographic and cognitive variables was not affected by the measurement method, with each measure showing an association with memory performance and suggesting that both were equally valid for this purpose. This study supports the use of automated measures, based on Freesurfer in this instance, as being sufficiently reliable and valid particularly in the context of larger sample sizes when the research question does not rely on 'true' hippocampal volumes.

  4. In vivo hippocampal measurement and memory: a comparison of manual tracing and automated segmentation in a large community-based sample.

    Science.gov (United States)

    Cherbuin, Nicolas; Anstey, Kaarin J; Réglade-Meslin, Chantal; Sachdev, Perminder S

    2009-01-01

    While manual tracing is the method of choice in measuring hippocampal volume, its time intensive nature and proneness to human error make automated methods attractive, especially when applied to large samples. Few studies have systematically compared the performance of the two techniques. In this study, we measured hippocampal volumes in a large (N = 403) population-based sample of individuals aged 44-48 years using manual tracing by a trained researcher and automated procedure using Freesurfer (http://surfer.nmr.mgh.harvard.edu) imaging suite. Results showed that absolute hippocampal volumes assessed with these methods were significantly different, with automated measures using the Freesurfer software suite being significantly larger, by 23% for the left and 29% for the right hippocampus. The correlation between the two methods varied from 0.61 to 0.80, with lower correlations for hippocampi with visible abnormalities. Inspection of 2D and 3D models suggested that this difference was largely due to greater inclusion of boundary voxels by the automated method and variations in subiculum/entorhinal segmentation. The correlation between left and right hippocampal volumes was very similar by the two methods. The relationship of hippocampal volumes to selected sociodemographic and cognitive variables was not affected by the measurement method, with each measure showing an association with memory performance and suggesting that both were equally valid for this purpose. This study supports the use of automated measures, based on Freesurfer in this instance, as being sufficiently reliable and valid particularly in the context of larger sample sizes when the research question does not rely on 'true' hippocampal volumes.

  5. Automation of sample preparation for mass cytometry barcoding in support of clinical research: protocol optimization.

    Science.gov (United States)

    Nassar, Ala F; Wisnewski, Adam V; Raddassi, Khadir

    2017-03-01

    Analysis of multiplexed assays is highly important for clinical diagnostics and other analytical applications. Mass cytometry enables multi-dimensional, single-cell analysis of cell type and state. In mass cytometry, the rare earth metals used as reporters on antibodies allow determination of marker expression in individual cells. Barcode-based bioassays for CyTOF are able to encode and decode for different experimental conditions or samples within the same experiment, facilitating progress in producing straightforward and consistent results. Herein, an integrated protocol for automated sample preparation for barcoding used in conjunction with mass cytometry for clinical bioanalysis samples is described; we offer results of our work with barcoding protocol optimization. In addition, we present some points to be considered in order to minimize the variability of quantitative mass cytometry measurements. For example, we discuss the importance of having multiple populations during titration of the antibodies and effect of storage and shipping of labelled samples on the stability of staining for purposes of CyTOF analysis. Data quality is not affected when labelled samples are stored either frozen or at 4 °C and used within 10 days; we observed that cell loss is greater if cells are washed with deionized water prior to shipment or are shipped in lower concentration. Once the labelled samples for CyTOF are suspended in deionized water, the analysis should be performed expeditiously, preferably within the first hour. Damage can be minimized if the cells are resuspended in phosphate-buffered saline (PBS) rather than deionized water while waiting for data acquisition.

  6. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  7. Human genomic DNA analysis using a semi-automated sample preparation, amplification, and electrophoresis separation platform.

    Science.gov (United States)

    Raisi, Fariba; Blizard, Benjamin A; Raissi Shabari, Akbar; Ching, Jesus; Kintz, Gregory J; Mitchell, Jim; Lemoff, Asuncion; Taylor, Mike T; Weir, Fred; Western, Linda; Wong, Wendy; Joshi, Rekha; Howland, Pamela; Chauhan, Avinash; Nguyen, Peter; Petersen, Kurt E

    2004-03-01

    The growing importance of analyzing the human genome to detect hereditary and infectious diseases associated with specific DNA sequences has motivated us to develop automated devices to integrate sample preparation, real-time PCR, and microchannel electrophoresis (MCE). In this report, we present results from an optimized compact system capable of processing a raw sample of blood, extracting the DNA, and performing a multiplexed PCR reaction. Finally, an innovative electrophoretic separation was performed on the post-PCR products using a unique MCE system. The sample preparation system extracted and lysed white blood cells (WBC) from whole blood, producing DNA of sufficient quantity and quality for a polymerase chain reaction (PCR). Separation of multiple amplicons was achieved in a microfabricated channel 30 microm x 100 microm in cross section and 85 mm in length filled with a replaceable methyl cellulose matrix operated under denaturing conditions at 50 degrees C. By incorporating fluorescent-labeled primers in the PCR, the amplicons were identified by a two-color (multiplexed) fluorescence detection system. Two base-pair resolution of single-stranded DNA (PCR products) was achieved. We believe that this integrated system provides a unique solution for DNA analysis.

  8. Designing an Automated Assessment of Public Speaking Skills Using Multimodal Cues

    Science.gov (United States)

    Chen, Lei; Feng, Gary; Leong, Chee Wee; Joe, Jilliam; Kitchen, Christopher; Lee, Chong Min

    2016-01-01

    Traditional assessments of public speaking skills rely on human scoring. We report an initial study on the development of an automated scoring model for public speaking performances using multimodal technologies. Task design, rubric development, and human rating were conducted according to standards in educational assessment. An initial corpus of…

  9. Measurement of airborne carbonyls using an automated sampling and analysis system.

    Science.gov (United States)

    Aiello, Mauro; McLaren, Robert

    2009-12-01

    Based upon the well established method of derivitization with 2,4-dinitrophenylhydrazine, an instrument was developed for ambient measurement of carbonyls with significantly improved temporal resolution and detection limits through automation, direct injection, and continuous use of a single microsilica DNPH cartridge. Kinetic experiments indicate that the derivitization reaction on the cartridge is fast enough for continuous measurements with 50 min air sampling. Reaction efficiencies measured on the cartridge were 100% for the carbonyls tested, including formaldehyde, acetaldehyde, propanal, acetone, and benzaldehyde. Transmission of the carbonyls through an ozone scrubber (KI) were in the range of 97-101%. Blank levels and detection limits were lower than those obtainable with conventional DNPH methods by an order of magnitude or greater. Mixing ratio detection limits of carbonyls in ambient air were 38-73 ppt for a 50 min air sample (2.5 L). The instrument made continuous measurements of carbonyls on a 2 h cycle over a period of 10 days during a field study in southwestern Ontario. Median mixing ratios were 0.58 ppb formaldehyde; 0.29 ppb acetaldehyde; 1.14 ppb acetone; and 0.45 ppb glyoxal. Glyoxal shows a significant correlation with ozone and zero intercept, consistent with a secondary source and minor direct source to the atmosphere. The method should easily be extendable to the detection of other low molecular weight carbonyls that have been previously reported using the DNPH technique.

  10. Automated nanoliter solution deposition for total reflection X-ray fluorescence analysis of semiconductor samples

    Energy Technology Data Exchange (ETDEWEB)

    Sparks, Chris M. [Process Characterization Laboratory, ATDF, Austin, TX 78741 (United States)]. E-mail: chris.sparks@atdf.com; Gondran, Carolyn H. [Process Characterization Laboratory, ATDF, Austin, TX 78741 (United States); Havrilla, George J. [Chemistry Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Hastings, Elizabeth P. [Chemistry Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2006-11-15

    In this study, a BioDot BioJet dispensing system was investigated as a nanoliter sample deposition method for total reflection X-ray fluorescence (TXRF) analysis. The BioDot system was programmed to dispense arrays of 20 nL droplets of sample solution on Si wafers. Each 20 nL droplet was approximately 100 {mu}m in diameter. A 10 x 10 array (100 droplets) was deposited and dried in less than 2 min at room temperature and pressure, demonstrating the efficiency of the automated deposition method. Solutions of various concentrations of Ni and Ni in different matrices were made from stock trace element standards to investigate of the effect of the matrix on the TXRF signal. The concentrations were such that the levels of TXRF signal saturation could be examined. Arrays were deposited to demonstrate the capability of drying 100 {mu}L of vapor phase decomposition-like residue in the area of a typical TXRF detector.

  11. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  12. Plasma cortisol and noradrenalin concentrations in pigs: automated sampling of freely moving pigs housed in PigTurn versus manually sampled and restrained pigs

    Science.gov (United States)

    Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...

  13. Automated dispersive liquid-liquid microextraction coupled to high performance liquid chromatography - cold vapour atomic fluorescence spectroscopy for the determination of mercury species in natural water samples.

    Science.gov (United States)

    Liu, Yao-Min; Zhang, Feng-Ping; Jiao, Bao-Yu; Rao, Jin-Yu; Leng, Geng

    2017-04-14

    An automated, home-constructed, and low cost dispersive liquid-liquid microextraction (DLLME) device that directly coupled to a high performance liquid chromatography (HPLC) - cold vapour atomic fluorescence spectroscopy (CVAFS) system was designed and developed for the determination of trace concentrations of methylmercury (MeHg(+)), ethylmercury (EtHg(+)) and inorganic mercury (Hg(2+)) in natural waters. With a simple, miniaturized and efficient automated DLLME system, nanogram amounts of these mercury species were extracted from natural water samples and injected into a hyphenated HPLC-CVAFS for quantification. The complete analytical procedure, including chelation, extraction, phase separation, collection and injection of the extracts, as well as HPLC-CVAFS quantification, was automated. Key parameters, such as the type and volume of the chelation, extraction and dispersive solvent, aspiration speed, sample pH, salt effect and matrix effect, were thoroughly investigated. Under the optimum conditions, linear range was 10-1200ngL(-1) for EtHg(+) and 5-450ngL(-1) for MeHg(+) and Hg(2+). Limits of detection were 3.0ngL(-1) for EtHg(+) and 1.5ngL(-1) for MeHg(+) and Hg(2+). Reproducibility and recoveries were assessed by spiking three natural water samples with different Hg concentrations, giving recoveries from 88.4-96.1%, and relative standard deviations <5.1%.

  14. An automated maze task for assessing hippocampus-sensitive memory in mice.

    Science.gov (United States)

    Pioli, Elsa Y; Gaskill, Brianna N; Gilmour, Gary; Tricklebank, Mark D; Dix, Sophie L; Bannerman, David; Garner, Joseph P

    2014-03-15

    Memory deficits associated with hippocampal dysfunction are a key feature of a number of neurodegenerative and psychiatric disorders. The discrete-trial rewarded alternation T-maze task is highly sensitive to hippocampal dysfunction. Normal mice have spontaneously high levels of alternation, whereas hippocampal-lesioned mice are dramatically impaired. However, this is a hand-run task and handling has been shown to impact crucially on behavioural responses, as well as being labour-intensive and therefore unsuitable for high-throughput studies. To overcome this, a fully automated maze was designed. The maze was attached to the mouse's home cage and the subject earned all of its food by running through the maze. In this study the hippocampal dependence of rewarded alternation in the automated maze was assessed. Bilateral hippocampal-lesioned mice were assessed in the standard, hand-run, discrete-trial rewarded alternation paradigm and in the automated paradigm, according to a cross-over design. A similarly robust lesion effect on alternation performance was found in both mazes, confirming the sensitivity of the automated maze to hippocampal lesions. Moreover, the performance of the animals in the automated maze was not affected by their handling history whereas performance in the hand-run maze was affected by prior testing history. By having more stable performance and by decreasing human contact the automated maze may offer opportunities to reduce extraneous experimental variation and therefore increase the reproducibility within and/or between laboratories. Furthermore, automation potentially allows for greater experimental throughput and hence suitability for use in assessment of cognitive function in drug discovery.

  15. Fully automated algorithm for wound surface area assessment.

    Science.gov (United States)

    Deana, Alessandro Melo; de Jesus, Sérgio Henrique Costa; Sampaio, Brunna Pileggi Azevedo; Oliveira, Marcelo Tavares; Silva, Daniela Fátima Teixeira; França, Cristiane Miranda

    2013-01-01

    Worldwide, clinicians, dentists, nurses, researchers, and other health professionals need to monitor the wound healing progress and to quantify the rate of wound closure. The aim of this study is to demonstrate, step by step, a fully automated numerical method to estimate the size of the wound and the percentage damaged relative to the body surface area (BSA) in images, without the requirement for human intervention. We included the formula for BSA in rats in the algorithm. The methodology was validated in experimental wounds and human ulcers and was compared with the analysis of an experienced pathologist, with good agreement. Therefore, this algorithm is suitable for experimental wounds and burns and human ulcers, as they have a high contrast with adjacent normal skin.

  16. IHC Profiler: An Open Source Plugin for the Quantitative Evaluation and Automated Scoring of Immunohistochemistry Images of Human Tissue Samples

    Science.gov (United States)

    Malhotra, Renu; De, Abhijit

    2014-01-01

    In anatomic pathology, immunohistochemistry (IHC) serves as a diagnostic and prognostic method for identification of disease markers in tissue samples that directly influences classification and grading the disease, influencing patient management. However, till today over most of the world, pathological analysis of tissue samples remained a time-consuming and subjective procedure, wherein the intensity of antibody staining is manually judged and thus scoring decision is directly influenced by visual bias. This instigated us to design a simple method of automated digital IHC image analysis algorithm for an unbiased, quantitative assessment of antibody staining intensity in tissue sections. As a first step, we adopted the spectral deconvolution method of DAB/hematoxylin color spectra by using optimized optical density vectors of the color deconvolution plugin for proper separation of the DAB color spectra. Then the DAB stained image is displayed in a new window wherein it undergoes pixel-by-pixel analysis, and displays the full profile along with its scoring decision. Based on the mathematical formula conceptualized, the algorithm is thoroughly tested by analyzing scores assigned to thousands (n = 1703) of DAB stained IHC images including sample images taken from human protein atlas web resource. The IHC Profiler plugin developed is compatible with the open resource digital image analysis software, ImageJ, which creates a pixel-by-pixel analysis profile of a digital IHC image and further assigns a score in a four tier system. A comparison study between manual pathological analysis and IHC Profiler resolved in a match of 88.6% (P<0.0001, CI = 95%). This new tool developed for clinical histopathological sample analysis can be adopted globally for scoring most protein targets where the marker protein expression is of cytoplasmic and/or nuclear type. We foresee that this method will minimize the problem of inter-observer variations across labs and further help in

  17. IHC Profiler: an open source plugin for the quantitative evaluation and automated scoring of immunohistochemistry images of human tissue samples.

    Directory of Open Access Journals (Sweden)

    Frency Varghese

    Full Text Available In anatomic pathology, immunohistochemistry (IHC serves as a diagnostic and prognostic method for identification of disease markers in tissue samples that directly influences classification and grading the disease, influencing patient management. However, till today over most of the world, pathological analysis of tissue samples remained a time-consuming and subjective procedure, wherein the intensity of antibody staining is manually judged and thus scoring decision is directly influenced by visual bias. This instigated us to design a simple method of automated digital IHC image analysis algorithm for an unbiased, quantitative assessment of antibody staining intensity in tissue sections. As a first step, we adopted the spectral deconvolution method of DAB/hematoxylin color spectra by using optimized optical density vectors of the color deconvolution plugin for proper separation of the DAB color spectra. Then the DAB stained image is displayed in a new window wherein it undergoes pixel-by-pixel analysis, and displays the full profile along with its scoring decision. Based on the mathematical formula conceptualized, the algorithm is thoroughly tested by analyzing scores assigned to thousands (n = 1703 of DAB stained IHC images including sample images taken from human protein atlas web resource. The IHC Profiler plugin developed is compatible with the open resource digital image analysis software, ImageJ, which creates a pixel-by-pixel analysis profile of a digital IHC image and further assigns a score in a four tier system. A comparison study between manual pathological analysis and IHC Profiler resolved in a match of 88.6% (P<0.0001, CI = 95%. This new tool developed for clinical histopathological sample analysis can be adopted globally for scoring most protein targets where the marker protein expression is of cytoplasmic and/or nuclear type. We foresee that this method will minimize the problem of inter-observer variations across labs and

  18. Performance of Three Mode-Meter Block-Processing Algorithms for Automated Dynamic Stability Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel J.; Pierre, John W.; Zhou, Ning; Hauer, John F.; Parashar, Manu

    2008-05-31

    The frequency and damping of electromechanical modes offer considerable insight into the dynamic stability properties of a power system. The performance properties of three block-processing algorithms from the perspective of near real-time automated stability assessment are demonstrated and examined. The algorithms are: the extended modified Yule Walker (YW); extended modified Yule Walker with Spectral analysis (YWS); and numerical state-space subspace system identification(N4SID) algorithm. The YW and N4SID have been introduced in previous publications while the YWS is introduced here. Issues addressed include: stability assessment requirements; automated subset selecting identified modes; using algorithms in an automated format; data assumptions and quality; and expected algorithm estimation performance.

  19. Correction of an input function for errors introduced with automated blood sampling

    Energy Technology Data Exchange (ETDEWEB)

    Schlyer, D.J.; Dewey, S.L. [Brookhaven National Lab., Upton, NY (United States)

    1994-05-01

    Accurate kinetic modeling of PET data requires an precise arterial plasma input function. The use of automated blood sampling machines has greatly improved the accuracy but errors can be introduced by the dispersion of the radiotracer in the sampling tubing. This dispersion results from three effects. The first is the spreading of the radiotracer in the tube due to mass transfer. The second is due to the mechanical action of the peristaltic pump and can be determined experimentally from the width of a step function. The third is the adsorption of the radiotracer on the walls of the tubing during transport through the tube. This is a more insidious effect since the amount recovered from the end of the tube can be significantly different than that introduced into the tubing. We have measured the simple mass transport using [{sup 18}F]fluoride in water which we have shown to be quantitatively recovered with no interaction with the tubing walls. We have also carried out experiments with several radiotracers including [{sup 18}F]Haloperidol, [{sup 11}C]L-deprenyl, [{sup 18}]N-methylspiroperidol ([{sup 18}F]NMS) and [{sup 11}C]buprenorphine. In all cases there was some retention of the radiotracer by untreated silicone tubing. The amount retained in the tubing ranged from 6% for L-deprenyl to 30% for NMS. The retention of the radiotracer was essentially eliminated after pretreatment with the relevant unlabeled compound. For example less am 2% of the [{sup 18}F]NMS was retained in tubing treated with unlabelled NMS. Similar results were obtained with baboon plasma although the amount retained in the untreated tubing was less in all cases. From these results it is possible to apply a mathematical correction to the measured input function to account for mechanical dispersion and to apply a chemical passivation to the tubing to reduce the dispersion due to adsorption of the radiotracer on the tubing walls.

  20. Automated tissue classification framework for reproducible chronic wound assessment.

    Science.gov (United States)

    Mukherjee, Rashmi; Manohar, Dhiraj Dhane; Das, Dev Kumar; Achar, Arun; Mitra, Analava; Chakraborty, Chandan

    2014-01-01

    The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough) scheme for chronic wound (CW) evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB) wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity) color space and subsequently the "S" component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM), were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793).

  1. Automated Tissue Classification Framework for Reproducible Chronic Wound Assessment

    Directory of Open Access Journals (Sweden)

    Rashmi Mukherjee

    2014-01-01

    Full Text Available The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough scheme for chronic wound (CW evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity color space and subsequently the “S” component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM, were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793.

  2. Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™

    Science.gov (United States)

    Balfour, Stephen P.

    2013-01-01

    Two of the largest Massive Open Online Course (MOOC) organizations have chosen different methods for the way they will score and provide feedback on essays students submit. EdX, MIT and Harvard's non-profit MOOC federation, recently announced that they will use a machine-based Automated Essay Scoring (AES) application to assess written work in…

  3. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Ricardo Andres Pizarro

    2016-12-01

    Full Text Available High-resolution three-dimensional magnetic resonance imaging (3D-MRI is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM algorithm in the quality assessment of structural brain images, using global and region of interest (ROI automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  4. Assessment of Automated Measurement and Verification (M&V) Methods

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Custodio, Claudine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sohn, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jump, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  5. Transitioning the Defense Automated Neurobehavioral Assessment (DANA) to Operational Use

    Science.gov (United States)

    2013-10-01

    since all service mem- bers were fit for duty, not undergoing any disability evaluation, and thus assumed to be healthy. The purpose of assessing...bers were fit for duty, not undergoing any disability evaluation, and thus assumed to be healthy. The purpose of assessing service members across...System: Windows XP (service pack 2), Windows 7, or Mac OS X • RAM: 3GB • Storage: 200MB free • Processor: X86 (Intel or AMD ), 2GHz • Ports: 1

  6. Designing An Automated Assessment of Public Speaking Skills Using Multimodal Cues

    OpenAIRE

    Chen, Lei; Feng, Gary; Leong, Chee Wee; Joe, Jilliam; Kitchen, Christopher; Lee, Chong Min

    2016-01-01

    Traditional assessments of public speaking skills rely on human scoring. We report an initial study on the development of an automated scoring model for public speaking performances using multimodal technologies. Task design, rubric development, and human rating were conducted according to standards in educational assessment. An initial corpus of 17 speakers with 4 speaking tasks was collected using audio, video, and 3D motion capturing devices. A scoring model based on basic features in the ...

  7. Automated high-throughput assessment of prostate biopsy tissue using infrared spectroscopic chemical imaging

    Science.gov (United States)

    Bassan, Paul; Sachdeva, Ashwin; Shanks, Jonathan H.; Brown, Mick D.; Clarke, Noel W.; Gardner, Peter

    2014-03-01

    Fourier transform infrared (FT-IR) chemical imaging has been demonstrated as a promising technique to complement histopathological assessment of biomedical tissue samples. Current histopathology practice involves preparing thin tissue sections and staining them using hematoxylin and eosin (H&E) after which a histopathologist manually assess the tissue architecture under a visible microscope. Studies have shown that there is disagreement between operators viewing the same tissue suggesting that a complementary technique for verification could improve the robustness of the evaluation, and improve patient care. FT-IR chemical imaging allows the spatial distribution of chemistry to be rapidly imaged at a high (diffraction-limited) spatial resolution where each pixel represents an area of 5.5 × 5.5 μm2 and contains a full infrared spectrum providing a chemical fingerprint which studies have shown contains the diagnostic potential to discriminate between different cell-types, and even the benign or malignant state of prostatic epithelial cells. We report a label-free (i.e. no chemical de-waxing, or staining) method of imaging large pieces of prostate tissue (typically 1 cm × 2 cm) in tens of minutes (at a rate of 0.704 × 0.704 mm2 every 14.5 s) yielding images containing millions of spectra. Due to refractive index matching between sample and surrounding paraffin, minimal signal processing is required to recover spectra with their natural profile as opposed to harsh baseline correction methods, paving the way for future quantitative analysis of biochemical signatures. The quality of the spectral information is demonstrated by building and testing an automated cell-type classifier based upon spectral features.

  8. ASSESSING SMALL SAMPLE WAR-GAMING DATASETS

    OpenAIRE

    W. J. HURLEY; R. N. FARRELL

    2013-01-01

    One of the fundamental problems faced by military planners is the assessment of changes to force structure. An example is whether to replace an existing capability with an enhanced system. This can be done directly with a comparison of measures such as accuracy, lethality, survivability, etc. However this approach does not allow an assessment of the force multiplier effects of the proposed change. To gauge these effects, planners often turn to war-gaming. For many war-gaming experiments, it i...

  9. Judging Amy: Automated legal assessment using OWL 2

    NARCIS (Netherlands)

    van de Ven, S.; Hoekstra, R.; Breuker, J.; Wortel, L.; El-Ali, A.

    2008-01-01

    One of the most salient tasks in law is legal assessment, and concerns the problem of determining whether some case is allowed or disallowed given an appropriate body of legal norms. In this paper we describe a system and Protégé 4 plugin, called OWL Judge, that uses standard OWL 2 DL reasoning for

  10. Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples

    Science.gov (United States)

    Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi

    2014-01-01

    RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads

  11. Automated Performance Monitoring and Assessment for DCS Digital Systems.

    Science.gov (United States)

    1977-10-01

    eye opening monitor performance assessment techniques. (3) Implement and program a CPMAS test processor subsystem. (4) Perform a field test site...an internodal path in the planned DEB network is 11 ( Hillingdon to Schoenfeld). For this reason, the path from Node A to Node B in the Transmission...RADIOS SITE RADIOS HOHENSTADT 4 BANN 6 STUTTGART 4 HILLINGDON 3 LANGERKOPF 5 CROUGHTON 4 DONNERSBERG 10 MARTLESHAM HEATH 3 PIRMASENS 4 ADENAU 3

  12. Bayesian Stratified Sampling to Assess Corpus Utility

    CERN Document Server

    Hochberg, J; Thomas, T; Hall, S; Hochberg, Judith; Scovel, Clint; Thomas, Timothy; Hall, Sam

    1998-01-01

    This paper describes a method for asking statistical questions about a large text corpus. We exemplify the method by addressing the question, "What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?" We estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Stratified sampling is used to reduce the sampling uncertainty of the estimate from over 3100 documents to fewer than 1000. The stratification is based on observed characteristics of real documents, while the sampling procedure incorporates a Bayesian version of Neyman allocation. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  13. Donor disc attachment assessment with intraoperative spectral optical coherence tomography during descemet stripping automated endothelial keratoplasty

    Directory of Open Access Journals (Sweden)

    Edward Wylegala

    2013-01-01

    Full Text Available Optical coherence tomography has already been proven to be useful for pre- and post-surgical anterior eye segment assessment, especially in lamellar keratoplasty procedures. There is no evidence for intraoperative usefulness of optical coherence tomography (OCT. We present a case report of the intraoperative donor disc attachment assessment with spectral-domain optical coherence tomography in case of Descemet stripping automated endothelial keratoplasty (DSAEK surgery combined with corneal incisions. The effectiveness of the performed corneal stab incisions was visualized directly by OCT scan analysis. OCT assisted DSAEK allows the assessment of the accuracy of the Descemet stripping and donor disc attachment.

  14. Feasibility studies of safety assessment methods for programmable automation systems. Final report of the AVV project

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P.; Maskuniitty, M.; Pulkkinen, U. [VTT Automation, Espoo (Finland); Heikkinen, J.; Korhonen, J.; Tuulari, E. [VTT Electronics, Espoo (Finland)

    1995-10-01

    Feasibility studies of two different groups of methodologies for safety assessment of programmable automation systems has been executed at the Technical Research Centre of Finland (VTT). The studies concerned the dynamic testing methods and the fault tree (FT) and failure mode and effects analysis (FMEA) methods. In order to get real experience in the application of these methods, an experimental testing of two realistic pilot systems were executed and a FT/FMEA analysis of a programmable safety function accomplished. The purpose of the studies was not to assess the object systems, but to get experience in the application of methods and assess their potentials and development needs. (46 refs., 21 figs.).

  15. Defining food sampling strategy for chemical risk assessment

    OpenAIRE

    Wesolek, Nathalie; Roudot, Alain-Claude

    2012-01-01

    International audience; Collection of accurate and reliable data is a prerequisite for informed risk assessment and risk management. For chemical contaminants in food, contamination assessments enable consumer protection and exposure assessments. And yet, the accuracy of a contamination assessment depends on both chemical analysis and sampling plan performance. A sampling plan is always used when the contamination level of a food lot is evaluated, due to the fact that the whole lot can not be...

  16. Automated Gel Size Selection to Improve the Quality of Next-generation Sequencing Libraries Prepared from Environmental Water Samples.

    Science.gov (United States)

    Uyaguari-Diaz, Miguel I; Slobodan, Jared R; Nesbitt, Matthew J; Croxen, Matthew A; Isaac-Renton, Judith; Prystajecky, Natalie A; Tang, Patrick

    2015-04-17

    Next-generation sequencing of environmental samples can be challenging because of the variable DNA quantity and quality in these samples. High quality DNA libraries are needed for optimal results from next-generation sequencing. Environmental samples such as water may have low quality and quantities of DNA as well as contaminants that co-precipitate with DNA. The mechanical and enzymatic processes involved in extraction and library preparation may further damage the DNA. Gel size selection enables purification and recovery of DNA fragments of a defined size for sequencing applications. Nevertheless, this task is one of the most time-consuming steps in the DNA library preparation workflow. The protocol described here enables complete automation of agarose gel loading, electrophoretic analysis, and recovery of targeted DNA fragments. In this study, we describe a high-throughput approach to prepare high quality DNA libraries from freshwater samples that can be applied also to other environmental samples. We used an indirect approach to concentrate bacterial cells from environmental freshwater samples; DNA was extracted using a commercially available DNA extraction kit, and DNA libraries were prepared using a commercial transposon-based protocol. DNA fragments of 500 to 800 bp were gel size selected using Ranger Technology, an automated electrophoresis workstation. Sequencing of the size-selected DNA libraries demonstrated significant improvements to read length and quality of the sequencing reads.

  17. Automated sample preparation for radiogenic and non-traditional metal isotope analysis by MC-ICP-MS

    Science.gov (United States)

    Field, M. P.; Romaniello, S. J.; Gordon, G. W.; Anbar, A. D.

    2012-12-01

    High throughput analysis is becoming increasingly important for many applications of radiogenic and non-traditional metal isotopes. While MC-ICP-MS instruments offer the potential for very high sample throughout, the requirement for labor-intensive sample preparation and purification procedures remains a substantial bottleneck. Current purification protocols require manually feeding gravity-driven separation columns, a process that is both costly and time consuming. This bottleneck is eliminated with the prepFAST-MC™, an automated, low-pressure ion exchange chromatography system that can process from 1 to 60 samples in unattended operation. The syringe-driven system allows sample loading, multiple acid washes, column conditioning and elution cycles necessary to isolate elements of interest and automatically collect up to 3 discrete eluent fractions at user-defined intervals (time, volume and flow rate). Newly developed protocols for automated purification of uranium illustrates high throughput (>30 per run), multiple samples processed per column (>30), complete (>99%) matrix removal, high recovery (> 98%, n=25), and excellent precision (2 sigma =0.03 permil, n=10). The prepFAST-MC™ maximizes sample throughput and minimizes costs associated with personnel and consumables providing an opportunity to greatly expand research horizons in fields where large isotopic data sets are required, including archeology, geochemistry, and climate/environmental science

  18. Automated Assessment of Right Ventricular Volumes and Function Using Three-Dimensional Transesophageal Echocardiography.

    Science.gov (United States)

    Nillesen, Maartje M; van Dijk, Arie P J; Duijnhouwer, Anthonie L; Thijssen, Johan M; de Korte, Chris L

    2016-02-01

    Assessment of right ventricular (RV) function is known to be of diagnostic value in patients with RV dysfunction. Because of its complex anatomic shape, automated determination of the RV volume is difficult and strong reliance on geometric assumptions is not desired. A method for automated RV assessment was developed using three-dimensional (3-D) echocardiography without relying on a priori knowledge of the cardiac anatomy. A 3-D adaptive filtering technique that optimizes the discrimination between blood and myocardium was applied to facilitate endocardial border detection. Filtered image data were incorporated in a segmentation model to automatically detect the endocardial RV border. End-systolic and end-diastolic RV volumes, as well as ejection fraction, were computed from the automatically segmented endocardial surfaces and compared against reference volumes manually delineated by two expert cardiologists. The results reported good performance in terms of correlation and agreement with the results from the reference volumes.

  19. Automating the aviation command safety assessment survey as an Enterprise Information System (EIS)

    OpenAIRE

    Held, Jonathan S.; Mingo, Fred J.

    1999-01-01

    The Aviation Command Safety Assessment (ACSA) is a questionnaire survey methodology developed to evaluate a Naval Aviation Command's safety climate, culture, and safety program effectiveness. This survey was a manual process first administered in the fall of 1996. The primary goal of this thesis is to design, develop, and test an Internet-based, prototype model for administering this survey using new technologies that allow automated survey submission and analysis. The result of this thesis i...

  20. Automation impact study of Army training management 2: Extension of sampling and collection of installation resource data

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, T.F.; McCallum, M.C.; Hunt, P.S.; Slavich, A.L.; Underwood, J.A.; Toquam, J.L.; Seaver, D.A.

    1989-05-01

    This automation impact study of Army training management (TM) was performed for the Army Development and Employment Agency (ADEA) and the Combined Arms Training Activity (CATA) by the Battelle Human Affairs Research Centers and the Pacific Northwest Laboratory. The primary objective of the study was to provide the Army with information concerning the potential costs and savings associated with automating the TM process. This study expands the sample of units surveyed in Phase I of the automation impact effort (Sanquist et al., 1988), and presents data concerning installation resource management in relation to TM. The structured interview employed in Phase I was adapted to a self-administered survey. The data collected were compatible with that of Phase I, and both were combined for analysis. Three US sites, one reserve division, one National Guard division, and one unit in the active component outside the continental US (OCONUS) (referred to in this report as forward deployed) were surveyed. The total sample size was 459, of which 337 respondents contributed the most detailed data. 20 figs., 62 tabs.

  1. A conceptual model of the automated credibility assessment of the volunteered geographic information

    Science.gov (United States)

    Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.

    2014-02-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.

  2. Revisiting the Hubble sequence in the SDSS DR7 spectroscopic sample: a publicly available bayesian automated classification

    CERN Document Server

    Huertas-Company, Marc; Bernardi, M; Mei, S; Almeida, J Sánchez

    2010-01-01

    We present an automated morphological classification in 4 types (E,S0,Sab,Scd) of ~700.000 galaxies from the SDSS DR7 spectroscopic sample based on support vector machines. The main new property of the classification is that we associate to each galaxy a probability of being in the four morphological classes instead of assigning a single class. The classification is therefore better adapted to nature where we expect a continuos transition between different morphological types. The algorithm is trained with a visual classification and then compared to several independent visual classifications including the Galaxy Zoo first release catalog. We find a very good correlation between the automated classification and classical visual ones. The compiled catalog is intended for use in different applications and can be downloaded at http://gepicom04.obspm.fr/sdss_morphology/Morphology_2010.html and soon from the CasJobs database.

  3. Lab on valve-multisyringe flow injection system (LOV-MSFIA) for fully automated uranium determination in environmental samples.

    Science.gov (United States)

    Avivar, Jessica; Ferrer, Laura; Casas, Montserrat; Cerdà, Víctor

    2011-06-15

    The hyphenation of lab-on-valve (LOV) and multisyringe flow analysis (MSFIA), coupled to a long path length liquid waveguide capillary cell (LWCC), allows the spectrophotometric determination of uranium in different types of environmental sample matrices, without any manual pre-treatment, and achieving high selectivity and sensitivity levels. On-line separation and preconcentration of uranium is carried out by means of UTEVA resin. The potential of the LOV-MSFIA makes possible the fully automation of the system by the in-line regeneration of the column. After elution, uranium(VI) is spectrophotometrically detected after reaction with arsenazo-III. The determination of levels of uranium present in environmental samples is required in order to establish an environmental control. Thus, we propose a rapid, cheap and fully automated method to determine uranium(VI) in environmental samples. The limit of detection reached is 1.9 ηg of uranium and depending on the preconcentrated volume; it results in ppt levels (10.3 ηg L(-1)). Different water sample matrices (seawater, well water, freshwater, tap water and mineral water) and a phosphogypsum sample (with natural uranium content) were satisfactorily analyzed.

  4. Are Flow Injection-based Approaches Suitable for Automated Handling of Solid Samples?

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald; Cerdà, Victor

    with the potential hyphenation with modern analytical instrumentation for automated monitoring of the content of targeted species in the on-line generated extracts [3,4]. [1] Z.-L. Zhi, A. Ríos, M. Valcárcel, Crit. Rev. Anal. Chem., 26 (1996) 239. [2] M. Miró, E.H. Hansen, R. Chomchoei, W. Frenzel, TRAC-Trends Anal...

  5. MLP based Reusability Assessment Automation Model for Java based Software Systems

    Directory of Open Access Journals (Sweden)

    Surbhi Maggo

    2014-08-01

    Full Text Available Reuse refers to a common principle of using existing resources repeatedly, that is pervasively applicable everywhere. In software engineering reuse refers to the development of software systems using already available artifacts or assets partially or completely, with or without modifications. Software reuse not only promises significant improvements in productivity and quality but also provides for the development of more reliable, cost effective, dependable and less buggy (considering that prior use and testing have removed errors software with reduced time and effort. In this paper we present an efficient and reliable automation model for reusability evaluation of procedure based object oriented software for predicting the reusability levels of the components as low, medium or high. The presented model follows a reusability metric framework that targets the requisite reusability attributes including maintainability (using the Maintainability Index for functional analysis of the components. Further Multilayer perceptron (using back propagation based neural network is applied for the establishment of significant relationships among these attributes for reusability prediction. The proposed approach provides support for reusability evaluation at functional level rather than at structural level. The automation support for this approach is provided in the form of a tool named JRA2M2 (Java based Reusability Assessment Automation Model using Multilayer Perceptron (MLP, implemented in Java. The performance of JRA2M2 is recorded using parameters like accuracy, classification error, precision and recall. The results generated using JRA2M2 indicate that the proposed automation tool can be effectively used as a reliable and efficient solution for automated evaluation of reusability.

  6. Automated Image Sampling and Classification Can Be Used to Explore Perceived Naturalness of Urban Spaces.

    Science.gov (United States)

    Hyam, Roger

    2017-01-01

    The psychological restorative effects of exposure to nature are well established and extend to just viewing of images of nature. A previous study has shown that Perceived Naturalness (PN) of images correlates with their restorative value. This study tests whether it is possible to detect degree of PN of images using an image classifier. It takes images that have been scored by humans for PN (including a subset that have been assessed for restorative value) and passes them through the Google Vision API image classification service. The resulting labels are assigned to broad semantic classes to create a Calculated Semantic Naturalness (CSN) metric for each image. It was found that CSN correlates with PN. CSN was then calculated for a geospatial sampling of Google Street View images across the city of Edinburgh. CSN was found to correlate with PN in this sample also indicating the technique may be useful in large scale studies. Because CSN correlates with PN which correlates with restorativeness it is suggested that CSN or a similar measure may be useful in automatically detecting restorative images and locations. In an exploratory aside CSN was not found to correlate with an indicator of socioeconomic deprivation.

  7. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn [Seoul National University Hospital, Seoul (Korea, Republic of)

    2013-02-15

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume ({gamma}= 0.637 for total liver and {gamma}= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  8. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    Science.gov (United States)

    Williams, Alex C.; Hitt, Austin; Voisin, Sophie; Tourassi, Georgia

    2013-03-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists' manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  9. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alex C [ORNL; Hitt, Austin N [ORNL; Voisin, Sophie [ORNL; Tourassi, Georgia [ORNL

    2013-01-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  10. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study

    Directory of Open Access Journals (Sweden)

    Paul Otten

    2015-08-01

    Full Text Available Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA, are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient’s upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods.

  11. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    Science.gov (United States)

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age.

  12. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    Directory of Open Access Journals (Sweden)

    Marjan Mansourvar

    Full Text Available Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP and artificial neural networks (ANNs models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age.

  13. Interdisciplinary development of manual and automated product usability assessments for older adults with dementia: lessons learned.

    Science.gov (United States)

    Boger, Jennifer; Taati, Babak; Mihailidis, Alex

    2016-10-01

    The changes in cognitive abilities that accompany dementia can make it difficult to use everyday products that are required to complete activities of daily living. Products that are inherently more usable for people with dementia could facilitate independent activity completion, thus reducing the need for caregiver assistance. The objectives of this research were to: (1) gain an understanding of how water tap design impacted tap usability and (2) create an automated computerized tool that could assess tap usability. 27 older adults, who ranged from cognitively intact to advanced dementia, completed 1309 trials on five tap designs. Data were manually analyzed to investigate tap usability as well as used to develop an automated usability analysis tool. Researchers collaborated to modify existing techniques and to create novel ones to accomplish both goals. This paper presents lessons learned through the course of this research, which could be applicable in the development of other usability studies, automated vision-based assessments and the development of assistive technologies for cognitively impaired older adults. Collaborative interdisciplinary teamwork, which included older adult with dementia participants, was key to enabling innovative advances that achieved the projects' research goals. Implications for Rehabilitation Products that are implicitly familiar and usable by older adults could foster independent activity completion, potentially reducing reliance on a caregiver. The computer-based automated tool can significantly reduce the time and effort required to perform product usability analysis, making this type of analysis more feasible. Interdisciplinary collaboration can result in a more holistic understanding of assistive technology research challenges and enable innovative solutions.

  14. Evaluation of a software package for automated quality assessment of contrast detail images--comparison with subjective visual assessment.

    Science.gov (United States)

    Pascoal, A; Lawinski, C P; Honey, I; Blake, P

    2005-12-07

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA(detector), which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  15. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pascoal, A [Medical Engineering and Physics, King' s College London, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Lawinski, C P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Honey, I [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Blake, P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark)

    2005-12-07

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA{sub detector}, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  16. Evaluation of a software package for automated quality assessment of contrast detail images—comparison with subjective visual assessment

    Science.gov (United States)

    Pascoal, A.; Lawinski, C. P.; Honey, I.; Blake, P.

    2005-12-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMAdetector, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  17. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Science.gov (United States)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  18. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Salim, Nazaratul Ashifa Bt. Abdullah [Division of Waste and Environmental Technology, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Ismail, Nadiah Binti [Fakulti Kejuruteraan Elektrik, UiTM Pulau Pinang, 13500 Permatang Pauh, Pulau Pinang (Malaysia)

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  19. COST-WORTH ASSESSMENT OF AUTOMATED RADIAL DISTRIBUTION SYSTEM BASED ON RELIABILITY

    Directory of Open Access Journals (Sweden)

    E. Vidya Sagar

    2010-11-01

    Full Text Available Power reliability and quality are gaining their greater grounds than ever in the power and industrial market of the times. It has become an essential means for the successful dispatch of quality product, operation, services of any industry. The reliability of power distribution network can be greatly enhanced by the automation of its feeder system and other associated parts. Remotely controlled and automated restoration services can avoid the necessity of executing manual switching schedules and are bound to bring about remarkable levels of system reliability and interruption costs. The reliability cost-worth analysis is the excellent tool for evaluation of interruption costs. Reliability cost-worth analysis is very significant in the power system planning, operation, maintenance andexpansion, for it takes into account the customer concerns in the analysis. This paper deals in due detail with the reliability based cost-worth analysis of automated radial distribution network. The direct method of assessing the worth of reliability is to calculate the user costs relevant to interruptions in the power supply. By the application of failure mode effect analysis and cost-worth analysis byusing direct costing method the reliability of radial distribution network can be evaluated. The customer interruption costs’ indices of a radial distribution network calculated using the analytical method applied to an Indian utility network system with 2000KVA& sixteen nodes and the related results of the study are duly discussed in this paper.

  20. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    Science.gov (United States)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  1. Non-destructive automated sampling of mycotoxins in bulk food and feed - A new tool for required harmonization.

    Science.gov (United States)

    Spanjer, M; Stroka, J; Patel, S; Buechler, S; Pittet, A; Barel, S

    2001-06-01

    Mycotoxins contamination is highly non-uniformly distributed as is well recog-nized by the EC, by not only setting legal limits in a series of commodities, but also schedule a sampling plan that takes this heterogeneity into account. In practice however, it turns out that it is very difficult to carry out this sampling plan in a harmonised way. Applying the sampling plan to a container filled with pallets of bags (i.e. with nuts or coffee beans) varies from very laborious to almost impossible. The presented non-destructive automated method to sample bulk food could help to overcome these practical problems and to enforcing of EC directives. It is derived from a tested and approved technology for detection of illicit substances in security applications. It has capability to collect and iden-tify ultra trace contaminants, i.e. from a fingerprint of chemical substance in a bulk of goods, a cargo pallet load (~ 1000 kg) with boxes and commodities.The technology, patented for explosives detection, uses physical and chemistry processes for excitation and remote rapid enhanced release of contaminant residues, vapours and particulate, of the inner/outer surfaces of inspected bulk and collect them on selective probes. The process is automated, takes only 10 minutes, is non-destructive and the bulk itself remains unharmed. The system design is based on applicable international regulations for shipped cargo hand-ling and transportation by road, sea and air. After this process the pallet can be loaded on a truck, ship or plane. Analysis can be carried out before the cargo leaves the place of shipping. The potent application of this technology for myco-toxins detection, has been demonstrated by preliminary feasibility experiments. Aflatoxins were detected in pistachios and ochratoxin A in green coffee beans bulk. Both commodities were naturally contaminated, priory found and confirm-ed by common methods as used at routine inspections. Once the contaminants are extracted from a

  2. A study of automated self-assessment in a primary care student health centre setting.

    Science.gov (United States)

    Poote, Aimee E; French, David P; Dale, Jeremy; Powell, John

    2014-04-01

    We evaluated the advice given by a prototype self-assessment triage system in a university student health centre. Students attending the health centre with a new problem used the automated self-assessment system prior to a face-to-face consultation with the general practitioner (GP). The system's rating of urgency was available to the GP, and following the consultation, the GP recorded their own rating of the urgency of the patient's presentation. Full data were available for 154 of the 207 consultations. Perfect agreement, where both the GP and the self-assessment system selected the same category of advice, occurred in 39% of consultations. The association between the GP assessment and the self-assessment rankings of urgency was low but significant (rho = 0.19, P = 0.016). The self-assessment system tended to be risk averse compared to the GP assessments, with advice for more urgent level of care seeking being recommended in 86 consultations (56%) and less urgent advice in only 8 (5%). This difference in assessment of urgency was significant (P self-assessed and GP-assessed urgency was not associated with symptom site or socio-demographic characteristics of the user. Although the self-assessment system was more risk averse than the GPs, which resulted in a high proportion of patients being triaged as needing emergency or immediate care, the self-assessment system successfully identified a proportion of patients who were felt by the GP to have a self-limiting condition that did not need a consultation. In its prototype form, the self-assessment system was not a replacement for clinician assessment and further refinement is necessary.

  3. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    Science.gov (United States)

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  4. Waste package performance assessment code with automated sensitivity-calculation capability

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Horwedel, J.E.

    1986-09-01

    WAPPA-C is a waste package performance assessment code that predicts the temporal and spatial extent of the loss of containment capability of a given waste package design. This code was enhanced by the addition of the capability to calculate the sensitivity of model results to any parameter. The GRESS automated procedure was used to add this capability in only two man-months of effort. The verification analysis of the enhanced code, WAPPAG, showed that the sensitivities calculated using GRESS were accurate to within the precision of perturbation results against which the sensitivities were compared. Sensitivities of all summary table values to eight diverse data values were verified.

  5. Qualification of an automated device to objectively assess the effect of hair care products on hair shine.

    Science.gov (United States)

    Hagens, Ralf; Wiersbinski, Tim; Becker, Michael E; Weisshaar, Jürgen; Schreiner, Volker; Wenck, Horst

    2011-01-01

    The authors developed and qualified an automated routine screening tool to quantify hair shine. This tool is able to separately record individual properties of hair shine such as specular reflection and multiple reflection, as well as additional features such as sparkle, parallelism of hair fibers, and hair color, which strongly affect the subjective ranking by individual readers. A side-by-side comparison of different hair care and styling products with regard to hair shine using the automated screening tool in parallel with standard panel assessment showed that the automated system provides an almost identical ranking and the same statistical significances as the panel assessment. Provided stringent stratification of hair fibers for color and parallelism, the automated tool competes favorably with panel assessments of hair shine. In this case, data generated with the opsira Shine-Box are clearly superior over data generated by panel assessment in terms of reliability and repeatability, workload and time consumption, and sensitivity and specificity to detect differences after shampoo, conditioner, and leave-in treatment. The automated tool is therefore well suited to replace standard panel assessments in claim support, at least as a screening tool. A further advantage of the automated system over panel assessments is the fact that absolute numeric values are generated for a given hair care product, whereas panel assessments can only give rankings of a series of hair care products included in the same study. Thus, the absolute numeric data generated with the automated system allow comparison of hair care products between studies or at different time points after treatment.

  6. Deep learning for automated skeletal bone age assessment in X-ray images.

    Science.gov (United States)

    Spampinato, C; Palazzo, S; Giordano, D; Aldinucci, M; Leonardi, R

    2017-02-01

    Skeletal bone age assessment is a common clinical practice to investigate endocrinology, genetic and growth disorders in children. It is generally performed by radiological examination of the left hand by using either the Greulich and Pyle (G&P) method or the Tanner-Whitehouse (TW) one. However, both clinical procedures show several limitations, from the examination effort of radiologists to (most importantly) significant intra- and inter-operator variability. To address these problems, several automated approaches (especially relying on the TW method) have been proposed; nevertheless, none of them has been proved able to generalize to different races, age ranges and genders. In this paper, we propose and test several deep learning approaches to assess skeletal bone age automatically; the results showed an average discrepancy between manual and automatic evaluation of about 0.8 years, which is state-of-the-art performance. Furthermore, this is the first automated skeletal bone age assessment work tested on a public dataset and for all age ranges, races and genders, for which the source code is available, thus representing an exhaustive baseline for future research in the field. Beside the specific application scenario, this paper aims at providing answers to more general questions about deep learning on medical images: from the comparison between deep-learned features and manually-crafted ones, to the usage of deep-learning methods trained on general imagery for medical problems, to how to train a CNN with few images.

  7. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  8. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    Science.gov (United States)

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  9. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  10. Assessment for Operator Confidence in Automated Space Situational Awareness and Satellite Control Systems

    Science.gov (United States)

    Gorman, J.; Voshell, M.; Sliva, A.

    2016-09-01

    The United States is highly dependent on space resources to support military, government, commercial, and research activities. Satellites operate at great distances, observation capacity is limited, and operator actions and observations can be significantly delayed. Safe operations require support systems that provide situational understanding, enhance decision making, and facilitate collaboration between human operators and system automation both in-the-loop, and on-the-loop. Joint cognitive systems engineering (JCSE) provides a rich set of methods for analyzing and informing the design of complex systems that include both human decision-makers and autonomous elements as coordinating teammates. While, JCSE-based systems can enhance a system analysts' understanding of both existing and new system processes, JCSE activities typically occur outside of traditional systems engineering (SE) methods, providing sparse guidance about how systems should be implemented. In contrast, the Joint Director's Laboratory (JDL) information fusion model and extensions, such as the Dual Node Network (DNN) technical architecture, provide the means to divide and conquer such engineering and implementation complexity, but are loosely coupled to specialized organizational contexts and needs. We previously describe how Dual Node Decision Wheels (DNDW) extend the DNN to integrate JCSE analysis and design with the practicalities of system engineering and implementation using the DNN. Insights from Rasmussen's JCSE Decision Ladders align system implementation with organizational structures and processes. In the current work, we present a novel approach to assessing system performance based on patterns occurring in operational decisions that are documented by JCSE processes as traces in a decision ladder. In this way, system assessment is closely tied not just to system design, but the design of the joint cognitive system that includes human operators, decision-makers, information systems, and

  11. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    Science.gov (United States)

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample.

  12. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    Directory of Open Access Journals (Sweden)

    Constantinos Georgiou

    2010-07-01

    Full Text Available This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+ solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor’s response to carrier solutions of different pHs was tested. Vibrio fischeri’s bioluminescence is promoted in the pH 5–10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions.

  13. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Vilém Guryča

    2014-03-01

    Full Text Available The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility.

  14. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range

    NARCIS (Netherlands)

    Duijn, E. van; Sandman, H.; Grossouw, D.; Mocking, J.A.J.; Coulier, L.; Vaes, W.H.J.

    2014-01-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. H

  15. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  16. Revisiting the Hubble sequence in the SDSS DR7 spectroscopic sample: a publicly available Bayesian automated classification

    Science.gov (United States)

    Huertas-Company, M.; Aguerri, J. A. L.; Bernardi, M.; Mei, S.; Sánchez Almeida, J.

    2011-01-01

    We present an automated morphological classification in 4 types (E, S0, Sab, Scd) of ~700 000 galaxies from the SDSS DR7 spectroscopic sample based on support vector machines. The main new property of the classification is that we associate a probability to each galaxy of being in the four morphological classes instead of assigning a single class. The classification is therefore better adapted to nature where we expect a continuous transition between different morphological types. The algorithm is trained with a visual classification and then compared to several independent visual classifications including the Galaxy Zoo first-release catalog. We find a very good correlation between the automated classification and classical visual ones. The compiled catalog is intended for use in different applications and is therefore freely available through a dedicated webpage* and soon from the CasJobs database. Full catalog is only available in electronic form at CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/525/A157 or via http://gepicom04.obspm.fr/sdss_morphology/Morphology_2010.html

  17. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    Directory of Open Access Journals (Sweden)

    K. Schneider-Zapp

    2014-02-01

    Full Text Available In order to advance understanding of the role of seawater surfactants in the air–sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw, we constructed a fully automated, closed air-water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with MilliQ water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air–sea gas exchange process.

  18. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    Directory of Open Access Journals (Sweden)

    K. Schneider-Zapp

    2014-07-01

    Full Text Available In order to advance understanding of the role of seawater surfactants in the air–sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw, we constructed a fully automated, closed air–water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with Milli-Q water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air–sea gas exchange process.

  19. High-throughput automated microfluidic sample preparation for accurate microbial genomics

    Science.gov (United States)

    Kim, Soohong; De Jonghe, Joachim; Kulesa, Anthony B.; Feldman, David; Vatanen, Tommi; Bhattacharyya, Roby P.; Berdy, Brittany; Gomez, James; Nolan, Jill; Epstein, Slava; Blainey, Paul C.

    2017-01-01

    Low-cost shotgun DNA sequencing is transforming the microbial sciences. Sequencing instruments are so effective that sample preparation is now the key limiting factor. Here, we introduce a microfluidic sample preparation platform that integrates the key steps in cells to sequence library sample preparation for up to 96 samples and reduces DNA input requirements 100-fold while maintaining or improving data quality. The general-purpose microarchitecture we demonstrate supports workflows with arbitrary numbers of reaction and clean-up or capture steps. By reducing the sample quantity requirements, we enabled low-input (∼10,000 cells) whole-genome shotgun (WGS) sequencing of Mycobacterium tuberculosis and soil micro-colonies with superior results. We also leveraged the enhanced throughput to sequence ∼400 clinical Pseudomonas aeruginosa libraries and demonstrate excellent single-nucleotide polymorphism detection performance that explained phenotypically observed antibiotic resistance. Fully-integrated lab-on-chip sample preparation overcomes technical barriers to enable broader deployment of genomics across many basic research and translational applications. PMID:28128213

  20. The use of automated assessments in internet-based CBT: The computer will be with you shortly

    Directory of Open Access Journals (Sweden)

    Elizabeth C. Mason

    2014-10-01

    Full Text Available There is evidence from randomized control trials that internet-based cognitive behavioral therapy (iCBT is efficacious in the treatment of anxiety and depression, and recent research demonstrates the effectiveness of iCBT in routine clinical care. The aims of this study were to implement and evaluate a new pathway by which patients could access online treatment by completing an automated assessment, rather than seeing a specialist health professional. We compared iCBT treatment outcomes in patients who received an automated pre-treatment questionnaire assessment with patients who were assessed by a specialist psychiatrist prior to treatment. Participants were treated as part of routine clinical care and were therefore not randomized. The results showed that symptoms of anxiety and depression decreased significantly with iCBT, and that the mode of assessment did not affect outcome. That is, a pre-treatment assessment by a psychiatrist conferred no additional treatment benefits over an automated assessment. These findings suggest that iCBT is effective in routine care and may be implemented with an automated assessment. By providing wider access to evidence-based interventions and reducing waiting times, the use of iCBT within a stepped-care model is a cost-effective way to reduce the burden of disease caused by these common mental disorders.

  1. Development of a full automation solid phase microextraction method for investigating the partition coefficient of organic pollutant in complex sample.

    Science.gov (United States)

    Jiang, Ruifen; Lin, Wei; Wen, Sijia; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-08-07

    A fully automated solid phase microextraction (SPME) depletion method was developed to study the partition coefficient of organic compound between complex matrix and water sample. The SPME depletion process was conducted by pre-loading the fiber with a specific amount of organic compounds from a proposed standard gas generation vial, and then desorbing the fiber into the targeted samples. Based on the proposed method, the partition coefficients (Kmatrix) of 4 polyaromatic hydrocarbons (PAHs) between humic acid (HA)/hydroxypropyl-β-cyclodextrin (β-HPCD) and aqueous sample were determined. The results showed that the logKmatrix of 4 PAHs with HA and β-HPCD ranged from 3.19 to 4.08, and 2.45 to 3.15, respectively. In addition, the logKmatrix values decreased about 0.12-0.27 log units for different PAHs for every 10°C increase in temperature. The effect of temperature on the partition coefficient followed van't Hoff plot, and the partition coefficient at any temperature can be predicted based on the plot. Furthermore, the proposed method was applied for the real biological fluid analysis. The partition coefficients of 6 PAHs between the complex matrices in the fetal bovine serum and water were determined, and compared to ones obtained from SPME extraction method. The result demonstrated that the proposed method can be applied to determine the sorption coefficients of hydrophobic compounds between complex matrix and water in a variety of samples.

  2. Automated total and radioactive strontium separation and preconcentration in samples of environmental interest exploiting a lab-on-valve system.

    Science.gov (United States)

    Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Victor

    2012-07-15

    A novel lab-on-valve system has been developed for strontium determination in environmental samples. Miniaturized lab-on-valve system potentially offers facilities to allow any kind of chemical and physical processes, including fluidic and microcarrier bead control, homogenous reaction and liquid-solid interaction. A rapid, inexpensive and fully automated method for the separation and preconcentration of total and radioactive strontium, using a solid phase extraction material (Sr-Resin), has been developed. Total strontium concentrations are determined by ICP-OES and (90)Sr activities by a low background proportional counter. The method has been successfully applied to different water samples of environmental interest. The proposed system offers minimization of sample handling, drastic reduction of reagent volume, improvement of the reproducibility and sample throughput and attains a significant decrease of both time and cost per analysis. The LLD of the total Sr reached is 1.8ng and the minimum detectable activity for (90)Sr is 0.008Bq. The repeatability of the separation procedure is 1.2% (n=10).

  3. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Science.gov (United States)

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  4. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Directory of Open Access Journals (Sweden)

    Kamfai Chan

    Full Text Available Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs. Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  5. Small sample Bayesian analyses in assessment of weapon performance

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Abundant test data are required in assessment of weapon performance.When weapon test data are insufficient,Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations.The several Bayesian approaches are discussed and some limitations are founded.An improvement is put forward after limitations of Bayesian approaches available are analyzed and t he improved approach is applied to assessment of some new weapon performance.

  6. Rolling Deck to Repository (R2R): Automated Magnetic and Gravity Quality Assessment and Data Reduction

    Science.gov (United States)

    Morton, J. J.; O'Hara, S.; Ferrini, V.; Arko, R. A.

    2010-12-01

    With its global capability and diverse array of sensors, the academic research fleet is an integral component of ocean exploration. The Rolling Deck to Repository (R2R) Program provides a central shore-side gateway for underway data from the U.S. academic research fleet. In addition to ensuring preservation and documentation of routine underway data, R2R is also developing automated quality assessment (QA) tools for a variety of underway data types. Routine post-cruise QA will enable prompt feedback to shipboard operators and to provide the science community with sufficient background information for data analysis. Based on community feedback, R2R will perform data reduction to generate enhanced data products for select data types including gravity and magnetics. In the development of these tools, R2R seeks input from the scientific community, engaging specialists for each data type and requesting feedback from operators and scientists to deliver the most relevant and useful metadata. Development of data acquisition best practices that are being assembled within the community for some data types will also be important components of R2R QA development. Protocols for gravity and magnetics QA will include the development of guidelines for minimal and optimal metadata for each data type that will enable data reduction and optimize data re-use. Metadata including instrument specifications, navigational offsets, and calibration information will be important inputs for both data reduction and QA. Data reduction will include merging these geophysical data types with high-quality R2R-generated navigation data products, cleaning the data and applying instrument corrections. Automated routines that are being developed will then be used to assess data quality, ultimately producing a Quality Assessment Certificate (QAC) that will provide the science community with quality information in an easily accessible and understandable format. We present progress to date and invite

  7. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  8. Assessment of the quality of sample labelling for clinical research

    Directory of Open Access Journals (Sweden)

    Pablo Pérez-Huertas

    2016-03-01

    Full Text Available Objective: To assess the quality of the labels for clinical trial samples through current regulations, and to analyze its potential correlation with the specific characteristics of each sample. Method: A transversal multicenter study where the clinical trial samples from two third level hospitals were analyzed. The eleven items from Directive 2003/94/EC, as well as the name of the clinical trial and the dose on the label cover, were considered variables for labelling quality. The influence of the characteristics of each sample on labelling quality was also analyzed. Outcome: The study included 503 samples from 220 clinical trials. The mean quality of labelling, understood as the proportion of items from Appendix 13, was of 91.9%. Out of these, 6.6% did not include the name of the sample in the outer face of the label, while in 9.7% the dose was missing. The samples with clinical trial-type samples presented a higher quality (p < 0.049, blinding reduced their quality (p = 0.017, and identification by kit number or by patient increased it (p < 0.01. The promoter was the variable which introduced the highest variability into the analysis. Conclusions: The mean quality of labelling is adequate in the majority of clinical trial samples. The lack of essential information in some samples, such as the clinical trial code and the period of validity, is alarming and might be the potential source for dispensing or administration errors.

  9. Assessing the accuracy of an inter-institutional automated patient-specific health problem list

    Directory of Open Access Journals (Sweden)

    Taylor Laurel

    2010-02-01

    Full Text Available Abstract Background Health problem lists are a key component of electronic health records and are instrumental in the development of decision-support systems that encourage best practices and optimal patient safety. Most health problem lists require initial clinical information to be entered manually and few integrate information across care providers and institutions. This study assesses the accuracy of a novel approach to create an inter-institutional automated health problem list in a computerized medical record (MOXXI that integrates three sources of information for an individual patient: diagnostic codes from medical services claims from all treating physicians, therapeutic indications from electronic prescriptions, and single-indication drugs. Methods Data for this study were obtained from 121 general practitioners and all medical services provided for 22,248 of their patients. At the opening of a patient's file, all health problems detected through medical service utilization or single-indication drug use were flagged to the physician in the MOXXI system. Each new arising health problem were presented as 'potential' and physicians were prompted to specify if the health problem was valid (Y or not (N or if they preferred to reassess its validity at a later time. Results A total of 263,527 health problems, representing 891 unique problems, were identified for the group of 22,248 patients. Medical services claims contributed to the majority of problems identified (77%, followed by therapeutic indications from electronic prescriptions (14%, and single-indication drugs (9%. Physicians actively chose to assess 41.7% (n = 106,950 of health problems. Overall, 73% of the problems assessed were considered valid; 42% originated from medical service diagnostic codes, 11% from single indication drugs, and 47% from prescription indications. Twelve percent of problems identified through other treating physicians were considered valid compared to 28

  10. [Automated serial diagnosis of donor blood samples. Ergonomic and economic organization structure].

    Science.gov (United States)

    Stoll, T; Fischer-Fröhlich, C L; Mayer, G; Hanfland, P

    1990-01-01

    A comprehensive computer-aided administration-system for blood-donors is presented. Ciphered informations of barcode-labels allow the automatic and nevertheless selective pipetting of samples by pipetting-robots. Self-acting analysis-results are transferred to a host-computer in order to actualize a donor data-base.

  11. Development of an Automated and Sensitive Microfluidic Device for Capturing and Characterizing Circulating Tumor Cells (CTCs from Clinical Blood Samples.

    Directory of Open Access Journals (Sweden)

    Priya Gogoi

    Full Text Available Current analysis of circulating tumor cells (CTCs is hindered by sub-optimal sensitivity and specificity of devices or assays as well as lack of capability of characterization of CTCs with clinical biomarkers. Here, we validate a novel technology to enrich and characterize CTCs from blood samples of patients with metastatic breast, prostate and colorectal cancers using a microfluidic chip which is processed by using an automated staining and scanning system from sample preparation to image processing. The Celsee system allowed for the detection of CTCs with apparent high sensitivity and specificity (94% sensitivity and 100% specificity. Moreover, the system facilitated rapid capture of CTCs from blood samples and also allowed for downstream characterization of the captured cells by immunohistochemistry, DNA and mRNA fluorescence in-situ hybridization (FISH. In a subset of patients with prostate cancer we compared the technology with a FDA-approved CTC device, CellSearch and found a higher degree of sensitivity with the Celsee instrument. In conclusion, the integrated Celsee system represents a promising CTC technology for enumeration and molecular characterization.

  12. Automated on-line preconcentration of palladium on different sorbents and its determination in environmental samples.

    Science.gov (United States)

    Sánchez Rojas, Fuensanta; Bosch Ojeda, Catalina; Cano Pavón, José Manuel

    2007-01-01

    The determination of noble metals in environmental samples is of increasing importance. Palladium is often employed as a catalyst in chemical industry and is also used with platinum and rhodium in motor car catalytic converters which might cause environmental pollution problems. Two different sorbents for palladium preconcentration in different samples were investigated: silica gel functionalized with 1,5-bis(di-2-pyridyl)methylene tbiocarbohydrazide (DPTH-gel) and [1,5-Bis(2-pyridyl)-3-sulphophenyI methylene thiocarbonohydrazide (PSTH) immobilised on an anion-exchange resin (Dowex lx8-200)]. The sorbents were tested in a micro-column, placed in the auto-sampler arm, at the flow rate 2.8 mL min(-1). Elution was performed with 4 M HCl and 4 M HNO3, respectively. Satisfactory results were obtained for two sorbents.

  13. Steady-State Vacuum Ultraviolet Exposure Facility With Automated Lamp Calibration and Sample Positioning Fabricated

    Science.gov (United States)

    Sechkar, Edward A.; Steuber, Thomas J.; Banks, Bruce A.; Dever, Joyce A.

    2000-01-01

    The Next Generation Space Telescope (NGST) will be placed in an orbit that will subject it to constant solar radiation during its planned 10-year mission. A sunshield will be necessary to passively cool the telescope, protecting it from the Sun s energy and assuring proper operating temperatures for the telescope s instruments. This sunshield will be composed of metalized polymer multilayer insulation with an outer polymer membrane (12 to 25 mm in thickness) that will be metalized on the back to assure maximum reflectance of sunlight. The sunshield must maintain mechanical integrity and optical properties for the full 10 years. This durability requirement is most challenging for the outermost, constantly solar-facing polymer membrane of the sunshield. One of the potential threats to the membrane material s durability is from vacuum ultraviolet (VUV) radiation in wavelengths below 200 nm. Such radiation can be absorbed in the bulk of these thin polymer membrane materials and degrade the polymer s optical and mechanical properties. So that a suitable membrane material can be selected that demonstrates durability to solar VUV radiation, ground-based testing of candidate materials must be conducted to simulate the total 10- year VUV exposure expected during the Next Generation Space Telescope mission. The Steady State Vacuum Ultraviolet exposure facility was designed and fabricated at the NASA Glenn Research Center at Lewis Field to provide unattended 24-hr exposure of candidate materials to VUV radiation of 3 to 5 times the Sun s intensity in the wavelength range of 115 to 200 nm. The facility s chamber, which maintains a pressure of approximately 5 10(exp -6) torr, is divided into three individual exposure cells, each with a separate VUV source and sample-positioning mechanism. The three test cells are separated by a water-cooled copper shield plate assembly to minimize thermal effects from adjacent test cells. Part of the interior sample positioning mechanism of one

  14. Control Performance Management in Industrial Automation Assessment, Diagnosis and Improvement of Control Loop Performance

    CERN Document Server

    Jelali, Mohieddine

    2013-01-01

    Control Performance Management in Industrial Automation provides a coherent and self-contained treatment of a group of methods and applications of burgeoning importance to the detection and solution of problems with control loops that are vital in maintaining product quality, operational safety, and efficiency of material and energy consumption in the process industries. The monograph deals with all aspects of control performance management (CPM), from controller assessment (minimum-variance-control-based and advanced methods), to detection and diagnosis of control loop problems (process non-linearities, oscillations, actuator faults), to the improvement of control performance (maintenance, re-design of loop components, automatic controller re-tuning). It provides a contribution towards the development and application of completely self-contained and automatic methodologies in the field. Moreover, within this work, many CPM tools have been developed that goes far beyond available CPM packages. Control Perform...

  15. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    Science.gov (United States)

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology.

  16. Reproducibility in the automated quantitative assessment of HER2/neu for breast cancer

    Directory of Open Access Journals (Sweden)

    Tyler Keay

    2013-01-01

    Full Text Available Background: With the emerging role of digital imaging in pathology and the application of automated image-based algorithms to a number of quantitative tasks, there is a need to examine factors that may affect the reproducibility of results. These factors include the imaging properties of whole slide imaging (WSI systems and their effect on the performance of quantitative tools. This manuscript examines inter-scanner and inter-algorithm variability in the assessment of the commonly used HER2/neu tissue-based biomarker for breast cancer with emphasis on the effect of algorithm training. Materials and Methods: A total of 241 regions of interest from 64 breast cancer tissue glass slides were scanned using three different whole-slide images and were analyzed using two different automated image analysis algorithms, one with preset parameters and another incorporating a procedure for objective parameter optimization. Ground truth from a panel of seven pathologists was available from a previous study. Agreement analysis was used to compare the resulting HER2/neu scores. Results: The results of our study showed that inter-scanner agreement in the assessment of HER2/neu for breast cancer in selected fields of view when analyzed with any of the two algorithms examined in this study was equal or better than the inter-observer agreement previously reported on the same set of data. Results also showed that discrepancies observed between algorithm results on data from different scanners were significantly reduced when the alternative algorithm that incorporated an objective re-training procedure was used, compared to the commercial algorithm with preset parameters. Conclusion: Our study supports the use of objective procedures for algorithm training to account for differences in image properties between WSI systems.

  17. Automated processing of whole blood samples into microliter aliquots of plasma

    OpenAIRE

    1988-01-01

    A rotor that accepts and automatically processes a bulk aliquot of a single blood sample into multiple aliquots of plasma has been designed and built. The rotor consists of a central processing unit, which includes a disk containing eight precision-bore capillaries. By varying the internal diameters of the capillaries, aliquot volumes ranging 1 to 10 μl can be prepared. In practice, an unmeasured volume of blood is placed in a centre well, and, as the rotor begins to spin, is moved radially i...

  18. Quantitative Assessment of Molecular Dynamics Sampling for Flexible Systems.

    Science.gov (United States)

    Nemec, Mike; Hoffmann, Daniel

    2017-02-14

    Molecular dynamics (MD) simulation is a natural method for the study of flexible molecules but at the same time is limited by the large size of the conformational space of these molecules. We ask by how much the MD sampling quality for flexible molecules can be improved by two means: the use of diverse sets of trajectories starting from different initial conformations to detect deviations between samples and sampling with enhanced methods such as accelerated MD (aMD) or scaled MD (sMD) that distort the energy landscape in controlled ways. To this end, we test the effects of these approaches on MD simulations of two flexible biomolecules in aqueous solution, Met-Enkephalin (5 amino acids) and HIV-1 gp120 V3 (a cycle of 35 amino acids). We assess the convergence of the sampling quantitatively with known, extensive measures of cluster number Nc and cluster distribution entropy Sc and with two new quantities, conformational overlap Oconf and density overlap Odens, both conveniently ranging from 0 to 1. These new overlap measures quantify self-consistency of sampling in multitrajectory MD experiments, a necessary condition for converged sampling. A comprehensive assessment of sampling quality of MD experiments identifies the combination of diverse trajectory sets and aMD as the most efficient approach among those tested. However, analysis of Odens between conventional and aMD trajectories also reveals that we have not completely corrected aMD sampling for the distorted energy landscape. Moreover, for V3, the courses of Nc and Odens indicate that much higher resources than those generally invested today will probably be needed to achieve convergence. The comparative analysis also shows that conventional MD simulations with insufficient sampling can be easily misinterpreted as being converged.

  19. Improved automation of dissolved organic carbon sampling for organic-rich surface waters.

    Science.gov (United States)

    Grayson, Richard P; Holden, Joseph

    2016-02-01

    In-situ UV-Vis spectrophotometers offer the potential for improved estimates of dissolved organic carbon (DOC) fluxes for organic-rich systems such as peatlands because they are able to sample and log DOC proxies automatically through time at low cost. In turn, this could enable improved total carbon budget estimates for peatlands. The ability of such instruments to accurately measure DOC depends on a number of factors, not least of which is how absorbance measurements relate to DOC and the environmental conditions. Here we test the ability of a S::can Spectro::lyser™ for measuring DOC in peatland streams with routinely high DOC concentrations. Through analysis of the spectral response data collected by the instrument we have been able to accurately measure DOC up to 66 mg L(-1), which is more than double the original upper calibration limit for this particular instrument. A linear regression modelling approach resulted in an accuracy >95%. The greatest accuracy was achieved when absorbance values for several different wavelengths were used at the same time in the model. However, an accuracy >90% was achieved using absorbance values for a single wavelength to predict DOC concentration. Our calculations indicated that, for organic-rich systems, in-situ measurement with a scanning spectrophotometer can improve fluvial DOC flux estimates by 6 to 8% compared with traditional sampling methods. Thus, our techniques pave the way for improved long-term carbon budget calculations from organic-rich systems such as peatlands.

  20. Evaluation of automated sample preparation, retention time locked gas chromatography-mass spectrometry and data analysis methods for the metabolomic study of Arabidopsis species.

    Science.gov (United States)

    Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat

    2011-05-27

    In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected.

  1. ICSH recommendations for assessing automated high-performance liquid chromatography and capillary electrophoresis equipment for the quantitation of HbA2.

    Science.gov (United States)

    Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J

    2015-10-01

    Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed.

  2. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    Science.gov (United States)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  3. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    Science.gov (United States)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  4. Examples of Optical Assessment of Surface Cleanliness of Genesis Samples

    Science.gov (United States)

    Rodriquez, Melissa C.; Allton, J. H.; Burkett, P. J.; Gonzalez, C. P.

    2013-01-01

    Optical microscope assessment of Genesis solar wind collector surfaces is a coordinated part of the effort to obtain an assessed clean subset of flown wafer material for the scientific community. Microscopic survey is typically done at 50X magnification at selected approximately 1 square millimeter areas on the fragment surface. This survey is performed each time a principle investigator (PI) returns a sample to JSC for documentation as part of the established cleaning plan. The cleaning plan encompasses sample handling and analysis by Genesis science team members, and optical survey is done at each step in the process. Sample surface cleaning is performed at JSC (ultrapure water [1] and UV ozone cleaning [2]) and experimentally by other science team members (acid etch [3], acetate replica peels [4], CO2 snow [5], etc.). The documentation of each cleaning method can potentially be assessed with optical observation utilizing Image Pro Plus software [6]. Differences in particle counts can be studied and discussed within analysis groups. Approximately 25 samples have been identified as part of the cleaning matrix effort to date.

  5. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Science.gov (United States)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  6. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...... cycle can be performed in less than 3 min. Bovine serum albumin was used as a model protein to characterize the mixing efficiency and sample consumption of the system. The N2 fragment of an adaptor protein (p120-RasGAP) was used to demonstrate how the device can be used to survey the structural space...

  7. Adjustable virtual pore-size filter for automated sample preparation using acoustic radiation force

    Energy Technology Data Exchange (ETDEWEB)

    Jung, B; Fisher, K; Ness, K; Rose, K; Mariella, R

    2008-05-22

    We present a rapid and robust size-based separation method for high throughput microfluidic devices using acoustic radiation force. We developed a finite element modeling tool to predict the two-dimensional acoustic radiation force field perpendicular to the flow direction in microfluidic devices. Here we compare the results from this model with experimental parametric studies including variations of the PZT driving frequencies and voltages as well as various particle sizes and compressidensities. These experimental parametric studies also provide insight into the development of an adjustable 'virtual' pore-size filter as well as optimal operating conditions for various microparticle sizes. We demonstrated the separation of Saccharomyces cerevisiae and MS2 bacteriophage using acoustic focusing. The acoustic radiation force did not affect the MS2 viruses, and their concentration profile remained unchanged. With optimized design of our microfluidic flow system we were able to achieve yields of > 90% for the MS2 with > 80% of the S. cerevisiae being removed in this continuous-flow sample preparation device.

  8. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  9. Liquid chromatography coupled with multi-channel electrochemical detection for the determination of daidzin in rat blood sampled by an automated blood sampling system.

    Science.gov (United States)

    Tian, Feifei; Zhu, Yongxin; Long, Hong; Cregor, Meloney; Xie, Fuming; Kissinger, Candice B; Kissinger, Peter T

    2002-05-25

    Daidzin, a soy-derived biologically active natural product, has been reported to inhibit mitochondrial aldehyde dehydrogenase and suppress ethanol intake. This paper describes a method for the determination of daidzin in rat blood. After administration of daidzin, blood samples were periodically collected from awake, freely moving animals by a Culex automated blood sampler. Daidzin was extracted from 50 microl of diluted blood (blood and saline at a ratio of 1:1) with ethyl acetate. Chromatographic separation was achieved within 12 min using a microbore C(18) (100 x 1.0 mm) 3 microm column with a mobile phase containing 20 mM sodium acetate, 0.25 mM EDTA, pH 4.3, 4% methanol and 11% acetonitrile at a flow-rate of 90 microl/min. Detection was attained using a four-channel electrochemical detector with glassy carbon electrodes using oxidation potentials of +1100, 950, 850, 750 mV vs. Ag/AgCl. The limit of detection for daidzin in rat plasma was 5 ng/ml at a signal-to-noise ratio of 3:1. The extraction recovery of daidzin from rat plasma was over 74%. Linearity was obtained for the range of 25-1000 ng/ml. The intra- and inter-assay precisions were in the ranges of 2.7-6.6 and 1.9-3.7%, respectively. This method is suitable to routine in vivo monitoring of daidzin in rat plasma.

  10. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H. [and others

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons.

  11. Assessment of paclitaxel induced sensory polyneuropathy with "Catwalk" automated gait analysis in mice.

    Directory of Open Access Journals (Sweden)

    Petra Huehnchen

    Full Text Available Neuropathic pain as a symptom of sensory nerve damage is a frequent side effect of chemotherapy. The most common behavioral observation in animal models of chemotherapy induced polyneuropathy is the development of mechanical allodynia, which is quantified with von Frey filaments. The data from one study, however, cannot be easily compared with other studies owing to influences of environmental factors, inter-rater variability and differences in test paradigms. To overcome these limitations, automated quantitative gait analysis was proposed as an alternative, but its usefulness for assessing animals suffering from polyneuropathy has remained unclear. In the present study, we used a novel mouse model of paclitaxel induced polyneuropathy to compare results from electrophysiology and the von Frey method to gait alterations measured with the Catwalk test. To mimic recently improved clinical treatment strategies of gynecological malignancies, we established a mouse model of dose-dense paclitaxel therapy on the common C57Bl/6 background. In this model paclitaxel treated animals developed mechanical allodynia as well as reduced caudal sensory nerve action potential amplitudes indicative of a sensory polyneuropathy. Gait analysis with the Catwalk method detected distinct alterations of gait parameters in animals suffering from sensory neuropathy, revealing a minimized contact of the hind paws with the floor. Treatment of mechanical allodynia with gabapentin improved altered dynamic gait parameters. This study establishes a novel mouse model for investigating the side effects of dose-dense paclitaxel therapy and underlines the usefulness of automated gait analysis as an additional easy-to-use objective test for evaluating painful sensory polyneuropathy.

  12. Equilibrium sampling for a thermodynamic assessment of contaminated sediments

    DEFF Research Database (Denmark)

    Mayer, Philipp; Nørgaard Schmidt, Stine; Mäenpää, Kimmo

    valid equilibrium sampling (method incorporated QA/QC). The measured equilibrium concentrations in silicone (Csil) can then be divided by silicone/water partition ratios to yield Cfree. CSil can also be compared to CSil from silicone equilibrated with biota in order to determine the equilibrium status...... will focus at the latest developments in equilibrium sampling concepts and methods. Further, we will explain how these approaches can provide a new basis for a thermodynamic assessment of polluted sediments.......Hydrophobic organic contaminants (HOCs) reaching the aquatic environment are largely stored in sediments. The risk of contaminated sediments is challenging to assess since traditional exhaustive extraction methods yield total HOC concentrations, whereas freely dissolved concentrations (Cfree...

  13. Automated column liquid chromatographic determination of amoxicillin and cefadroxil in bovine serum and muscle tissue using on-line dialysis for sample preparation

    NARCIS (Netherlands)

    Snippe, N; van de Merbel, N C; Ruiter, F P; Steijger, O M; Lingeman, H; Brinkman, U A

    1994-01-01

    A fully automated method is described for the determination of amoxicillin and cefadroxil in bovine serum and muscle tissue. The method is based on the on-line combination of dialysis and solid-phase extraction for sample preparation, and column liquid chromatography with ultraviolet detection. In o

  14. Re-Emergence of Under-Selected Stimuli, after the Extinction of Over-Selected Stimuli in an Automated Match to Samples Procedure

    Science.gov (United States)

    Broomfield, Laura; McHugh, Louise; Reed, Phil

    2008-01-01

    Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…

  15. Assessing the Bias in Communication Networks Sampled from Twitter

    CERN Document Server

    González-Bailón, Sandra; Rivero, Alejandro; Borge-Holthoefer, Javier; Moreno, Yamir

    2012-01-01

    We collect and analyse messages exchanged in Twitter using two of the platform's publicly available APIs (the search and stream specifications). We assess the differences between the two samples, and compare the networks of communication reconstructed from them. The empirical context is given by political protests taking place in May 2012: we track online communication around these protests for the period of one month, and reconstruct the network of mentions and re-tweets according to the two samples. We find that the search API over-represents the more central users and does not offer an accurate picture of peripheral activity; we also find that the bias is greater for the network of mentions. We discuss the implications of this bias for the study of diffusion dynamics and collective action in the digital era, and advocate the need for more uniform sampling procedures in the study of online communication.

  16. Automated Cognitive Health Assessment From Smart Home-Based Behavior Data.

    Science.gov (United States)

    Dawadi, Prafulla Nath; Cook, Diane Joyce; Schmitter-Edgecombe, Maureen

    2016-07-01

    Smart home technologies offer potential benefits for assisting clinicians by automating health monitoring and well-being assessment. In this paper, we examine the actual benefits of smart home-based analysis by monitoring daily behavior in the home and predicting clinical scores of the residents. To accomplish this goal, we propose a clinical assessment using activity behavior (CAAB) approach to model a smart home resident's daily behavior and predict the corresponding clinical scores. CAAB uses statistical features that describe characteristics of a resident's daily activity performance to train machine learning algorithms that predict the clinical scores. We evaluate the performance of CAAB utilizing smart home sensor data collected from 18 smart homes over two years. We obtain a statistically significant correlation ( r=0.72) between CAAB-predicted and clinician-provided cognitive scores and a statistically significant correlation ( r=0.45) between CAAB-predicted and clinician-provided mobility scores. These prediction results suggest that it is feasible to predict clinical scores using smart home sensor data and learning-based data analysis.

  17. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  18. Assessing Community Health Risks: Proactive Vs Reactive Sampling

    Directory of Open Access Journals (Sweden)

    Sarah Taylor

    2009-01-01

    Full Text Available Problem statement: A considerable number of native birds died in the West Australian coastal town of Esperance and surroundings during late 2006 and early 2007, which raised community concerns about environmental contamination. Forensic investigations of dead birds suggested that lead may have been the causative agent. At the time, lead and nickel, as well as iron ore and other materials, were being exported through the Port of Esperance (port. Government agencies undertook a targeted environmental sampling programme to identify the exposure sources and the extent of contamination. Results of ambient air monitoring, blood lead level investigations and analysis of metals in rainwater tanks suggested widespread contamination of the Esperance town site with lead and nickel. The Department of Environment and Conservation (DEC retained Golder Associates Pty Ltd., (Golder to undertake a human health and ecological risk assessment (risk assessment using the information collected through the investigation of lead and nickel contamination in Esperance. The quantity and quality of exposure data are an important contributor to the uncertainty associated with the outcomes of a risk assessment. Conclusion: As the data were collected essentially as part of the emergency response to the events in Esperance, there was some uncertainty about the suitability and completeness of the data for risk assessment. The urgent nature of the emergency response meant that sampling was opportunistic and not necessarily sufficient or suitable for risk assessment from a methodical and scientific perspective. This study demonstrated the need for collecting ‘meaningful and reliable’ data for assessing risks from environmental contamination.

  19. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    Science.gov (United States)

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples.

  20. Assessment of the 296-S-21 Stack Sampling Probe Location

    Energy Technology Data Exchange (ETDEWEB)

    Glissmeyer, John A.

    2006-09-08

    Tests were performed to assess the suitability of the location of the air sampling probe on the 296-S-21 stack according to the criteria of ANSI N13.1-1999, Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities. Pacific Northwest National Laboratory conducted most tests on a 3.67:1 scale model of the stack. CH2MHill also performed some limited confirmatory tests on the actual stack. The tests assessed the capability of the air-monitoring probe to extract a sample representative of the effluent stream. The tests were conducted for the practical combinations of operating fans and addressed: (1) Angular Flow--The purpose is to determine whether the velocity vector is aligned with the sampling nozzle. The average yaw angle relative to the nozzle axis should not be more than 20. The measured values ranged from 5 to 11 degrees on the scale model and 10 to 12 degrees on the actual stack. (2) Uniform Air Velocity--The gas momentum across the stack cross section where the sample is extracted should be well mixed or uniform. The uniformity is expressed as the variability of the measurements about the mean, the coefficient of variance (COV). The lower the COV value, the more uniform the velocity. The acceptance criterion is that the COV of the air velocity must be ?20% across the center two-thirds of the area of the stack. At the location simulating the sampling probe, the measured values ranged form 4 to 11%, which are within the criterion. To confirm the validity of the scale model results, air velocity uniformity measurements were made both on the actual stack and on the scale model at the test ports 1.5 stack diameters upstream of the sampling probe. The results ranged from 6 to 8% COV on the actual stack and 10 to 13% COV on the scale model. The average difference for the eight runs was 4.8% COV, which is within the validation criterion. The fact that the scale model results were slightly higher than the

  1. Fully automated assessment of inflammatory cell counts and cytokine expression in bronchial tissue.

    NARCIS (Netherlands)

    Sont, J.K.; Boer, W.I.; Schadewijk, W.A. van; Grunberg, K.; Krieken, J.H.J.M. van; Hiemstra, P.S.; Sterk, P.J.

    2003-01-01

    Automated image analysis of bronchial tissue offers the opportunity to quantify stained area and staining intensity in a standardized way to obtain robust estimates of inflammatory cell counts and cytokine expression from multiple large areas of histopathologic sections. We compared fully automated

  2. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    Science.gov (United States)

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  3. Equilibrium sampling for a thermodynamic assessment of contaminated sediments

    DEFF Research Database (Denmark)

    Hydrophobic organic contaminants (HOCs) reaching the aquatic environment are largely stored in sediments. The risk of contaminated sediments is challenging to assess since traditional exhaustive extraction methods yield total HOC concentrations, whereas freely dissolved concentrations (Cfree...... of polluted sediments. Glass jars with µm-thin silicone coatings on the inner walls can be used for ex situ equilibration while a device housing several silicone-coated fibers can be used for in situ equilibration. In both cases, parallel sampling with varying silicone thicknesses can be applied to confirm...... valid equilibrium sampling (method incorporated QA/QC). The measured equilibrium concentrations in silicone (Csil) can then be divided by silicone/water partition ratios to yield Cfree. CSil can also be compared to CSil from silicone equilibrated with biota in order to determine the equilibrium status...

  4. Holistic approach for automated background EEG assessment in asphyxiated full-term infants

    Science.gov (United States)

    Matic, Vladimir; Cherian, Perumpillichira J.; Koolen, Ninah; Naulaers, Gunnar; Swarte, Renate M.; Govaert, Paul; Van Huffel, Sabine; De Vos, Maarten

    2014-12-01

    Objective. To develop an automated algorithm to quantify background EEG abnormalities in full-term neonates with hypoxic ischemic encephalopathy. Approach. The algorithm classifies 1 h of continuous neonatal EEG (cEEG) into a mild, moderate or severe background abnormality grade. These classes are well established in the literature and a clinical neurophysiologist labeled 272 1 h cEEG epochs selected from 34 neonates. The algorithm is based on adaptive EEG segmentation and mapping of the segments into the so-called segments’ feature space. Three features are suggested and further processing is obtained using a discretized three-dimensional distribution of the segments’ features represented as a 3-way data tensor. Further classification has been achieved using recently developed tensor decomposition/classification methods that reduce the size of the model and extract a significant and discriminative set of features. Main results. Effective parameterization of cEEG data has been achieved resulting in high classification accuracy (89%) to grade background EEG abnormalities. Significance. For the first time, the algorithm for the background EEG assessment has been validated on an extensive dataset which contained major artifacts and epileptic seizures. The demonstrated high robustness, while processing real-case EEGs, suggests that the algorithm can be used as an assistive tool to monitor the severity of hypoxic insults in newborns.

  5. Accuracy of MRI volume measurements of breast lesions: comparison between automated, semiautomated and manual assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rominger, Marga B.; Fournell, Daphne; Nadar, Beenarose Thanka; Figiel, Jens H.; Keil, Boris; Heverhagen, Johannes T. [Philipps University, Department of Radiology, Marburg (Germany); Behrens, Sarah N.M. [MeVis GmbH, Bremen (Germany)

    2009-05-15

    The aim of this study was to investigate the efficacy of a dedicated software tool for automated and semiautomated volume measurement in contrast-enhanced (CE) magnetic resonance mammography (MRM). Ninety-six breast lesions with histopathological workup (27 benign, 69 malignant) were re-evaluated by different volume measurement techniques. Volumes of all lesions were extracted automatically (AVM) and semiautomatically (SAVM) from CE 3D MRM and compared with manual 3D contour segmentation (manual volume measurement, MVM, reference measurement technique) and volume estimates based on maximum diameter measurement (MDM). Compared with MVM as reference method MDM, AVM and SAVM underestimated lesion volumes by 63.8%, 30.9% and 21.5%, respectively, with significantly different accuracy for benign (102.4%, 18.4% and 11.4%) and malignant (54.9%, 33.0% and 23.1%) lesions (p<0.05). Inter- and intraobserver reproducibility was best for AVM (mean difference {+-}2SD, 1.0{+-}9.7% and 1.8{+-}12.1%) followed by SAVM (4.3{+-}25.7% and 4.3{+-}7.9%), MVM (2.3{+-}38.2% and 8.6{+-}31.8%) and MDM (33.9{+-}128.4% and 9.3{+-}55.9%). SAVM is more accurate for volume assessment of breast lesions than MDM and AVM. Volume measurement is less accurate for malignant than benign lesions. (orig.)

  6. Beyond crosswalks: reliability of exposure assessment following automated coding of free-text job descriptions for occupational epidemiology.

    Science.gov (United States)

    Burstyn, Igor; Slutsky, Anton; Lee, Derrick G; Singer, Alison B; An, Yuan; Michael, Yvonne L

    2014-05-01

    Epidemiologists typically collect narrative descriptions of occupational histories because these are less prone than self-reported exposures to recall bias of exposure to a specific hazard. However, the task of coding these narratives can be daunting and prohibitively time-consuming in some settings. The aim of this manuscript is to evaluate the performance of a computer algorithm to translate the narrative description of occupational codes into standard classification of jobs (2010 Standard Occupational Classification) in an epidemiological context. The fundamental question we address is whether exposure assignment resulting from manual (presumed gold standard) coding of the narratives is materially different from that arising from the application of automated coding. We pursued our work through three motivating examples: assessment of physical demands in Women's Health Initiative observational study, evaluation of predictors of exposure to coal tar pitch volatiles in the US Occupational Safety and Health Administration's (OSHA) Integrated Management Information System, and assessment of exposure to agents known to cause occupational asthma in a pregnancy cohort. In these diverse settings, we demonstrate that automated coding of occupations results in assignment of exposures that are in reasonable agreement with results that can be obtained through manual coding. The correlation between physical demand scores based on manual and automated job classification schemes was reasonable (r = 0.5). The agreement between predictive probability of exceeding the OSHA's permissible exposure level for polycyclic aromatic hydrocarbons, using coal tar pitch volatiles as a surrogate, based on manual and automated coding of jobs was modest (Kendall rank correlation = 0.29). In the case of binary assignment of exposure to asthmagens, we observed that fair to excellent agreement in classifications can be reached, depending on presence of ambiguity in assigned job classification (κ

  7. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  8. Influence of commonly used primer systems on automated ribosomal intergenic spacer analysis of bacterial communities in environmental samples.

    Directory of Open Access Journals (Sweden)

    Witoon Purahong

    Full Text Available Due to the high diversity of bacteria in many ecosystems, their slow generation times, specific but mostly unknown nutrient requirements and syntrophic interactions, isolation based approaches in microbial ecology mostly fail to describe microbial community structure. Thus, cultivation independent techniques, which rely on directly extracted nucleic acids from the environment, are a well-used alternative. For example, bacterial automated ribosomal intergenic spacer analysis (B-ARISA is one of the widely used methods for fingerprinting bacterial communities after PCR-based amplification of selected regions of the operon coding for rRNA genes using community DNA. However, B-ARISA alone does not provide any taxonomic information and the results may be severely biased in relation to the primer set selection. Furthermore, amplified DNA stemming from mitochondrial or chloroplast templates might strongly bias the obtained fingerprints. In this study, we determined the applicability of three different B-ARISA primer sets to the study of bacterial communities. The results from in silico analysis harnessing publicly available sequence databases showed that all three primer sets tested are specific to bacteria but only two primers sets assure high bacterial taxa coverage (1406f/23Sr and ITSF/ITSReub. Considering the study of bacteria in a plant interface, the primer set ITSF/ITSReub was found to amplify (in silico sequences of some important crop species such as Sorghum bicolor and Zea mays. Bacterial genera and plant species potentially amplified by different primer sets are given. These data were confirmed when DNA extracted from soil and plant samples were analyzed. The presented information could be useful when interpreting existing B-ARISA results and planning B-ARISA experiments, especially when plant DNA can be expected.

  9. Assessment of the application of an automated electronic milk analyzer for the enumeration of total bacteria in raw goat milk.

    Science.gov (United States)

    Ramsahoi, L; Gao, A; Fabri, M; Odumeru, J A

    2011-07-01

    Automated electronic milk analyzers for rapid enumeration of total bacteria counts (TBC) are widely used for raw milk testing by many analytical laboratories worldwide. In Ontario, Canada, Bactoscan flow cytometry (BsnFC; Foss Electric, Hillerød, Denmark) is the official anchor method for TBC in raw cow milk. Penalties are levied at the BsnFC equivalent level of 50,000 cfu/mL, the standard plate count (SPC) regulatory limit. This study was conducted to assess the BsnFC for TBC in raw goat milk, to determine the mathematical relationship between the SPC and BsnFC methods, and to identify probable reasons for the difference in the SPC:BsnFC equivalents for goat and cow milks. Test procedures were conducted according to International Dairy Federation Bulletin guidelines. Approximately 115 farm bulk tank milk samples per month were tested for inhibitor residues, SPC, BsnFC, psychrotrophic bacteria count, composition (fat, protein, lactose, lactose and other solids, and freezing point), and somatic cell count from March 2009 to February 2010. Data analysis of the results for the samples tested indicated that the BsnFC method would be a good alternative to the SPC method, providing accurate and more precise results with a faster turnaround time. Although a linear regression model showed good correlation and prediction, tests for linearity indicated that the relationship was linear only beyond log 4.1 SPC. The logistic growth curve best modeled the relationship between the SPC and BsnFC for the entire sample population. The BsnFC equivalent to the SPC 50,000 cfu/mL regulatory limit was estimated to be 321,000 individual bacteria count (ibc)/mL. This estimate differs considerably from the BsnFC equivalent for cow milk (121,000 ibc/mL). Because of the low frequency of bulk tank milk pickups at goat farms, 78.5% of the samples had their oldest milking in the tank to be 6.5 to 9.0 d old when tested, compared with the cow milk samples, which had their oldest milking at 4 d

  10. An automated version of the BAT Syntactic Comprehension task for assessing auditory L2 proficiency in healthy adults.

    Science.gov (United States)

    Achim, André; Marquis, Alexandra

    2011-06-01

    Studies of bilingualism sometimes require healthy subjects to be assessed for proficiency at auditory sentence processing in their second language (L2). The Syntactic Comprehension task of the Bilingual Aphasia Test could satisfy this need. For ease and uniformity of application, we automated its English ( Paradis, M., Libben, G., and Hummel, K. (1987) . The Bilingual Aphasia Test. English version. Hillsdale, NJ: Lawrence Erlbaum Associates) and French ( Paradis, M., & Goldblum, M. C. (1987) . The Bilingual Aphasia Test, French version. Hillsdale, NJ: Lawrence Erlbaum Associates) versions. Although the Bilingual Aphasia Test is meant to assess neurological disorders affecting language, we hypothesised that ceiling performance in L2 would be rare and L2 errors should be consistent with lack of processing automaticity. Initial data from 13 French-English and 4 English-French bilinguals confirm these expectations. Thus, the automated Syntactic Comprehension task (available online for PC and Mac platforms) is indeed suited to test bilingual English and French proficiency levels in healthy adults.

  11. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism

    OpenAIRE

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-01-01

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels,...

  12. Assessment of Pain Response in Capsaicin-Induced Dynamic Mechanical Allodynia Using a Novel and Fully Automated Brushing Device

    Directory of Open Access Journals (Sweden)

    Kristian G du Jardin

    2013-01-01

    Full Text Available BACKGROUND: Dynamic mechanical allodynia is traditionally induced by manual brushing of the skin. Brushing force and speed have been shown to influence the intensity of brush-evoked pain. There are still limited data available with respect to the optimal stroke number, length, force, angle and speed. Therefore, an automated brushing device (ABD was developed, for which brushing angle and speed could be controlled to enable quantitative assessment of dynamic mechanical allodynia.

  13. Assessing Office Automation Effect on Performance Using Balanced Scorecard approach Case Study: Esfahan Education Organizations and Schools

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Moshref Javadi

    2013-09-01

    Full Text Available Survival of each organization depends on its dynamic interaction with internal and external environment. Regarding development of technology and its effect on performance of organizations, organizations need to implement these technologies in order to be successful. This research aims to explore relationship between implementation of office automation and performance using structural equitation modeling method (SEM. This study is considered an applied survey in which its method is descriptive. Statistical population was managers of offices and schools of ministry of education in Esfahan and Lenjan city.130 individuals were selected randomly as sample. In order to evaluate validity of questionnaire, content and construct validity were used and relations between variables of this research has been confirmed based on results of SEM method. For analyzes of data, structural equation method has been used. Regarding obtained results, effectiveness amount of office automation on performance was measured which was equal to estimated standard amount as 83%. Obtained results from main hypothesis test of this research completely conform which there is about office automation in studied organization and office automation could improve performance of organization.

  14. Performance assessment of automated tissue characterization for prostate H and E stained histopathology

    Science.gov (United States)

    DiFranco, Matthew D.; Reynolds, Hayley M.; Mitchell, Catherine; Williams, Scott; Allan, Prue; Haworth, Annette

    2015-03-01

    Reliable automated prostate tumor detection and characterization in whole-mount histology images is sought in many applications, including post-resection tumor staging and as ground-truth data for multi-parametric MRI interpretation. In this study, an ensemble-based supervised classification algorithm for high-resolution histology images was trained on tile-based image features including histogram and gray-level co-occurrence statistics. The algorithm was assessed using different combinations of H and E prostate slides from two separate medical centers and at two different magnifications (400x and 200x), with the aim of applying tumor classification models to new data. Slides from both datasets were annotated by expert pathologists in order to identify homogeneous cancerous and non-cancerous tissue regions of interest, which were then categorized as (1) low-grade tumor (LG-PCa), including Gleason 3 and high-grade prostatic intraepithelial neoplasia (HG-PIN), (2) high-grade tumor (HG-PCa), including various Gleason 4 and 5 patterns, or (3) non-cancerous, including benign stroma and benign prostatic hyperplasia (BPH). Classification models for both LG-PCa and HG-PCa were separately trained using a support vector machine (SVM) approach, and per-tile tumor prediction maps were generated from the resulting ensembles. Results showed high sensitivity for predicting HG-PCa with an AUC up to 0.822 using training data from both medical centres, while LG-PCa showed a lower sensitivity of 0.763 with the same training data. Visual inspection of cancer probability heatmaps from 9 patients showed that 17/19 tumors were detected, and HG-PCa generally reported less false positives than LG-PCa.

  15. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    Energy Technology Data Exchange (ETDEWEB)

    Kertesz, Vilmos [ORNL; Van Berkel, Gary J [ORNL

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  16. Carotid Catheterization and Automated Blood Sampling Induce Systemic IL-6 Secretion and Local Tissue Damage and Inflammation in the Heart, Kidneys, Liver and Salivary Glands in NMRI Mice

    DEFF Research Database (Denmark)

    Teilmann, Anne Charlotte; Rozell, Björn; Kalliokoski, Otto

    2016-01-01

    Automated blood sampling through a vascular catheter is a frequently utilized technique in laboratory mice. The potential immunological and physiological implications associated with this technique have, however, not been investigated in detail. The present study compared plasma levels of the cyt......Automated blood sampling through a vascular catheter is a frequently utilized technique in laboratory mice. The potential immunological and physiological implications associated with this technique have, however, not been investigated in detail. The present study compared plasma levels...... of the cytokines IL-1β, IL-2, IL-6, IL-10, IL-17A, GM-CSF, IFN-γ and TNF-α in male NMRI mice that had been subjected to carotid artery catheterization and subsequent automated blood sampling with age-matched control mice. Body weight and histopathological changes in the surgical area, including the salivary glands......, the heart, brain, spleen, liver, kidneys and lungs were compared. Catheterized mice had higher levels of IL-6 than did control mice, but other cytokine levels did not differ between the groups. No significant difference in body weight was found. The histology revealed inflammatory and regenerative (healing...

  17. Sampling for Soil Carbon Stock Assessment in Rocky Agricultural Soils

    Science.gov (United States)

    Beem-Miller, Jeffrey P.; Kong, Angela Y. Y.; Ogle, Stephen; Wolfe, David

    2016-01-01

    Coring methods commonly employed in soil organic C (SOC) stock assessment may not accurately capture soil rock fragment (RF) content or soil bulk density (rho (sub b)) in rocky agricultural soils, potentially biasing SOC stock estimates. Quantitative pits are considered less biased than coring methods but are invasive and often cost-prohibitive. We compared fixed-depth and mass-based estimates of SOC stocks (0.3-meters depth) for hammer, hydraulic push, and rotary coring methods relative to quantitative pits at four agricultural sites ranging in RF content from less than 0.01 to 0.24 cubic meters per cubic meter. Sampling costs were also compared. Coring methods significantly underestimated RF content at all rocky sites, but significant differences (p is less than 0.05) in SOC stocks between pits and corers were only found with the hammer method using the fixed-depth approach at the less than 0.01 cubic meters per cubic meter RF site (pit, 5.80 kilograms C per square meter; hammer, 4.74 kilograms C per square meter) and at the 0.14 cubic meters per cubic meter RF site (pit, 8.81 kilograms C per square meter; hammer, 6.71 kilograms C per square meter). The hammer corer also underestimated rho (sub b) at all sites as did the hydraulic push corer at the 0.21 cubic meters per cubic meter RF site. No significant differences in mass-based SOC stock estimates were observed between pits and corers. Our results indicate that (i) calculating SOC stocks on a mass basis can overcome biases in RF and rho (sub b) estimates introduced by sampling equipment and (ii) a quantitative pit is the optimal sampling method for establishing reference soil masses, followed by rotary and then hydraulic push corers.

  18. Assessment of human exposure to airborne fungi in agricultural confinements: personal inhalable sampling versus stationary sampling.

    Science.gov (United States)

    Adhikari, Atin; Reponen, Tiina; Lee, Shu-An; Grinshpun, Sergey A

    2004-01-01

    Accurate exposure assessment to airborne fungi in agricultural environments is essential for estimating the associated occupational health hazards of workers. The objective of this pilot study was to compare personal and stationary sampling for assessing farmers' exposure to airborne fungi in 3 different agricultural confinements located in Ohio, USA (hog farm, dairy farm, and grain farm), using Button Personal Inhalable Samplers. Personal exposures were measured with samplers worn by 3 subjects (each carrying 2 samplers) during 3 types of activities, including animal feeding in the hog farm, cleaning and animal handling in the dairy farm, and soybean unloading and handling in the grain farm. Simultaneously, the stationary measurements were performed using 5 static Button Samplers and 1 revolving Button Sampler. The study showed that the total concentration of airborne fungi ranged from 1.4 x 10(4)-1.2 x 10(5) spores m(-3) in 3 confinements. Grain unloading and handling activity generated highest concentrations of airborne fungi compared to the other 2 activities. Prevalent airborne fungi belonged to Cladosporium, Aspergillus/Penicillium, Ascospores, smut spores, Epicoccum, Alternaria, and Basidiospores. Lower coefficients of variations were observed for the fungal concentrations measured by personal samplers (7-12%) compared to the concentrations measured by stationary samplers (27-37%). No statistically significant difference was observed between the stationary and personal measurement data for the total concentrations of airborne fungi (p > 0.05). Revolving stationary and static stationary Button Samplers demonstrated similar performance characteristics for the collection of airborne fungi. This reflects the low sensitivity of the sampler's efficiency to the wind speed and direction. The results indicate that personal exposure of agricultural workers in confinements may be adequately assessed by placing several Button Samplers simultaneously operating in a

  19. Risk Assessment on the Transition Program for Air Traffic Control Automation System Upgrade

    Directory of Open Access Journals (Sweden)

    Li Dong Bin

    2016-01-01

    Full Text Available We analyzed the safety risks of the transition program for Air Traffic Control (ATC automation system upgrade by using the event tree analysis method in this paper. We decomposed the occurrence progress of the three transition phase and built the event trees corresponding to the three stages, and then we determined the probability of success of each factor and calculated probability of success of the air traffic control automation system upgrade transition. In the conclusion, we illustrate the transition program safety risk according to the results.

  20. Automated extraction of 11-nor-delta9-tetrahydrocannabinol carboxylic acid from urine samples using the ASPEC XL solid-phase extraction system.

    Science.gov (United States)

    Langen, M C; de Bijl, G A; Egberts, A C

    2000-09-01

    The analysis of 11-nor-delta9-tetrahydrocannabinol-carboxylic acid (THCCOOH, the major metabolite of cannabis) in urine with gas chromatography and mass spectrometry (GC-MS) and solid-phase extraction (SPE) sample preparation is well documented. Automated SPE sample preparation of THCCOOH in urine, although potentially advantageous, is to our knowledge poorly investigated. The objective of the present study was to develop and validate an automated SPE sample-preparation step using ASPEC XL suited for GC-MS confirmation analysis of THCCOOH in urine drug control. The recoveries showed that it was not possible to transfer the protocol for the manual SPE procedure with the vacuum manifold to the ASPEC XL without loss of recovery. Making the sample more lipophilic by adding 1 mL 2-propanol after hydrolysis to the urine sample in order to overcome the problem of surface adsorption of THCCOOH led to an extraction efficiency (77%) comparable to that reached with the vacuum manifold (84%). The reproducibility of the automated SPE procedure was better (coefficient of variation 5%) than that of the manual procedure (coefficient of variation 12%). The limit of detection was 1 ng/mL, and the limit of quantitation was 4 ng/mL. Precision at the 12.5-ng/mL level was as follows: mean, 12.4 and coefficient of variation, 3.0%. Potential carryover was evaluated, but a carryover effect could not be detected. It was concluded that the proposed method is suited for GC-MS confirmation urinalysis of THCCOOH for prisons and detoxification centers.

  1. Assessing tiger population dynamics using photographic capture-recapture sampling

    Science.gov (United States)

    Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E.

    2006-01-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, ?robust design? capture?recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of =K' =Y' 0.10 ? 0.069 (values are estimated mean ? SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 ? 0.051, and the estimated probability that a newly caught animal was a transient was = 0.18 ? 0.11. During the period when the sampled area was of constant size, the estimated population size Nt varied from 17 ? 1.7 to 31 ? 2.1 tigers, with a geometric mean rate of annual population change estimated as = 1.03 ? 0.020, representing a 3% annual increase. The estimated recruitment of new animals, Bt, varied from 0 ? 3.0 to 14 ? 2.9 tigers. Population density estimates, D, ranged from 7.33 ? 0.8 tigers/100 km2 to 21.73 ? 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain

  2. Assessment of Automated Analyses of Cell Migration on Flat and Nanostructured Surfaces

    DEFF Research Database (Denmark)

    Gradinaru, Cristian; Lopacinska, Joanna M.; Huth, Johannes;

    2012-01-01

    Motility studies of cells often rely on computer software that analyzes time-lapse recorded movies and establishes cell trajectories fully automatically. This raises the question of reproducibility of results, since different programs could yield significantly different results of such automated...

  3. Sampling cows to assess lying time for on-farm animal welfare assessment.

    Science.gov (United States)

    Vasseur, E; Rushen, J; Haley, D B; de Passillé, A M

    2012-09-01

    The time that dairy cows spend lying down is an important measure of their welfare, and data loggers can be used to automatically monitor lying time on commercial farms. To determine how the number of days of sampling, parity, stage of lactation, and production level affect lying time, electronic data loggers were used to record lying time for 10 d consecutively, at 3 stages of lactation [early: when cows were at 10-40 d in milk (DIM), mid: 100-140 DIM, late: 200-240 DIM] of 96 Holstein cows in tiestalls (TS) and 127 in freestalls (FS). We calculated daily duration of lying, bout frequency, and mean bout duration. We observed complex interactions between parity and stage of lactation, which differed somewhat between tiestalls and freestalls. First-parity cows had higher bout frequency and shorter lying bouts than older cows but bout frequency decreased and mean bout duration increased as DIM increased. We found that individual cows were not consistent in time spent lying between early and mid lactation (Pearson coefficient, TS: r = 0.1, FS: r = 0.2), whereas cows seemed to be more consistent in time spent lying between mid and late lactation (TS: r = 0.7, FS: r = 0.3). For both TS and FS cows, daily milk production was significantly, but slightly negatively, correlated with lying time across the lactation (range, r: -0.2 to -0.4), whereas parity was slightly to moderately positively correlated with mean bout duration across the lactation (r: +0.2 to +0.6) and negatively with bout frequency (r: -0.2 to -0.5). To estimate how the duration of the time sample affected the estimates of lying time subsets of data subsets consisting of 1, 2, 3, 4, 5, 6, 7, 8, and 9 d per cow were created, and the relationship between the overall mean (based on 10 d) and the mean of each subset was tested by regression. For both TS and FS, lying time based on 4 d of sampling provided good estimates of the average 10-d estimate (90% of accuracy). Automated monitoring of lying time has

  4. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    Science.gov (United States)

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis

  5. Performance of automated software in the assessment of segmental left ventricular function in cardiac CT: Comparison with cardiac magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Rui [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Capital Medical University, Department of Radiology, Beijing Anzhen Hospital, Beijing (China); Meinel, Felix G. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Canstein, Christian [Siemens Medical Solutions USA, Malvern, PA (United States); Spearman, James V. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); De Cecco, Carlo N. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Departments of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2015-12-15

    To evaluate the accuracy, reliability and time saving potential of a novel cardiac CT (CCT)-based, automated software for the assessment of segmental left ventricular function compared to visual and manual quantitative assessment of CCT and cardiac magnetic resonance (CMR). Forty-seven patients with suspected or known coronary artery disease (CAD) were enrolled in the study. Wall thickening was calculated. Segmental LV wall motion was automatically calculated and shown as a colour-coded polar map. Processing time for each method was recorded. Mean wall thickness in both systolic and diastolic phases on polar map, CCT, and CMR was 9.2 ± 0.1 mm and 14.9 ± 0.2 mm, 8.9 ± 0.1 mm and 14.5 ± 0.1 mm, 8.3 ± 0.1 mm and 13.6 ± 0.1 mm, respectively. Mean wall thickening was 68.4 ± 1.5 %, 64.8 ± 1.4 % and 67.1 ± 1.4 %, respectively. Agreement for the assessment of LV wall motion between CCT, CMR and polar maps was good. Bland-Altman plots and ICC indicated good agreement between CCT, CMR and automated polar maps of the diastolic and systolic segmental wall thickness and thickening. The processing time using polar map was significantly decreased compared with CCT and CMR. Automated evaluation of segmental LV function with polar maps provides similar measurements to manual CCT and CMR evaluation, albeit with substantially reduced analysis time. (orig.)

  6. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  7. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    Directory of Open Access Journals (Sweden)

    Demir Sumeyra U

    2012-12-01

    Full Text Available Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD

  8. Respondent-Driven Sampling: An Assessment of Current Methodology*

    OpenAIRE

    Gile, Krista J.; Handcock, Mark S.

    2010-01-01

    Respondent-Driven Sampling (RDS) employs a variant of a link-tracing network sampling strategy to collect data from hard-to-reach populations. By tracing the links in the underlying social network, the process exploits the social structure to expand the sample and reduce its dependence on the initial (convenience) sample. The primary goal of RDS is typically to estimate population averages in the hard-to-reach population. The current estimates make strong assumptions in order to treat the dat...

  9. Preliminary biogeochemical assessment of EPICA LGM and Holocene ice samples

    Science.gov (United States)

    Bulat, S.; Alekhina, I.; Marie, D.; Wagenbach, D.; Raynaud, D.; Petit, J. R.

    2009-04-01

    weak signals were possible to generate which are now under cloning. The signals were hard to reproduce because of rather low volume of samples. More ice volume is needed to get the biosignal stronger and reproducible. Meantime we are adjusting PCR and in addition testing DNA repair-enzyme cocktail in case of DNA damage. As a preliminary conclusion we would like to highlight the following. Both Holocene and LGM ice samples (EDC99 and EDML) are very clean in terms of Ultra low biomass and Ultra low DOC content. The most basal ice of EDC and EDML ice cores could help in assessing microbial biomass and diversity if present under the glacier at the ice-bedrock boundary. * The present-day consortium includes S. Bulat, I. Alekhina, P. Normand, D. Prieur, J-R. Petit and D. Raynaud (France) and E. Willerslev and J.P. Steffensen (Denmark)

  10. Energy Impact of Different Penetrations of Connected and Automated Vehicles: A Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rios-Torres, Jackeline [ORNL; Malikopoulos, Andreas [ORNL

    2016-01-01

    Previous research reported in the literature has shown the benefits of traffic coordination to alleviate congestion, and reduce fuel consumption and emissions. However, there are still many remaining challenges that need to be addressed before a massive deployment of fully automated vehicles. This paper aims to investigate the energy impacts of different penetration rates of connected and automated vehicles (CAVs) and their interaction with human-driven vehicles. We develop a simulation framework for mixed traffic (CAVs interacting with human-driven vehicles) in merging roadways and analyze the energy impact of different penetration rates of CAVs on the energy consumption. The Gipps car following model is used along with heuristic controls to represent the driver decisions in a merging roadways traffic scenario. Using different penetration rates of CAVs, the simulation results indicated that for low penetration rates, the fuel consumption benefits are significant but the total travel time increases. The benefits in travel time are noticeable for higher penetration rates of CAVs.

  11. AN ACCURACY ASSESSMENT OF AUTOMATED PHOTOGRAMMETRIC TECHNIQUES FOR 3D MODELING OF COMPLEX INTERIORS

    OpenAIRE

    Georgantas, A.; M. Brédif; Pierrot-Desseilligny, M.

    2012-01-01

    This paper presents a comparison of automatic photogrammetric techniques to terrestrial laser scanning for 3D modelling of complex interior spaces. We try to evaluate the automated photogrammetric techniques not only in terms of their geometric quality compared to laser scanning but also in terms of cost in money, acquisition and computational time. To this purpose we chose as test site a modern building’s stairway. APERO/MICMAC ( ©IGN )which is an Open Source photogrammetric softwar...

  12. Automated DNA-based plant identification for large-scale biodiversity assessment.

    Science.gov (United States)

    Papadopoulou, Anna; Chesters, Douglas; Coronado, Indiana; De la Cadena, Gissela; Cardoso, Anabela; Reyes, Jazmina C; Maes, Jean-Michel; Rueda, Ricardo M; Gómez-Zurita, Jesús

    2015-01-01

    Rapid degradation of tropical forests urges to improve our efficiency in large-scale biodiversity assessment. DNA barcoding can assist greatly in this task, but commonly used phenetic approaches for DNA-based identifications rely on the existence of comprehensive reference databases, which are infeasible for hyperdiverse tropical ecosystems. Alternatively, phylogenetic methods are more robust to sparse taxon sampling but time-consuming, while multiple alignment of species-diagnostic, typically length-variable, markers can be problematic across divergent taxa. We advocate the combination of phylogenetic and phenetic methods for taxonomic assignment of DNA-barcode sequences against incomplete reference databases such as GenBank, and we developed a pipeline to implement this approach on large-scale plant diversity projects. The pipeline workflow includes several steps: database construction and curation, query sequence clustering, sequence retrieval, distance calculation, multiple alignment and phylogenetic inference. We describe the strategies used to establish these steps and the optimization of parameters to fit the selected psbA-trnH marker. We tested the pipeline using infertile plant samples and herbivore diet sequences from the highly threatened Nicaraguan seasonally dry forest and exploiting a valuable purpose-built resource: a partial local reference database of plant psbA-trnH. The selected methodology proved efficient and reliable for high-throughput taxonomic assignment, and our results corroborate the advantage of applying 'strict' tree-based criteria to avoid false positives. The pipeline tools are distributed as the scripts suite 'BAGpipe' (pipeline for Biodiversity Assessment using GenBank data), which can be readily adjusted to the purposes of other projects and applied to sequence-based identification for any marker or taxon.

  13. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities.

  14. Assessment of Automated Analyses of Cell Migration on Flat and Nanostructured Surfaces

    Directory of Open Access Journals (Sweden)

    Hans A Kestler

    2012-07-01

    Full Text Available Motility studies of cells often rely on computer software that analyzes time-lapse recorded movies and establishes cell trajectories fully automatically. This raises the question of reproducibility of results, since different programs could yield significantly different results of such automated analysis. The fact that the segmentation routines of such programs are often challenged by nanostructured surfaces makes the question more pertinent. Here we illustrate how it is possible to track cells on bright field microscopy images with image analysis routines implemented in an open-source cell tracking program, PACT (Program for Automated Cell Tracking. We compare the automated motility analysis of three cell tracking programs, PACT, Autozell, and TLA, using the same movies as input for all three programs. We find that different programs track overlapping, but different subsets of cells due to different segmentation methods. Unfortunately, population averages based on such different cell populations, differ significantly in some cases. Thus, results obtained with one software package are not necessarily reproducible by other software.

  15. ASSESSMENT OF AUTOMATED ANALYSES OF CELL MIGRATION ON FLAT AND NANOSTRUCTURED SURFACES

    Directory of Open Access Journals (Sweden)

    Cristian Grădinaru

    2012-07-01

    Full Text Available Motility studies of cells often rely on computer software that analyzes time-lapse recorded movies and establishes cell trajectories fully automatically. This raises the question of reproducibility of results, since different programs could yield significantly different results of such automated analysis. The fact that the segmentation routines of such programs are often challenged by nanostructured surfaces makes the question more pertinent. Here we illustrate how it is possible to track cells on bright field microscopy images with image analysis routines implemented in an open-source cell tracking program, PACT (Program for Automated Cell Tracking. We compare the automated motility analysis of three cell tracking programs, PACT, Autozell, and TLA, using the same movies as input for all three programs. We find that different programs track overlapping, but different subsets of cells due to different segmentation methods. Unfortunately, population averages based on such different cell populations, differ significantly in some cases. Thus, results obtained with one software package are not necessarily reproducible by other software.

  16. Detection of Giardia lamblia, Cryptosporidium spp. and Entamoeba histolytica in clinical stool samples by using multiplex real-time PCR after automated DNA isolation.

    Science.gov (United States)

    Van Lint, P; Rossen, J W; Vermeiren, S; Ver Elst, K; Weekx, S; Van Schaeren, J; Jeurissen, A

    2013-01-01

    Diagnosis of intestinal parasites in stool samples is generally still carried out by microscopy; however, this technique is known to suffer from a low sensitivity and is unable to discriminate between certain protozoa. In order to overcome these limitations, a real-time multiplex PCR was evaluated as an alternative approach for diagnosing Giardia lamblia, Cryptosporidium spp. and Entamoeba histolytica in stool samples.Therefore, a total of 631 faecal samples were analysed both by microscopy as well as by real-time PCR following automated DNA extraction. Results showed that real-time PCR exhibited sensitivity and specificity of both 100%, whereas traditional microscopy exhibited sensitivity and specificity of 37.5% and 99.8% respectively. As real-time PCR provides simple, sensitive and specific detection of these three important pathogenic protozoan parasites, this technique, rather than microscopy, has become our diagnostic method of choice for the detection of enteric protozoan parasites for the majority of patients.

  17. Automated ambulatory assessment of cognitive performance, environmental conditions, and motor activity during military operations

    Science.gov (United States)

    Lieberman, Harris R.; Kramer, F. Matthew; Montain, Scott J.; Niro, Philip; Young, Andrew J.

    2005-05-01

    Until recently scientists had limited opportunities to study human cognitive performance in non-laboratory, fully ambulatory situations. Recently, advances in technology have made it possible to extend behavioral assessment to the field environment. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device, now widely employed, can acquire minute-by-minute information on an individual"s level of motor activity. Actigraphs can, with reasonable accuracy, distinguish sleep from waking, the most critical and basic aspect of human behavior. However, rapid technologic advances have provided the opportunity to collect much more information from fully ambulatory humans. Our laboratory has developed a series of wrist-worn devices, which are not much larger then a watch, which can assess simple and choice reaction time, vigilance and memory. In addition, the devices can concurrently assess motor activity with much greater temporal resolution then the standard actigraph. Furthermore, they continuously monitor multiple environmental variables including temperature, humidity, sound and light. We have employed these monitors during training and simulated military operations to collect information that would typically be unavailable under such circumstances. In this paper we will describe various versions of the vigilance monitor and how each successive version extended the capabilities of the device. Samples of data from several studies are presented, included studies conducted in harsh field environments during simulated infantry assaults, a Marine Corps Officer training course and mechanized infantry (Stryker) operations. The monitors have been useful for documenting environmental conditions experienced by wearers, studying patterns of sleep and activity and examining the effects of nutritional manipulations on warfighter performance.

  18. Automated DNA Sequencing System

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  19. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism.

    Science.gov (United States)

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-10-12

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.

  20. Automated Writing Evaluation for Formative Assessment of Second Language Writing: Investigating the Accuracy and Usefulness of Feedback as Part of Argument-Based Validation

    Science.gov (United States)

    Ranalli, Jim; Link, Stephanie; Chukharev-Hudilainen, Evgeny

    2017-01-01

    An increasing number of studies on the use of tools for automated writing evaluation (AWE) in writing classrooms suggest growing interest in their potential for formative assessment. As with all assessments, these applications should be validated in terms of their intended interpretations and uses. A recent argument-based validation framework…

  1. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    Science.gov (United States)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  2. Post-operative corticosterone levels in plasma and feces of mice subjected to permanent catheterization and automated blood sampling.

    Science.gov (United States)

    Sundbom, Renée; Jacobsen, Kirsten R; Kalliokoski, Otto; Hau, Jann; Abelson, Klas S P

    2011-01-01

    This study investigated the effects of surgical placement of permanent arterial catheters on plasma corticosterone levels, fecal corticosterone excretion and body weight in male BALB/c/Sca mice. In addition, the effects of voluntarily ingested buprenorphine in doses of 0.5 and 1.0 mg/kg body weight on these parameters were studied. A catheter was placed in the carotid artery during isoflurane anesthesia. Immediately after surgery, the mice were connected to an AccuSampler® μ and blood samples for plasma corticosterone quantification were collected automatically during the first 24 h postoperatively. All fecal boli produced 24 h before and 24 h after surgery were collected for fecal corticosterone excretion measures and the pre- and post-operative body weights were registered. Plasma corticosterone levels were in the range of 150-300 ng/ml after the surgical procedure and the body weight was significantly lower 24 h after surgery compared to its pre-operative value. Contrary to what was expected, the total fecal corticosterone excretion was significantly reduced 24 h after surgery, as was the defecation. Buprenorphine treatment significantly lowered the plasma corticosterone levels, but had no effect on fecal corticosterone excretion or body weight change. It was concluded that surgical placement of an arterial catheter induces a significant stress response, as judged by its effect on plasma corticosterone and body weight. Voluntary ingestion of buprenorphine improved postoperative recovery by lowering plasma corticosterone concentrations. Neither fecal corticosterone excretion nor body weight change seems suitable for postoperative stress assessment in mice in the present experimental setup.

  3. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune

    2011-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO...... 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained...... the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFlSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI...

  4. A molecular method to assess Phytophthora diversity in environmental samples.

    Science.gov (United States)

    Scibetta, Silvia; Schena, Leonardo; Chimento, Antonio; Cacciola, Santa O; Cooke, David E L

    2012-03-01

    Current molecular detection methods for the genus Phytophthora are specific to a few key species rather than the whole genus and this is a recognized weakness of protocols for ecological studies and international plant health legislation. In the present study a molecular approach was developed to detect Phytophthora species in soil and water samples using novel sets of genus-specific primers designed against the internal transcribed spacer (ITS) regions. Two different rDNA primer sets were tested: one assay amplified a long product including the ITS1, 5.8S and ITS2 regions (LP) and the other a shorter product including the ITS1 only (SP). Both assays specifically amplified products from Phytophthora species without cross-reaction with the related Pythium s. lato, however the SP assay proved the more sensitive and reliable. The method was validated using woodland soil and stream water from Invergowrie, Scotland. On-site use of a knapsack sprayer and in-line water filters proved more rapid and effective than centrifugation at sampling Phytophthora propagules. A total of 15 different Phytophthora phylotypes were identified which clustered within the reported ITS-clades 1, 2, 3, 6, 7 and 8. The range and type of the sequences detected varied from sample to sample and up to three and five different Phytophthora phylotypes were detected within a single sample of soil or water, respectively. The most frequently detected sequences were related to members of ITS-clade 6 (i.e. P. gonapodyides-like). The new method proved very effective at discriminating multiple species in a given sample and can also detect as yet unknown species. The reported primers and methods will prove valuable for ecological studies, biosecurity and commercial plant, soil or water (e.g. irrigation water) testing as well as the wider metagenomic sampling of this fascinating component of microbial pathogen diversity.

  5. Automated cytochrome c oxidase bioassay developed for ionic liquids' toxicity assessment.

    Science.gov (United States)

    Costa, Susana P F; Martins, Bárbara S F; Pinto, Paula C A G; Saraiva, M Lúcia M F S

    2016-05-15

    A fully automated cytochrome c oxidase assay resorting to sequential injection analysis (SIA) was developed for the first time and implemented to evaluate potential toxic compounds. The bioassay was validated by evaluation of 15 ionic liquids (ILs) with distinct cationic head groups, alkyl side chains and anions. The assay was based on cytochrome c oxidase activity reduction in presence of tested compounds and quantification of inhibitor concentration required to cause 50% of enzyme activity inhibition (EC50). The obtained results demonstrated that enzyme activity was considerably inhibited by BF4 anion and ILs incorporating non-aromatic pyrrolidinium and tetrabutylphosphonium cation cores. Emim [Ac] and chol [Ac], on contrary, presented the higher EC50 values among the ILs tested. The developed automated SIA methodology is a simple and robust high-throughput screening bioassay and exhibited good repeatability in all the tested conditions (rsd<3.7%, n=10). Therefore, it is expected that due to its simplicity and low cost, the developed approach can be used as alternative to traditional screening assays for evaluation of ILs toxicity and identification of possible toxicophore structures. Additionally, the results presented in this study provide further information about ILs toxicity.

  6. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  7. Protein Quality Assessment on Saliva Samples for Biobanking Purposes.

    Science.gov (United States)

    Rosa, Nuno; Marques, Jéssica; Esteves, Eduardo; Fernandes, Mónica; Mendes, Vera M; Afonso, Ângela; Dias, Sérgio; Pereira, Joaquim Polido; Manadas, Bruno; Correia, Maria José; Barros, Marlene

    2016-08-01

    Biobank saliva sample quality depends on specific criteria applied to collection, processing, and storage. In spite of the growing interest in saliva as a diagnostic fluid, few biobanks currently store large collections of such samples. The development of a standard operating procedure (SOP) for saliva collection and quality control is fundamental for the establishment of a new saliva biobank, which stores samples to be made available to the saliva research community. Different collection methods were tested regarding total volume of protein obtained, protein content, and protein profiles, and the results were used to choose the best method for protein studies. Furthermore, the impact of the circadian variability and inter- and intraindividual differences, as well as the saliva sample stability at room temperature, were also evaluated. Considering our results, a sublingual cotton roll method for saliva collection proved to produce saliva with the best characteristics and should be applied in the morning, whenever possible. In addition, there is more variability in salivary proteins between individuals than in the same individual for a 5-month period. According to the electrophoretic protein profile, protein stability is guaranteed for 24 hours at room temperature and the protein degradation profile and protein identification were characterized. All this information was used to establish an SOP for saliva collection, processing, and storage in a biobank. We conclude that it is possible to collect saliva using an easy and inexpensive protocol, resulting in saliva samples for protein analysis with sufficient quality for biobanking purposes.

  8. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  9. Metabolomic Quality Assessment of EDTA Plasma and Serum Samples.

    Science.gov (United States)

    Malm, Linus; Tybring, Gunnel; Moritz, Thomas; Landin, Britta; Galli, Joakim

    2016-10-01

    Handling and processing of blood can significantly alter the molecular composition and consistency of biobank samples and can have a major impact on the identification of biomarkers. It is thus crucial to identify tools to determine the quality of samples to be used in biomarker discovery studies. In this study, a non-targeted gas chromatography/time-of-flight mass spectrometry (GC-TOFMS) metabolomic strategy was used with the aim of identifying quality markers for serum and plasma biobank collections lacking proper documentation of preanalytical handling. The effect of postcentrifugation delay was examined in serum stored in tubes with gel separation plugs and ethylenediaminetetraacetic acid (EDTA) plasma in tubes with or without gel separation plugs. The change in metabolic pattern was negligible in all sample types processed within 3 hours after centrifugation regardless of whether the samples were kept at 4°C or 22°C. After 8 and 24 hours postcentrifugation delay before aliquoting, there was a pronounced increase in the number of affected metabolites, as well as in the magnitude of the observed changes. No protective effect on the metabolites was observed in gel-separated EDTA plasma samples. In a separate series of experiments, lactate and glucose levels were determined in plasma to estimate the effect of precentrifugation delay. This separate experiment indicates that the lactate to glucose ratio may serve as a marker to identify samples with delayed time to centrifugation. Although our data from the untargeted GC-TOFMS analysis did not identify any specific markers, we conclude that plasma and serum metabolic profiles remain quite stable when plasma and serum are centrifuged and separated from the blood cells within 3 hours.

  10. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  11. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    Energy Technology Data Exchange (ETDEWEB)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Andruch, Vasil, E-mail: vasil.andruch@upjs.sk [Department of Analytical Chemistry, University of P.J. Šafárik, SK-04154 Košice (Slovakia); Moskvin, Leonid [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Bulatov, Andrey, E-mail: bulatov_andrey@mail.ru [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation)

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L{sup −1} Na{sub 2}CO{sub 3}) and the proton donor solution (1 mol L{sup −1} CH{sub 3}COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min{sup −1} during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L{sup −1} of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L{sup −1}. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  12. Genesis Solar Wind Collector Cleaning Assessment: 60366 Sample Case Study

    Science.gov (United States)

    Goreva, Y. S.; Gonzalez, C. P.; Kuhlman, K. R.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, M. C.; Burkett, P. J.

    2014-01-01

    In order to recognize, localize, characterize and remove particle and thin film surface contamination, a small subset of Genesis mission collector fragments are being subjected to extensive study via various techniques [1-5]. Here we present preliminary results for sample 60336, a Czochralski silicon (Si-CZ) based wafer from the bulk array (B/C).

  13. Automated system for generation of soil moisture products for agricultural drought assessment

    Science.gov (United States)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically

  14. A Robust and Automated Hyperspectral Damage Assessment System Under Varying Illumination Conditions and Viewing Geometry Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Some target signatures of interest in drought monitoring, flooding assessment, fire damage assessment, coastal changes, urban changes, etc. may need to be tracked...

  15. Mass asymmetry and tricyclic wobble motion assessment using automated launch video analysis

    Institute of Scientific and Technical Information of China (English)

    Ryan DECKER; Joseph DONINI; William GARDNER; Jobin JOHN; Walter KOENIG

    2016-01-01

    This paper describes an approach to identify epicyclic and tricyclic motion during projectile flight caused by mass asymmetries in spin-stabilized projectiles. Flight video was captured following projectile launch of several M110A2E1 155 mm artillery projectiles. These videos were then analyzed using the automated flight video analysis method to attain their initial position and orientation histories. Examination of the pitch and yaw histories clearly indicates that in addition to epicyclic motion’s nutation and precession oscillations, an even faster wobble amplitude is present during each spin revolution, even though some of the amplitudes of the oscillation are smaller than 0.02 degree. The results are compared to a sequence of shots where little appreciable mass asymmetries were present, and only nutation and precession frequencies are predominantly apparent in the motion history results. Magnitudes of the wobble motion are estimated and compared to product of inertia measurements of the asymmetric projectiles.

  16. An Accuracy Assessment of Automated Photogrammetric Techniques for 3d Modeling of Complex Interiors

    Science.gov (United States)

    Georgantas, A.; Brédif, M.; Pierrot-Desseilligny, M.

    2012-07-01

    This paper presents a comparison of automatic photogrammetric techniques to terrestrial laser scanning for 3D modelling of complex interior spaces. We try to evaluate the automated photogrammetric techniques not only in terms of their geometric quality compared to laser scanning but also in terms of cost in money, acquisition and computational time. To this purpose we chose as test site a modern building's stairway. APERO/MICMAC ( ©IGN )which is an Open Source photogrammetric software was used for the production of the 3D photogrammetric point cloud which was compared to the one acquired by a Leica Scanstation 2 laser scanner. After performing various qualitative and quantitative controls we present the advantages and disadvantages of each 3D modelling method applied in a complex interior of a modern building.

  17. Development of Genesis Solar Wind Sample Cleanliness Assessment: Initial Report on Sample 60341 Optical Imagery and Elemental Mapping

    Science.gov (United States)

    Gonzalez, C. P.; Goreva, Y. S.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, P. J.; Burkett, P. J.

    2014-01-01

    Since 2005 the Genesis science team has experimented with techniques for removing the contaminant particles and films from the collection surface of the Genesis fragments. A subset of 40 samples have been designated as "cleaning matrix" samples. These are small samples to which various cleaning approaches are applied and then cleanliness is assessed optically, by TRXRF, SEM, ToF-SIMS, XPS, ellipsometry or other means [1-9]. Most of these sam-ples remain available for allocation, with cleanliness assessment data. This assessment allows evaluation of various cleaning techniques and handling or analytical effects. Cleaning techniques investigated by the Genesis community include acid/base etching, acetate replica peels, ion beam, and CO2 snow jet cleaning [10-16]. JSC provides surface cleaning using UV ozone exposure and ultra-pure water (UPW) [17-20]. The UPW rinse is commonly used to clean samples for handling debris between processing by different researchers. Optical microscopic images of the sample taken before and after UPW cleaning show what has been added or removed during the cleaning process.

  18. Assessing total and volatile solids in municipal solid waste samples.

    Science.gov (United States)

    Peces, M; Astals, S; Mata-Alvarez, J

    2014-01-01

    Municipal solid waste is broadly generated in everyday activities and its treatment is a global challenge. Total solids (TS) and volatile solids (VS) are typical control parameters measured in biological treatments. In this study, the TS and VS were determined using the standard methods, as well as introducing some variants: (i) the drying temperature for the TS assays was 105°C, 70°C and 50°C and (ii) the VS were determined using different heating ramps from room tempature to 550°C. TS could be determined at either 105°C or 70°C, but oven residence time was tripled at 70°C, increasing from 48 to 144 h. The VS could be determined by smouldering the sample (where the sample is burnt without a flame), which avoids the release of fumes and odours in the laboratory. However, smouldering can generate undesired pyrolysis products as a consequence of carbonization, which leads to VS being underestimated. Carbonization can be avoided using slow heating ramps to prevent the oxygen limitation. Furthermore, crushing the sample cores decreased the time to reach constant weight and decreased the potential to underestimate VS.

  19. Alveolar breath sampling and analysis to assess trihalomethane exposures during competitive swimming training.

    OpenAIRE

    Lindstrom, A B; Pleil, J D; Berkoff, D C

    1997-01-01

    Alveolar breath sampling was used to assess trihalomethane (THM) exposures encountered by collegiate swimmers during a typical 2-hr training period in an indoor natatorium. The breath samples were collected at regular intervals before, during, and for 3 hr after a moderately intense training workout. Integrated and grab whole-air samples were collected during the training period to help determine inhalation exposures, and pool water samples were collected to help assess dermal exposures. Resu...

  20. A lab-on-a-chip system integrating tissue sample preparation and multiplex RT-qPCR for gene expression analysis in point-of-care hepatotoxicity assessment.

    Science.gov (United States)

    Lim, Geok Soon; Chang, Joseph S; Lei, Zhang; Wu, Ruige; Wang, Zhiping; Cui, Kemi; Wong, Stephen

    2015-10-21

    A truly practical lab-on-a-chip (LOC) system for point-of-care testing (POCT) hepatotoxicity assessment necessitates the embodiment of full-automation, ease-of-use and "sample-in-answer-out" diagnostic capabilities. To date, the reported microfluidic devices for POCT hepatotoxicity assessment remain rudimentary as they largely embody only semi-quantitative or single sample/gene detection capabilities. In this paper, we describe, for the first time, an integrated LOC system that is somewhat close to a practical POCT hepatotoxicity assessment device - it embodies both tissue sample preparation and multiplex real-time RT-PCR. It features semi-automation, is relatively easy to use, and has "sample-in-answer-out" capabilities for multiplex gene expression analysis. Our tissue sample preparation module incorporating both a microhomogenizer and surface-treated paramagnetic microbeads yielded high purity mRNA extracts, considerably better than manual means of extraction. A primer preloading surface treatment procedure and the single-loading inlet on our multiplex real-time RT-PCR module simplify off-chip handling procedures for ease-of-use. To demonstrate the efficacy of our LOC system for POCT hepatotoxicity assessment, we perform a preclinical animal study with the administration of cyclophosphamide, followed by gene expression analysis of two critical protein biomarkers for liver function tests, aspartate transaminase (AST) and alanine transaminase (ALT). Our experimental results depict normalized fold changes of 1.62 and 1.31 for AST and ALT, respectively, illustrating up-regulations in their expression levels and hence validating their selection as critical genes of interest. In short, we illustrate the feasibility of multiplex gene expression analysis in an integrated LOC system as a viable POCT means for hepatotoxicity assessment.

  1. Automation system risk assessment; Gestao de riscos de sistemas de automacao

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Felipe; Furuta, Margarete [PricewaterhouseCoopers, Sao Paulo, SP (Brazil)

    2008-07-01

    In spite of what was learnt with the history of several industrial accidents and the initiative in relation to the protection measures taken by several Organizations, even today extremely serious accidents keep happening in the automation environment. If on one hand, the growing competitive demands an increase in the productivity, which several times is only possible through more complex processes that make the facilities operate in their limits, on the other, it is noticeable that the control, automation and security related to these more complex processes are also more difficult to manage. The incessant investigation of past accidents resulted in the prevention of specific dangerous events in relation to industrial facilities but it also brought to light the importance of actions related to the Risk Management Process. Without doubt, the consequences resulting from the materialization of an event can reach disastrous and unrecoverable levels, taking into account the comprehensiveness of the potential risks. Studies carried out by international entities illustrate that the inadequate management of risks is the factor that contributes more for the occurrence of accidents. The initial phase of the risk management results from the analysis of the risks inherent to the process (e.g. to determine the probability of occurring each different potential failure), the study of the consequences if these failures occur, the definition of the risk considered acceptable according to the appetite for risks established by the Organization, and the identification of the action on the risk that can vary in a spectrum that can involve decreasing, transferring, avoiding or accepting the risk. This work has as objective to exploit the aspects for the implementation of Risk Management in the Oil and Gas segment. The study also seeks to explicit, based on the systematic registry of the measured items, how it is possible to evaluate the financial exposure of the risk to which a project

  2. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.

  3. Computerized self-assessment of automated lesion segmentation in breast ultrasound: implication for CADx applied to findings in the axilla

    Science.gov (United States)

    Drukker, K.; Giger, M. L.

    2008-03-01

    We developed a self-assessment method in which the CADx system provided a confidence level for its lesion segmentations. The self-assessment was performed by a fuzzy-inference system based on 4 computer-extracted features of the computer-segmented lesions in a leave-one-case-out evaluation protocol. In instances where the initial segmentation received a low assessment rating, lesions were re-segmented using the same segmentation method but based on a user-defined region-of-interest. A total of 542 cases with 1133 lesions were collected in this study, and we focused here on the 97 normal lymph nodes in this dataset since these pose challenges for automated segmentation due to their inhomogeneous appearance. The percentage of all lesions with satisfactory segmentation (i.e., normalized overlap with the radiologist-delineated lesion >=0.3) was 85%. For normal lymph nodes, however, this percentage was only 36%. Of the lymph nodes, 53 received a low confidence rating (confidence levels demonstrated potential to 1) help radiologists decide whether to use or disregard CADx output, and 2) provide a guide for improvement of lesion segmentation.

  4. [Assessment of AFP in amniotic fluid: comparison of three automated techniques].

    Science.gov (United States)

    Leguy, Marie-Clémence; Tavares, Silvina Dos Reis; Tsatsaris, Vassili; Lewin, Fanny; Clauser, Eric; Guibourdenche, Jean

    2011-01-01

    Ultrasound scanning is useful to detect neural tube defect (NTD) but scarcely distinguished between closed NTD and open NTD, which had very different prognosis. An amniotic fluid punction is thus mandatory to search for an increase in alpha foeto protein (AFP) levels and for the presence of acetylcholinesterase which identified open NTD. However, AFP levels fluctuate both with the gestational age and the assay used. Our aim was to establish normative values for AFP in amniotic fluid in the second half of pregnancy using three different immunoassays and to improve their clinical relevance. Amniotic fluid punctions were performed on 527 patients from 9 week of gestation (WG) to 37 WG either for maternal age, Trisomy 21 screening, increase in nucal translucency (control group, n = 527) or for suspicion of neural tube defect or abdominal defect (n = 5). AFP was measured using the immunoassay developed for serum AFP on the Access 2 system, the Immulite 2000 and the Advia Centaur. Results were expressed in ng/ml, multiple of the median (MoM) and percentiles. AFP decrease by 1.5 fold between 9 and 19 WG. When NTD was suspected, an increase in anmniotic AFP was observed (from 2.5 MoM to 9.3 MoM) confirming an open NTD. In conclusion, the assay developed on those 3 automates is suitable for the measurement of AFP in amniotic fluid.

  5. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    Directory of Open Access Journals (Sweden)

    Colin J Torney

    Full Text Available Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future.

  6. A filter paper-based microdevice for low-cost, rapid, and automated DNA extraction and amplification from diverse sample types.

    Science.gov (United States)

    Gan, Wupeng; Zhuang, Bin; Zhang, Pengfei; Han, Junping; Li, Cai-Xia; Liu, Peng

    2014-10-07

    A plastic microfluidic device that integrates a filter disc as a DNA capture phase was successfully developed for low-cost, rapid and automated DNA extraction and PCR amplification from various raw samples. The microdevice was constructed by sandwiching a piece of Fusion 5 filter, as well as a PDMS (polydimethylsiloxane) membrane, between two PMMA (poly(methyl methacrylate)) layers. An automated DNA extraction from 1 μL of human whole blood can be finished on the chip in 7 minutes by sequentially aspirating NaOH, HCl, and water through the filter. The filter disc containing extracted DNA was then taken out directly for PCR. On-chip DNA purification from 0.25-1 μL of human whole blood yielded 8.1-21.8 ng of DNA, higher than those obtained using QIAamp® DNA Micro kits. To realize DNA extraction from raw samples, an additional sample loading chamber containing a filter net with an 80 μm mesh size was designed in front of the extraction chamber to accommodate sample materials. Real-world samples, including whole blood, dried blood stains on Whatman® 903 paper, dried blood stains on FTA™ cards, buccal swabs, saliva, and cigarette butts, can all be processed in the system in 8 minutes. In addition, multiplex amplification of 15 STR (short tandem repeat) loci and Sanger-based DNA sequencing of the 520 bp GJB2 gene were accomplished from the filters that contained extracted DNA from blood. To further prove the feasibility of integrating this extraction method with downstream analyses, "in situ" PCR amplifications were successfully performed in the DNA extraction chamber following DNA purification from blood and blood stains without DNA elution. Using a modified protocol to bond the PDMS and PMMA, our plastic PDMS devices withstood the PCR process without any leakage. This study represents a significant step towards the practical application of on-chip DNA extraction methods, as well as the development of fully integrated genetic analytical systems.

  7. Environmental Assessment of Natural Radioactivity in Soil Samples

    Directory of Open Access Journals (Sweden)

    Ryuta Hazama

    2009-07-01

    Full Text Available The environmental impacts and hazards due to the unstoppable hot mud flow by the East Java ‘LUSI’ Mud Volcano are increasing since its unexpected eruption on May 29, 2006. Analysis should be undertaken, not only to examine its impact on human health and the environment, but also to explore the potential benefits of the mud flow. One may be able to tap the mud flow as a material source for brick and cement. Recently there has been great concern about the health risks associated with exposure to natural radioactivity present in soil and building materials all over the world. In this context, measurements for natural radioactive isotopes such as 238U and 232Th series, and 40K in mud samples were carried out using the HPGe (High-Purity Germanium detector to determine the re-usability of the mud. 226Ra, 232Th and 40K activity concentrations were found to be 13±1, 15±1 and 111±3 Bq/kg (1 Bq = 1 sec-1, respectively, and the corresponding activity index was found to be 0.16±0.02. These values were compared with previous data and our measured accuracy was improved by a factor of nine at the maximum. Radium equivalent activity, external and internal hazard indices, and annual effective dose equivalent were also evaluated and all were found to be within acceptable limits.

  8. Automated large scale parameter extraction of road-side trees sampled by a laser mobile mapping system

    NARCIS (Netherlands)

    Lindenbergh, R.C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-01-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadsid

  9. Evaluation of two automated enzyme-immunoassays for detection of thermophilic campylobacters in faecal samples from cattle and swine

    DEFF Research Database (Denmark)

    Hoorfar, Jeffrey; Nielsen, E.M.; Stryhn, H.

    1999-01-01

    We evaluated the performance of two enzyme-immunoassays (EIA) for the detection of naturally occurring, thermophilic Campylobacter spp. found in faecal samples from cattle (n = 21 and n = 26) and swine (n = 43) relative to the standard culture method, and also assuming that none of the tests was ...

  10. An automated on-line multidimensional HPLC system for protein and peptide mapping with integrated sample preparation

    NARCIS (Netherlands)

    Wagner, K.; Miliotis, T.; Marko-Varga, G; Bischoff, Rainer; Unger, K.K.

    2002-01-01

    A comprehensive on-line two-dimensional 2D-HPLC system with integrated sample preparation was developed for the analysis of proteins and peptides with a molecular weight below 20 kDa. The system setup provided fast separations and high resolving power and is considered to be a complementary techniqu

  11. Warehouse automation

    OpenAIRE

    Pogačnik, Jure

    2017-01-01

    An automated high bay warehouse is commonly used for storing large number of material with a high throughput. In an automated warehouse pallet movements are mainly performed by a number of automated devices like conveyors systems, trolleys, and stacker cranes. From the introduction of the material to the automated warehouse system to its dispatch the system requires no operator input or intervention since all material movements are done automatically. This allows the automated warehouse to op...

  12. Assessing the Alcohol-BMI Relationship in a US National Sample of College Students

    Science.gov (United States)

    Barry, Adam E.; Piazza-Gardner, Anna K.; Holton, M. Kim

    2015-01-01

    Objective: This study sought to assess the body mass index (BMI)-alcohol relationship among a US national sample of college students. Design: Secondary data analysis using the Fall 2011 National College Health Assessment (NCHA). Setting: A total of 44 US higher education institutions. Methods: Participants included a national sample of college…

  13. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    Science.gov (United States)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1).

  14. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  15. Real-time automated spectral assessment of the BOLD response for neurofeedback at 3 and 7T.

    Science.gov (United States)

    Koush, Yury; Elliott, Mark A; Scharnowski, Frank; Mathiak, Klaus

    2013-09-15

    Echo-planar imaging is the dominant functional MRI data acquisition scheme for evaluating the BOLD signal. To date, it remains the only approach providing neurofeedback from spatially localized brain activity. Real-time functional single-voxel proton spectroscopy (fSVPS) may be an alternative for spatially specific BOLD neurofeedback at 7T because it allows for a precise estimation of the local T2* signal, EPI-specific artifacts may be avoided, and the signal contrast may increase. In order to explore and optimize this alternative neurofeedback approach, we tested fully automated real-time fSVPS spectral estimation procedures to approximate T2* BOLD signal changes from the unsuppressed water peak, i.e. lorentzian non-linear complex spectral fit (LNLCSF) in frequency and frequency-time domain. The proposed approaches do not require additional spectroscopic localizers in contrast to conventional T2* approximation based on linear regression of the free induction decay (FID). For methods comparison, we evaluated quality measures for signals from the motor and the visual cortex as well as a real-time feedback condition at high (3T) and at ultra-high (7T) magnetic field strengths. Using these methods, we achieved reliable and fast water peak spectral parameter estimations. At 7T, we observed an absolute increase of spectra line narrowing due to the BOLD effect, but quality measures did not improve due to artifactual line broadening. Overall, the automated fSVPS approach can be used to assess dynamic spectral changes in real-time, and to provide localized T2* neurofeedback at 3 and 7T.

  16. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    Energy Technology Data Exchange (ETDEWEB)

    Rosato, Antonio [University of Florence, Department of Chemistry and Magnetic Resonance Center (Italy); Vranken, Wim [Vrije Universiteit Brussel, Structural Biology Brussels (Belgium); Fogh, Rasmus H.; Ragan, Timothy J. [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom); Tejero, Roberto [Universidad de Valencia, Departamento de Química Física (Spain); Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H. [University of Georgia, Complex Carbohydrate Research Center and Northeast Structural Genomics Consortium (United States); Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H. [University of Toronto, Department of Medical Biophysics, Cancer Genomics and Proteomics, Ontario Cancer Institute, Northeast Structural Genomics Consortium (Canada); Kennedy, Michael [Miami University, Department of Chemistry and Biochemistry, Northeast Structural Genomics Consortium (United States); Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, Northeast Structural Genomics Consortium, Rutgers (United States); Vuister, Geerten W., E-mail: gv29@le.ac.uk [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom)

    2015-08-15

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

  17. An assessment of two automated snow water equivalent instruments during the WMO Solid Precipitation Intercomparison Experiment

    Science.gov (United States)

    Smith, Craig D.; Kontu, Anna; Laffin, Richard; Pomeroy, John W.

    2017-01-01

    During the World Meteorological Organization (WMO) Solid Precipitation Intercomparison Experiment (SPICE), automated measurements of snow water equivalent (SWE) were made at the Sodankylä (Finland), Weissfluhjoch (Switzerland) and Caribou Creek (Canada) SPICE sites during the northern hemispheric winters of 2013/14 and 2014/15. Supplementary intercomparison measurements were made at Fortress Mountain (Kananaskis, Canada) during the 2013/14 winter. The objectives of this analysis are to compare automated SWE measurements with a reference, comment on their performance and, where possible, to make recommendations on how to best use the instruments and interpret their measurements. Sodankylä, Caribou Creek and Fortress Mountain hosted a Campbell Scientific CS725 passive gamma radiation SWE sensor. Sodankylä and Weissfluhjoch hosted a Sommer Messtechnik SSG1000 snow scale. The CS725 operating principle is based on measuring the attenuation of soil emitted gamma radiation by the snowpack and relating the attenuation to SWE. The SSG1000 measures the mass of the overlying snowpack directly by using a weighing platform and load cell. Manual SWE measurements were obtained at the intercomparison sites on a bi-weekly basis over the accumulation-ablation periods using bulk density samplers. These manual measurements are considered to be the reference for the intercomparison. Results from Sodankylä and Caribou Creek showed that the CS725 generally overestimates SWE as compared to manual measurements by roughly 30-35 % with correlations (r2) as high as 0.99 for Sodankylä and 0.90 for Caribou Creek. The RMSE varied from 30 to 43 mm water equivalent (mm w.e.) and from 18 to 25 mm w.e. at Sodankylä and Caribou Creek, which had respective SWE maximums of approximately 200 and 120 mm w.e. The correlation at Fortress Mountain was 0.94 (RMSE of 48 mm w.e. with a maximum SWE of approximately 650 mm w.e.) with no systematic overestimation. The SSG1000 snow scale, having a different

  18. A Large-Sample Test of a Semi-Automated Clavicle Search Engine to Assist Skeletal Identification by Radiograph Comparison.

    Science.gov (United States)

    D'Alonzo, Susan S; Guyomarc'h, Pierre; Byrd, John E; Stephan, Carl N

    2017-01-01

    In 2014, a morphometric capability to search chest radiograph databases by quantified clavicle shape was published to assist skeletal identification. Here, we extend the validation tests conducted by increasing the search universe 18-fold, from 409 to 7361 individuals to determine whether there is any associated decrease in performance under these more challenging circumstances. The number of trials and analysts were also increased, respectively, from 17 to 30 skeletons, and two to four examiners. Elliptical Fourier analysis was conducted on clavicles from each skeleton by each analyst (shadowgrams trimmed from scratch in every instance) and compared to the search universe. Correctly matching individuals were found in shortlists of 10% of the sample 70% of the time. This rate is similar to, although slightly lower than, rates previously found for much smaller samples (80%). Accuracy and reliability are thereby maintained, even when the comparison system is challenged by much larger search universes.

  19. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  20. An Automated Grass-Based Procedure to Assess the Geometrical Accuracy of the Openstreetmap Paris Road Network

    Science.gov (United States)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.

    2016-06-01

    OpenStreetMap (OSM) is the largest spatial database of the world. One of the most frequently occurring geospatial elements within this database is the road network, whose quality is crucial for applications such as routing and navigation. Several methods have been proposed for the assessment of OSM road network quality, however they are often tightly coupled to the characteristics of the authoritative dataset involved in the comparison. This makes it hard to replicate and extend these methods. This study relies on an automated procedure which was recently developed for comparing OSM with any road network dataset. It is based on three Python modules for the open source GRASS GIS software and provides measures of OSM road network spatial accuracy and completeness. Provided that the user is familiar with the authoritative dataset used, he can adjust the values of the parameters involved thanks to the flexibility of the procedure. The method is applied to assess the quality of the Paris OSM road network dataset through a comparison against the French official dataset provided by the French National Institute of Geographic and Forest Information (IGN). The results show that the Paris OSM road network has both a high completeness and spatial accuracy. It has a greater length than the IGN road network, and is found to be suitable for applications requiring spatial accuracies up to 5-6 m. Also, the results confirm the flexibility of the procedure for supporting users in carrying out their own comparisons between OSM and reference road datasets.

  1. Evaluation of the RapidHIT™ 200, an automated human identification system for STR analysis of single source samples.

    Science.gov (United States)

    Holland, Mitchell; Wendt, Frank

    2015-01-01

    The RapidHIT™ 200 Human Identification System was evaluated to determine its suitability for STR analysis of single source buccal swabs. Overall, the RapidHIT™ 200 performed as well as our traditional capillary electrophoresis based method in producing useable profile information on a first-pass basis. General observations included 100% concordance with known profile information, consistent instrument performance after two weeks of buccal swab storage, and an absence of contamination in negative controls. When data analysis was performed by the instrument software, 95.3% of the 85 samples in the reproducibility study gave full profiles. Including the 81 full profiles, a total of 2682 alleles were correctly called by the instrument software, or 98.6% of 2720 possible alleles tested. Profile information was generated from as little as 10,000 nucleated cells, with swab collection technique being a major contributing factor to profile quality. The average peak-height-ratio for heterozygote profiles (81%) was comparable to conventional STR analysis, and while a high analytical threshold was required when offline profile analysis was performed (800 RFU), it was proportionally consistent with traditional methods. Stochastic sampling effects were evaluated, and a manageable approach to address limits of detection for homozygote profiles is provided. These results support consideration of the RapidHIT™ 200 as an acceptable alternative to conventional, laboratory based STR analysis for the testing of single source buccal samples, with review of profile information as a requirement until an expert software system is incorporated, and when proper developmental and internal validation studies have been completed.

  2. Assessing automated image analysis of sand grain shape to identify sedimentary facies, Gran Dolina archaeological site (Burgos, Spain)

    Science.gov (United States)

    Campaña, I.; Benito-Calvo, A.; Pérez-González, A.; Bermúdez de Castro, J. M.; Carbonell, E.

    2016-12-01

    Gran Dolina is a cave (Sierra de Atapuerca, Spain) infilled by a 25 m thick sedimentary record, divided into 12 lithostratigraphic units that have been separated into 19 sedimentary facies containing Early and Middle Pleistocene hominin remains. In this paper, an automated image analysis method has been used to study the shape of the sedimentary particles. Since particle shape is interpreted as the result of sedimentary transport and sediment source, this study can provide valuable data about the sedimentological mechanism of sequence formation. The shape of the sand fraction in 73 samples from Gran Dolina site and Sierra de Atapuerca was analyzed using the Malvern Morphologi G3, an advanced particle characterization tool. In this first complete test, we used this method to the published sequence of Gran Dolina, defined previously through field work observations and geochemical and textural analysis. The results indicate that this image analysis method allows differentiation of the sedimentary facies, providing objective tools to identify weathered layers and measure the textural maturity of the sediments. Channel facies have the highest values of circularity and convexity, showing the highest textural maturity of particles. On the other hand, terra rossa and debris flow samples show similar values, with the lowest particle maturity.

  3. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    Energy Technology Data Exchange (ETDEWEB)

    Walworth, Matthew J [ORNL; ElNaggar, Mariam S [ORNL; Stankovich, Joseph J [ORNL; WitkowskiII, Charles E. [Protein Discovery, Inc.; Norris, Jeremy L [ORNL; Van Berkel, Gary J [ORNL

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  4. Calibration of a liquid scintillation counter to assess tritium levels in various samples

    CERN Document Server

    Al-Haddad, M N; Abu-Jarad, F A

    1999-01-01

    An LKB-Wallac 1217 Liquid Scintillation Counter (LSC) was calibrated with a newly adopted cocktail. The LSC was then used to measure tritium levels in various samples to assess the compliance of tritium levels with the recommended international levels. The counter was calibrated to measure both biological and operational samples for personnel and for an accelerator facility at KFUPM. The biological samples include the bioassay (urine), saliva, and nasal tests. The operational samples of the light ion linear accelerator include target cooling water, organic oil, fomblin oil, and smear samples. Sets of standards, which simulate various samples, were fabricated using traceable certified tritium standards. The efficiency of the counter was obtained for each sample. The typical range of the efficiencies varied from 33% for smear samples down to 1.5% for organic oil samples. A quenching curve for each sample is presented. The minimum detectable activity for each sample was established. Typical tritium levels in bio...

  5. MECH: Algorithms and Tools for Automated Assessment of Potential Attack Locations

    Science.gov (United States)

    2015-10-06

    or was not surveyed when the Russian and U.S. maps were originally created. 121 A.4 Population Population estimates were scraped ...tactical risk assessments of a study area. MECH- WPS MECH Web Portal Server The back-end processing engine which computes in real-time studies

  6. Unbiased Group-Level Statistical Assessment of Independent Component Maps by Means of Automated Retrospective Matching

    NARCIS (Netherlands)

    Langers, Dave R. M.

    2010-01-01

    This report presents and validates a method for the group-level statistical assessment of independent component analysis (ICA) outcomes. The method is based on a matching of individual component maps to corresponding aggregate maps that are obtained from concatenated data. Group-level statistics are

  7. Automated texture scoring for assessing breast cancer masking risk in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Petersen, Peter Kersten; Lillholm, Martin

    2015-01-01

    PURPOSE: The goal of this work is to develop a method to identify women at high risk for having breast cancer that is easily missed in regular mammography screening. Such a method will provide a rationale for selecting women for adjunctive screening. It goes beyond current risk assessment models...

  8. Test-retest reliability analysis of the Cambridge Neuropsychological Automated Tests for the assessment of dementia in older people living in retirement homes.

    Science.gov (United States)

    Gonçalves, Marta Matos; Pinho, Maria Salomé; Simões, Mário R

    2016-01-01

    The validity of the Cambridge Neuropsychological Automated Tests has been widely studied, but their reliability has not. This study aimed to estimate the test-retest reliability of these tests in a sample of 34 older adults, aged 69 to 90 years old, without neuropsychiatric diagnoses and living in retirement homes in the district of Lisbon, Portugal. The battery was administered twice, with a 4-week interval between sessions. The Paired Associates Learning (PAL), Spatial Working Memory (SWM), Rapid Visual Information Processing, and Reaction Time tests revealed measures with high-to-adequate test-retest correlations (.71-.89), although several PAL and SWM measures showed susceptibility to practice effects. Two estimated standardized regression-based methods were found to be more efficient at correcting for practice effects than a method of fixed correction. We also found weak test-retest correlations (.56-.68) for several measures. These results suggest that some, but not all, measures are suitable for cognitive assessment and monitoring in this population.

  9. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR LANDSCAPE ASSESSMENT AND WATERSHED MANAGEMENT

    Science.gov (United States)

    The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...

  10. Evaluation of a Portable Automated Serum Chemistry Analyzer for Field Assessment of Harlequin Ducks, Histrionicus histrionicus

    OpenAIRE

    Michael K. Stoskopf; Daniel M. Mulcahy; Daniel Esler

    2010-01-01

    A portable analytical chemistry analyzer was used to make field assessments of wild harlequin ducks (Histrionicus histrionicus) inassociation with telemetry studies of winter survival in Prince William Sound, Alaska. We compared serum chemistry resultsobtained on-site with results from a traditional laboratory. Particular attention was paid to serum glucose and potassiumconcentrations as potential indicators of high-risk surgical candidates based on evaluation of the field data. Themedian dif...

  11. Vertical Sampling in Recharge Areas Versus Lateral Sampling in Discharge Areas: Assessing the Agricultural Nitrogen Legacy in Groundwater

    Science.gov (United States)

    Gilmore, T. E.; Genereux, D. P.; Solomon, D. K.; Mitasova, H.; Burnette, M.

    2014-12-01

    Agricultural nitrogen (N) is a legacy contaminant often found in shallow groundwater systems. This legacy has commonly been observed using well nests (vertical sampling) in recharge areas, but may also be observed by sampling at points in/beneath a streambed using pushable probes along transects across a channel (lateral sampling). We compared results from two different streambed point sampling approaches and from wells in the recharge area to assess whether the different approaches give fundamentally different pictures of (1) the magnitude of N contamination, (2) historic trends in N contamination, and (3) the extent to which denitrification attenuates nitrate transport through the surficial aquifer. Two different arrangements of streambed points (SP) were used to sample groundwater discharging into a coastal plain stream in North Carolina. In July 2012, a 58 m reach was sampled using closely-spaced lateral transects of SP, revealing high average [NO3-] (808 μM, n=39). In March 2013, transects of SP were widely distributed through a 2.7 km reach that contained the 58 m reach and suggested overall lower [NO3-] (210 μM, n=30), possibly due to variation in land use along the longer study reach. Mean [NO3-] from vertical sampling (2 well nests with 3 wells each) was 296 μM. Groundwater apparent ages from SP in the 58 m and 2.7 km reaches suggested lower recharge [NO3-] (observed [NO3-] plus modeled excess N2) in 0-10 year-old water (1250 μM and 525 μM, respectively), compared to higher recharge [NO3-] from 10-30 years ago (about 1600 μM and 900 μM, respectively). In the wells, [NO3-] was highest (835 μM) in groundwater with apparent age of 12-15 years and declined as apparent age increased, a trend that was consistent with SP in the 2.7 km reach. The 58 m reach suggested elevated recharge [NO3-] (>1100 μM) over a 50-year period. Excess N2 from wells suggested that about 62% of nitrate had been removed via denitrification since recharge, versus 51% and 78

  12. Accounting Automation

    OpenAIRE

    Laynebaril1

    2017-01-01

    Accounting Automation   Click Link Below To Buy:   http://hwcampus.com/shop/accounting-automation/  Or Visit www.hwcampus.com Accounting Automation” Please respond to the following: Imagine you are a consultant hired to convert a manual accounting system to an automated system. Suggest the key advantages and disadvantages of automating a manual accounting system. Identify the most important step in the conversion process. Provide a rationale for your response. ...

  13. Automated modal tracking and fatigue assessment of a wind turbine based on continuous dynamic monitoring

    Directory of Open Access Journals (Sweden)

    Oliveira Gustavo

    2015-01-01

    Full Text Available The paper describes the implementation of a dynamic monitoring system at a 2.0 MW onshore wind turbine. The system is composed by two components aiming at the structural integrity and fatigue assessment. The first component enables the continuous tracking of modal characteristics of the wind turbine (natural frequency values, modal damping ratios and mode shapes in order to detect abnormal deviations of these properties, which may be caused by the occurrence of structural damage. On the other hand, the second component allows the estimation of the remaining fatigue lifetime of the structure based on the analysis of the measured cycles of structural vibration.

  14. Efficiency comparisons of fish sampling gears for a lentic ecosystem health assessments in Korea

    Directory of Open Access Journals (Sweden)

    Jeong-Ho Han

    2016-12-01

    Full Text Available The key objective of this study was to analyze the sampling efficiency of various fish sampling gears for a lentic ecosystem health assessment. A fish survey for the lentic ecosystem health assessment model was sampled twice from 30 reservoirs during 2008–2012. During the study, fishes of 81 species comprising 53,792 individuals were sampled from 30 reservoirs. A comparison of sampling gears showed that casting nets were the best sampling gear with high species richness (69 species, whereas minnow traps were the worst gear with low richness (16 species. Fish sampling efficiency, based on the number of individual catch per unit effort, was best in fyke nets (28,028 individuals and worst in minnow traps (352 individuals. When we compared trammel nets and kick nets versus fyke nets and casting nets, the former were useful in terms of the number of fish individuals but not in terms of the number of fish species.

  15. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  16. Assessing breast cancer masking risk in full field digital mammography with automated texture analysis

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lillholm, Martin; Diao, Pengfei

    2015-01-01

    to determine cancer detection status in a five-fold cross validation. To assess the interaction of the texture scores with breast density, Volpara Density Grade was determined for each image. Results: We grouped women into low (VDG 1/2) versus high (VDG 3/4) dense, and low (Quartile 1/2) versus high (Q 3......Purpose: The goal of this work is to develop a method to assess the risk of breast cancer masking, based on image characteristics beyond breast density. Method: From the Dutch breast cancer screening program we collected 285 screen detected cancers, and 109 cancers that were screen negative....../4) texture risk score. We computed odds ratios for breast cancer masking risk (i.e. interval versus screen detected cancer) for each of the subgroups. The odds ratio was 1.63 (1.04-2.53 95%CI) in the high dense group (as compared to the low dense group), whereas for the high texture score group (as compared...

  17. Assessing breast cancer masking risk with automated texture analysis in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lillholm, Martin; Diao, Pengfei

    2015-01-01

    status in a five-fold cross validation. To assess the interaction of the texture scores with breast density, Volpara Density Grade (VDG) was determined for each image using Volpara, Matakina Technology, New Zealand. RESULTS We grouped women into low (VDG 1/2) versus high (VDG 3/4) dense, and low...... for the high texture score group (as compared to the low texture score group) this OR was 2.19 (1.37-3.49). Women who were classified as low dense but had a high texture score had a higher masking risk (OR 1.66 (0.53-5.20)) than women with dense breasts but a low texture score. CONCLUSION Mammographic texture...... is associated with breast cancer masking risk. We were able to identify a subgroup of women who are at an increased risk of having a cancer that is not detected due to textural masking, even though their breasts are non-dense. CLINICAL RELEVANCE/APPLICATION Automatic texture analysis enables assessing the risk...

  18. Development and Assessment of an Automated High-Resolution InSAR Volcano-Monitoring System

    Science.gov (United States)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas

    2016-08-01

    Monitoring volcanoes and volcanic areas using synthetic aperture radar (SAR) data is a well-established method of risk assessment. However, acquisition planning, ordering, and downloading are time and work intensive, but inevitable process. It has to be done not only once before the actual processing, but for continuous monitoring, it poses a continuous and expensive effort. Therefore an automatic acquisition and processing system is developed at DLR, which allows pseudo-continuous processing of data sequences over the test site and also be applicable to any other optional test-site extension, including the increase of data volume. This system reduces the load of manual work necessary to perform interferometric stacking and quickly gain first information on evolving geophysical processes at the, but not limited to the Italian supersites.

  19. Automated texture scoring for assessing breast cancer masking risk in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Petersen, Kersten; Lilholm, Martin;

    validation. To assess the independency of the texture scores of breast density, density was determined for each image using Volpara. RESULTS: The odds ratios for interval cancer were 1.59 (95%CI: 0.76-3.32), 2.07 (1.02-4.20), and 3.14 (1.60-6.17) for quartile 2, 3 and 4 respectively, relative to quartile 1....... Correlation between the texture scores and breast density was 0.59 (0.52-0.64). Breast density adjusted odds ratios, as determined with logistic regression, were 1.49 (0.71-3.13), 1.58 (0.75-3.33), and 1.97 (0.91-4.27). CONCLUSIONS: The CSAE texture score is independently associated with the risk of having...

  20. Strong Prognostic Value of Tumor-infiltrating Neutrophils and Lymphocytes Assessed by Automated Digital Image Analysis in Early Stage Cervical Cancer

    DEFF Research Database (Denmark)

    Carus, Andreas; Donskov, Frede; Switten Nielsen, Patricia;

    2014-01-01

    INTRODUCTION Manual observer-assisted stereological (OAS) assessments of tumor-infiltrating neutrophils and lymphocytes are prognostic, accurate, but cumbersome. We assessed the applicability of automated digital image analysis (DIA). METHODS Visiomorph software was used to obtain DIA densities...... to lymphocyte (TA–NL) index accurately predicted the risk of relapse, ranging from 8% to 52% (P = 0.001). CONCLUSIONS DIA is a potential assessment technique. The TA–NL index obtained by DIA is a strong prognostic variable with possible routine clinical application....

  1. Respondent-driven sampling to assess outcomes of sexual violence: a methodological assessment.

    Science.gov (United States)

    Greiner, Ashley L; Albutt, Katherine; Rouhani, Shada A; Scott, Jennifer; Dombrowski, Kirk; VanRooyen, Michael J; Bartels, Susan A

    2014-09-01

    Sexual violence is pervasive in eastern Democratic Republic of Congo (DRC). Survivors of sexual violence encounter numerous challenges, and women with a sexual violence-related pregnancy (SVRP) face even more complex sequelae. Because of the stigma associated with SVRP, there is no conventional sampling frame and, therefore, a paucity of research on SVRP outcomes. Respondent-driven sampling (RDS), used to study this "hidden" population, uses a peer recruitment sampling system that maintains strict participant privacy and controls and tracks recruitment. If RDS assumptions are met and the sample attains equilibrium, sample weights to correct for biases associated with traditional chain referral sampling can be calculated. Questionnaires were administered to female participants who were raising a child from a SVRP and/or who terminated a SVRP. A total of 852 participants were recruited from October 9, 2012, to November 7, 2012. There was rapid recruitment, and there were long referral chains. The majority of the variables reached equilibrium; thus, trends established in the sample population reflected the target population's trends. To our knowledge, this is the first study to use RDS to study outcomes of sexual violence. RDS was successfully applied to this population and context and should be considered as a sampling methodology in future sexual violence research.

  2. An automated model for rooftop PV systems assessment in ArcGIS using LIDAR

    Directory of Open Access Journals (Sweden)

    Mesude Bayrakci Boz

    2015-08-01

    Full Text Available As photovoltaic (PV systems have become less expensive, building rooftops have come to be attractive for local power production. Identifying rooftops suitable for solar energy systems over large geographic areas is needed for cities to obtain more accurate assessments of production potential and likely patterns of development. This paper presents a new method for extracting roof segments and locating suitable areas for PV systems using Light Detection and Ranging (LIDAR data and building footprints. Rooftop segments are created using seven slope (tilt, ve aspect (azimuth classes and 6 different building types. Moreover, direct beam shading caused by nearby objects and the surrounding terrain is taken into account on a monthly basis. Finally, the method is implemented as an ArcGIS model in ModelBuilder and a tool is created. In order to show its validity, the method is applied to city of Philadelphia, PA, USA with the criteria of slope, aspect, shading and area used to locate suitable areas for PV system installation. The results show that 33.7% of the buildings footprints areas and 48.6% of the rooftop segments identi ed is suitable for PV systems. Overall, this study provides a replicable model using commercial software that is capable of extracting individual roof segments with more detailed criteria across an urban area.

  3. Measurement of acceleration while walking as an automated method for gait assessment in dairy cattle

    DEFF Research Database (Denmark)

    Chapinal, N.; de Passillé, A.M.; Pastell, M.;

    2011-01-01

    -dimensional accelerometers, 1 attached to each leg and 1 to the back, and acceleration data were collected while cows walked in a straight line on concrete (experiment 1) or on both concrete and rubber (experiment 2). Cows were video-recorded while walking to assess overall gait, asymmetry of the steps...... on a 5-point scale] and lower scores for asymmetry of the steps (18.0 vs. 23.1; SED = 2.2, measured on a continuous 100-unit scale) when they walked on rubber compared with concrete, and their walking speed increased (1.28 vs. 1.22 m/s; SED = 0.02). The acceleration of the front (1.67 vs. 1.72 g; SED = 0.......02) and rear (1.62 vs. 1.67 g; SED = 0.02) legs and the variance of acceleration of the rear legs (0.88 vs. 0.94 g; SED = 0.03) were lower when cows walked on rubber compared with concrete. Despite the improvements in gait score that occurred when cows walked on rubber, the asymmetry of variance...

  4. Automated Soil Physical Parameter Assessment Using Smartphone and Digital Camera Imagery

    Directory of Open Access Journals (Sweden)

    Matt Aitkenhead

    2016-12-01

    Full Text Available Here we present work on using different types of soil profile imagery (topsoil profiles captured with a smartphone camera and full-profile images captured with a conventional digital camera to estimate the structure, texture and drainage of the soil. The method is adapted from earlier work on developing smartphone apps for estimating topsoil organic matter content in Scotland and uses an existing visual soil structure assessment approach. Colour and image texture information was extracted from the imagery. This information was linked, using geolocation information derived from the smartphone GPS system or from field notes, with existing collections of topography, land cover, soil and climate data for Scotland. A neural network model was developed that was capable of estimating soil structure (on a five-point scale, soil texture (sand, silt, clay, bulk density, pH and drainage category using this information. The model is sufficiently accurate to provide estimates of these parameters from soils in the field. We discuss potential improvements to the approach and plans to integrate the model into a set of smartphone apps for estimating health and fertility indicators for Scottish soils.

  5. Evaluation of a Portable Automated Serum Chemistry Analyzer for Field Assessment of Harlequin Ducks, Histrionicus histrionicus.

    Science.gov (United States)

    Stoskopf, Michael K; Mulcahy, Daniel M; Esler, Daniel

    2010-01-01

    A portable analytical chemistry analyzer was used to make field assessments of wild harlequin ducks (Histrionicus histrionicus) in association with telemetry studies of winter survival in Prince William Sound, Alaska. We compared serum chemistry results obtained on-site with results from a traditional laboratory. Particular attention was paid to serum glucose and potassium concentrations as potential indicators of high-risk surgical candidates based on evaluation of the field data. The median differential for glucose values (N = 82) between methods was 0.6 mmol/L (quartiles 0.3 and 0.9 mmol/L) with the median value higher when assayed on site. Analysis of potassium on site returned a median of 2.7 mmol/L (N = 88; quartiles 2.4 and 3.0 mmol/L). Serum potassium values were too low for quantitation by the traditional laboratory. Changes in several serum chemistry values following a three-day storm during the study support the value of on site evaluation of serum potassium to identify presurgical patients with increased anesthetic risk.

  6. A SEMI - AUTOMATED MORPHOMETRIC ASSESSMENT OF NUCLEI IN PAP SMEARS USING IMAGEJ

    Directory of Open Access Journals (Sweden)

    Vijayashree

    2015-04-01

    Full Text Available BACKGROUND AND PURPOSE: Carcinoma of cervix is the fourth commonest malignancy in women. Its incidence is progressively falling due to the routine use of Pap smears to detect precancerous lesions. However, routine Pap smear examination is time consuming and, as it is based on de scriptive morphological assessment, false positive or negative reports are likely to occur. Using morphometric techniques, several attempts have been made to improve the accuracy of reports. In the present study, we have used Image morphometric software an d some of its plugins to create a macro to analyse large number of cells at a time . MATERIALS AND METHODS: Using Image and three of its plugins, namely, BEEPS, Kuwahara filter and Mexican Hat filter, we created a macro to morphometrically analyse normal, r eactive and neoplastic Pap smears. We also compared the macro measurements with manual measurements. RESULTS AND CONCLUSIONS: Results obtained with macro showed strong positive correlation with manual measurement. Although the neoplastic nuclei were on an average larger than reactive/normal nuclei, there was considerable overlap. More than the enlargement, anisonucleosis (variability in the size appeared to be a better indicator of neoplasia. The macro that we developed works rapidly and gives results comp arable to manual measurements provided the smears and the photographs are technically acceptable.

  7. Usability of a virtual reality environment simulating an automated teller machine for assessing and training persons with acquired brain injury

    Directory of Open Access Journals (Sweden)

    Li Teresa HY

    2010-04-01

    Full Text Available Abstract Objective This study aimed to examine the usability of a newly designed virtual reality (VR environment simulating the operation of an automated teller machine (ATM for assessment and training. Design Part I involved evaluation of the sensitivity and specificity of a non-immersive VR program simulating an ATM (VR-ATM. Part II consisted of a clinical trial providing baseline and post-intervention outcome assessments. Setting A rehabilitation hospital and university-based teaching facilities were used as the setting. Participants A total of 24 persons in the community with acquired brain injury (ABI - 14 in Part I and 10 in Part II - made up the participants in the study. Interventions In Part I, participants were randomized to receive instruction in either an "early" or a "late" VR-ATM program and were assessed using both the VR program and a real ATM. In Part II, participants were assigned in matched pairs to either VR training or computer-assisted instruction (CAI teaching programs for six 1-hour sessions over a three-week period. Outcome Measures Two behavioral checklists based on activity analysis of cash withdrawals and money transfers using a real ATM were used to measure average reaction time, percentage of incorrect responses, level of cues required, and time spent as generated by the VR system; also used was the Neurobehavioral Cognitive Status Examination. Results The sensitivity of the VR-ATM was 100% for cash withdrawals and 83.3% for money transfers, and the specificity was 83% and 75%, respectively. For cash withdrawals, the average reaction time of the VR group was significantly shorter than that of the CAI group (p = 0.021. We found no significant differences in average reaction time or accuracy between groups for money transfers, although we did note positive improvement for the VR-ATM group. Conclusion We found the VR-ATM to be usable as a valid assessment and training tool for relearning the use of ATMs prior to real

  8. Development and Validation of an Admission Test Designed to Assess Samples of Performance on Academic Tasks

    Science.gov (United States)

    Tanilon, Jenny; Segers, Mien; Vedder, Paul; Tillema, Harm

    2009-01-01

    This study illustrates the development and validation of an admission test, labeled as Performance Samples on Academic Tasks in Educational Sciences (PSAT-Ed), designed to assess samples of performance on academic tasks characteristic of those that would eventually be encountered by examinees in an Educational Sciences program. The test was based…

  9. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    Science.gov (United States)

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  10. Assessment of Automated Snow Cover Detection at High Solar Zenith Angles with PROBA-V

    Directory of Open Access Journals (Sweden)

    Florent Hawotte

    2016-08-01

    Full Text Available Changes in the snow cover extent are both a cause and a consequence of climate change. Optical remote sensing with heliosynchronous satellites currently provides snow cover data at high spatial resolution with daily revisiting time. However, high latitude image acquisition is limited because reflective sensors of many satellites are switched off at high solar zenith angles (SZA due to lower signal quality. In this study, the relevance and reliability of high SZA acquisition are objectively quantified in the purpose of high latitude snow cover detection, thanks to the PROBA-V (Project for On-Board Autonomy-Vegetation satellite. A snow cover extent classification based on Normalized Difference Snow Index (NDSI and Normalized Difference Vegetation Index (NDVI has been performed for the northern hemisphere on latitudes between 55°N and 75°N during the 2015–2016 winter season. A stratified probabilistic sampling was used to estimate the classification accuracy. The latter has been evaluated among eight SZA intervals to determine the maximum usable angle. The global overall snow classification accuracy with PROBA-V, 82% ± 4%, was significantly larger than the MODIS (Moderate-resolution Imaging Spectroradiometer snow cover extent product (75% ± 4%. User and producer accuracy of snow are above standards and overall accuracy is stable until 88.5° SZA. These results demonstrate that optical remote sensing data can still be used with large SZA. Considering the relevance of snow cover mapping for ecology and climatology, the data acquisition at high solar zenith angles should be continued by PROBA-V.

  11. RNA extracted from blood samples with a rapid automated procedure is fit for molecular diagnosis or minimal residual disease monitoring in patients with a variety of malignant blood disorders.

    Science.gov (United States)

    Bechlian, Didier; Honstettre, Amélie; Terrier, Michèle; Brest, Christelle; Malenfant, Carine; Mozziconacci, Marie-Joëlle; Chabannon, Christian

    2009-06-01

    Scientific studies in oncology, cancer diagnosis, and monitoring tumor response to therapeutics currently rely on a growing number of clinico-pathological information. These often include molecular analyses. The quality of these analyses depends on both pre-analytical and analytical information and often includes the extraction of DNA and/or RNA from human tissues and cells. The quality and quantity of obtained nucleic acids are of utmost importance. The use of automated techniques presents several advantages over manual techniques, such as reducing technical time and thus cost, and facilitating standardization. The purpose of this study was to validate an automated technique for RNA extraction from cells of patients treated for various malignant blood diseases. A well-established manual technique was compared to an automated technique, in order to extract RNA from blood samples drawn for the molecular diagnosis of a variety of leukemic diseases or monitoring of minimal residual disease. The quality of the RNA was evaluated by real-time quantitative RT-PCR (RQ-PCR) analyses of the Abelson gene transcript. The results show that both techniques produce RNA with comparable quality and quantity, thus suggesting that an automated technique can be substituted for the reference and manual technique used in the daily routine of a molecular pathology laboratory involved in minimal residual disease monitoring. Increased costs of reagents and disposables used for automated techniques can be compensated by a decrease in human resource.

  12. Contaminant analysis automation demonstration proposal

    Energy Technology Data Exchange (ETDEWEB)

    Dodson, M.G.; Schur, A.; Heubach, J.G.

    1993-10-01

    The nation-wide and global need for environmental restoration and waste remediation (ER&WR) presents significant challenges to the analytical chemistry laboratory. The expansion of ER&WR programs forces an increase in the volume of samples processed and the demand for analysis data. To handle this expanding volume, productivity must be increased. However. The need for significantly increased productivity, faces contaminant analysis process which is costly in time, labor, equipment, and safety protection. Laboratory automation offers a cost effective approach to meeting current and future contaminant analytical laboratory needs. The proposed demonstration will present a proof-of-concept automated laboratory conducting varied sample preparations. This automated process also highlights a graphical user interface that provides supervisory, control and monitoring of the automated process. The demonstration provides affirming answers to the following questions about laboratory automation: Can preparation of contaminants be successfully automated?; Can a full-scale working proof-of-concept automated laboratory be developed that is capable of preparing contaminant and hazardous chemical samples?; Can the automated processes be seamlessly integrated and controlled?; Can the automated laboratory be customized through readily convertible design? and Can automated sample preparation concepts be extended to the other phases of the sample analysis process? To fully reap the benefits of automation, four human factors areas should be studied and the outputs used to increase the efficiency of laboratory automation. These areas include: (1) laboratory configuration, (2) procedures, (3) receptacles and fixtures, and (4) human-computer interface for the full automated system and complex laboratory information management systems.

  13. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  14. Assessment of the Current Level of Automation in the Manufacture of Fuel Cell Systems for Combined Heat and Power Applications

    Energy Technology Data Exchange (ETDEWEB)

    Ulsh, M.; Wheeler, D.; Protopappas, P.

    2011-08-01

    The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study using a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.

  15. Testing of an automated online EA-IRMS method for fast and simultaneous carbon content and stable isotope measurement of aerosol samples

    Science.gov (United States)

    Major, István; Gyökös, Brigitta; Túri, Marianna; Futó, István; Filep, Ágnes; Hoffer, András; Molnár, Mihály

    2016-04-01

    Comprehensive atmospheric studies have demonstrated that carbonaceous aerosol is one of the main components of atmospheric particulate matter over Europe. Various methods, considering optical or thermal properties, have been developed for quantification of the accurate amount of both organic and elemental carbon constituents of atmospheric aerosol. The aim of our work was to develop an alternative fast and easy method for determination of the total carbon content of individual aerosol samples collected on prebaked quartz filters whereby the mass and surface concentration becomes simply computable. We applied the conventional "elemental analyzer (EA) coupled online with an isotope ratio mass spectrometer (IRMS)" technique which is ubiquitously used in mass spectrometry. Using this technique we are able to measure simultaneously the carbon stable isotope ratio of the samples, as well. During the developing process, we compared the EA-IRMS technique with an off-line catalytic combustion method worked out previously at Hertelendi Laboratory of Environmental Studies (HEKAL). We tested the combined online total carbon content and stable isotope ratio measurement both on standard materials and real aerosol samples. Regarding the test results the novel method assures, on the one hand, at least 95% of carbon recovery yield in a broad total carbon mass range (between 100 and 3000 ug) and, on the other hand, a good reproducibility of stable isotope measurements with an uncertainty of ± 0.2 per mill. Comparing the total carbon results obtained by the EA-IRMS and the off-line catalytic combustion method we found a very good correlation (R2=0.94) that proves the applicability of both preparation method. Advantages of the novel method are the fast and simplified sample preparation steps and the fully automated, simultaneous carbon stable isotope ratio measurement processes. Furthermore stable isotope ratio results can effectively be applied in the source apportionment

  16. Fully automated ionic liquid-based headspace single drop microextraction coupled to GC-MS/MS to determine musk fragrances in environmental water samples.

    Science.gov (United States)

    Vallecillos, Laura; Pocurull, Eva; Borrull, Francesc

    2012-09-15

    A fully automated ionic liquid-based headspace single drop microextraction (IL-HS-SDME) procedure has been developed for the first time to preconcentrate trace amounts of ten musk fragrances extensively used in personal care products (six polycyclic musks, three nitro musks and one polycyclic musk degradation product) from wastewater samples prior to analysis by gas chromatography and ion trap tandem mass spectrometry (GC-IT-MS/MS). Due to the low volatility of the ILs, a large internal diameter liner (3.4 mm i.d.) was used to improve the ILs evaporation. Furthermore, a piece of glass wool was introduced into the liner to avoid the entrance of the ILs in the GC column and a guard column was used to prevent analytical column damages. The main factors influencing the IL-HS-SDME were optimized. For all species, the highest enrichments factors were achieved using 1 μL of 1-octyl-3-methylimidazolium hexafluorophosphate ([OMIM][PF(6)]) ionic liquid exposed in the headspace of 10 mL water samples containing 300 g L(-1) of NaCl and stirred at 750 rpm and 60 °C for 45 min. All compounds were determined by direct injection GC-IT-MS/MS with a chromatographic time of 19 min. Method detection limits were found in the low ng mL(-1) range between 0.010 ng mL(-1) and 0.030 ng mL(-1) depending on the target analytes. Also, under optimized conditions, the method gave good levels of intra-day and inter-day repeatabilities in wastewater samples with relative standard deviations varying between 3% and 6% and 5% and 11%, respectively (n=3, 1 ng mL(-1)). The applicability of the method was tested with different wastewater samples from influent and effluent urban wastewater treatment plants (WWTPs) and one potable treatment plant (PTP). The analysis of influent urban wastewater revealed the presence of galaxolide and tonalide at concentrations of between 2.10 ng mL(-1) and 0.29 ng mL(-1) and 0.32 ng mL(-1) and waters from PTP only galaxolide was found at a concentration higher than MQL.

  17. Soil sampling strategies for site assessments in petroleum-contaminated areas.

    Science.gov (United States)

    Kim, Geonha; Chowdhury, Saikat; Lin, Yen-Min; Lu, Chih-Jen

    2017-04-01

    Environmental site assessments are frequently executed for monitoring and remediation performance evaluation purposes, especially in total petroleum hydrocarbon (TPH)-contaminated areas, such as gas stations. As a key issue, reproducibility of the assessment results must be ensured, especially if attempts are made to compare results between different institutions. Although it is widely known that uncertainties associated with soil sampling are much higher than those with chemical analyses, field guides or protocols to deal with these uncertainties are not stipulated in detail in the relevant regulations, causing serious errors and distortion of the reliability of environmental site assessments. In this research, uncertainties associated with soil sampling and sample reduction for chemical analysis were quantified using laboratory-scale experiments and the theory of sampling. The research results showed that the TPH mass assessed by sampling tends to be overestimated and sampling errors are high, especially for the low range of TPH concentrations. Homogenization of soil was found to be an efficient method to suppress uncertainty, but high-resolution sampling could be an essential way to minimize this.

  18. Comparison of sampling methods for the assessment of indoor microbial exposure

    DEFF Research Database (Denmark)

    Frankel, M; Timm, Michael; Hansen, E W

    2012-01-01

    Abstract Indoor microbial exposure has been related to allergy and respiratory disorders. However, the lack of standardized sampling methodology is problematic when investigating dose-response relationships between exposure and health effects. In this study, different sampling methods were compared...... with those from GSP. Settled dust from the EDC was most representative of airborne dust and may thus be considered as a surrogate for the assessment of indoor airborne microbial exposure. PRACTICAL IMPLICATIONS: Significant discrepancies between sampling methods regarding indoor microbial exposures have been...... regarding their assessment of microbial exposures, including culturable fungi and bacteria, endotoxin, as well as the total inflammatory potential (TIP) of dust samples from Danish homes. The Gesamtstaubprobenahme (GSP) filter sampler and BioSampler were used for sampling of airborne dust, whereas the dust...

  19. Testing automated liquid-based cytology samples with a manual liquid-based cytology method using residual cell suspensions from 500 ThinPrep cases.

    Science.gov (United States)

    Maksem, John A; Dhanwada, Vijaya; Trueblood, Joy E; Weidmann, James; Kane, Bruce; Bolick, David R; Bedrossian, Carlos W M; Kurtycz, Daniel F I; Stewart, Jim

    2006-06-01

    We report a technical improvement upon a previously disclosed manual liquid-based cytology (MLBC) method; and, we use the improved method to prepare slides from residual ThinPrep specimens in order to see how often ThinPrep diagnoses correspond to diagnoses derived from exhaustive examination of their parent sample suspensions. Residual cell suspensions from 500 ThinPrep cases comprising (1) 20 low-grade squamous intraepithelial lesions (LSILs); (2) 200 high risk (HR) negatives and 20 ASC-US; and (3) 260 screening cytology specimens were studied. Institutional review committee guidelines allowed us to know diagnoses by groups of specimens, but did not allow us to know individual patient diagnoses, so we could not perform case-by-case matched outcome-comparisons. Cells were concentrated by conventional centrifugation and sedimented into a polymer gel that was then vortex-mixed and converted into a viscous cell-rich suspension. The cell suspension was smeared between two clean glass slides, which were air-dried and stained with the Papanicolaou stain. Two study-sets were created, comprising one slide from each case. Each of the two study sets was examined by two cytopathologists, and discordant diagnoses were adjudicated. Because of the ambiguity involved in the "atypical" (ASC-US, ASC-H, AGC) diagnosis categories, only outcomes at the level of LSIL or greater were recorded. All MLBC SILs were digitally imaged and abnormal slides plus digital images were sent to the laboratory that provided the residual automated liquid-based cytology (ALBC) suspensions. The final diagnoses were confirmed by the laboratory that provided the residual ALBC specimens. MLBC slides of the 20 LSIL cases afforded 2 high-grade squamous intraepithelial lesions (HSILs) and 18 LSILs. Those of the 200 HR-Negatives showed 3 HSILs and 30 LSILs; and those of the 20 HR-ASC-US showed 3 HSILs and 9 LSILs. MLBC slides of the 260 screening cytology specimens showed 1 Carcinoma, 3 HSILs and 20 LSILs

  20. An automated system for access to derived climate indices in support of ecological impacts assessments and resource management

    Science.gov (United States)

    Walker, J.; Morisette, J. T.; Talbert, C.; Blodgett, D. L.; Kunicki, T.

    2012-12-01

    A U.S. Geological Survey team is working with several providers to establish standard data services for the climate projection data they host. To meet the needs of climate adaptation science and landscape management communities, the team is establishing a set of climate index calculation algorithms that will consume data from various providers and provide directly useful data derivatives. Climate projections coming from various scenarios, modeling centers, and downscaling methods are increasing in number and size. Global change impact modeling and assessment, generally, requires inputs in the form of climate indices or values derived from raw climate projections. This requirement puts a large burden on a community not familiar with climate data formats, semantics, and processing techniques and requires storage capacity and computing resources out of the reach of most. In order to fully understand the implications of our best available climate projections, assessments must take into account an ensemble of climate projections and potentially a range of parameters for calculation of climate indices. These requirements around data access and processing are not unique from project to project, or even among projected climate data sets, pointing to the need for a reusable tool to generate climate indices. The U.S. Geological Survey has developed a pilot application and supporting web service framework that automates the generation of climate indices. The web service framework consists of standards-based data servers and a data integration broker. The resulting system allows data producers to publish and maintain ownership of their data and data consumers to access climate derivatives via a simple to use "data product ordering" workflow. Data access and processing is completed on enterprise "cloud" computing resources and only the relatively small, derived climate indices are delivered to the scientist or land manager. These services will assist the scientific and land

  1. An integrative pharmacological approach to radio telemetry and blood sampling in pharmaceutical drug discovery and safety assessment

    Directory of Open Access Journals (Sweden)

    Kamendi Harriet W

    2011-01-01

    Full Text Available Abstract Background A successful integration of the automated blood sampling (ABS and telemetry (ABST system is described. The new ABST system facilitates concomitant collection of physiological variables with blood and urine samples for determination of drug concentrations and other biochemical measures in the same rat without handling artifact. Method Integration was achieved by designing a 13 inch circular receiving antenna that operates as a plug-in replacement for the existing pair of DSI's orthogonal antennas which is compatible with the rotating cage and open floor design of the BASi Culex® ABS system. The circular receiving antenna's electrical configuration consists of a pair of electrically orthogonal half-toroids that reinforce reception of a dipole transmitter operating within the coil's interior while reducing both external noise pickup and interference from other adjacent dipole transmitters. Results For validation, measured baclofen concentration (ABST vs. satellite (μM: 69.6 ± 23.8 vs. 76.6 ± 19.5, p = NS and mean arterial pressure (ABST vs. traditional DSI telemetry (mm Hg: 150 ± 5 vs.147 ± 4, p = NS variables were quantitatively and qualitatively similar between rats housed in the ABST system and traditional home cage approaches. Conclusion The ABST system offers unique advantages over traditional between-group study paradigms that include improved data quality and significantly reduced animal use. The superior within-group model facilitates assessment of multiple physiological and biochemical responses to test compounds in the same animal. The ABST also provides opportunities to evaluate temporal relations between parameters and to investigate anomalous outlier events because drug concentrations, physiological and biochemical measures for each animal are available for comparisons.

  2. Assessing terpene content variability of whitebark pine in order to estimate representative sample size

    Directory of Open Access Journals (Sweden)

    Stefanović Milena

    2013-01-01

    Full Text Available In studies of population variability, particular attention has to be paid to the selection of a representative sample. The aim of this study was to assess the size of the new representative sample on the basis of the variability of chemical content of the initial sample on the example of a whitebark pine population. Statistical analysis included the content of 19 characteristics (terpene hydrocarbons and their derivates of the initial sample of 10 elements (trees. It was determined that the new sample should contain 20 trees so that the mean value calculated from it represents a basic set with a probability higher than 95 %. Determination of the lower limit of the representative sample size that guarantees a satisfactory reliability of generalization proved to be very important in order to achieve cost efficiency of the research. [Projekat Ministarstva nauke Republike Srbije, br. OI-173011, br. TR-37002 i br. III-43007

  3. A content validated questionnaire for assessment of self reported venous blood sampling practices

    Directory of Open Access Journals (Sweden)

    Bölenius Karin

    2012-01-01

    Full Text Available Abstract Background Venous blood sampling is a common procedure in health care. It is strictly regulated by national and international guidelines. Deviations from guidelines due to human mistakes can cause patient harm. Validated questionnaires for health care personnel can be used to assess preventable "near misses"--i.e. potential errors and nonconformities during venous blood sampling practices that could transform into adverse events. However, no validated questionnaire that assesses nonconformities in venous blood sampling has previously been presented. The aim was to test a recently developed questionnaire in self reported venous blood sampling practices for validity and reliability. Findings We developed a questionnaire to assess deviations from best practices during venous blood sampling. The questionnaire contained questions about patient identification, test request management, test tube labeling, test tube handling, information search procedures and frequencies of error reporting. For content validity, the questionnaire was confirmed by experts on questionnaires and venous blood sampling. For reliability, test-retest statistics were used on the questionnaire answered twice. The final venous blood sampling questionnaire included 19 questions out of which 9 had in total 34 underlying items. It was found to have content validity. The test-retest analysis demonstrated that the items were generally stable. In total, 82% of the items fulfilled the reliability acceptance criteria. Conclusions The questionnaire could be used for assessment of "near miss" practices that could jeopardize patient safety and gives several benefits instead of assessing rare adverse events only. The higher frequencies of "near miss" practices allows for quantitative analysis of the effect of corrective interventions and to benchmark preanalytical quality not only at the laboratory/hospital level but also at the health care unit/hospital ward.

  4. Gluten-containing grains skew gluten assessment in oats due to sample grind non-homogeneity.

    Science.gov (United States)

    Fritz, Ronald D; Chen, Yumin; Contreras, Veronica

    2017-02-01

    Oats are easily contaminated with gluten-rich kernels of wheat, rye and barley. These contaminants are like gluten 'pills', shown here to skew gluten analysis results. Using R-Biopharm R5 ELISA, we quantified gluten in gluten-free oatmeal servings from an in-market survey. For samples with a 5-20ppm reading on a first test, replicate analyses provided results ranging 160ppm. This suggests sample grinding may inadequately disperse gluten to allow a single accurate gluten assessment. To ascertain this, and characterize the distribution of 0.25-g gluten test results for kernel contaminated oats, twelve 50g samples of pure oats, each spiked with a wheat kernel, showed that 0.25g test results followed log-normal-like distributions. With this, we estimate probabilities of mis-assessment for a 'single measure/sample' relative to the gluten content.

  5. Feasibility of hair sampling to assess levels of organophosphate metabolites in rural areas of Sri Lanka

    Science.gov (United States)

    Knipe, D.W.; Jayasumana, C.; Siribaddana, S.; Priyadarshana, C.; Pearson, M.; Gunnell, D.; Metcalfe, C.; Tzatzarakis, M.N.; Tsatsakis, A.M.

    2016-01-01

    Measuring chronic pesticide exposure is important in order to investigate the associated health effects. Traditional biological samples (blood/urine) are difficult to collect, store and transport in large epidemiological studies in settings such as rural Asia. We assessed the acceptability of collecting hair samples from a rural Sri Lankan population and found that this method of data collection was feasible. We also assessed the level of non-specific metabolites (DAPS) of organophosphate pesticides in the hair samples. The median concentration (pg/mg) of each DAP was: diethyl phosphate: 83.3 (IQI 56.0, 209.4); diethyl thiophosphate: 34.7 (IQI 13.8, 147.9); diethyl dithiophosphate: 34.5 (IQI 23.4, 55.2); and dimethyl phosphate: 3 (IQI 3, 109.7). Total diethylphosphates were recovered in >80% of samples and were positively correlated with self-reported pesticide exposure. PMID:26894816

  6. Assessment of Residual Stresses in 3013 Inner and Outer Containers and Teardrop Samples

    Energy Technology Data Exchange (ETDEWEB)

    Stroud, Mary Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Prime, Michael Bruce [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Veirs, Douglas Kirk [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Berg, John M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Clausen, Bjorn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Worl, Laura Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); DeWald, Adrian T. [Hill Engineering, LLC, Rancho Cordova, CA (United States)

    2015-12-08

    This report is an assessment performed by LANL that examines packaging for plutonium-bearing materials and the resilience of its design. This report discusses residual stresses in the 3013 outer, the SRS/Hanford and RFETS/LLNL inner containers, and teardrop samples used in studies to assess the potential for SCC in 3013 containers. Residual tensile stresses in the heat affected zones of the closure welds are of particular concern.

  7. Soyuz 24 Return Samples: Assessment of Air Quality Aboard the International Space Station

    Science.gov (United States)

    James, John T.

    2011-01-01

    Fifteen mini-grab sample containers (m-GSCs) were returned aboard Soyuz. This is the first time all samples were acquired with the mini-grab samplers. The toxicological assessment of 15 m-GSCs from the ISS is shown. The recoveries of the 3 internal standards, C(13)-acetone, fluorobenzene, and chlorobenzene, from the GSCs averaged 75, 97 and 79%, respectively. Formaldehyde badges were not returned on Soyuz 24

  8. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V.; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  9. Hydrochemical assessment of Semarang area using multivariate statistics: A sample based dataset

    OpenAIRE

    Irawan, Dasapta Erwin; Putranto, Thomas Triadi

    2016-01-01

    The following paper describes in brief the data set related to our project "Hydrochemical assessment of Semarang Groundwater Quality". All of 58 samples were taken in 1992, 1993, 2003, 2006, and 2007 using well point data from several reports from Ministry of Energy and Min- eral Resources and independent consultants. We provided 20 parameters in each samples (sample id, coord X, coord Y, well depth, water level, water elevation, TDS, pH, EC, K, Ca, Na, Mg, Cl, SO4, HCO3, ye...

  10. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  11. Analyses of odontometric sexual dimorphism and sex assessment accuracy on a large sample.

    Science.gov (United States)

    Angadi, Punnya V; Hemani, S; Prabhu, Sudeendra; Acharya, Ashith B

    2013-08-01

    Correct sex assessment of skeletonized human remains allows investigators to undertake a more focused search of missing persons' files to establish identity. Univariate and multivariate odontometric sex assessment has been explored in recent years on small sample sizes and have not used a test sample. Consequently, inconsistent results have been produced in terms of accuracy of sex allocation. This paper has derived data from a large sample of males and females, and applied logistic regression formulae on a test sample. Using a digital caliper, buccolingual and mesiodistal dimensions of all permanent teeth (except third molars) were measured on 600 dental casts (306 females, 294 males) of young adults (18-32 years), and the data subjected to univariate (independent samples' t-test) and multivariate statistics (stepwise logistic regression analysis, or LRA). The analyses revealed that canines were the most sexually dimorphic teeth followed by molars. All tooth variables were larger in males, with 51/56 (91.1%) being statistically larger (p contradiction to a previous report of virtually 100% sex allocation for a small heterogeneous sample. These reflect the importance of using a large sample to quantify sexual dimorphism in tooth dimensions and the application of the derived formulae on a test dataset to ascertain accuracy which, at best, is moderate in nature.

  12. Automated inter-rater reliability assessment and electronic data collection in a multi-center breast cancer study

    Directory of Open Access Journals (Sweden)

    Enger Shelley M

    2007-06-01

    Full Text Available Abstract Background The choice between paper data collection methods and electronic data collection (EDC methods has become a key question for clinical researchers. There remains a need to examine potential benefits, efficiencies, and innovations associated with an EDC system in a multi-center medical record review study. Methods A computer-based automated menu-driven system with 658 data fields was developed for a cohort study of women aged 65 years or older, diagnosed with invasive histologically confirmed primary breast cancer (N = 1859, at 6 Cancer Research Network sites. Medical record review with direct data entry into the EDC system was implemented. An inter-rater and intra-rater reliability (IRR system was developed using a modified version of the EDC. Results Automation of EDC accelerated the flow of study information and resulted in an efficient data collection process. Data collection time was reduced by approximately four months compared to the project schedule and funded time available for manuscript preparation increased by 12 months. In addition, an innovative modified version of the EDC permitted an automated evaluation of inter-rater and intra-rater reliability across six data collection sites. Conclusion Automated EDC is a powerful tool for research efficiency and innovation, especially when multiple data collection sites are involved.

  13. Large-Scale Assessment, Locally-Developed Measures, and Automated Scoring of Essays: Fishing for Red Herrings?

    Science.gov (United States)

    Condon, William

    2013-01-01

    Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…

  14. Using Structural Equation Modeling to Assess Functional Connectivity in the Brain: Power and Sample Size Considerations

    Science.gov (United States)

    Sideridis, Georgios; Simos, Panagiotis; Papanicolaou, Andrew; Fletcher, Jack

    2014-01-01

    The present study assessed the impact of sample size on the power and fit of structural equation modeling applied to functional brain connectivity hypotheses. The data consisted of time-constrained minimum norm estimates of regional brain activity during performance of a reading task obtained with magnetoencephalography. Power analysis was first…

  15. Soyuz 25 Return Samples: Assessment of Air Quality Aboard the International Space Station

    Science.gov (United States)

    James, John T.

    2011-01-01

    Six mini-grab sample containers (m-GSCs) were returned aboard Soyuz 25. The toxicological assessment of 6 m-GSCs from the ISS is shown. The recoveries of the 3 internal standards, C-13-acetone, fluorobenzene, and chlorobenzene, from the GSCs averaged 76, 108 and 88%, respectively. Formaldehyde badges were not returned aboard Soyuz 25.

  16. Mood disorders in everyday life : A systematic review of experience sampling and ecological momentary assessment studies

    NARCIS (Netherlands)

    Aan het Rot, M.; Hogenelst, Koen; Schoevers, R.A.

    2012-01-01

    In the past two decades, the study of mood disorder patients using experience sampling methods (ESM) and ecological momentary assessment (EMA) has yielded important findings. In patients with major depressive disorder (MDD), the dynamics of their everyday mood have been associated with various aspec

  17. Assessment of Emotional Intelligence in a Sample of Prospective Secondary Education Teachers

    Science.gov (United States)

    Gutiérrez-Moret, Margarita; Ibáñez-Martinez, Raquel; Aguilar-Moya, Remedios; Vidal-Infer, Antonio

    2016-01-01

    In the past few years, skills related to emotional intelligence (EI) have acquired special relevance in the educational domain. This study assesses EI in a sample of 155 students of 5 different specialities of a Master's degree in Teacher Training for Secondary Education. Data collection was conducted through the administration of the Trait Meta…

  18. Automation or De-automation

    Science.gov (United States)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  19. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population.

    Science.gov (United States)

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel

    2016-05-11

    Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods) were analysed and compared using the city of Talca (Chile) as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population.

  20. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population

    Directory of Open Access Journals (Sweden)

    Guillermo Rey Gozalo

    2016-05-01

    Full Text Available Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods were analysed and compared using the city of Talca (Chile as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population.

  1. Threshold-dependent sample sizes for selenium assessment with stream fish tissue

    Science.gov (United States)

    Hitt, Nathaniel P.; Smith, David R.

    2015-01-01

    Natural resource managers are developing assessments of selenium (Se) contamination in freshwater ecosystems based on fish tissue concentrations. We evaluated the effects of sample size (i.e., number of fish per site) on the probability of correctly detecting mean whole-body Se values above a range of potential management thresholds. We modeled Se concentrations as gamma distributions with shape and scale parameters fitting an empirical mean-to-variance relationship in data from southwestern West Virginia, USA (63 collections, 382 individuals). We used parametric bootstrapping techniques to calculate statistical power as the probability of detecting true mean concentrations up to 3 mg Se/kg above management thresholds ranging from 4 to 8 mg Se/kg. Sample sizes required to achieve 80% power varied as a function of management thresholds and Type I error tolerance (α). Higher thresholds required more samples than lower thresholds because populations were more heterogeneous at higher mean Se levels. For instance, to assess a management threshold of 4 mg Se/kg, a sample of eight fish could detect an increase of approximately 1 mg Se/kg with 80% power (given α = 0.05), but this sample size would be unable to detect such an increase from a management threshold of 8 mg Se/kg with more than a coin-flip probability. Increasing α decreased sample size requirements to detect above-threshold mean Se concentrations with 80% power. For instance, at an α-level of 0.05, an 8-fish sample could detect an increase of approximately 2 units above a threshold of 8 mg Se/kg with 80% power, but when α was relaxed to 0.2, this sample size was more sensitive to increasing mean Se concentrations, allowing detection of an increase of approximately 1.2 units with equivalent power. Combining individuals into 2- and 4-fish composite samples for laboratory analysis did not decrease power because the reduced number of laboratory samples was compensated for by increased

  2. Automated assessment of Pavlovian conditioned freezing and shock reactivity in mice using the VideoFreeze system

    Directory of Open Access Journals (Sweden)

    Stephan G Anagnostaras

    2010-09-01

    Full Text Available The Pavlovian conditioned freezing paradigm has become a prominent mouse and rat model of learning and memory, as well as of pathological fear. Due to its efficiency, reproducibility, and well-defined neurobiology, the paradigm has become widely adopted in large-scale genetic and pharmacological screens. However, one major shortcoming of the use of freezing behavior has been that it has required the use of tedious hand scoring, or a variety of proprietary automated methods that are often poorly validated or difficult to obtain and implement. Here we report an extensive validation of the Video Freeze system in mice, a turn-key all-inclusive system for fear conditioning in small animals. Using digital video and near-infrared lighting, the system achieved outstanding performance in scoring both freezing and movement. Given the large-scale adoption of the conditioned freezing paradigm, we encourage similar validation of other automated systems for scoring freezing, or other behaviors.

  3. Automated simulation of areal bone mineral density assessment in the distal radius from high-resolution peripheral quantitative computed tomography

    OpenAIRE

    Burghardt, A. J.; Kazakia, G. J.; Link, T.M.; Majumdar, S

    2009-01-01

    Summary An automated image processing method is presented for simulating areal bone mineral density measures using high-resolution peripheral quantitative computed tomography (HR-pQCT) in the ultra-distal radius. The accuracy of the method is validated against clinical dual X-ray absorptiometry (DXA). This technique represents a useful reference to gauge the utility of novel 3D quantification methods applied to HR-pQCT in multi-center clinical studies and potentially negates the need for sepa...

  4. A hybrid DNA extraction method for the qualitative and quantitative assessment of bacterial communities from poultry production samples.

    Science.gov (United States)

    Rothrock, Michael J; Hiett, Kelli L; Gamble, John; Caudill, Andrew C; Cicconi-Hogan, Kellie M; Caporaso, J Gregory

    2014-12-10

    The efficacy of DNA extraction protocols can be highly dependent upon both the type of sample being investigated and the types of downstream analyses performed. Considering that the use of new bacterial community analysis techniques (e.g., microbiomics, metagenomics) is becoming more prevalent in the agricultural and environmental sciences and many environmental samples within these disciplines can be physiochemically and microbiologically unique (e.g., fecal and litter/bedding samples from the poultry production spectrum), appropriate and effective DNA extraction methods need to be carefully chosen. Therefore, a novel semi-automated hybrid DNA extraction method was developed specifically for use with environmental poultry production samples. This method is a combination of the two major types of DNA extraction: mechanical and enzymatic. A two-step intense mechanical homogenization step (using bead-beating specifically formulated for environmental samples) was added to the beginning of the "gold standard" enzymatic DNA extraction method for fecal samples to enhance the removal of bacteria and DNA from the sample matrix and improve the recovery of Gram-positive bacterial community members. Once the enzymatic extraction portion of the hybrid method was initiated, the remaining purification process was automated using a robotic workstation to increase sample throughput and decrease sample processing error. In comparison to the strict mechanical and enzymatic DNA extraction methods, this novel hybrid method provided the best overall combined performance when considering quantitative (using 16S rRNA qPCR) and qualitative (using microbiomics) estimates of the total bacterial communities when processing poultry feces and litter samples.

  5. Meconium samples used to assess infant exposure to the components of ETS during pregnancy

    Directory of Open Access Journals (Sweden)

    Sylwia Narkowicz

    2015-12-01

    Full Text Available Objectives: The aim of the study was to use meconium samples to assess fetal exposure to compounds present in environmental tobacco smoke (ETS. Material and Methods: In order to assess fetal exposure to toxic tobacco smoke compounds, samples of meconium from the offspring of women with different levels of tobacco smoke exposure, and the samples of saliva from the mothers were analyzed. Thiocyanate ion as a biomarker of tobacco smoke exposure, and other ions that are indices of such exposure were determined by means of ion chromatography. Results: The results of ion chromatography analysis of the meconium and maternal saliva samples for the presence of cations and anions (including thiocyanate ion indicate that the concentration level of specific ions depends on the intensity of environmental tobacco smoke exposure of pregnant women. Conclusions: Based on the results, it can be concluded that meconium samples can be used to determine the substances from tobacco smoke. The results confirm the effect of smoking during pregnancy on the presence and content of substances from tobacco smoke.

  6. Validating the Diagnostic Infant and Preschool Assessment Using a Danish Trauma Sample

    DEFF Research Database (Denmark)

    Schandorph Løkkegaard, Sille; Elklit, Ask

    Background: There is a lack of validated assessment tools for identifying young children with posttraumatic stress disorder (PTSD). One of the few existing tools for children aged 1-6 years is the Diagnostic Infant and Preschool Assessment (DIPA: Sheeringa & Haslett, 2010.) Purpose: To validate...... a Danish version of the DIPA using a sample of young children exposed to (potentially) traumatic events. Method: Interviews of caregivers of 100 children exposed to traumas using the DIPA and the Strength and Difficulties Questionnaire (SDQ). One third of the children have witnessed their father stalking...

  7. Does Automated Feedback Improve Writing Quality?

    Science.gov (United States)

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  8. Assessing genetic polymorphisms using DNA extracted from cells present in saliva samples

    Directory of Open Access Journals (Sweden)

    Nemoda Zsofia

    2011-12-01

    Full Text Available Abstract Background Technical advances following the Human Genome Project revealed that high-quality and -quantity DNA may be obtained from whole saliva samples. However, usability of previously collected samples and the effects of environmental conditions on the samples during collection have not been assessed in detail. In five studies we document the effects of sample volume, handling and storage conditions, type of collection device, and oral sampling location, on quantity, quality, and genetic assessment of DNA extracted from cells present in saliva. Methods Saliva samples were collected from ten adults in each study. Saliva volumes from .10-1.0 ml, different saliva collection devices, sampling locations in the mouth, room temperature storage, and multiple freeze-thaw cycles were tested. One representative single nucleotide polymorphism (SNP in the catechol-0-methyltransferase gene (COMT rs4680 and one representative variable number of tandem repeats (VNTR in the serotonin transporter gene (5-HTTLPR: serotonin transporter linked polymorphic region were selected for genetic analyses. Results The smallest tested whole saliva volume of .10 ml yielded, on average, 1.43 ± .77 μg DNA and gave accurate genotype calls in both genetic analyses. The usage of collection devices reduced the amount of DNA extracted from the saliva filtrates compared to the whole saliva sample, as 54-92% of the DNA was retained on the device. An "adhered cell" extraction enabled recovery of this DNA and provided good quality and quantity DNA. The DNA from both the saliva filtrates and the adhered cell recovery provided accurate genotype calls. The effects of storage at room temperature (up to 5 days, repeated freeze-thaw cycles (up to 6 cycles, and oral sampling location on DNA extraction and on genetic analysis from saliva were negligible. Conclusions Whole saliva samples with volumes of at least .10 ml were sufficient to extract good quality and quantity DNA. Using

  9. Methods for collecting benthic invertebrate samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.

  10. Development of a methodology for automated assessment of the quality of digitized images in mammography; Desenvolvimento de metodologia para avaliacao automatizada da qualidade de imagens digitalizadas em mamografia

    Energy Technology Data Exchange (ETDEWEB)

    Santana, Priscila do Carmo

    2010-07-01

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  11. HydroCrowd: Citizen-empowered snapshot sampling to assess the spatial distribution of stream

    Science.gov (United States)

    Kraft, Philipp; Breuer, Lutz; Bach, Martin; Aubert, Alice H.; Frede, Hans-Georg

    2016-04-01

    Large parts of groundwater bodies in Central Europe shows elevated nitrate concentrations. While groundwater samplings characterize the water quality for a longer period, surface water resources, in particular streams, may be subject of fast concentration fluctuations and measurements distributed in time cannot by compared. Thus, sampling should be done in a short time frame (snapshot sampling). To describe the nitrogen status of streams in Germany, we organized a crowdsourcing experiment in the form of a snapshot sampling at a distinct day. We selected a national holiday in fall 2013 (Oct, 3rd) to ensure that a) volunteers have time to take a sample, b) stream water is unlikely to be influenced by recent agricultural fertilizer application, and c) low flow conditions are likely. We distributed 570 cleaned sample flasks to volunteers and got 280 filled flasks back with coordinates and other meta data about the sampled stream. The volunteers were asked to visit any stream outside of settlements and fill the flask with water from that stream. The samples were analyzed in our lab for concentration of nitrate, ammonium and dissolved organic nitrogen (DON), results are presented as a map on the web site http://www.uni-giessen.de/hydrocrowd. The measured results are related to catchment features such as population density, soil properties, and land use derived from national geodata sources. The statistical analyses revealed a significant correlation between nitrate and fraction of arable land (0.46), as well as soil humus content (0.37), but a weak correlation with population density (0.12). DON correlations were weak but significant with humus content (0.14) and arable land (0.13). The mean contribution of DON to total dissolved nitrogen was 22%. Crowdsourcing turned out to be a useful method to assess the spatial distribution of stream solutes, as considerable amounts of samples were collected with comparatively little effort at a single day.

  12. Central Colorado Assessment Project (CCAP)-Geochemical data for rock, sediment, soil, and concentrate sample media

    Science.gov (United States)

    Granitto, Matthew; DeWitt, Ed H.; Klein, Terry L.

    2010-01-01

    This database was initiated, designed, and populated to collect and integrate geochemical data from central Colorado in order to facilitate geologic mapping, petrologic studies, mineral resource assessment, definition of geochemical baseline values and statistics, environmental impact assessment, and medical geology. The Microsoft Access database serves as a geochemical data warehouse in support of the Central Colorado Assessment Project (CCAP) and contains data tables describing historical and new quantitative and qualitative geochemical analyses determined by 70 analytical laboratory and field methods for 47,478 rock, sediment, soil, and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed either in the analytical laboratories of the USGS or by contract with commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects. In addition, geochemical data from 7,470 sediment and soil samples collected and analyzed under the Atomic Energy Commission National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program (henceforth called NURE) have been included in this database. In addition to data from 2,377 samples collected and analyzed under CCAP, this dataset includes archived geochemical data originally entered into the in-house Rock Analysis Storage System (RASS) database (used by the USGS from the mid-1960s through the late 1980s) and the in-house PLUTO database (used by the USGS from the mid-1970s through the mid-1990s). All of these data are maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB and from the NURE database were used to generate most of this dataset. In addition, USGS data that have been excluded previously from the NGDB because the data predate earliest USGS geochemical databases, or were once excluded for programmatic reasons

  13. Assessing the value of DNA barcodes for molecular phylogenetics: effect of increased taxon sampling in lepidoptera.

    Directory of Open Access Journals (Sweden)

    John James Wilson

    Full Text Available BACKGROUND: A common perception is that DNA barcode datamatrices have limited phylogenetic signal due to the small number of characters available per taxon. However, another school of thought suggests that the massively increased taxon sampling afforded through the use of DNA barcodes may considerably increase the phylogenetic signal present in a datamatrix. Here I test this hypothesis using a large dataset of macrolepidopteran DNA barcodes. METHODOLOGY/PRINCIPAL FINDINGS: Taxon sampling was systematically increased in datamatrices containing macrolepidopteran DNA barcodes. Sixteen family groups were designated as concordance groups and two quantitative measures; the taxon consistency index and the taxon retention index, were used to assess any changes in phylogenetic signal as a result of the increase in taxon sampling. DNA barcodes alone, even with maximal taxon sampling (500 species per family, were not sufficient to reconstruct monophyly of families and increased taxon sampling generally increased the number of clades formed per family. However, the scores indicated a similar level of taxon retention (species from a family clustering together in the cladograms as the number of species included in the datamatrix was increased, suggesting substantial phylogenetic signal below the 'family' branch. CONCLUSIONS/SIGNIFICANCE: The development of supermatrix, supertree or constrained tree approaches could enable the exploitation of the massive taxon sampling afforded through DNA barcodes for phylogenetics, connecting the twigs resolved by barcodes to the deep branches resolved through phylogenomics.

  14. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  15. Automation of plasma protein binding assay using rapid equilibrium dialysis device and Tecan workstation.

    Science.gov (United States)

    Ye, Zhengqi; Zetterberg, Craig; Gao, Hong

    2017-03-14

    Binding of drug molecules to plasma proteins is an important parameter in assessing drug ADME properties. Plasma protein binding (PPB) assays are routinely performed during drug discovery and development. A fully automated PPB assay was developed using rapid equilibrium dialysis (RED) device and Tecan workstation coupled to an automated incubator. The PPB assay was carried out in unsealed RED plates which allowed the assay to be fully automated. The plasma pH was maintained at 7.4 during the 6-h dialysis under 2% CO2 condition. The samples were extracted with acetonitrile and analyzed by liquid chromatography tandem mass spectrometry. The percent bound results of 10 commercial drugs in plasma protein binding were very similar between the automated and manual assays, and were comparable to literature values. The automated assay increases laboratory productivity and is applicable to high-throughput screening of drug protein binding in drug discovery.

  16. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff....... Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  17. Contamination assessment in microbiological sampling of the Eyreville core, Chesapeake Bay impact structure

    Science.gov (United States)

    Gronstal, A.L.; Voytek, M.A.; Kirshtein, J.D.; Von der, Heyde; Lowit, M.D.; Cockell, C.S.

    2009-01-01

    Knowledge of the deep subsurface biosphere is limited due to difficulties in recovering materials. Deep drilling projects provide access to the subsurface; however, contamination introduced during drilling poses a major obstacle in obtaining clean samples. To monitor contamination during the 2005 International Continental Scientific Drilling Program (ICDP)-U.S. Geological Survey (USGS) deep drilling of the Chesapeake Bay impact structure, four methods were utilized. Fluorescent microspheres were used to mimic the ability of contaminant cells to enter samples through fractures in the core material during retrieval. Drilling mud was infused with a chemical tracer (Halon 1211) in order to monitor penetration of mud into cores. Pore water from samples was examined using excitation-emission matrix (EEM) fl uorescence spectroscopy to characterize dissolved organic carbon (DOC) present at various depths. DOC signatures at depth were compared to signatures from drilling mud in order to identify potential contamination. Finally, microbial contaminants present in drilling mud were identified through 16S ribosomal deoxyribonucleic acid (rDNA) clone libraries and compared to species cultured from core samples. Together, these methods allowed us to categorize the recovered core samples according to the likelihood of contamination. Twenty-two of the 47 subcores that were retrieved were free of contamination by all the methods used and were subsequently used for microbiological culture and culture-independent analysis. Our approach provides a comprehensive assessment of both particulate and dissolved contaminants that could be applied to any environment with low biomass. ?? 2009 The Geological Society of America.

  18. Sampling methods for assessing syrphid biodiversity (Diptera: Syrphidae) in tropical forests.

    Science.gov (United States)

    Marcos-García, M A; García-López, A; Zumbado, M A; Rotheray, G E

    2012-12-01

    When assessing the species richness of a taxonomic group in a specific area, the choice of sampling method is critical. In this study, the effectiveness of three methods for sampling syrphids (Diptera: Syrphidae) in tropical forests is compared: Malaise trapping, collecting adults with an entomological net, and collecting and rearing immatures. Surveys were made from 2008 to 2011 in six tropical forest sites in Costa Rica. The results revealed significant differences in the composition and richness of syrphid faunas obtained by each method. Collecting immatures was the most successful method based on numbers of species and individuals, whereas Malaise trapping was the least effective. This pattern of sampling effectiveness was independent of syrphid trophic or functional group and annual season. An advantage of collecting immatures over collecting adults is the quality and quantity of associated biological data obtained by the former method. However, complementarity between results of collecting adults and collecting immatures, showed that a combined sampling regime obtained the most complete inventory. Differences between these results and similar studies in more open Mediterranean habitats, suggest that for effective inventory, it is important to consider the effects of environmental characteristics on the catchability of syrphids as much as the costs and benefits of different sampling techniques.

  19. Alveolar breath sampling and analysis to assess trihalomethane exposures during competitive swimming training.

    Science.gov (United States)

    Lindstrom, A B; Pleil, J D; Berkoff, D C

    1997-06-01

    Alveolar breath sampling was used to assess trihalomethane (THM) exposures encountered by collegiate swimmers during a typical 2-hr training period in an indoor natatorium. The breath samples were collected at regular intervals before, during, and for 3 hr after a moderately intense training workout. Integrated and grab whole-air samples were collected during the training period to help determine inhalation exposures, and pool water samples were collected to help assess dermal exposures. Resulting breath samples collected during the workout demonstrated a rapid uptake of two THMs (chloroform and bromodichloromethane), with chloroform concentrations exceeding the natatorium air levels within 8 min after the exposure began. Chloroform levels continued to rise steeply until they were more than two times the indoor levels, providing evidence that the dermal route of exposure was relatively rapid and ultimately more important than the inhalation route in this training scenario. Chloroform elimination after the exposure period was fitted to a three compartment model that allowed estimation of compartmental half-lives, resulting minimum bloodborne dose, and an approximation of the duration of elevated body burdens. We estimated the dermal exposure route to account for 80% of the blood chloroform concentration and the transdermal diffusion efficiency from the water to the blood to in excess of 2%. Bromodichloromethane elimination was fitted to a two compartment model which provided evidence of a small, but measurable, body burden of this THM resulting from vigorous swim training. These results suggest that trihalomethane exposures for competitive swimmers under prolonged, high-effort training are common and possibly higher than was previously thought and that the dermal exposure route is dominant. The exposures and potential risks associated with this common recreational activity should be more thoroughly investigated.

  20. Observing System Simulation Experiments for the assessment of temperature sampling strategies in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    F. Raicich

    Full Text Available For the first time in the Mediterranean Sea various temperature sampling strategies are studied and compared to each other by means of the Observing System Simulation Experiment technique. Their usefulness in the framework of the Mediterranean Forecasting System (MFS is assessed by quantifying their impact in a Mediterranean General Circulation Model in numerical twin experiments via univariate data assimilation of temperature profiles in summer and winter conditions. Data assimilation is performed by means of the optimal interpolation algorithm implemented in the SOFA (System for Ocean Forecasting and Analysis code. The sampling strategies studied here include various combinations of eXpendable BathyThermograph (XBT profiles collected along Volunteer Observing Ship (VOS tracks, Airborne XBTs (AXBTs and sea surface temperatures. The actual sampling strategy adopted in the MFS Pilot Project during the Targeted Operational Period (TOP, winter-spring 2000 is also studied.

    The data impact is quantified by the error reduction relative to the free run. The most effective sampling strategies determine 25–40% error reduction, depending on the season, the geographic area and the depth range. A qualitative relationship can be recognized in terms of the spread of information from the data positions, between basin circulation features and spatial patterns of the error reduction fields, as a function of different spatial and seasonal characteristics of the dynamics. The largest error reductions are observed when samplings are characterized by extensive spatial coverages, as in the cases of AXBTs and the combination of XBTs and surface temperatures. The sampling strategy adopted during the TOP is characterized by little impact, as a consequence of a sampling frequency that is too low.

    Key words. Oceanography: general (marginal and semi-enclosed seas; numerical modelling

  1. Bayesian Reliability Modeling and Assessment Solution for NC Machine Tools under Small-sample Data

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaojun; KAN Yingnan; CHEN Fei; XU Binbin; CHEN Chuanhai; YANG Chuangui

    2015-01-01

    Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters’ prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters’ posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.

  2. Exploring Stochastic Sampling in Nuclear Data Uncertainties Assessment for Reactor Physics Applications and Validation Studies

    Directory of Open Access Journals (Sweden)

    Alexander Vasiliev

    2016-12-01

    Full Text Available The quantification of uncertainties of various calculation results, caused by the uncertainties associated with the input nuclear data, is a common task in nuclear reactor physics applications. Modern computation resources and improved knowledge on nuclear data allow nowadays to significantly advance the capabilities for practical investigations. Stochastic sampling is the method which has received recently a high momentum for its use and exploration in the domain of reactor design and safety analysis. An application of a stochastic sampling based tool towards nuclear reactor dosimetry studies is considered in the given paper with certain exemplary test evaluations. The stochastic sampling not only allows the input nuclear data uncertainties propagation through the calculations, but also an associated correlation analysis performance with no additional computation costs and for any parameters of interest can be done. Thus, an example of assessment of the Pearson correlation coefficients for several models, used in practical validation studies, is shown here. As a next step, the analysis of the obtained information is proposed for discussion, with focus on the systems similarities assessment. The benefits of the employed method and tools with respect to practical reactor dosimetry studies are consequently outlined.

  3. Non-invasive automated assessment of the ratio of pulmonary to systemic flow in patients with atrial septal defects by the colour Doppler velocity profile integration method

    OpenAIRE

    Ueda, Y.; Hozumi, T; Yoshida, K.; Watanabe, H; Akasaka, T; Takagi, T; Yamamuro, A; Homma, S; Yoshikawa, J

    2002-01-01

    Background: The recent introduction of the automated cardiac flow measurement (ACM) method, using spatiotemporal integration of the Doppler velocity profile, provides a quick and accurate automated calculation of cardiac output.

  4. Brief assessment of cognition in schizophrenia: normative data in an English-speaking ethnic Chinese sample.

    Science.gov (United States)

    Eng, Goi Khia; Lam, Max; Bong, Yioe Ling; Subramaniam, Mythily; Bautista, Dianne; Rapisarda, Attilio; Kraus, Michael; Lee, Jimmy; Collinson, Simon Lowes; Chong, Siow Ann; Keefe, Richard S E

    2013-12-01

    There is a dearth of non-Western normative data for neuropsychological batteries designed to measure cognitive deficits in schizophrenia. Here, we provide normative data for English-speaking ethnic Chinese on the widely used Brief Assessment of Cognition in Schizophrenia acquired from 595 healthy community participants between ages 14 and 55. Means and standard deviations of subtests and composite scores were stratified by age group and sex. We also explored linear regression approaches to generate continuous norms adjusted for age, sex, and education. Notable differences in subtest performances were found against a Western comparison sample. Normative data established in the current sample are essential for clinical and research purposes as it serves as a reference source of cognition for ethnic Chinese.

  5. Quality Assessment of Attribute Data in GIS Based on Simple Random Sampling

    Institute of Scientific and Technical Information of China (English)

    LIU Chun; SHI Wenzhong; LIU Dajie

    2003-01-01

    On the basis of the principles of simple random sampling, the statistical model of rate of disfigurement (RD) is put forward and described in detail. According to the definition of simple random sampling for the attribute data in GIS, the mean and variance of the RD are deduced as the characteristic value of the statistical model in order to explain the feasibility of the accuracy measurement of the attribute data in GIS by using the RD. Moreover, on the basis of the mean and variance of the RD, the quality assessment method for attribute data of vector maps during the data collecting is discussed. The RD spread graph is also drawn to see whether the quality of the attribute data is under control. The RD model can synthetically judge the quality of attribute data, which is different from other measurement coefficients that only discuss accuracy of classification.

  6. Urban air quality assessment using monitoring data of fractionized aerosol samples, chemometrics and meteorological conditions.

    Science.gov (United States)

    Yotova, Galina I; Tsitouridou, Roxani; Tsakovski, Stefan L; Simeonov, Vasil D

    2016-01-01

    The present article deals with assessment of urban air by using monitoring data for 10 different aerosol fractions (0.015-16 μm) collected at a typical urban site in City of Thessaloniki, Greece. The data set was subject to multivariate statistical analysis (cluster analysis and principal components analysis) and, additionally, to HYSPLIT back trajectory modeling in order to assess in a better way the impact of the weather conditions on the pollution sources identified. A specific element of the study is the effort to clarify the role of outliers in the data set. The reason for the appearance of outliers is strongly related to the atmospheric condition on the particular sampling days leading to enhanced concentration of pollutants (secondary emissions, sea sprays, road and soil dust, combustion processes) especially for ultra fine and coarse particles. It is also shown that three major sources affect the urban air quality of the location studied-sea sprays, mineral dust and anthropogenic influences (agricultural activity, combustion processes, and industrial sources). The level of impact is related to certain extent to the aerosol fraction size. The assessment of the meteorological conditions leads to defining of four downwind patterns affecting the air quality (Pelagic, Western and Central Europe, Eastern and Northeastern Europe and Africa and Southern Europe). Thus, the present study offers a complete urban air assessment taking into account the weather conditions, pollution sources and aerosol fractioning.

  7. Assessment of Heavy Metals in Water Samples of Certain Locations Situated Around Tumkur, Karnataka, India

    Directory of Open Access Journals (Sweden)

    C. Vijaya Bhaskar

    2010-01-01

    Full Text Available Surface water and groundwater samples of certain locations namely Kallambella, Bugudanahalli, Maidala, Honnudike, Kunigal, Kadaba and Hebbur, situated around Tumkur were assessed in the month of September 2008 for pH, EC and heavy metals Cd, Cu, Fe, Hg, Mn, Zn and Ni. The pH vales of surface waters were in alkaline range of 7.8-8.2 and are well within safe limits for crop production. The pH of ground- water was in the range of 7.6-8.4. The conductivity was in the range of 0.20-0.68 mS/cm and 0.34-2.44 mS/cm for surface and groundwaters respectively. High EC value of Kallambella groundwater accounts for its salinity. All surface waters except Honnudike and Hebbur samples contain low concentrations of these metals and are ideal for irrigation. Though the samples from Honnudike, Kadaba and Hebbur have high iron concentration, only Honnudike and Hebbur samples have exceeded the limit of 5 mg/L required for irrigation. In groundwaters the concentrations of all these heavy metals except copper are also well in permissible limits and suitable for drinking. Cu, Fe, Ni and Zn were detected in all the samples and found in the range of 0.094-0.131, 0.958-12.537, 0.020-0.036 and 0.082-1.139 mg/L respectively in surface waters and these are in the range of 0.132-0.142, 0.125-1.014, 0.028-0.036 and 0.003-0.037 mg/L in ground- waters. The elements cadmium, mercury and manganese are absent in all the samples.

  8. Soyuz 22 Return Samples: Assessment of Air Quality Aboard the International Space Station

    Science.gov (United States)

    Jams, John T.

    2010-01-01

    Three mini-grab sample containers (m-GSCs) were returned aboard Soyuz 22 because of concerns that new air pollutants were present in the air and these were getting into the water recovery system. The Total Organic Carbon Analyzer had been giving increasing readings of total organic carbon (TOC) in the potable water, and it was postulated that an increased load into the system was responsible. The toxicological assessment of 3 m-GSCs from the ISS is shown in Table 1. The recoveries of the 3 standards (as listed above) from the GSCs averaged 103, 95 and 76%, respectively. Recovery from formaldehyde control badges were 90 and 91%.

  9. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  10. Semi-automated image analysis for the assessment of megafaunal densities at the Arctic deep-sea observatory HAUSGARTEN.

    Directory of Open Access Journals (Sweden)

    Timm Schoening

    Full Text Available Megafauna play an important role in benthic ecosystem function and are sensitive indicators of environmental change. Non-invasive monitoring of benthic communities can be accomplished by seafloor imaging. However, manual quantification of megafauna in images is labor-intensive and therefore, this organism size class is often neglected in ecosystem studies. Automated image analysis has been proposed as a possible approach to such analysis, but the heterogeneity of megafaunal communities poses a non-trivial challenge for such automated techniques. Here, the potential of a generalized object detection architecture, referred to as iSIS (intelligent Screening of underwater Image Sequences, for the quantification of a heterogenous group of megafauna taxa is investigated. The iSIS system is tuned for a particular image sequence (i.e. a transect using a small subset of the images, in which megafauna taxa positions were previously marked by an expert. To investigate the potential of iSIS and compare its results with those obtained from human experts, a group of eight different taxa from one camera transect of seafloor images taken at the Arctic deep-sea observatory HAUSGARTEN is used. The results show that inter- and intra-observer agreements of human experts exhibit considerable variation between the species, with a similar degree of variation apparent in the automatically derived results obtained by iSIS. Whilst some taxa (e. g. Bathycrinus stalks, Kolga hyalina, small white sea anemone were well detected by iSIS (i. e. overall Sensitivity: 87%, overall Positive Predictive Value: 67%, some taxa such as the small sea cucumber Elpidia heckeri remain challenging, for both human observers and iSIS.

  11. Semi-automated image analysis for the assessment of megafaunal densities at the Arctic deep-sea observatory HAUSGARTEN.

    Science.gov (United States)

    Schoening, Timm; Bergmann, Melanie; Ontrup, Jörg; Taylor, James; Dannheim, Jennifer; Gutt, Julian; Purser, Autun; Nattkemper, Tim W

    2012-01-01

    Megafauna play an important role in benthic ecosystem function and are sensitive indicators of environmental change. Non-invasive monitoring of benthic communities can be accomplished by seafloor imaging. However, manual quantification of megafauna in images is labor-intensive and therefore, this organism size class is often neglected in ecosystem studies. Automated image analysis has been proposed as a possible approach to such analysis, but the heterogeneity of megafaunal communities poses a non-trivial challenge for such automated techniques. Here, the potential of a generalized object detection architecture, referred to as iSIS (intelligent Screening of underwater Image Sequences), for the quantification of a heterogenous group of megafauna taxa is investigated. The iSIS system is tuned for a particular image sequence (i.e. a transect) using a small subset of the images, in which megafauna taxa positions were previously marked by an expert. To investigate the potential of iSIS and compare its results with those obtained from human experts, a group of eight different taxa from one camera transect of seafloor images taken at the Arctic deep-sea observatory HAUSGARTEN is used. The results show that inter- and intra-observer agreements of human experts exhibit considerable variation between the species, with a similar degree of variation apparent in the automatically derived results obtained by iSIS. Whilst some taxa (e. g. Bathycrinus stalks, Kolga hyalina, small white sea anemone) were well detected by iSIS (i. e. overall Sensitivity: 87%, overall Positive Predictive Value: 67%), some taxa such as the small sea cucumber Elpidia heckeri remain challenging, for both human observers and iSIS.

  12. Determination of organochlorine Pesticides in Water Samples by Fully Automated Quantitative Concentrator-Gas Chromatography%全自动定量浓缩-气相色谱法分析地表水中的有机氯农药

    Institute of Scientific and Technical Information of China (English)

    曹旭静

    2016-01-01

    Organochlorine pesticides in water were extracted by n-hexan,the extracted liquid was concentrated to 1mL with fully automated quantitative concentrator in the water bath temperature 35℃and the vacuum 300mbar.Which only need 25min. Organochlorine pesticides were determined by gas chromatograph after samples pre-treatment by liquid-liquid ex⁃traction with n-hexane and concentration with fully automated quantitative concentrator.The detection limits of method for organochlorine pesticides were in the range of 0.001~0.008μg/L.The average recoveries were 78.6%~104%. This method had advantages of good accuracy and precision,rapid,high degree of automation and was suitable for batch samples.%地表水中的有机氯农药用正己烷萃取后,用全自动定量蒸发浓缩仪在水浴温度35℃,真空度为300mbar时浓缩定容到1mL,一个样品只需要25min。用液液萃取-全自动定量浓缩仪-气相色谱法分析地表水水中的有机氯农药,该方法的检出限为为0.001~0.008μg/L,方法的平均回收率在78.6%~104%之间。该方法检出限低,精密度好,省时省力,自动化程度高,适合于大批量样品的监测。

  13. Genotoxicity assessment of water sampled from R-11 reservoir by means of allium test

    Energy Technology Data Exchange (ETDEWEB)

    Bukatich, E.; Pryakhin, E. [Urals Research Center for Radiation Medicine (Russian Federation); Geraskin, S. [Russian Institute of Agricultural Radiology and Agroecology (Russian Federation)

    2014-07-01

    slides of root tips meristem were dyed with aceto-orcein. Approximately 150 ana-telophases were scored for each root. 20-40 roots were analyzed for each water sample. In total 3000 - 6000 ana-telophases for each water sample were analyzed. Chromosome aberrations in ana-telophases (chromatid and chromosomal bridges and fragments), mitotic abnormalities (multipolar mitosis and laggards) were scored. The data analysis was arranged using R statistics. Aberration frequency in water samples from the natural control reservoir (0.46 ± 0.12%) exceeded insignificantly the frequency of aberrations in distilled (0.15 ± 0.08%) and bottled waters (0.33 ± 0.08%). Average frequency of aberrant cells in root meristem of onion germinated in water samples from R-11 reservoir (1.36 ± 0.24%) was about 3 times higher compared to control ones. Mitotic activity in root meristem was slightly inhibited in bulbs germinated in R-11 sample, but this effect was statistically insignificant. There was no difference in types of aberrations among all water samples but only in the frequency of abnormalities. So genotoxicity assessment of water sampled from R-11 reservoir by means of allium test shows the presence of genotoxic factor in water from the reservoir. Document available in abstract form only. (authors)

  14. A preliminary study to assess the construct validity of a cultural intelligence measure on a South African sample

    OpenAIRE

    2014-01-01

    Orientation: Cultural intelligence is an essential social competence for effective individual interaction in a cross-cultural context. The cultural intelligence scale (CQS) is used extensively for assessing cultural intelligence; nevertheless, its reliability and validity on a South African sample are yet to be ascertained.Research purpose: The purpose of the current study was to assess the construct validity of the CQS on a South African sample. The results of the psychometric assessment off...

  15. Goblet cells of the normal human bulbar conjunctiva and their assessment by impression cytology sampling.

    Science.gov (United States)

    Doughty, Michael J

    2012-07-01

    Goblet cells of the conjunctiva are the main source of mucus for the ocular surface. The objectives of this review are to consider the goblet cells as assessed by various histological, cytological and electron microscopy methods, and to assess the consistency of published reports (over more than 25 years) of goblet cell density (GCD) from impression cytology specimens from nominally healthy human subjects. Reported GCD values have been notably variable, with a range from 24 to 2226 cells/mm² for average values. Data analysis suggests that a high density of goblet cells should be expected for the healthy human conjunctiva, with a tendency toward higher values in samples taken from normally covered locations (inferior and superior bulbar conjunctiva) of the open eye (at 973 +/- 789 cells/ mm²) than in samples taken from exposed (interpalpebral) locations (at 427 +/- 376 cells/mm²). No obvious change in GCD was found with respect to age, perhaps because the variability of the data did not allow detection of any age-related decline in GCD. Analyses of published data from 33 other sources indicated a trend for GCD to be lower than normal across a spectrum of ocular surface diseases.

  16. Impact of diastolic dysfunction severity on global left ventricular volumetric filling - assessment by automated segmentation of routine cine cardiovascular magnetic resonance

    Directory of Open Access Journals (Sweden)

    Mendoza Dorinna D

    2010-07-01

    Full Text Available Abstract Objectives To examine relationships between severity of echocardiography (echo -evidenced diastolic dysfunction (DD and volumetric filling by automated processing of routine cine cardiovascular magnetic resonance (CMR. Background Cine-CMR provides high-resolution assessment of left ventricular (LV chamber volumes. Automated segmentation (LV-METRIC yields LV filling curves by segmenting all short-axis images across all temporal phases. This study used cine-CMR to assess filling changes that occur with progressive DD. Methods 115 post-MI patients underwent CMR and echo within 1 day. LV-METRIC yielded multiple diastolic indices - E:A ratio, peak filling rate (PFR, time to peak filling rate (TPFR, and diastolic volume recovery (DVR80 - proportion of diastole required to recover 80% stroke volume. Echo was the reference for DD. Results LV-METRIC successfully generated LV filling curves in all patients. CMR indices were reproducible (≤ 1% inter-reader differences and required minimal processing time (175 ± 34 images/exam, 2:09 ± 0:51 minutes. CMR E:A ratio decreased with grade 1 and increased with grades 2-3 DD. Diastolic filling intervals, measured by DVR80 or TPFR, prolonged with grade 1 and shortened with grade 3 DD, paralleling echo deceleration time (p 80 identified 71% of patients with echo-evidenced grade 1 but no patients with grade 3 DD, and stroke-volume adjusted PFR identified 67% with grade 3 but none with grade 1 DD (matched specificity = 83%. The combination of DVR80 and PFR identified 53% of patients with grade 2 DD. Prolonged DVR80 was associated with grade 1 (OR 2.79, CI 1.65-4.05, p = 0.001 with a similar trend for grade 2 (OR 1.35, CI 0.98-1.74, p = 0.06, whereas high PFR was associated with grade 3 (OR 1.14, CI 1.02-1.25, p = 0.02 DD. Conclusions Automated cine-CMR segmentation can discern LV filling changes that occur with increasing severity of echo-evidenced DD. Impaired relaxation is associated with prolonged

  17. GY SAMPLING THEORY IN ENVIRONMENTAL STUDIES 1: ASSESSING SOIL SPLITTING PROTOCOLS

    Science.gov (United States)

    Five soil sample splitting methods (riffle splitting, paper cone riffle splitting, fractional shoveling, coning and quartering, and grab sampling) were evaluated with synthetic samples to verify Pierre Gy sampling theory expectations. Individually prepared samples consisting of l...

  18. A new strategic sampling for offshore wind assessment using radar satellite images

    Energy Technology Data Exchange (ETDEWEB)

    Beaucage, P.; Lafrance, G.; Bernier, M.; Lafrance, J. [Institut National de la Recherche Scientifique, Varennes, PQ (Canada); Choisnard, J. [Hydro-Quebec, Varennes, PQ (Canada)

    2007-07-01

    Synthetic Aperture Radar (SAR) satellite images have been used for offshore wind assessment. Several offshore wind farms are in operation or under construction in northern Europe. The European target for 2030 is 300 GW, of which half is intended for onshore and half for offshore development. Offshore projects in the east coast United States, the Gulf of Mexico and west coast of Canada are in the planning stage. Information obtained from SAR can be used to supplement current mapping methods of offshore wind energy resources. SAR is a useful tool to localize wind pattern over water surfaces. Other sources of offshore wind observations include meteorological stations such as buoys and masts; remote sensing instruments onboard satellites such as scatterometers (QuikSCAT, ASCAT) or passive microwave radiometers; and numerical weather prediction models. The synergy between scatterometers and SAR was discussed. The SAR system has been used for microscale resolution wind mapping in the Gaspe Peninsula. Strategic sampling zones were chosen in proximity to the QuikSCAT grid. It was concluded that 270 and 570 SAR images are needed to calculate average wind speed (U) and mean power output of a 3 MW wind turbine (P) over the Gaspe Peninsula region, respectively. It was concluded that microscale regional wind mapping can be produced at a lower cost with strategic sampling compared to random sampling. refs., tabs., figs.

  19. Determination of furan levels in commercial samples of baby food from Brazil and preliminary risk assessment.

    Science.gov (United States)

    Pavesi Arisseto, A; Vicente, E; De Figueiredo Toledo, M C

    2010-08-01

    Commercial baby food samples available on the Brazilian market (n = 31) were analysed for furan content using a gas chromatography-mass spectrometry method preceded by solid-phase microextraction. A limit of detection of 0.7 microg kg(-1), a limit of quantitation of 2.4 microg kg(-1), mean recoveries varying from 80% to 107%, and coefficients of variation ranging from 5.6% to 9.4% for repeatability and from 7.4% to 12.4% for within-laboratory reproducibility were obtained during an in-house validation. The levels of furan found in the samples were from not detected to 95.5 microg kg(-1). Samples containing vegetables and meat showed higher furan levels as compared with those containing only fruits. An exposure assessment showed furan intakes up to 2.4 microg kg(-1) body weight day(-1) (99th percentile) for babies fed exclusively with commercial baby foods. Margins of exposure obtained from intakes estimated in this work indicated a potential public health concern.

  20. Automated extraction and assessment of functional features of areal measured microstructures using a segmentation-based evaluation method

    Science.gov (United States)

    Hartmann, Wito; Loderer, Andreas

    2014-10-01

    In addition to currently available surface parameters, according to ISO 4287:2010 and ISO 25178-2:2012—which are defined particularly for stochastic surfaces—a universal evaluation procedure is provided for geometrical, well-defined, microstructured surfaces. Since several million of features (like diameters, depths, etc) are present on microstructured surfaces, segmentation techniques are used for the automation of the feature-based dimensional evaluation. By applying an additional extended 3D evaluation after the segmentation and classification procedure, the accuracy of the evaluation is improved compared to the direct evaluation of segments, and additional functional parameters can be derived. Advantages of the extended segmentation-based evaluation method include not only the ability to evaluate the manufacturing process statistically (e.g. by capability indices, according to ISO 21747:2007 and ISO 3534-2:2013) and to derive statistical reliable values for the correction of microstructuring processes but also the direct re-use of the evaluated parameter (including its statistical distribution) in simulations for the calculation of probabilities with respect to the functionality of the microstructured surface. The practical suitability of this method is demonstrated using examples of microstructures for the improvement of sliding and ink transfers for printing machines.

  1. Assessing the Role of Automation in Managing of Iranian E-banking and its Impact on Social Benefit

    Directory of Open Access Journals (Sweden)

    Hamidreza Salmani MOJAVERI

    2011-06-01

    Full Text Available Banks in the field of commercial developments have attention to create structural changes in the receiving and payment systems and also have facilities in services process to customers. In fact we can claim one of the reasons of general tendency to electronic business is the banks managers’ attention to the importance and necessity of this phenomenon, thus have led to their trend and serious attention for providing banking structure, based on electronic method. What banking services makes it different in comparing with other conventional methods for using E-Banking systems, is, quantitative and qualitative expansion in customer service. In other words, E-Banking, prepares the situation to customer till have wider and more diverse services. Furthermore, time and spatial dimension will not have effect in reducing or increasing services to customers. Also the customer can control his/her financial activities in every time and everywhere without attending in bank’s branches. The aim of this paper is to illustrate the status of banking automation, its social and organizational consequences in Iranian E-banking system, and providing appropriate recommendations.

  2. Heating automation

    OpenAIRE

    Tomažič, Tomaž

    2013-01-01

    This degree paper presents usage and operation of peripheral devices with microcontroller for heating automation. The main goal is to make a quality system control for heating three house floors and with that, increase efficiency of heating devices and lower heating expenses. Heat pump, furnace, boiler pump, two floor-heating pumps and two radiator pumps need to be controlled by this system. For work, we have chosen a development kit stm32f4 - discovery with five temperature sensors, LCD disp...

  3. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  4. Marketing automation

    OpenAIRE

    Raluca Dania TODOR

    2017-01-01

    The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the...

  5. Applicability Comparison of Methods for Acid Generation Assessment of Rock Samples

    Science.gov (United States)

    Oh, Chamteut; Ji, Sangwoo; Yim, Giljae; Cheong, Youngwook

    2014-05-01

    Minerals including various forms of sulfur could generate AMD (Acid Mine Drainage) or ARD (Acid Rock Drainage), which can have serious effects on the ecosystem and even on human when exposed to air and/or water. To minimize the hazards by acid drainage, it is necessary to assess in advance the acid generation possibility of rocks and estimate the amount of acid generation. Because of its relatively simple and effective experiment procedure, the method of combining the results of ABA (Acid Base Accounting) and NAG (Net Acid Generation) tests have been commonly used in determining acid drainage conditions. The simplicity and effectiveness of the above method however, are derived from massive assumptions of simplified chemical reactions and this often leads to results of classifying the samples as UC (Uncertain) which would then require additional experimental or field data to reclassify them properly. This paper therefore, attempts to find the reasons that cause samples to be classified as UC and suggest new series of experiments where samples can be reclassified appropriately. Study precedents on evaluating potential acid generation and neutralization capacity were reviewed and as a result three individual experiments were selected in the light of applicability and compatibility of minimizing unnecessary influence among other experiments. The proposed experiments include sulfur speciation, ABCC (Acid Buffering Characteristic Curve), and Modified NAG which are all improved versions of existing experiments of Total S, ANC (Acid Neutralizing Capacity), and NAG respectively. To assure the applicability of the experiments, 36 samples from 19 sites with diverse geologies, field properties, and weathering conditions were collected. The samples were then subject to existing experiments and as a result, 14 samples which either were classified as UC or could be used as a comparison group had been selected. Afterwards, the selected samples were used to conduct the suggested

  6. A simulation approach to assessing sampling strategies for insect pests: an example with the balsam gall midge.

    Directory of Open Access Journals (Sweden)

    R Drew Carleton

    Full Text Available Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with "pre-sampling" data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n ∼ 100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand was the most efficient, with sample means converging on true mean density for sample sizes of n ∼ 25-40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods.

  7. Sample-based assessment of the microbial etiology of bovine necrotic vulvovaginitis.

    Science.gov (United States)

    Blum, S; Mazuz, M; Brenner, J; Friedgut, O; Stram, Y; Koren, O; Goshen, T; Elad, D

    2007-07-15

    A semiquantitative evaluation of potential bacterial pathogens was correlated to the severity of lesions during an outbreak of bovine necrotic vulvovaginitis (BNVV) on an Israeli dairy herd. Bacteriologic examination of 287 vaginal swabs from 104 post-calving heifers showed a highly significant correlation between Porphyromonas levii colony forming unit numbers and the clinical scores of the lesions, when assessed by an ordinal regression statistical model. No such correlation was found for the other bacteria included in the study. Nineteen samples taken for virological examinations resulted negative for bovine herpes viruses 1, 2, 4 and 5. Thus the results of this study substantiate the essential role of P. levii in the etiology of BNVV and indicate that BHV4 is not required as a predisposing factor to the syndrome.

  8. Soyuz 23 Return Samples: Assessment of Air Quality Aboard the International Space Station

    Science.gov (United States)

    James, John T.

    2011-01-01

    Six mini-grab sample containers (m-GSCs) were returned aboard Soyuz 23 because of concerns that new air pollutants had been present in the air and these were getting into the water recovery system. The Total Organic Carbon Analyzer had been giving increasing readings of total organic carbon (TOC) in the potable water, and it was postulated that an increased load into the system was responsible. The TOC began to decline in late October, 2010. The toxicological assessment of 6 m-GSCs from the ISS is shown in Table 1. The recoveries of 13C-acetone, fluorobenzene, and chlorobenzene from the GSCs averaged 73, 82, and 59%, respectively. We are working to understand the sub-optimal recovery of chlorobenzene.

  9. Assessing decentering: validation, psychometric properties, and clinical usefulness of the Experiences Questionnaire in a Spanish sample.

    Science.gov (United States)

    Soler, Joaquim; Franquesa, Alba; Feliu-Soler, Albert; Cebolla, Ausias; García-Campayo, Javier; Tejedor, Rosa; Demarzo, Marcelo; Baños, Rosa; Pascual, Juan Carlos; Portella, Maria J

    2014-11-01

    Decentering is defined as the ability to observe one's thoughts and feelings in a detached manner. The Experiences Questionnaire (EQ) is a self-report instrument that originally assessed decentering and rumination. The purpose of this study was to evaluate the psychometric properties of the Spanish version of EQ-Decentering and to explore its clinical usefulness. The 11-item EQ-Decentering subscale was translated into Spanish and psychometric properties were examined in a sample of 921 adult individuals, 231 with psychiatric disorders and 690 without. The subsample of nonpsychiatric participants was also split according to their previous meditative experience (meditative participants, n=341; and nonmeditative participants, n=349). Additionally, differences among these three subgroups were explored to determine clinical validity of the scale. Finally, EQ-Decentering was administered twice in a group of borderline personality disorder, before and after a 10-week mindfulness intervention. Confirmatory factor analysis indicated acceptable model fit, sbχ(2)=243.8836 (p.46; and divergent validity: r<-.35). The scale detected changes in decentering after a 10-session intervention in mindfulness (t=-4.692, p<.00001). Differences among groups were significant (F=134.8, p<.000001), where psychiatric participants showed the lowest scores compared to nonpsychiatric meditative and nonmeditative participants. The Spanish version of the EQ-Decentering is a valid and reliable instrument to assess decentering either in clinical and nonclinical samples. In addition, the findings show that EQ-Decentering seems an adequate outcome instrument to detect changes after mindfulness-based interventions.

  10. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  11. Micrometeoroid and Orbital Debris Threat Assessment: Mars Sample Return Earth Entry Vehicle

    Science.gov (United States)

    Christiansen, Eric L.; Hyde, James L.; Bjorkman, Michael D.; Hoffman, Kevin D.; Lear, Dana M.; Prior, Thomas G.

    2011-01-01

    This report provides results of a Micrometeoroid and Orbital Debris (MMOD) risk assessment of the Mars Sample Return Earth Entry Vehicle (MSR EEV). The assessment was performed using standard risk assessment methodology illustrated in Figure 1-1. Central to the process is the Bumper risk assessment code (Figure 1-2), which calculates the critical penetration risk based on geometry, shielding configurations and flight parameters. The assessment process begins by building a finite element model (FEM) of the spacecraft, which defines the size and shape of the spacecraft as well as the locations of the various shielding configurations. This model is built using the NX I-deas software package from Siemens PLM Software. The FEM is constructed using triangular and quadrilateral elements that define the outer shell of the spacecraft. Bumper-II uses the model file to determine the geometry of the spacecraft for the analysis. The next step of the process is to identify the ballistic limit characteristics for the various shield types. These ballistic limits define the critical size particle that will penetrate a shield at a given impact angle and impact velocity. When the finite element model is built, each individual element is assigned a property identifier (PID) to act as an index for its shielding properties. Using the ballistic limit equations (BLEs) built into the Bumper-II code, the shield characteristics are defined for each and every PID in the model. The final stage of the analysis is to determine the probability of no penetration (PNP) on the spacecraft. This is done using the micrometeoroid and orbital debris environment definitions that are built into the Bumper-II code. These engineering models take into account orbit inclination, altitude, attitude and analysis date in order to predict an impacting particle flux on the spacecraft. Using the geometry and shielding characteristics previously defined for the spacecraft and combining that information with the

  12. Philadelphia Brief Assessment of Cognition in healthy and clinical Brazilian sample

    Directory of Open Access Journals (Sweden)

    Danilo Assis Pereira

    2012-03-01

    Full Text Available The Philadelphia Brief Assessment of Cognition (PBAC is a neuropsychological screening instrument that assesses five cognitive domains: working memory, visuospatial functioning, language, episodic memory and comportment. The aim is to verify if PBAC can properly be used in the Brazilian sample. Participated in this study: (a 200 healthy volunteers - 100 young [21.6(2.5 years old] and 100 older adults [70.1(7.3 years old]; >12 years of education; (b 30 Alzheimer's patients (AD [73.7(5.7 years old], 4-11 years in education. The PBAC scores: (a 95.8(2.6, 90.0(4.4 and (b 65.0(10.8 were correlated with the Mini-Mental State Examination (MMSE for young 29.1(0.9, older adults 28.3(1.4 and AD 18.4(3.0 groups. A positive correlation between MMSE and PBAC (r=0.9, p<0.001 was found. Negative correlations were observed between PBAC domains [memory (-0.63, visuospatial abilities (-0.44 and working memory (-0.3 tasks]. MANOVA showed a better male performance in visuospatial functioning (F=8.5, p=0.004. The Brazilian version of PBAC proved to be a promising screening instrument for clinical purposes.

  13. Violence risk assessment and women: predictive accuracy of the HCR-20 in a civil psychiatric sample.

    Science.gov (United States)

    Garcia-Mansilla, Alexandra; Rosenfeld, Barry; Cruise, Keith R

    2011-01-01

    Research to date has not adequately demonstrated whether the HCR-20 Violence Risk Assessment Scheme (HCR-20; Webster, Douglas, Eaves, & Hart, 1997), a structured violence risk assessment measure with a robust literature supporting its validity in male samples, is a valid indicator of violence risk in women. This study utilized data from the MacArthur Study of Mental Disorder and Violence to retrospectively score an abbreviated version of HCR-20 in 827 civil psychiatric patients. HCR-20 scores and predictive accuracy of community violence were compared for men and women. Results suggested that the HCR-20 is slightly, but not significantly, better for evaluating future risk for violence in men than in women, although the magnitude of the gender differences was small and was largely limited to historical factors. The results do not indicate that the HCR-20 needs to be tailored for use in women or that it should not be used in women, but they do highlight that the HCR-20 should be used cautiously and with full awareness of its potential limitations in women.

  14. Normative data for the Montreal Cognitive Assessment in an Italian population sample.

    Science.gov (United States)

    Santangelo, Gabriella; Siciliano, Mattia; Pedone, Roberto; Vitale, Carmine; Falco, Fabrizia; Bisogno, Rossella; Siano, Pietro; Barone, Paolo; Grossi, Dario; Santangelo, Franco; Trojano, Luigi

    2015-04-01

    The Montreal Cognitive Assessment (MoCA) is a rapid screening battery, also including subtests to assess frontal functions such as set-shifting, abstraction and cognitive flexibility. MoCA seems to be useful to identify non-amnestic mild cognitive impairment (MCI) and subcortical dementia; it has high sensitivity and specificity in distinguishing MCI from mild Alzheimer's Disease. Previous studies revealed that certain items of MoCA may be culturally biased and highlighted the need for population-based norms for the MoCA. The aim of present study was to collect normative values in a sample of Italian healthy subjects. Four hundred and fifteen Italian healthy subjects (252 women and 163 men) of different ages (age range 21-95 years) and educational level (from primary to university) underwent MoCA and Mini Mental State Examination (MMSE). Multiple linear regression analysis revealed that age and education significantly influenced performance on MoCA. No significant effect of gender was found. From the derived linear equation, a correction grid for MoCA raw scores was built. Inferential cut-off score, estimated using a non-parametric technique, is 15.5 and equivalent scores were computed. Correlation analysis showed a significant but weak correlation between MoCA adjusted scores with MMSE adjusted scores (r = 0.43, p normative data for the MoCA in an Italian population useful for both clinical and research purposes.

  15. Osteoporosis Self-Assessment Tool Performance in a Large Sample of Postmenopausal Women of Mendoza, Argentina

    Directory of Open Access Journals (Sweden)

    Fernando D. Saraví

    2013-01-01

    Full Text Available The Osteoporosis Self-assessment Tool (OST is a clinical instrument designed to select patients at risk of osteoporosis, who would benefit from a bone mineral density measurement. The OST only takes into account the age and weight of the subject. It was developed for Asian women and later validated for European and North American white women. The performance of the OST in a sample of 4343 women from Greater Mendoza, a large metropolitan area of Argentina, was assessed. Dual X-ray absorptiometry (DXA scans of lumbar spine and hip were obtained. Patients were classified as either osteoporotic (N=1830 or nonosteoporotic (n=2513 according to their lowest T-score at any site. Osteoporotic patients had lower OST scores (P<0.0001. A receiver operating characteristic (ROC curve showed an area under the curve of 71% (P<0.0001, with a sensitivity of 83.7% and a specificity of 44% for a cut-off value of 2. Positive predictive value was 52% and negative predictive value was 79%. The odds ratio for the diagnosis of osteoporosis was 4.06 (CI95 3.51 to 4.71; P<0.0001. It is concluded that the OST is useful for selecting postmenopausal women for DXA testing in the studied population.

  16. Colorimetric assessment of BCR-ABL1 transcripts in clinical samples via gold nanoprobes.

    Science.gov (United States)

    Vinhas, Raquel; Correia, Cláudia; Ribeiro, Patricia; Lourenço, Alexandra; Botelho de Sousa, Aida; Fernandes, Alexandra R; Baptista, Pedro V

    2016-07-01

    Gold nanoparticles functionalized with thiolated oligonucleotides (Au-nanoprobes) have been used in a range of applications for the detection of bioanalytes of interest, from ions to proteins and DNA targets. These detection strategies are based on the unique optical properties of gold nanoparticles, in particular, the intense color that is subject to modulation by modification of the medium dieletric. Au-nanoprobes have been applied for the detection and characterization of specific DNA sequences of interest, namely pathogens and disease biomarkers. Nevertheless, despite its relevance, only a few reports exist on the detection of RNA targets. Among these strategies, the colorimetric detection of DNA has been proven to work for several different targets in controlled samples but demonstration in real clinical bioanalysis has been elusive. Here, we used a colorimetric method based on Au-nanoprobes for the direct detection of the e14a2 BCR-ABL fusion transcript in myeloid leukemia patient samples without the need for retro-transcription. Au-nanoprobes directly assessed total RNA from 38 clinical samples, and results were validated against reverse transcription-nested polymerase chain reaction (RT-nested PCR) and reverse transcription-quantitative polymerase chain reaction (RT-qPCR). The colorimetric Au-nanoprobe assay is a simple yet reliable strategy to scrutinize myeloid leukemia patients at diagnosis and evaluate progression, with obvious advantages in terms of time and cost, particularly in low- to medium-income countries where molecular screening is not routinely feasible. Graphical abstract Gold nanoprobe for colorimetric detection of BCR-ABL1 fusion transcripts originating from the Philadelphia chromosome.

  17. Assessment of DDT and DDE levels in soil, dust, and blood samples from Chihuahua, Mexico.

    Science.gov (United States)

    Martínez, Fernando Díaz-Barriga; Trejo-Acevedo, Antonio; Betanzos, Angel F; Espinosa-Reyes, Guillermo; Alegría-Torres, Jorge Alejandro; Maldonado, Iván Nelinho Pérez

    2012-02-01

    The aim of this study was to assess levels of DDT and DDE in two environmental matrices (soil and dust) and to investigate the blood levels of these insecticides in exposed children living in a north Mexican state (Chihuahua) where DDT was sprayed several years ago during (1) health campaigns for the control of malaria and (2) agricultural activities. DDT and DDE were analyzed by gas chromatography/mass spectrometry. In general, lower levels were found in household outdoor samples. The levels in outdoor samples ranged from 0.001 to 0.788 mg/kg for DDT and from 0.001 to 0.642 mg/kg for DDE. The levels in indoor samples ranged from 0.001 to 15.47 mg/kg for DDT and from 0.001 to 1.063 mg/kg for DDE. Similar results to those found in indoor soil were found in dust, in which the levels ranged from 0.001 to 95.87 mg/kg for DDT and from 0.001 to 0.797 mg/kg for DDE. Moreover, blood levels showed that all of the communities studied had been exposed to DDT and/or DDE, indicating a general past or present exposure to DDT. It is important to note that the quotient DDT/DDE in all matrices was always >1. Whether the people living in our study area are at risk is an issue that deserves further analysis. However, applying precautionary principles, it is important to initiate a risk-reduction program to decrease exposure to DDT and its metabolites in people living in this area.

  18. Inter- and intra-observer agreement of BI-RADS-based subjective visual estimation of amount of fibroglandular breast tissue with magnetic resonance imaging: comparison to automated quantitative assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wengert, G.J.; Helbich, T.H.; Woitek, R.; Kapetas, P.; Clauser, P.; Baltzer, P.A. [Medical University of Vienna/ Vienna General Hospital, Department of Biomedical Imaging and Image-guided Therapy, Division of Molecular and Gender Imaging, Vienna (Austria); Vogl, W.D. [Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Computational Imaging Research Lab, Wien (Austria); Weber, M. [Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Division of General and Pediatric Radiology, Wien (Austria); Meyer-Baese, A. [State University of Florida, Department of Scientific Computing in Medicine, Tallahassee, FL (United States); Pinker, Katja [Medical University of Vienna/ Vienna General Hospital, Department of Biomedical Imaging and Image-guided Therapy, Division of Molecular and Gender Imaging, Vienna (Austria); State University of Florida, Department of Scientific Computing in Medicine, Tallahassee, FL (United States); Memorial Sloan-Kettering Cancer Center, Department of Radiology, Molecular Imaging and Therapy Services, New York City, NY (United States)

    2016-11-15

    To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. (orig.)

  19. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864....5680 Automated heparin analyzer. (a) Identification. An automated heparin analyzer is a device used to determine the heparin level in a blood sample by mixing the sample with protamine (a...

  20. Ecological Rationality in Teachers' Conceptions of Assessment across Samples from Cyprus and New Zealand

    Science.gov (United States)

    Brown, Gavin Thomas Lumsden; Michaelides, Michalis P.

    2011-01-01

    Teacher conceptions of assessment are influential mediators of how assessment policy initiatives are implemented in schools. Four hierarchical, intercorrelated factors (i.e., assessment for improvement, school accountability, and student accountability, and assessment as irrelevant) of how teachers' conceive of assessment have been reported.…

  1. Improvement in the detection of enteric protozoa from clinical stool samples using the automated urine sediment analyzer sediMAX(®) 2 compared to sediMAX(®) 1.

    Science.gov (United States)

    Intra, J; Sala, M R; Falbo, R; Cappellini, F; Brambilla, P

    2017-01-01

    Detection of intestinal parasites from fecal samples is routinely performed by direct wet mount examination. This method requires skilled personnel, and it is time consuming. The aim of this work is to demonstrate the usefulness of the newer automated urinary sediment analyser sediMAX 2 for a fast detection of intestinal protozoa in stool samples. A total of 700 consecutively preserved samples consisting of 70 positives and 630 negatives were analyzed. SediMAX 2 takes digital images of each sediment sample, and analysis was conducted using a dilution of stool specimens, allowing determination of typical morphology. Compared to manual microscopy, sediMAX 2 showed sensitivity and specificity of 100 % in the detection of intestinal parasites, as also recently demonstrated for sediMAX 1. However, all clinically important human protozoa were detected using only 15 images for each specimen, compared to 30 images required in sediMAX 1 analysis. Moreover, changing manually the focus, it is possible to carry out a discrimination between morphologically identical Entamoeba complex members, including the pathogenic E. histolytica and the non-pathogenic E. dispar, E. moshkovskii and E. Bangladeshi, from the non-pathogenic Entamoeba coli based on the number of nuclei present in the cells. This study presents sediMAX 2 as an automatic aid to traditional microscopy.

  2. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Ruiz, Tomas [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain)]. E-mail: tpr@um.es; Martinez-Lozano, Carmen [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain); Garcia, Maria Dolores [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain)

    2007-02-19

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 {mu}g mL{sup -1} of propoxur, with a detection limit of 5 ng mL{sup -1}. The repeatability was 0.82% expressed as relative standard deviation (n = 10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL{sup -1} levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L{sup -1} using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 {mu}g kg{sup -1}.

  3. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection.

    Science.gov (United States)

    Pérez-Ruiz, Tomás; Martínez-Lozano, Carmen; García, María Dolores

    2007-02-19

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 microg mL(-1) of propoxur, with a detection limit of 5 ng mL(-1). The repeatability was 0.82% expressed as relative standard deviation (n=10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL(-1) levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L(-1) using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 microg kg(-1).

  4. A comparison of three macroinvertebrate sampling devices for use in conducting rapid-assessment procedures of Delmarva Peninsula wetlands

    Science.gov (United States)

    Lowe, Terrence (Peter); Tebbs, Kerry; Sparling, Donald W.

    2016-01-01

    Three types of macroinvertebrate collecting devices, Gerking box traps, D-shaped sweep nets, and activity traps, have commonly been used to sample macroinvertebrates when conducting rapid biological assessments of North American wetlands. We compared collections of macroinvertebrates identified to the family level made with these devices in 6 constructed and 2 natural wetlands on the Delmarva Peninsula of Maryland. We also assessed their potential efficacy in comparisons among wetlands using several proportional and richness attributes. Differences in median diversity among samples from the 3 devices were significant; the sweep-net samples had the greatest diversity and the activity-trap samples had the least diversity. Differences in median abundance were not significant between the Gerking box-trap samples and sweep-net samples, but median abundance among activity-trap samples was significantly lower than among samples of the other 2 devices. Within samples, the proportions of median diversity composed of major class and order groupings were similar among the 3 devices. However the proportions of median abundance composed of the major class and order groupings within activity-trap samples were not similar to those of the other 2 devices. There was a slight but significant increase in the total number of families captured when we combined activity-trap samples with Gerking box-trap samples or with sweep-net samples, and the per-sample median numbers of families of the combined activity-trap and sweep-net samples was significantly higher than that of the combined activity-trap and Gerking box-trap samples. We detected significant differences among wetlands for 4 macroinvertebrate attributes with the Gerking box-trap data, 6 attributes with sweep-net data, and 5 attributes with the activity-trap data. A small, but significant increase in the number of attributes showing differences among wetlands occurred when we combined activity-trap samples with those of the

  5. The accuracy of platelet counting in thrombocytopenic blood samples distributed by the UK National External Quality Assessment Scheme for General Haematology.

    Science.gov (United States)

    De la Salle, Barbara J; McTaggart, Paul N; Briggs, Carol; Harrison, Paul; Doré, Caroline J; Longair, Ian; Machin, Samuel J; Hyde, Keith

    2012-01-01

    A knowledge of the limitations of automated platelet counting is essential for the effective care of thrombocytopenic patients and management of platelet stocks for transfusion. For this study, 29 external quality assessment specimen pools with platelet counts between 5 and 64 × 10(9)/L were distributed to more than 1,100 users of 23 different hematology analyzer models. The same specimen pools were analyzed by the international reference method (IRM) for platelet counting at 3 reference centers. The IRM values were on average lower than the all-methods median values returned by the automated analyzers. The majority (~67%) of the automated analyzer results overestimated the platelet count compared with the IRM, with significant differences in 16.5% of cases. Performance differed between analyzer models. The observed differences may depend in part on the nature of the survey material and analyzer technology, but the findings have implications for the interpretation of platelet counts at levels of clinical decision making.

  6. STS 119 Return Samples: Assessment of Air Quality aboard the Shuttle (STS-119) and International Space Station (15A)

    Science.gov (United States)

    James, John T.

    2009-01-01

    The toxicological assessments of 2 grab sample canisters (GSCs) from the Shuttle are reported. Analytical methods have not changed from earlier reports. The recoveries of the 3 surrogates (C-13-acetone, fluorobenzene, and chlorobenzene) from the 2 GSCs averaged 106, 106, and 101 %,respectively. Based on the end-of-mission sample, the Shuttle atmosphere was acceptable for human respiration.

  7. An automated image analysis framework for segmentation and division plane detection of single live Staphylococcus aureus cells which can operate at millisecond sampling time scales using bespoke Slimfield microscopy

    CERN Document Server

    Wollman, Adam J M; Foster, Simon; Leake, Mark C

    2016-01-01

    Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphological...

  8. Visual and automated assessment of matrix metalloproteinase-14 tissue expression for the evaluation of ovarian cancer prognosis.

    Science.gov (United States)

    Trudel, Dominique; Desmeules, Patrice; Turcotte, Stéphane; Plante, Marie; Grégoire, Jean; Renaud, Marie-Claude; Orain, Michèle; Bairati, Isabelle; Têtu, Bernard

    2014-10-01

    The purpose of this study was to evaluate whether the membrane type 1 matrix metalloproteinase-14 (or MT1-MMP) tissue expression, as assessed visually on digital slides and by digital image analysis, could predict outcomes in women with ovarian carcinoma. Tissue microarrays from a cohort of 211 ovarian carcinoma women who underwent a debulking surgery between 1993 and 2006 at the CHU de Québec (Canada) were immunostained for matrix metalloproteinase-14. The percentage of MMP-14 staining was assessed visually and with the Calopix software. Progression was evaluated using the CA-125 and/or the RECIST criteria according to the GCIG criteria. Dates of death were obtained by record linkage with the Québec mortality files. Adjusted hazard ratios of death and progression with their 95% confidence intervals were estimated using the Cox model. Comparisons between the two modalities of MMP-14 assessment were done using the box plots and the Kruskal-Wallis test. The highest levels of MMP-14 immunostaining were associated with nonserous histology, early FIGO stage, and low preoperative CA-125 levels (P40% of MMP-14-positive cells) was inversely associated with progression using visual assessment (hazard ratio=0.39; 95% confidence interval: 0.18-0.82). A similar association was observed with the highest quartile of MMP-14-positive area assessed by digital image analysis (hazard ratio=0.48; 95% confidence interval: 0.28-0.82). After adjustment for standard prognostic factors, these associations were no longer significant in the ovarian carcinoma cohort. However, in women with serous carcinoma, the highest quartile of MMP-14-positive area was associated with progression (adjusted hazard ratio=0.48; 95% confidence interval: 0.24-0.99). There was no association with overall survival. The digital image analysis of MMP-14-positive area matched the visual assessment using three categories (>40% vs 21-40 vs <20%). Higher levels of MMP-14 immunostaining were associated with standard

  9. Local Adaptation in European Firs Assessed through Extensive Sampling across Altitudinal Gradients in Southern Europe

    Science.gov (United States)

    Postolache, Dragos; Lascoux, Martin; Drouzas, Andreas D.; Källman, Thomas; Leonarduzzi, Cristina; Liepelt, Sascha; Piotti, Andrea; Popescu, Flaviu; Roschanski, Anna M.; Zhelev, Peter; Fady, Bruno; Vendramin, Giovanni Giuseppe

    2016-01-01

    Background Local adaptation is a key driver of phenotypic and genetic divergence at loci responsible for adaptive traits variations in forest tree populations. Its experimental assessment requires rigorous sampling strategies such as those involving population pairs replicated across broad spatial scales. Methods A hierarchical Bayesian model of selection (HBM) that explicitly considers both the replication of the environmental contrast and the hierarchical genetic structure among replicated study sites is introduced. Its power was assessed through simulations and compared to classical ‘within-site’ approaches (FDIST, BAYESCAN) and a simplified, within-site, version of the model introduced here (SBM). Results HBM demonstrates that hierarchical approaches are very powerful to detect replicated patterns of adaptive divergence with low false-discovery (FDR) and false-non-discovery (FNR) rates compared to the analysis of different sites separately through within-site approaches. The hypothesis of local adaptation to altitude was further addressed by analyzing replicated Abies alba population pairs (low and high elevations) across the species’ southern distribution range, where the effects of climatic selection are expected to be the strongest. For comparison, a single population pair from the closely related species A. cephalonica was also analyzed. The hierarchical model did not detect any pattern of adaptive divergence to altitude replicated in the different study sites. Instead, idiosyncratic patterns of local adaptation among sites were detected by within-site approaches. Conclusion Hierarchical approaches may miss idiosyncratic patterns of adaptation among sites, and we strongly recommend the use of both hierarchical (multi-site) and classical (within-site) approaches when addressing the question of adaptation across broad spatial scales. PMID:27392065

  10. Assessment of outdoor radiofrequency electromagnetic field exposure through hotspot localization using kriging-based sequential sampling.

    Science.gov (United States)

    Aerts, Sam; Deschrijver, Dirk; Verloock, Leen; Dhaene, Tom; Martens, Luc; Joseph, Wout

    2013-10-01

    In this study, a novel methodology is proposed to create heat maps that accurately pinpoint the outdoor locations with elevated exposure to radiofrequency electromagnetic fields (RF-EMF) in an extensive urban region (or, hotspots), and that would allow local authorities and epidemiologists to efficiently assess the locations and spectral composition of these hotspots, while at the same time developing a global picture of the exposure in the area. Moreover, no prior knowledge about the presence of radiofrequency radiation sources (e.g., base station parameters) is required. After building a surrogate model from the available data using kriging, the proposed method makes use of an iterative sampling strategy that selects new measurement locations at spots which are deemed to contain the most valuable information-inside hotspots or in search of them-based on the prediction uncertainty of the model. The method was tested and validated in an urban subarea of Ghent, Belgium with a size of approximately 1 km2. In total, 600 input and 50 validation measurements were performed using a broadband probe. Five hotspots were discovered and assessed, with maximum total electric-field strengths ranging from 1.3 to 3.1 V/m, satisfying the reference levels issued by the International Commission on Non-Ionizing Radiation Protection for exposure of the general public to RF-EMF. Spectrum analyzer measurements in these hotspots revealed five radiofrequency signals with a relevant contribution to the exposure. The radiofrequency radiation emitted by 900 MHz Global System for Mobile Communications (GSM) base stations was always dominant, with contributions ranging from 45% to 100%. Finally, validation of the subsequent surrogate models shows high prediction accuracy, with the final model featuring an average relative error of less than 2dB (factor 1.26 in electric-field strength), a correlation coefficient of 0.7, and a specificity of 0.96.

  11. Detection of Giardia lamblia, Cryptosporidium spp. and Entamoeba histolytica in clinical stool samples by using multiplex real-time PCR after automated DNA isolation

    NARCIS (Netherlands)

    Van Lint, P; Rossen, J W; Vermeiren, S; Ver Elst, K; Weekx, S; Van Schaeren, J; Jeurissen, A

    2013-01-01

    Diagnosis of intestinal parasites in stool samples is generally still carried out by microscopy; however, this technique is known to suffer from a low sensitivity and is unable to discriminate between certain protozoa. In order to overcome these limitations, a real-time multiplex PCR was evaluated a

  12. Photographic capture-recapture sampling for assessing populations of the Indian gliding lizard Draco dussumieri.

    Science.gov (United States)

    Sreekar, Rachakonda; Purushotham, Chetana B; Saini, Katya; Rao, Shyam N; Pelletier, Simon; Chaplod, Saniya

    2013-01-01

    The usage of invasive tagging methods to assess lizard populations has often been criticised, due to the potential negative effects of marking, which possibly cause increased mortality or altered behaviour. The development of safe, less invasive techniques is essential for improved ecological study and conservation of lizard populations. In this study, we describe a photographic capture-recapture (CR) technique for estimating Draco dussumieri (Agamidae) populations. We used photographs of the ventral surface of the patagium to identify individuals. To establish that the naturally occurring blotches remained constant through time, we compared capture and recapture photographs of 45 pen-marked individuals after a 30 day interval. No changes in blotches were observed and individual lizards could be identified with 100% accuracy. The population density of D. dussumieri in a two hectare areca-nut plantation was estimated using the CR technique with ten sampling occasions over a ten day period. The resulting recapture histories for 24 individuals were analysed using population models in the program CAPTURE. All models indicated that nearly all individuals were captured. The estimated probability for capturing D. dussumieri on at least one occasion was 0.92 and the estimated population density was 13±1.65 lizards/ha. Our results demonstrate the potential for applying CR to population studies in gliding lizards (Draco spp.) and other species with distinctive markings.

  13. Radiometric assessment of natural radioactivity levels of agricultural soil samples collected in Dakahlia, Egypt.

    Science.gov (United States)

    Issa, Shams A M

    2013-01-01

    Determination of the natural radioactivity has been carried out, by using a gamma-ray spectrometry [NaI (Tl) 3″ × 3″] system, in surface soil samples collected from various locations in Dakahlia governorate, Egypt. These locations form the agriculturally important regions of Egypt. The study area has many industries such as chemical, paper, organic fertilisers and construction materials, and the soils of the study region are used as a construction material. Therefore, it becomes necessary to study the natural radioactivity levels in soil to assess the dose for the population in order to know the health risks. The activity concentrations of (226)Ra, (232)Th and (40)K in the soil ranged from 5.7 ± 0.3 to 140 ± 7, from 9.0 ± 0.4 to 139 ± 7 and from 22 ± 1 to 319 ± 16 Bq kg(-1), respectively. The absorbed dose rate, annual effective dose rate, radium equivalent (Req), excess lifetime cancer risk, hazard indices (Hex and Hin) and annual gonadal dose equivalent, which resulted from the natural radionuclides in the soil were calculated.

  14. ASSESSMENT OF PRESENCE AND LEACHING TOXIC METALS IN SAMPLES OF THREE TRADEMARKS OF ALGINATES

    Directory of Open Access Journals (Sweden)

    Fabíole Jordana Los

    2013-07-01

    Full Text Available A material widely used in dental offices for making dental impressions is alginate. The objective of this work was to study three different brands of alginate by fluorescence and X-ray diffraction and extractions using the method of Tessier, followed by the determination of the concentrations of metals in the extracts by atomic absorption spectrometry. Through X-ray fluorescence, was observed that the most representative in the samples are Si, Ca, K, S, Al, Mg, Fe and P, and the sum of these elements reaches values above 97%. By X-ray diffraction, it was observed that the peak was indicative of a higher concentration of SiO2 (cristobalite and the compounds found in all specimens were CaSO4.2H2O, SiO2 (cristobalite and SiO 2. The assessment of mobilization (leaching of metals performed by the method of Tessier and its comparison with the CONAMA 430/11 has shown potential risks of improper disposal of such material in the environment due to concentrations (above PMV of lead in the fractions of the metal on the carbonates and organic matter.

  15. Photographic capture-recapture sampling for assessing populations of the Indian gliding lizard Draco dussumieri.

    Directory of Open Access Journals (Sweden)

    Rachakonda Sreekar

    Full Text Available The usage of invasive tagging methods to assess lizard populations has often been criticised, due to the potential negative effects of marking, which possibly cause increased mortality or altered behaviour. The development of safe, less invasive techniques is essential for improved ecological study and conservation of lizard populations. In this study, we describe a photographic capture-recapture (CR technique for estimating Draco dussumieri (Agamidae populations. We used photographs of the ventral surface of the patagium to identify individuals. To establish that the naturally occurring blotches remained constant through time, we compared capture and recapture photographs of 45 pen-marked individuals after a 30 day interval. No changes in blotches were observed and individual lizards could be identified with 100% accuracy. The population density of D. dussumieri in a two hectare areca-nut plantation was estimated using the CR technique with ten sampling occasions over a ten day period. The resulting recapture histories for 24 individuals were analysed using population models in the program CAPTURE. All models indicated that nearly all individuals were captured. The estimated probability for capturing D. dussumieri on at least one occasion was 0.92 and the estimated population density was 13±1.65 lizards/ha. Our results demonstrate the potential for applying CR to population studies in gliding lizards (Draco spp. and other species with distinctive markings.

  16. PREVALENCE AND ANTIMICROBIAL RESISTANCE ASSESSMENT OF SUBCLINICAL MASTITIS IN MILK SAMPLES FROM SELECTED DAIRY FARMS

    Directory of Open Access Journals (Sweden)

    Murugaiyah Marimuthu

    2014-01-01

    Full Text Available This study was conducted in order to determine the prevalence and bacteriological assessment of subclinical mastitis and antimicrobial resistance of bacterial isolates from dairy cows in different farms around Selangor, Malaysia. A total of 120 milk samples from 3 different farms were randomly collected and tested for subclinical mastitis using California Mastitis Test (CMT, as well as for bacterial culture for isolation, identification and antimicrobial resistance. The most prevalent bacteria was Staphylococcus sp. (55%, followed by Bacillus sp., (21% and Corynebacterium sp., (7%, Yersinia sp. and Neisseria sp. both showed 5% prevalence, other species with prevalence below 5% are Acinetobacter sp., Actinobacillus sp., Vibrio sp., Pseudomonas sp., E.coli, Klebsiella sp. and Chromobacter sp. Selected Staphylococcus sp. showed a mean antimicrobial resistance of 73.3% to Ampicillin, 26.7% to Penicillin, Methicillin and Compound Sulphonamide each, 20% to Oxacillin, Amoxycillin and Cefuroxime, 13.3% to Polymyxin B, Erythromycin, Ceftriaxone and Azithromycin and 6.7% to Streptomycin, Clindamycin, Lincomycin and Tetracycline each. This study indicates the need for urgent and effective control measures to tackle the increase in prevalence of subclinical mastitis and their antimicrobial resistance in the study area.

  17. Assessment of respiratory effect of air pollution: study design on general population samples.

    Science.gov (United States)

    Baldacci, S; Carrozzi, L; Viegi, G; Giuntini, C

    1997-01-01

    The aim of this paper is to describe an epidemiological model to investigate the relationship between respiratory diseases and environmental air pollution. In the Po Delta prospective study, subjects were investigated before and after a large thermoelectric power plant began operating, in 1980 to 1982 and in 1988 to 1991, respectively. The Pisa prospective study was performed in 1986 to 1988 and in 1991 to 1993, before and after the construction of a new expressway that encircles the city from the North to the Southeast. In each survey, subjects completed the interviewer-administered standardized CNR questionnaire on respiratory symptoms/diseases and risk factors, and performed lung function tests. In the second survey of each study, skin prick tests, total serum IgE determination, methacholine challenge test and biomarkers (such as sister chromatide exchanges, micronuclei, chromosomal abnormalities, DNA and hemoglobin adducts) were also performed. Concentrations of total suspended particulate and SO2 in both surveys were higher in urban than in rural areas, as well as symptom/disease prevalences and bronchial reactivity. Subgroups of subjects from the two samples were enrolled to perform a specific study on the acute respiratory effects of indoor pollution; the daily presence of symptoms and measurements of peak expiratory flow (PEF), daily activity pattern, and assessment of the indoor air quality (particulates particulates, especially asthmatics. In conclusion, these studies represent a basis for further analyses to better define the relationship between respiratory health and indoor/outdoor pollutant levels.

  18. An assessment of the overlap between morale and work engagement in a nonoperational military sample.

    Science.gov (United States)

    Ivey, Gary W; Blanc, J-R Sébastien; Mantler, Janet

    2015-07-01

    The degree of overlap between two positive motivational constructs-morale and work engagement-was assessed in a random sample of Canadian Armed Forces personnel stationed across Canada (N = 1,224). Based on self-determination theory and past research, job-specific self-efficacy, trust in teammates, and job significance were expected to be associated with morale and work engagement. Structural equation modeling analyses revealed that morale and work engagement were highly positively correlated, but had different patterns of association with predictor and outcome variables. Although trust in teammates and job significance predicted both morale and work engagement, job-specific self-efficacy predicted morale but not work engagement. Willingness to deploy on operations, turnover intentions, and psychological distress were predicted by both morale and work engagement, but morale was a better predictor of psychological distress and work engagement was a stronger predictor of turnover intentions. Together, the results suggest that, despite their overlap, morale and work engagement, as defined and measured herein, are not interchangeable.

  19. OCT as a convenient tool to assess the quality and application of organotypic retinal samples

    Science.gov (United States)

    Gater, Rachel; Khoshnaw, Nicholas; Nguyen, Dan; El Haj, Alicia J.; Yang, Ying

    2016-03-01

    Eye diseases such as macular degeneration and glaucoma have profound consequences on the quality of human life. Without treatment, these diseases can lead to loss of sight. To develop better treatments for retinal diseases, including cell therapies and drug intervention, establishment of an efficient and reproducible 3D native retinal tissue system, enabled over a prolonged culture duration, will be valuable. The retina is a complex tissue, consisting of ten layers with a different density and cellular composition to each. Uniquely, as a light transmitting tissue, retinal refraction of light differs among the layers, forming a good basis to use optical coherence tomography (OCT) in assessing the layered structure of the retina and its change during the culture and treatments. In this study, we develop a new methodology to generate retinal organotypic tissues and compare two substrates: filter paper and collagen hydrogel, to culture the organotypic tissue. Freshly slaughtered pig eyes have been obtained for use in this study. The layered morphology of intact organotypic retinal tissue cultured on two different substrates has been examined by spectral domain OCT. The viability of the tissues has been examined by live/dead fluorescence dye kit to cross validate the OCT images. For the first time, it is demonstrated that the use of a collagen hydrogel supports the viability of retinal organotypic tissue, capable of prolonged culture up to 2 weeks. OCT is a convenient tool for appraising the quality and application of organotypic retinal samples and is important in the development of current organotypic models.

  20. Mapping species distributions with MAXENT using a geographically biased sample of presence data: a performance assessment of methods for correcting sampling bias.

    Science.gov (United States)

    Fourcade, Yoan; Engler, Jan O; Rödder, Dennis; Secondi, Jean

    2014-01-01

    MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one "virtual" derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.

  1. Using the Sampling Margin of Error to Assess the Interpretative Validity of Student Evaluations of Teaching

    Science.gov (United States)

    James, David E.; Schraw, Gregory; Kuch, Fred

    2015-01-01

    We present an equation, derived from standard statistical theory, that can be used to estimate sampling margin of error for student evaluations of teaching (SETs). We use the equation to examine the effect of sample size, response rates and sample variability on the estimated sampling margin of error, and present results in four tables that allow…

  2. Assessing Generative Braille Responding Following Training in a Matching-to-Sample Format

    Science.gov (United States)

    Putnam, Brittany C.; Tiger, Jeffrey H.

    2016-01-01

    We evaluated the effects of teaching sighted college students to select printed text letters given a braille sample stimulus in a matching-to-sample (MTS) format on the emergence of untrained (a) construction of print characters given braille samples, (b) construction of braille characters given print samples, (c) transcription of print characters…

  3. Automation of a high-speed imaging setup for differential viscosity measurements

    Science.gov (United States)

    Hurth, C.; Duane, B.; Whitfield, D.; Smith, S.; Nordquist, A.; Zenhausern, F.

    2013-12-01

    We present the automation of a setup previously used to assess the viscosity of pleural effusion samples and discriminate between transudates and exudates, an important first step in clinical diagnostics. The presented automation includes the design, testing, and characterization of a vacuum-actuated loading station that handles the 2 mm glass spheres used as sensors, as well as the engineering of electronic Printed Circuit Board (PCB) incorporating a microcontroller and their synchronization with a commercial high-speed camera operating at 10 000 fps. The hereby work therefore focuses on the instrumentation-related automation efforts as the general method and clinical application have been reported earlier [Hurth et al., J. Appl. Phys. 110, 034701 (2011)]. In addition, we validate the performance of the automated setup with the calibration for viscosity measurements using water/glycerol standard solutions and the determination of the viscosity of an "unknown" solution of hydroxyethyl cellulose.

  4. Automation of a high-speed imaging setup for differential viscosity measurements

    Energy Technology Data Exchange (ETDEWEB)

    Hurth, C.; Duane, B.; Whitfield, D.; Smith, S.; Nordquist, A.; Zenhausern, F. [Center for Applied Nanobioscience and Medicine, The University of Arizona College of Medicine, 425 N 5th Street, Phoenix, Arizona 85004 (United States)

    2013-12-28

    We present the automation of a setup previously used to assess the viscosity of pleural effusion samples and discriminate between transudates and exudates, an important first step in clinical diagnostics. The presented automation includes the design, testing, and characterization of a vacuum-actuated loading station that handles the 2 mm glass spheres used as sensors, as well as the engineering of electronic Printed Circuit Board (PCB) incorporating a microcontroller and their synchronization with a commercial high-speed camera operating at 10 000 fps. The hereby work therefore focuses on the instrumentation-related automation efforts as the general method and clinical application have been reported earlier [Hurth et al., J. Appl. Phys. 110, 034701 (2011)]. In addition, we validate the performance of the automated setup with the calibration for viscosity measurements using water/glycerol standard solutions and the determination of the viscosity of an “unknown” solution of hydroxyethyl cellulose.

  5. US Environmental Protection Agency Method 314.1, an automated sample preconcentration/matrix elimination suppressed conductivity method for the analysis of trace levels (0.50 microg/L) of perchlorate in drinking water.

    Science.gov (United States)

    Wagner, Herbert P; Pepich, B V; Pohl, C; Later, D; Joyce, R; Srinivasan, K; Thomas, D; Woodruff, A; Deborba, B; Munch, D J

    2006-06-16

    Since 1997 there has been increasing interest in the development of analytical methods for the analysis of perchlorate. The US Environmental Protection Agency (EPA) Method 314.0, which was used during the first Unregulated Contaminant Monitoring Regulation (UCMR) cycle, supports a method reporting limit (MRL) of 4.0 microg/L. The non-selective nature of conductivity detection, combined with very high ionic strength matrices, can create conditions that make the determination of perchlorate difficult. The objective of this work was to develop an automated, suppressed conductivity method with improved sensitivity for use in the second UCMR cycle. The new method, EPA Method 314.1, uses a 35 mm x 4 mm cryptand concentrator column in the sample loop position to concentrate perchlorate from a 2 mL sample volume, which is subsequently rinsed with 10 mM NaOH to remove interfering anions. The cryptand concentrator column is combined with a primary AS16 analytical column and a confirmation AS20 analytical column. Unique characteristics of the cryptand column allow perchlorate to be desorbed from the cryptand trap and refocused on the head of the guard column for subsequent separation and analysis. EPA Method 314.1 has a perchlorate lowest concentration minimum reporting level (LCMRL) of 0.13 microg/L in both drinking water and laboratory synthetic sample matrices (LSSM) containing up to 1,000 microg/L each of chloride, bicarbonate and sulfate.

  6. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  7. Dynamic three-dimensional echocardiography combined with semi-automated border detection offers advantages for assessment of resynchronization therapy

    Directory of Open Access Journals (Sweden)

    Voormolen Marco M

    2003-10-01

    Full Text Available Abstract Simultaneous electrical stimulation of both ventricles in patients with interventricular conduction disturbance and advanced heart failure improves hemodynamics and results in increased exercise tolerance, quality of life. We have developed a novel technique for the assessment and optimization of resynchronization therapy. Our approach is based on transthoracic dynamic three-dimensional (3D echocardiography and allows determination of the most delayed contraction site of the left ventricle (LV together with global LV function data. Our initial results suggest that fast reconstruction of the LV is feasible for the selection of the optimal pacing site and allows identifying LV segments with dyssynchrony.

  8. Automated extraction of DNA from reference samples from various types of biological materials on the Qiagen BioRobot EZ1 Workstation

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Jørgensen, Mads; Hansen, Anders Johannes

    2009-01-01

    We have validated and implemented a protocol for DNA extraction from various types of biological materials using a Qiagen BioRobot EZ1 Workstation. The sample materials included whole blood, blood from deceased, buccal cells on Omni swabs and FTA Cards, blood on FTA Cards and cotton swabs......, and muscle biopsies. The DNA extraction was validated according to EN/ISO 17025 for the STR kits AmpFlSTR« Identifiler« and AmpFlSTR« Yfiler« (Applied Biosystems). Of 298 samples extracted, 11 (4%) did not yield acceptable results. In conclusion, we have demonstrated that extraction of DNA from various types...... of biological material can be performed quickly and without the use of hazardous chemicals, and that the DNA may be successfully STR typed according to the requirements of forensic genetic investigations accredited according to EN/ISO 17025...

  9. Full second order chromatographic/spectrometric data matrices for automated sample identification and component analysis by non-data-reducing image analysis

    DEFF Research Database (Denmark)

    Nielsen, Niles-Peter Vest; Smedsgaard, Jørn; Frisvad, Jens Christian

    1999-01-01

    A data analysis method is proposed for identification and for confirmation of classification schemes, based on single- or multiple-wavelength chromatographic profiles. The proposed method works directly on the chromatographic data without data reduction procedures such as peak area or retention...... index calculation, Chromatographic matrices from analysis of previously identified samples are used for generating a reference chromatogram for each class, and unidentified samples are compared with all reference chromatograms by calculating a resemblance measure for each reference. Once the method...... yielded over 90% agreement with accepted classifications. The method is highly accurate and may be used on all sorts of chromatographic profiles. Characteristic component analysis yielded results in good agreement with existing knowledge of characteristic components, but also succeeded in identifying new...

  10. Automated In-Injector Derivatization Combined with High-Performance Liquid Chromatography-Fluorescence Detection for the Determination of Semicarbazide in Fish and Bread Samples.

    Science.gov (United States)

    Wang, Yinan; Chan, Wan

    2016-04-06

    Semicarbazide (1) is a widespread genotoxic food contaminant originating as a metabolic byproduct of the antibiotic nitrofurazone used in fish farming or as a thermal degradation product of the common flour additive azodicarbonamide. The goal of this study is to develop a simple and sensitive high-performance liquid chromatography coupled with fluorescence detection (HPLC-FLD) method for the detection of compound 1 in food products. In comparison to existing methods for the determination of compound 1, the reported method combining online precolumn derivatization and HPLC-FLD is less labor-intensive, produces higher sample throughput, and does not require the use of expensive analytical instruments. After validation of accuracy and precision, this method was applied to determine the amount of compound 1 in fish and bread samples. Comparative studies using an established liquid chromatography coupled with tandem mass spectrometry method did not yield systematically different results, indicating that the developed HPLC-FLD method is accurate and suitable for the determination of compound 1 in fish and bread samples.

  11. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  12. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  13. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  14. Automation in biological crystallization

    Science.gov (United States)

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  15. Automated detection of breast tumor in MRI and comparison of kinetic features for assessing tumor response to chemotherapy

    Science.gov (United States)

    Aghaei, Faranak; Tan, Maxine; Zheng, Bin

    2015-03-01

    Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) is used increasingly in diagnosis of breast cancer and assessment of treatment efficacy in current clinical practice. The purpose of this preliminary study is to develop and test a new quantitative kinetic image feature analysis method and biomarker to predict response of breast cancer patients to neoadjuvant chemotherapy using breast MR images acquired before the chemotherapy. For this purpose, we developed a computer-aided detection scheme to automatically segment breast areas and tumors depicting on the sequentially scanned breast MR images. From a contrast-enhancement map generated by subtraction of two image sets scanned pre- and post-injection of contrast agent, our scheme computed 38 morphological and kinetic image features from both tumor and background parenchymal regions. We applied a number of statistical data analysis methods to identify effective image features in predicting response of the patients to the chemotherapy. Based on the performance assessment of individual features and their correlations, we applied a fusion method to generate a final image biomarker. A breast MR image dataset involving 68 patients was used in this study. Among them, 25 had complete response and 43 had partially response to the chemotherapy based on the RECIST guideline. Using this image feature fusion based biomarker, the area under a receiver operating characteristic curve is AUC = 0.850±0.047. This study demonstrated that a biomarker developed from the fusion of kinetic image features computed from breast MR images acquired pre-chemotherapy has potentially higher discriminatory power in predicting response of the patients to the chemotherapy.

  16. Assessment of outdoor radiofrequency electromagnetic field exposure through hotspot localization using kriging-based sequential sampling

    Energy Technology Data Exchange (ETDEWEB)

    Aerts, Sam, E-mail: sam.aerts@intec.ugent.be; Deschrijver, Dirk; Verloock, Leen; Dhaene, Tom; Martens, Luc; Joseph, Wout

    2013-10-15

    In this study, a novel methodology is proposed to create heat maps that accurately pinpoint the outdoor locations with elevated exposure to radiofrequency electromagnetic fields (RF-EMF) in an extensive urban region (or, hotspots), and that would allow local authorities and epidemiologists to efficiently assess the locations and spectral composition of these hotspots, while at the same time developing a global picture of the exposure in the area. Moreover, no prior knowledge about the presence of radiofrequency radiation sources (e.g., base station parameters) is required. After building a surrogate model from the available data using kriging, the proposed method makes use of an iterative sampling strategy that selects new measurement locations at spots which are deemed to contain the most valuable information—inside hotspots or in search of them—based on the prediction uncertainty of the model. The method was tested and validated in an urban subarea of Ghent, Belgium with a size of approximately 1 km{sup 2}. In total, 600 input and 50 validation measurements were performed using a broadband probe. Five hotspots were discovered and assessed, with maximum total electric-field strengths ranging from 1.3 to 3.1 V/m, satisfying the reference levels issued by the International Commission on Non-Ionizing Radiation Protection for exposure of the general public to RF-EMF. Spectrum analyzer measurements in these hotspots revealed five radiofrequency signals with a relevant contribution to the exposure. The radiofrequency radiation emitted by 900 MHz Global System for Mobile Communications (GSM) base stations was always dominant, with contributions ranging from 45% to 100%. Finally, validation of the subsequent surrogate models shows high prediction accuracy, with the final model featuring an average relative error of less than 2 dB (factor 1.26 in electric-field strength), a correlation coefficient of 0.7, and a specificity of 0.96. -- Highlights: • We present an

  17. Automated dynamic hollow fiber liquid-liquid-liquid microextraction combined with capillary electrophoresis for speciation of mercury in biological and environmental samples.

    Science.gov (United States)

    Li, Pingjing; He, Man; Chen, Beibei; Hu, Bin

    2015-10-01

    A simple home-made automatic dynamic hollow fiber based liquid-liquid-liquid microextraction (AD-HF-LLLME) device was designed and constructed for the simultaneous extraction of organomercury and inorganic mercury species with the assistant of a programmable flow injection analyzer. With 18-crown-6 as the complexing reagent, mercury species including methyl-, ethyl-, phenyl- and inorganic mercury were extracted into the organic phase (chlorobenzene), and then back-extracted into the acceptor phase of 0.1% (m/v) 3-mercapto-1-propanesulfonic acid (MPS) aqueous solution. Compared with automatic static (AS)-HF-LLLME system, the extraction equilibrium of target mercury species was obtained in shorter time with higher extraction efficiency in AD-HF-LLLME system. Based on it, a new method of AD-HF-LLLME coupled with large volume sample stacking (LVSS)-capillary electrophoresis (CE)/UV detection was developed for the simultaneous analysis of methyl-, phenyl- and inorganic mercury species in biological samples and environmental water. Under the optimized conditions, AD-HF-LLLME provided high enrichment factors (EFs) of 149-253-fold within relatively short extraction equilibrium time (25min) and good precision with RSD between 3.8 and 8.1%. By combining AD-HF-LLLME with LVSS-CE/UV, EFs were magnified up to 2195-fold and the limits of detection (at S/N=3) for target mercury species were improved to be sub ppb level.

  18. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  19. The impact of particle size selective sampling methods on occupational assessment of airborne beryllium particulates.

    Science.gov (United States)

    Sleeth, Darrah K

    2013-05-01

    In 2010, the American Conference of Governmental Industrial Hygienists (ACGIH) formally changed its Threshold Limit Value (TLV) for beryllium from a 'total' particulate sample to an inhalable particulate sample. This change may have important implications for workplace air sampling of beryllium. A history of particle size-selective sampling methods, with a special focus on beryllium, will be provided. The current state of the science on inhalable sampling will also be presented, including a look to the future at what new methods or technology may be on the horizon. This includes new sampling criteria focused on particle deposition in the lung, proposed changes to the existing inhalable convention, as well as how the issues facing beryllium sampling may help drive other changes in sampling technology.

  20. Assessment of polychlorinated biphenyls and organochlorine pesticides in water samples from the Yamuna River

    Directory of Open Access Journals (Sweden)

    Bhupander Kumar

    2012-07-01

    Full Text Available Polychlorinated biphenyls (PCBs, hexachlorocyclohexane (HCH and dichlorodiphenyltrichloroethane (DDT are toxic, persistent and bioaccumulative long-range atmospheric transport pollutants. These are transported worldwide affecting remote regions far from their original sources, and can transfer into food webs with a wide range of acute and chronic health effects. India ratified the Stockholm Convention with the intention of reducing and eliminating persistent organic pollutants (POPs, and encouraged the support of research on POPs. Despite the ban and restriction on the use of these chemicals in India, their contamination of air, water, sediment, biota and humans has been reported. In this study, surface water samples were collected during January 2012 from the Yamuna River in Delhi, India, and analyzed for PCBs and organochlorine pesticides (OCPs. The concentrations of ΣPCBs and ΣOCPs ranged between 2-779 ng L–1 and from less than 0.1 to 618 ng L–1 (mean 99±38 ng L–1 and 221±50 ng L–1, respectively. The PCB homolog was dominated by 3-4 chlorinated biphenyls. In calculating the toxicity equivalent of dioxin-like PCBs (dl-PCBsusing World Health Organization toxic equivalency factors, dl-PCBs accounted for 10% of a total of 27 PCBs. The concentration of ΣHCH ranged between less than 0.1 and 285 ng L–1 (mean 151±32 ng L–1. However, ΣDDTs concentrations varied between less than 0.1 and 354 ng L–1 (mean 83±26 ng L–1. The concentrations were lower than the US guideline values; however, levels of lindane exceeded those recommended in guidelines. Further in-depth study is proposed to determine the bioaccumulation of these pollutants through aquatic biota to assess the risk of contaminants to human health.

  1. Assessing the efficacy of hair snares as a method for noninvasive sampling of Neotropical felids

    Directory of Open Access Journals (Sweden)

    Tatiana P. Portella

    2013-02-01

    Full Text Available Hair snares have been used in North and Central America for a long time in assessment and monitoring studies of several mammalian species. This method can provide a cheap, suitable, and efficient way to monitor mammals because it combines characteristics that are not present in most alternative techniques. However, despite their usefulness, hair snares are rarely used in other parts of the world. The aim of our study was to evaluate the effectiveness of hair snares and three scent lures (cinnamon, catnip, and vanilla in the detection of felids in one of the largest remnants of the Brazilian Atlantic Forest. We performed tests with six captive felid species - Panthera onca (Linnaeus, 1758, Leopardus pardalis (Linnaeus, 1758, L. tigrinus (Schreber, 1775, L. wiedii (Schinz, 1821, Puma concolor (Linnaeus, 1771, and P. yagouaroundi (É. Geoffroy Saint-Hilaire, 1803 - to examine their responses to the attractants, and to correlate those with lure efficiency in the field. The field tests were conducted at the Parque Estadual Pico do Marumbi, state of Paraná, Brazil. Hair traps were placed on seven transects. There were equal numbers of traps with each scent lure, for a total of 1,551 trap-days. In captivity, vanilla provided the greatest response, yet no felids were detected in the field with any of the tested lures, although other species were recorded. Based on the sampling of non-target species, and the comparison with similar studies elsewhere, this study points to a possible caveat of this method when rare species or small populations are concerned. Meanwhile, we believe that improved hair snares could provide important results with several species in the location tested and others.

  2. Reliability assessment and cost-benefit analysis of urban distribution network automation%城市配网自动化可靠性评估与成本效益分析

    Institute of Scientific and Technical Information of China (English)

    赵晓慧; 梁标; 李海波; 鲁宗相

    2015-01-01

    实施城市配电网自动化是提高电力系统可靠性的重要技术手段。论述配网自动化技术的机理和特点,探讨配网自动化对可靠性的影响,并对传统的配电网可靠性评估方法进行修正,建立配网自动化可靠性评估模型;通过成本效益分析法,分析配网自动化的经济效益;最后,以某实际配电系统为例验证评估方法的综合效果以及成本效益间的经济性关系。实施配网自动化改造可有效提高其可靠性和经济效益。%The implementation of urban distribution network automation is an important technical means to enhance the reliability of power system.The mechanism and characteristics of distribu-tion network automation and its impact on the reliability were discussed in this paper.Based on the correction of the traditional assessment method,the reliability evaluation model of distribu-tion network automation was presented.The economic benefits of distribution automation were e-valuated through the application of cost-benefit analysis.Finally,a practical distribution system was calculated to verify the combined effect of assessment methods and the economic relations be-tween the cost-benefits.The proposed method can greatly improve the reliability and economy in distribution network automation.

  3. An artifacts removal post-processing for epiphyseal region-of-interest (EROI localization in automated bone age assessment (BAA

    Directory of Open Access Journals (Sweden)

    Salleh Sh-Hussain

    2011-09-01

    Full Text Available Abstract Background Segmentation is the most crucial part in the computer-aided bone age assessment. A well-known type of segmentation performed in the system is adaptive segmentation. While providing better result than global thresholding method, the adaptive segmentation produces a lot of unwanted noise that could affect the latter process of epiphysis extraction. Methods A proposed method with anisotropic diffusion as pre-processing and a novel Bounded Area Elimination (BAE post-processing algorithm to improve the algorithm of ossification site localization technique are designed with the intent of improving the adaptive segmentation result and the region-of interest (ROI localization accuracy. Results The results are then evaluated by quantitative analysis and qualitative analysis using texture feature evaluation. The result indicates that the image homogeneity after anisotropic diffusion has improved averagely on each age group for 17.59%. Results of experiments showed that the smoothness has been improved averagely 35% after BAE algorithm and the improvement of ROI localization has improved for averagely 8.19%. The MSSIM has improved averagely 10.49% after performing the BAE algorithm on the adaptive segmented hand radiograph. Conclusions The result indicated that hand radiographs which have undergone anisotropic diffusion have greatly reduced the noise in the segmented image and the result as well indicated that the BAE algorithm proposed is capable of removing the artifacts generated in adaptive segmentation.

  4. Fast automated dual-syringe based dispersive liquid-liquid microextraction coupled with gas chromatography-mass spectrometry for the determination of polycyclic aromatic hydrocarbons in environmental water samples.

    Science.gov (United States)

    Guo, Liang; Tan, Shufang; Li, Xiao; Lee, Hian Kee

    2016-03-18

    An automated procedure, combining low density solvent based solvent demulsification dispersive liquid-liquid microextraction (DLLME) with gas chromatography-mass spectrometry analysis, was developed for the determination of polycyclic aromatic hydrocarbons (PAHs) in environmental water samples. Capitalizing on a two-rail commercial autosampler, fast solvent transfer using a large volume syringe dedicated to the DLLME process, and convenient extract collection using a small volume microsyringe for better GC performance were enabled. Extraction parameters including the type and volume of extraction solvent, the type and volume of dispersive solvent and demulsification solvent, extraction and demulsification time, and the speed of solvent injection were investigated and optimized. Under the optimized conditions, the linearity ranged from 0.1 to 50 μg/L, 0.2 to 50 μg/L, and 0.5 to 50 μg/L, depending on the analytes. Limits of detection were determined to be between 0.023 and 0.058 μg/L. The method was applied to determine PAHs in environmental water samples.

  5. Application of calorimetry to the assessment of the performance of ITER Nb3Sn TF conductor samples in SULTAN tests

    Science.gov (United States)

    Savoldi Richard, L.; Zanino, R.

    2008-10-01

    In the frame of the International Thermonuclear Experimental Reactor (ITER), several short full-size Nb3Sn samples of candidate toroidal field (TF) conductors were tested in 2007 at the SULTAN facility, PSI Villigen, Switzerland, in conditions relevant to the ITER TF (background magnetic field of 10.78 T and transport current of 68 kA). The performance of a SULTAN sample is determined by the current sharing temperature TCS. This can be obtained in principle from voltage measurements along the conductor sample, but the procedure is not free of issues and ambiguities. Here a complementary approach, based on the calorimetric assessment of the Joule heating due to current sharing, is critically discussed. Suitable algorithms are defined and the respective error bars are estimated, also based on numerical thermal-hydraulic modeling. The calorimetric approach is then applied to assess the performance of the samples tested in 2007 and compared with the results of the standard (electrical) approach.

  6. STS 132 Return Samples: Assessment of Air Quality Aboard the Shuttle (STS-132) and International Space Station (ULF4)

    Science.gov (United States)

    James. John T.

    2010-01-01

    The toxicological assessments of 2 grab sample canisters (GSCs) from the Shuttle are reported. Analytical methods have not changed from earlier reports. The recoveries of the 3 surrogates (13C-acetone, fluorobenzene, and chlorobenzene) from the 2 Shuttle GSCs averaged 93, 85%, and 88%, respectively. Based on the end-of-mission sample, the Shuttle atmosphere was acceptable for human respiration. The toxicological assessment of 7 GSCs from the ISS is also shown. The recoveries of the 3 standards (as listed above) from the GSCs averaged 78, 96 and 90%, respectively. Recovery from formaldehyde control badges ranged from 90 to 112%.

  7. Assessing Health-Related Quality-of-Life in Prenatal Diagnosis Comparing Chorionic Villi Sampling and Amniocentesis: A Technical Report

    OpenAIRE

    David Feeny; Marie Townsend; William Furlong; Darrell Tomkins; Gail Robinson; George Torrance; Patrick Mohide; Qinan Wang

    2000-01-01

    Objectives. To assess the health-related quality-of-life (HRQL) effects of chorionic villi sampling (CVS) and genetic amniocentesis (GA) prenatal diagnosis, including factors related to both the processes and the outcomes. Study Design. The HRQL of one hundred twenty six women participating in a randomized controlled clinical trial of CVS versus GA in Toronto and Hamilton, Ontario was assessed in four interviews at weeks 8, 13, 18, and 22 of pregnancy. Statistical analyses included analysis o...

  8. STS 131 Return Samples: Assessment of Air Quality Aboard the Shuttle (STS-131) and International Space Station (19A)

    Science.gov (United States)

    James, John T.

    2010-01-01

    The toxicological assessments of 1 grab sample canister (GSC) from the Shuttle are reported in Table 1. Analytical methods have not changed from earlier reports. The recoveries of the 3 surrogates (C-13-acetone, fluorobenzene, and chlorobenzene) from the Shuttle GSC were 100%, 93%, and 101%, respectively. Based on the historical experience using end-of-mission samples, the Shuttle atmosphere was acceptable for human respiration.

  9. STS 130 Return Samples: Assessment of Air Quality Aboard the Shuttle (STS-130) and International Space Station (20A)

    Science.gov (United States)

    James, John T.

    2010-01-01

    The toxicological assessments of 3 grab sample canisters (GSCs) from the Shuttle are reported in Table 1. Analytical methods have not changed from earlier reports. The recoveries of the 3 surrogates ( 13C-acetone, fluorobenzene, and chlorobenzene) from the 3 Shuttle GSCs averaged 96, 90, and 85 %, respectively. Based on the end-of-mission sample, the Shuttle atmosphere was acceptable for human respiration.

  10. Assessment of Chlamydia trachomatis, Neisseria gonorrhoeae, and Mycobacterium tuberculosis infections in women undergoing laparoscopy: the role of peritoneal fluid sampling

    OpenAIRE

    Miroslav Dragic; Patrizia Posteraro; Carla Marani; Maria Emanuela Natale; Alessia Vecchioni; Maurizio Sanguinetti; Chiara de Waure; Brunella Posteraro

    2016-01-01

    Background. Aim of this study was to assess the role of peritoneal fluid sampling for detection of bacterial infections due to Chlamydia trachomatis (CT), Neisseria gonorrhoeae (NG), and Mycobacterium tuberculosis (MT) in women undergoing laparoscopic investigation. The potential link between microbiological positive result(s) and types of gynecological pathology was also evaluated. Materials and Methods. A large sample of women (n=1377) with their peritoneal fluids taken laparoscopically was...

  11. Implementation of a semi-automated strategy for the annotation of metabolomic fingerprints generated by liquid chromatography-high resolution mass spectrometry from biological samples.

    Science.gov (United States)

    Courant, Frédérique; Royer, Anne-Lise; Chéreau, Sylvain; Morvan, Marie-Line; Monteau, Fabrice; Antignac, Jean-Philippe; Le Bizec, Bruno

    2012-11-07

    Metabolomics aims at detecting and semi-quantifying small molecular weight metabolites in biological samples in order to characterise the metabolic changes resulting from one or more given factors and/or to develop models based on diagnostic biomarker candidates. Nevertheless, whatever the objective of a metabolomic study, one critical step consists in the structural identification of mass spectrometric features revealed by statistical analysis and this remains a real challenge. Indeed, this requires both an understanding of the studied biological system, the correct use of various analytical information (retention time, molecular weight experimentally measured, isotopic golden rules, MS/MS fragment pattern interpretation…), or querying online databases. In gas chromatography-electro-ionisation (EI)-mass spectrometry, EI leads to a very reproducible fragmentation allowing establishment of universal EI mass spectra databases (for example, the NIST database -National Institute of Standards and Technology) and thus facilitates the identification step. Unfortunately, the situation is different when working with liquid chromatography-mass spectrometry (LC-MS) since atmospheric pressure ionisation exhibits high inter-instrument variability regarding fragmentation. Therefore, the constitution of LC-MS "in-house" spectral databases appears relevant in this context. The present study describes the procedure developed and applied to increment 133 and 130 metabolites in databanks dedicated to analyses performed with LC-HRMS in positive and negative electrospray ionisation, and the use of these databanks for annotating quickly untargeted metabolomics fingerprints. This study also describes the optimization of the parameters controlling the automatic processing in order to obtain a fast and reliable annotation of a maximum of organic compounds. This strategy was applied to bovine kidney samples collected from control animals or animals treated with steroid hormones. Thirty

  12. A small sample-size automated adiabatic calorimeter from 70 to 580 K——Molar heat capacities of α-Al2O3

    Institute of Scientific and Technical Information of China (English)

    谭志诚; 张际标; 孟霜鹤; 李莉

    1999-01-01

    An automatic adiabatic calorimeter for measuring heat capacities in the temperature range 70—580 K, equipped with a small sample cell of 7.4 cm~3 in the internal volume has been developed. In order to obtain a good adiabatic condition of the calorimeter at high temperature, the calorimeter was surrounded in sequence by two adiabatic shields, three radiation shields and an auxiliary temperature-controlled sheath. The main body of the cell made of copper and the lid made of brass are silver-soldered and the cell is sealed with a copper screw cap. A sealing gasket made of Pb-Sn alloy is put between the cap and the lid to ensure a high vacuum sealing of the cell in the whole experimental temperature range. All the leads are insulated and fixed with W30-11 varnish, thus a good electric insulation is obtained at high temperature. All the experimental data, including those for energy and temperature are collected and processed automatically with a personal computer using a predetermined program. To verify the

  13. Development of an automated sampling-analysis system for simultaneous measurement of reactive oxygen species (ROS) in gas and particle phases: GAC-ROS

    Science.gov (United States)

    Huang, Wei; Zhang, Yuanxun; Zhang, Yang; Zeng, Limin; Dong, Huabin; Huo, Peng; Fang, Dongqing; Schauer, James J.

    2016-06-01

    A novel online system, GAC-ROS, for simultaneous measurement of reactive oxygen species (ROS) in both gas and particle phases was developed based on 2‧,7‧-dichlorofluorescin (DCFH) assay to provide fast sampling and analysis of atmospheric ROS. The GAC-ROS, composed of a Gas and Aerosol Collector (GAC), a series of reaction and transportation systems, and a fluorescence detector, was tested for instrumental performance in laboratory. Results showed good performance with a favorable R2 value for the calibration curve (above 0.998), high penetration efficiencies of ROS (above 99.5%), and low detection limits (gas-phase ROS: 0.16 nmol H2O2 m-3; particle-phase ROS: 0.12 nmol H2O2 m-3). Laboratorial comparison between online and offline methods for particle-bound ROS showed significant loss of ROS due to the relatively long time off-line treatment. Field observations in Beijing found that concentrations of ROS in winter time were significantly higher than those observed in spring. Only a few weak positive correlations were found between ROS and some air pollutants, which reflects the complexities of ROS generation and transformation in atmosphere. This study was the first to simultaneously obtain concentrations of gas and particle-phase ROS using an online method. Consequently, it provides a powerful tool to characterize the oxidizing capacity of the atmosphere and the sources of the oxidizing capacity.

  14. Automated Ground-Water Sampling and Analysis of Hexavalent Chromium using a “Universal” Sampling/Analytical System

    Directory of Open Access Journals (Sweden)

    Richard J. Venedam

    2005-02-01

    Full Text Available The capabilities of a “universal platform” for the deployment of analyticalsensors in the field for long-term monitoring of environmental contaminants were expandedin this investigation. The platform was previously used to monitor trichloroethene inmonitoring wells and at groundwater treatment systems (1,2. The platform was interfacedwith chromium (VI and conductivity analytical systems to monitor shallow wells installedadjacent to the Columbia River at the 100-D Area of the Hanford Site, Washington. Agroundwater plume of hexavalent chromium is discharging into the Columbia River throughthe gravels beds used by spawning salmon. The sampling/analytical platform was deployedfor the purpose of collecting data on subsurface hexavalent chromium concentrations atmore frequent intervals than was possible with the previous sampling and analysis methodsemployed a the Site.

  15. Decision making based on single and double acceptance sampling plans for assessing quality of lots

    Directory of Open Access Journals (Sweden)

    Ksenija Dumičic

    2012-01-01

    Full Text Available Background: Acceptance sampling is a statistical tool of quality control. Sampling plans and operating characteristic (OC curves are very useful for conducting acceptance sampling and provide the quality manager with tools to evaluate the quality of a production run or shipment. There are developed different sampling plans, but common used in practise are single and double acceptance sampling plans. Objectives: The goal of the paper is to test if applying of single and double sampling plan can lead to statistically significant different conclusion about quality level of observed lot. Methods/Approach: Statistical tests of difference in proportions are used to test if there is some statistically significant difference in probabilities of lot fraction defectives between a single and a double sampling plan at the same levels of probability of acceptance. Results: The results of the analysis show that in some cases there is statistically significant difference. Namely, the quality manager should be careful when he chooses to use, instead of the first, the second sampling plan with different parameters because on that way he could make statistically significant different conclusion about quality level of observed lot. Conclusions: The paper shows that some intentional manipulations by using different sampling plans are possible.

  16. Validation and comparison of two sampling methods to assess dermal exposure to drilling fluids and crude oil.

    Science.gov (United States)

    Galea, Karen S; McGonagle, Carolyn; Sleeuwenhoek, Anne; Todd, David; Jiménez, Araceli Sánchez

    2014-06-01

    Dermal exposure to drilling fluids and crude oil is an exposure route of concern. However, there have been no published studies describing sampling methods or reporting dermal exposure measurements. We describe a study that aimed to evaluate a wipe sampling method to assess dermal exposure to an oil-based drilling fluid and crude oil, as well as to investigate the feasibility of using an interception cotton glove sampler for exposure on the hands/wrists. A direct comparison of the wipe and interception methods was also completed using pigs' trotters as a surrogate for human skin and a direct surface contact exposure scenario. Overall, acceptable recovery and sampling efficiencies were reported for both methods, and both methods had satisfactory storage stability at 1 and 7 days, although there appeared to be some loss over 14 days. The methods' comparison study revealed significantly higher removal of both fluids from the metal surface with the glove samples compared with the wipe samples (on average 2.5 times higher). Both evaluated sampling methods were found to be suitable for assessing dermal exposure to oil-based drilling fluids and crude oil; however, the comparison study clearly illustrates that glove samplers may overestimate the amount of fluid transferred to the skin. Further comparison of the two dermal sampling methods using additional exposure situations such as immersion or deposition, as well as a field evaluation, is warranted to confirm their appropriateness and suitability in the working environment.

  17. Assessment of the Sampling Performance of Multiple-Copy Dynamics versus a Unique Trajectory.

    Science.gov (United States)

    Perez, Juan J; Tomas, M Santos; Rubio-Martinez, Jaime

    2016-10-24

    The goal of the present study was to ascertain the differential performance of a long molecular dynamics trajectory versus several shorter ones starting from different points in the phase space and covering the same sampling time. For this purpose, we selected the 16-mer peptide Bak16BH3 as a model for study and carried out several samplings in explicit solvent. These samplings included an 8 μs trajectory (sampling S1); two 4 μs trajectories (sampling S2); four 2 μs trajectories (sampling S3); eight 1 μs trajectories (sampling S4); 16 0.5 μs trajectories (sampling S5), and 80 0.1 μs trajectories (sampling S6). Moreover, the 8 μs trajectory was further extended to 16 μs to have reference values of the diverse properties measured. The diverse samplings were compared qualitatively and quantitatively. Among the former, we carried out a comparison of the conformational profiles of the peptide using cluster analysis. Moreover, we also gained insight into the interchange among these structures along the sampling process. Among the latter, we computed the number of new conformational patterns sampled with time using strings defined from the conformations attained by each of the residues in the peptide. We also compared the locations and depths of the obtained minima on the free energy surface using principal component analysis. Finally, we also compared the helical profiles per residue at the end of the sampling process. The results suggest that a few short molecular dynamics trajectories may provide better sampling than one unique trajectory. Moreover, this procedure can also be advantageous to avoid getting trapped in a local minimum. However, caution should be exercised since short trajectories need to be long enough to overcome local barriers surrounding the starting point and the required sampling time depends on the number of degrees of freedom of the system under study. An effective way to gain insight into the minimum MD trajectory length is to monitor the

  18. A comparison of two sampling designs for fish assemblage assessment in a large river

    Science.gov (United States)

    Kiraly, Ian A.; Coghlan Jr., Stephen M.; Zydlewski, Joseph; Hayes, Daniel

    2014-01-01

    We compared the efficiency of stratified random and fixed-station sampling designs to characterize fish assemblages in anticipation of dam removal on the Penobscot River, the largest river in Maine. We used boat electrofishing methods in both sampling designs. Multiple 500-m transects were selected randomly and electrofished in each of nine strata within the stratified random sampling design. Within the fixed-station design, up to 11 transects (1,000 m) were electrofished, all of which had been sampled previously. In total, 88 km of shoreline were electrofished during summer and fall in 2010 and 2011, and 45,874 individuals of 34 fish species were captured. Species-accumulation and dissimilarity curve analyses indicated that all sampling effort, other than fall 2011 under the fixed-station design, provided repeatable estimates of total species richness and proportional abundances. Overall, our sampling designs were similar in precision and efficiency for sampling fish assemblages. The fixed-station design was negatively biased for estimating the abundance of species such as Common Shiner Luxilus cornutus and Fallfish Semotilus corporalis and was positively biased for estimating biomass for species such as White Sucker Catostomus commersonii and Atlantic Salmon Salmo salar. However, we found no significant differences between the designs for proportional catch and biomass per unit effort, except in fall 2011. The difference observed in fall 2011 was due to limitations on the number and location of fixed sites that could be sampled, rather than an inherent bias within the design. Given the results from sampling in the Penobscot River, application of the stratified random design is preferable to the fixed-station design due to less potential for bias caused by varying sampling effort, such as what occurred in the fall 2011 fixed-station sample or due to purposeful site selection.

  19. Assessment of statistical uncertainty in the quantitative analysis of solid samples in motion using laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Cabalin, L.M.; Gonzalez, A. [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain); Ruiz, J. [Department of Applied Physics I, University of Malaga, E-29071 Malaga (Spain); Laserna, J.J., E-mail: laserna@uma.e [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain)

    2010-08-15

    Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s{sup -1}. Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.

  20. Assessment of borderline personality features in population samples: Is the Personality Assessment Inventory-Borderline Features scale measurement invariant across sex and age?

    NARCIS (Netherlands)

    Moor, de M.H.M.; Distel, M.A.; Trull, T.J.; Boomsma, D.I.

    2009-01-01

    Borderline personality disorder (BPD) is more often diagnosed in women than in men, and symptoms tend to decline with age. Using a large community sample, the authors investigated whether sex and age differences in four main features of BPD, measured with the Personality Assessment Inventory-Borderl

  1. Assessment of Borderline Personality Features in Population Samples: Is the Personality Assessment Inventory-Borderline Features Scale Measurement Invariant across Sex and Age?

    Science.gov (United States)

    De Moor, Marleen H. M.; Distel, Marijn A.; Trull, Timothy J.; Boomsma, Dorret I.

    2009-01-01

    Borderline personality disorder (BPD) is more often diagnosed in women than in men, and symptoms tend to decline with age. Using a large community sample, the authors investigated whether sex and age differences in four main features of BPD, measured with the "Personality Assessment Inventory-Borderline Features" scale (PAI-BOR; Morey,…

  2. Assessment of Airway Microbiota and Inflammation in Cystic Fibrosis Using Multiple Sampling Methods

    OpenAIRE

    Edith T Zemanick; Brandie D Wagner; Robertson, Charles E.; Stevens, Mark J.; Szefler, Stanley J; Accurso, Frank J.; Sagel, Scott D; Harris, J. Kirk

    2015-01-01

    Rationale: Oropharyngeal (OP) swabs and induced sputum (IS) are used for airway bacteria surveillance in nonexpectorating children with cystic fibrosis (CF). Molecular analyses of these airway samples detect complex microbial communities. However, the optimal noninvasive sampling approach for microbiota analyses and the clinical relevance of microbiota, particularly its relationship to airway inflammation, is not well characterized.

  3. A Comparison of Momentary Time Sampling and Partial-Interval Recording for Assessment of Effects of Social Skills Training

    Science.gov (United States)

    Radley, Keith C.; O'Handley, Roderick D.; Labrot, Zachary C.

    2015-01-01

    Assessment in social skills training often utilizes procedures such as partial-interval recording (PIR) and momentary time sampling (MTS) to estimate changes in duration in social engagements due to intervention. Although previous research suggests PIR to be more inaccurate than MTS in estimating levels of behavior, treatment analysis decisions…

  4. STS 120 Return Samples: Assessment of Air Quality Aboard the Shuttle (STS-120) and International Space Station (10A)

    Science.gov (United States)

    James, John T.

    2008-01-01

    The toxicological assessments of 2 grab sample canisters (GSCs) from the Shuttle are reported. Formaldehyde badges were not used. Analytical methods have not changed from earlier reports. The recoveries of the 3 surrogates (C-13-acetone, fluorobenzene, and chlorobenzene) from the 2 GSCs averaged 111, 82, and 78%, respectively. The Shuttle atmosphere was acceptable for human respiration.

  5. Frequent sampling by clear venipuncture in unstable angina is a reliable method to assess haemostatic system activity

    NARCIS (Netherlands)

    Biasucci, L.M.; Liuzzo, G.; Caligiuri, G.; Monaco, C.; Quaranta, G.; Sperti, G.; Greef, W. van de; Maseri, A.; Kluft, C.

    1994-01-01

    Sudden limitations in coronary flow account for the majority of cases of UA. Measurement of TAT in peripheral blood represent a reliable marker of an ongoing thrombotic process. The aim of the study was to assess the reliability of frequent blood sampling (Phase A) and to correlate TAT fluctuation t

  6. Detection of Cervical Cancer and High Grade Neoplastic Lesions by a Combination of Liquid‐Based Sampling Preparation and DNA Measurements Using Automated Image Cytometry

    Directory of Open Access Journals (Sweden)

    Xiao Rong Sun

    2005-01-01

    Full Text Available Objective: To establish if measurements of DNA ploidy could be used to assist cytopathologists and cytotechnologists in population based cervical cancer screening programs in countries where manually reading the slides is impossible due to the lack of sufficient skilled cytotechnologists. The goal of such program is to identify only clinically significant lesions, i.e. those where a clinical intervention to remove the lesion is required immediately. Study Design: A total of 9905 women were enrolled in the study. Cervical samples were taken with a cervix brush that was then placed into a fixative solution. The cells were separated from mucus by mechanical and chemical treatment and then deposited onto microscope slides by a cytocentrifuge. Two slides were prepared from each case; one slide was stained by Papanicolaou stain for manual cytology examination, while the other slide was stained by a DNA specific stain. The latter slide was used to determine the relative amount of DNA in the cell nuclei. Results: A total of 876 women were followed by colposcopy examination where biopsies were taken from the visible lesions or from suspicious areas and histopathology diagnosed 459 as normal or benign cases, 325 as CIN1, 36 as CIN2, 25 as CIN3/CIS, and 31 as invasive cancer. Of these 876 cases, manual cytology called 655 normal or ASCUS, 197 as LSIL, 16 cases as HSIL, and 8 as cancer. DNA measurements found 704 cases having no cells with DNA greater than 5c, 98 cases where there were 1 or 2 cells having DNA amount greater than 5c, and 74 cases where there were 3 or more cells having DNA amount greater than 5c. If manual cytology were to be used to refer all cases of HSIL and cancer to colposcopy and biopsy, 23 lesions that had to be removed would have been discovered (2 CIN2, 11 CIN3/CIS, and 10 cancers, for a sensitivity of 25.0±5.2% at specificity of 99.9±0.1%. If DNA assisted cytology were to be used instead, and all cases having 3 or more cells with

  7. Murine Automated Urine Sampler (MAUS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal outlines planned development for a low-power, low-mass automated urine sample collection and preservation system for small mammals, capable of...

  8. Mutanalyst, an online tool for assessing the mutational spectrum of epPCR libraries with poor sampling

    DEFF Research Database (Denmark)

    Ferla, Matteo

    2016-01-01

    Background: Assessing library diversity is an important control step in a directed evolution experiment. To do this, a limited amount of colonies from a test library are sequenced and tested. In the case of an error-prone PCR library, the spectrum of the identified mutations - the proportions...... of mutations of a specific nucleobase to another-is calculated enabling the user to make more informed predictions on library diversity and coverage. However, the calculations of the mutational spectrum are severely affected by the limited sample sizes.Results: Here an online program, called Mutanalyst...... of mutations per sequence it does so by fitting to a Poisson distribution, which is more robust than calculating the average in light of the small sampling size.Conclusion: As a result of the added measures to keep into account of small sample size the user can better assess whether the library is satisfactory...

  9. Assessment of heavy metals in Averrhoa bilimbi and A. carambola fruit samples at two developmental stages.

    Science.gov (United States)

    Soumya, S L; Nair, Bindu R

    2016-05-01

    Though the fruits of Averrhoa bilimbi and A. carambola are economically and medicinally important, they remain underutilized. The present study reports heavy metal quantitation in the fruit samples of A. bilimbi and A. carambola (Oxalidaceae), collected at two stages of maturity. Heavy metals are known to interfere with the functioning of vital cellular components. Although toxic, some elements are considered essential for human health, in trace quantities. Heavy metals such as Cr, Mn, Co, Cu, Zn, As, Se, Pb, and Cd were analyzed by atomic absorption spectroscopy (AAS). The samples under investigation included, A. bilimbi unripe (BU) and ripe (BR), A. carambola sour unripe (CSU) and ripe (CSR), and A. carambola sweet unripe (CTU) and ripe (CTR). Heavy metal analysis showed that relatively higher level of heavy metals was present in BR samples compared to the rest of the samples. The highest amount of As and Se were recorded in BU samples while Mn content was highest in CSU samples and Co in CSR. Least amounts of Cr, Zn, Se, Cd, and Pb were noted in CTU while, Mn, Cu, and As were least in CTR. Thus, the sweet types of A. carambola (CTU, CTR) had comparatively lower heavy metal content. There appears to be no reason for concern since different fruit samples of Averrhoa studied presently showed the presence of various heavy metals in trace quantities.

  10. Effect of preservation method on the assessment of bacterial community structure in soil and water samples.

    Science.gov (United States)

    Tatangelo, Valeria; Franzetti, Andrea; Gandolfi, Isabella; Bestetti, Giuseppina; Ambrosini, Roberto

    2014-07-01

    The methods used in sample preservation may affect the description of the microbial community structure by DNA-based techniques. This study aims at evaluating the effect of different storage conditions, including freezing, adding two liquid-based preservatives or simply storing samples with no preservative, on the structure of the microbial communities in aliquots of organic-rich soil and water samples as revealed by a terminal restriction fragment length polymorphisms. The results showed that the number of terminal restriction fragments (TRFs) detected in soil aliquots stored with LifeGuard(™) solution was significantly lower than that of samples analyzed immediately after sampling. Moreover, cluster and PCA analyses showed that soil aliquots stored using LifeGuard(™) clustered separately from those stored with the other methods. Conversely, soil and water aliquots stored with DMSO-EDTA-salt solution did not show either significant reduction in the number of TRFs or any change in the structure of the microbial community. Finally, the number of TRFs and the structure of microbial communities from soil aliquots stored with no preservative did not differ from those of aliquots analyzed immediately after sampling. Preservation methods should therefore be accurately evaluated before collecting samples that have to be stored for long time before DNA extraction.

  11. Análise de fármacos em material biológico: acoplamento microextração em fase sólida "no tubo" e cromatografia líquida de alta eficiência Analysis of drugs in biological samples: automated "in-tube" solid-phase microextraction and high performance liquid chromatography

    Directory of Open Access Journals (Sweden)

    Maria Eugênia C. Queiroz

    2005-10-01

    Full Text Available A new solid phase microextraction (SPME system, known as in-tube SPME, was recently developed using an open tubular fused-silica capilary column, instead of an SPME fiber, as the SPME device. On-line in-tube SPME is usually used in combination with high performance liquid chromatography. Drugs in biological samples are directly extracted and concentrated in the stationary phase of capillary columns by repeated draw/eject cycles of sample solution, and then directly transferred to the liquid chromatographic column. In-tube SPME is suitable for automation. Automated sample handling procedures not only shorten the total analysis time, but also usually provide better accuracy and precision relative to manual techniques. In-tube SPME has been demonstrated to be a very effective and highly sensitive technique to determine drugs in biological samples for various purposes such as therapeutic drug monitoring, clinical toxicology, bioavailability and pharmacokinetics.

  12. Automated extraction of lysergic acid diethylamide (LSD) and N-demethyl-LSD from blood, serum, plasma, and urine samples using the Zymark RapidTrace with LC/MS/MS confirmation.

    Science.gov (United States)

    de Kanel, J; Vickery, W E; Waldner, B; Monahan, R M; Diamond, F X

    1998-05-01

    A forensic procedure for the quantitative confirmation of lysergic acid diethylamide (LSD) and the qualitative confirmation of its metabolite, N-demethyl-LSD, in blood, serum, plasma, and urine samples is presented. The Zymark RapidTrace was used to perform fully automated solid-phase extractions of all specimen types. After extract evaporation, confirmations were performed using liquid chromatography (LC) followed by positive electrospray ionization (ESI+) mass spectrometry/mass spectrometry (MS/MS) without derivatization. Quantitation of LSD was accomplished using LSD-d3 as an internal standard. The limit of quantitation (LOQ) for LSD was 0.05 ng/mL. The limit of detection (LOD) for both LSD and N-demethyl-LSD was 0.025 ng/mL. The recovery of LSD was greater than 95% at levels of 0.1 ng/mL and 2.0 ng/mL. For LSD at 1.0 ng/mL, the within-run and between-run (different day) relative standard deviation (RSD) was 2.2% and 4.4%, respectively.