WorldWideScience

Sample records for automated sampling assessment

  1. Automated sampling assessment for molecular simulations using the effective sample size

    CERN Document Server

    Zhang, Xin; Zuckerman, Daniel M

    2010-01-01

    To quantify the progress in development of algorithms and forcefields used in molecular simulations, a method for the assessment of the sampling quality is needed. We propose a general method to assess the sampling quality through the estimation of the number of independent samples obtained from molecular simulations. This method is applicable to both dynamic and nondynamic methods and utilizes the variance in the populations of physical states to determine the ESS. We test the correctness and robustness of our procedure in a variety of systems--two-state toy model, all-atom butane, coarse-grained calmodulin, all-atom dileucine and Met-enkaphalin. We also introduce an automated procedure to obtain approximate physical states from dynamic trajectories: this procedure allows for sample--size estimation for systems for which physical states are not known in advance.

  2. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B;

    2014-01-01

    corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters......Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...

  3. Precise and automated microfluidic sample preparation.

    Energy Technology Data Exchange (ETDEWEB)

    Crocker, Robert W.; Patel, Kamlesh D.; Mosier, Bruce P.; Harnett, Cindy K.

    2004-07-01

    Autonomous bio-chemical agent detectors require sample preparation involving multiplex fluid control. We have developed a portable microfluidic pump array for metering sub-microliter volumes at flowrates of 1-100 {micro}L/min. Each pump is composed of an electrokinetic (EK) pump and high-voltage power supply with 15-Hz feedback from flow sensors. The combination of high pump fluid impedance and active control results in precise fluid metering with nanoliter accuracy. Automated sample preparation will be demonstrated by labeling proteins with fluorescamine and subsequent injection to a capillary gel electrophoresis (CGE) chip.

  4. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  5. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system that...... fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints of...

  6. Automating Spreadsheet Discovery & Risk Assessment

    CERN Document Server

    Perry, Eric

    2008-01-01

    There have been many articles and mishaps published about the risks of uncontrolled spreadsheets in today's business environment, including non-compliance, operational risk, errors, and fraud all leading to significant loss events. Spreadsheets fall into the realm of end user developed applications and are often absent the proper safeguards and controls an IT organization would enforce for enterprise applications. There is also an overall lack of software programming discipline enforced in how spreadsheets are developed. However, before an organization can apply proper controls and discipline to critical spreadsheets, an accurate and living inventory of spreadsheets across the enterprise must be created, and all critical spreadsheets must be identified. As such, this paper proposes an automated approach to the initial stages of the spreadsheet management lifecycle - discovery, inventory and risk assessment. Without the use of technology, these phases are often treated as a one-off project. By leveraging techn...

  7. Automated system for fractionation of blood samples

    Energy Technology Data Exchange (ETDEWEB)

    Lee, N. E.; Genung, R. K.; Johnson, W. F.; Mrochek, J. E.; Scott, C. D.

    1978-01-01

    A prototype system for preparing multiple fractions of blood components (plasma, washed red cells, and hemolysates) using automated techniques has been developed. The procedure is based on centrifugal separation and differential pressure-induced transfer in a rotor that has been designed to process numerous samples simultaneously. Red cells are sedimented against the outer walls of the sample chamber, and plasma is syphoned, by imposition of eithr a slight positive or negative pressure, into individual reservoirs in a collection ring. Washing of cells is performed in situ; samples of washed cells, either packed or in saline solution, can be recovered. Cellular hemolysates are prepared and automatically transferred to individual, commercially available collection vials ready for storage in liquid nitrogen or immediate analysis. The system has potential application in any biomedical area which requires high sample throughput and in which one or more of the blood fractions will be used. A separate unit has been designed and developed for the semiautomated cleaning of the blood processing vessel.

  8. Automated Assessment, Face to Face

    OpenAIRE

    Rizik M. H. Al-Sayyed; Amjad Hudaib; Muhannad AL-Shboul; Yousef Majdalawi; Mohammed Bataineh

    2010-01-01

    This research paper evaluates the usability of automated exams and compares them with the paper-and-pencil traditional ones. It presents the results of a detailed study conducted at The University of Jordan (UoJ) that comprised students from 15 faculties. A set of 613 students were asked about their opinions concerning automated exams; and their opinions were deeply analyzed. The results indicate that most students reported that they are satisfied with using automated exams but they have sugg...

  9. Automated Data Quality Assessment of Marine Sensors

    OpenAIRE

    Smith, Daniel V; Leon Reznik; Paulo A. Souza; Timms, Greg P.

    2011-01-01

    The automated collection of data (e.g., through sensor networks) has led to a massive increase in the quantity of environmental and other data available. The sheer quantity of data and growing need for real-time ingestion of sensor data (e.g., alerts and forecasts from physical models) means that automated Quality Assurance/Quality Control (QA/QC) is necessary to ensure that the data collected is fit for purpose. Current automated QA/QC approaches provide assessments based upon hard classific...

  10. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  11. Six Key Topics for Automated Assessment Utilisation and Acceptance

    Directory of Open Access Journals (Sweden)

    Torsten REINERS

    2011-04-01

    Full Text Available Automated assessment technologies have been used in education for decades (e.g., computerised multiple choice tests. In contrast, Automated Essay Grading (AEG technologies: have existed for decades; are `good in theory' (e.g., as accurate as humans, temporally and financially efficient, and can enhance formative feedback, and yet; are ostensibly used comparatively infrequently in Australian universities. To empirically examine these experiential observations we conducted a national survey to explore the use of automated assessment in Australian universities and examine why adoption of AEG is limited. Quantitative and qualitative data were collected in an online survey from a sample of 265 staff and students from 5 Australian universities. The type of assessment used by the greatest proportion of respondents was essays/reports (82.6%, however very few respondents had used AEG (3.8%. Recommendations are made regarding methods to promote technology utilisation, including the use of innovative dissemination channels such as 3D Virtual Worlds.

  12. Constructing Aligned Assessments Using Automated Test Construction

    Science.gov (United States)

    Porter, Andrew; Polikoff, Morgan S.; Barghaus, Katherine M.; Yang, Rui

    2013-01-01

    We describe an innovative automated test construction algorithm for building aligned achievement tests. By incorporating the algorithm into the test construction process, along with other test construction procedures for building reliable and unbiased assessments, the result is much more valid tests than result from current test construction…

  13. Automated Autonomy Assessment System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA has expressed the need to assess crew autonomy relative to performance and evaluate an optimal level of autonomy that maximizes individual and team...

  14. Automated Bone Age Assessment: Motivation, Taxonomies, and Challenges

    Directory of Open Access Journals (Sweden)

    Marjan Mansourvar

    2013-01-01

    Full Text Available Bone age assessment (BAA of unknown people is one of the most important topics in clinical procedure for evaluation of biological maturity of children. BAA is performed usually by comparing an X-ray of left hand wrist with an atlas of known sample bones. Recently, BAA has gained remarkable ground from academia and medicine. Manual methods of BAA are time-consuming and prone to observer variability. This is a motivation for developing automated methods of BAA. However, there is considerable research on the automated assessment, much of which are still in the experimental stage. This survey provides taxonomy of automated BAA approaches and discusses the challenges. Finally, we present suggestions for future research.

  15. Automated Phone Assessments and Hospital Readmissions.

    Science.gov (United States)

    Olsen, Russell; Courtemanche, Ted; Hodach, Richard

    2016-04-01

    This analysis examined the efficacy of an automated postdischarge phone assessment for reducing hospital readmissions. All patients discharged between April 1, 2013, and January 31, 2014, from a single Level 1 trauma hospital of a large regional health system center utilizing an automated postdischarge phone assessment service were contacted via automated call between 24 and 72 hours post discharge. Patients answered 5 questions assessing perceived well-being, understanding of discharge instructions and medication regimen, satisfaction, and scheduled follow-up appointments. Responses could automatically prompt health personnel to speak directly with the patient. Data analysis examined rates of hospital readmission-any admission occurring within 30 days of a previous admission-for 3 broad categories of respondents: Answering Machine, Live Answer, and Unsuccessful. There were 6867 discharges included in the analysis. Of the Live Answer patients, 3035 answered all assessment questions; 153 (5.0%) of these had a subsequent readmission. Of the 738 Unsuccessful patients, 62 (8.4%) had a subsequent readmission. Unsuccessful patients were almost 2 times more likely to have a readmission than those who answered all 5 assessment questions. Of the latter group, readmission rates were highest for those who perceived a worsening of their condition (7.4%), and lowest for those reporting no follow-up appointment scheduled (3.8%). (Population Health Management 2016;19:120-124). PMID:26057571

  16. Automated PolyU Palmprint sample Registration and Coarse Classification

    CERN Document Server

    M., Dhananjay D; Muralikrishna, I V

    2011-01-01

    Biometric based authentication for secured access to resources has gained importance, due to their reliable, invariant and discriminating features. Palmprint is one such biometric entity. Prior to classification and identification registering a sample palmprint is an important activity. In this paper we propose a computationally effective method for automated registration of samples from PlolyU palmprint database. In our approach we preprocess the sample and trace the border to find the nearest point from center of sample. Angle between vector representing the nearest point and vector passing through the center is used for automated palm sample registration. The angle of inclination between start and end point of heart line and life line is used for basic classification of palmprint samples in left class and right class.

  17. Sample Tracking in an Automated Cytogenetic Biodosimetry Laboratory for Radiation Mass Casualties

    OpenAIRE

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W F; Prasanna, P.G.S.

    2007-01-01

    Chromosome aberration-based dicentric assay is expected to be used after mass casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed ...

  18. An automated 55 GHz cryogenic Josephson sampling oscilloscope

    DEFF Research Database (Denmark)

    Bodin, P.; Jacobsen, M. L.; Kyhle, Anders; Hansen, Jørn Bindslev; Davidson, A.; Brady, M.; Olsen, L.; Qualmann, Werner

    1993-01-01

    A computer-automated superconductive 55 GHz sampling oscilloscope based on 4 kA/cm2, Nb/Nb2O5/Pb edge Josephson junctions is presented. The Josephson sampler chip was flip-chip bonded to a carrier chip with a coplanar transmission line by use of a novel flip-chip bonding machine. A 5.6 ps step...

  19. An Automated Home Made Low Cost Vibrating Sample Magnetometer

    CERN Document Server

    Kundu, S

    2011-01-01

    The design and operation of a homemade low cost vibrating sample magnetometer is described here. The sensitivity of this instrument is better than 10-2 emu and found to be very efficient for the measurement of magnetization of most of the ferromagnetic and other magnetic materials as a function of temperature down to 77 K and magnetic field upto 800 Oe. Both M(H) and M(T) data acquisition are fully automated employing computer and Labview software

  20. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    Science.gov (United States)

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  1. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  2. Automating Groundwater Sampling At Hanford, The Next Step

    International Nuclear Information System (INIS)

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very 'people intensive.' Approximately 1500 wells are sampled each year by field personnel or 'samplers.' These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  3. AUTOMATING GROUNDWATER SAMPLING AT HANFORD THE NEXT STEP

    Energy Technology Data Exchange (ETDEWEB)

    CONNELL CW; CONLEY SF; HILDEBRAND RD; CUNNINGHAM DE; R_D_Doug_Hildebrand@rl.gov; DeVon_E_Cunningham@rl.gov

    2010-01-21

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very "people intensive." Approximately 1500 wells are sampled each year by field personnel or "samplers." These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  4. Automated assessment of mobility in bedridden patients.

    Science.gov (United States)

    Bennett, Stephanie; Goubran, Rafik; Rockwood, Kenneth; Knoefel, Frank

    2013-01-01

    Immobility in older patients is a costly problem for both patients and healthcare workers. The Hierarchical Assessment of Balance and Mobility (HABAM) is a clinical tool able to assess immobile patients and predict morbidity, yet could become more reliable and informative through automation. This paper proposes an algorithm to automatically determine which of three enacted HABAM scores (associated with bedridden patients) had been performed by volunteers. A laptop was used to gather pressure data from three mats placed on a standard hospital bed frame while five volunteers performed three enactments each. A system of algorithms was created, consisting of three subsystems. The first subsystem used mattress data to calculate individual sensor sums and eliminate the weight of the mattress. The second subsystem established a baseline pressure reading for each volunteer and used percentage change to identify and distinguish between two enactments. The third subsystem used calculated weight distribution ratios to determine if the data represented the remaining enactment. The system was tested for accuracy by inputting the volunteer data and recording the assessment output (a score per data set). The system identified 13 of 15 sets of volunteer data as expected. Examination of these results indicated that the two sets of data were not misidentified; rather, the volunteers had made mistakes in performance. These results suggest that this system of algorithms is effective in distinguishing between the three HABAM score enactments examined here, and emphasizes the potential for pervasive computing to improve traditional healthcare. PMID:24110676

  5. Components for automated microfluidics sample preparation and analysis

    Science.gov (United States)

    Archer, M.; Erickson, J. S.; Hilliard, L. R.; Howell, P. B., Jr.; Stenger, D. A.; Ligler, F. S.; Lin, B.

    2008-02-01

    The increasing demand for portable devices to detect and identify pathogens represents an interdisciplinary effort between engineering, materials science, and molecular biology. Automation of both sample preparation and analysis is critical for performing multiplexed analyses on real world samples. This paper selects two possible components for such automated portable analyzers: modified silicon structures for use in the isolation of nucleic acids and a sheath flow system suitable for automated microflow cytometry. Any detection platform that relies on the genetic content (RNA and DNA) present in complex matrices requires careful extraction and isolation of the nucleic acids in order to ensure their integrity throughout the process. This sample pre-treatment step is commonly performed using commercially available solid phases along with various molecular biology techniques that require multiple manual steps and dedicated laboratory space. Regardless of the detection scheme, a major challenge in the integration of total analysis systems is the development of platforms compatible with current isolation techniques that will ensure the same quality of nucleic acids. Silicon is an ideal candidate for solid phase separations since it can be tailored structurally and chemically to mimic the conditions used in the laboratory. For analytical purposes, we have developed passive structures that can be used to fully ensheath one flow stream with another. As opposed to traditional flow focusing methods, our sheath flow profile is truly two dimensional, making it an ideal candidate for integration into a microfluidic flow cytometer. Such a microflow cytometer could be used to measure targets captured on either antibody- or DNA-coated beads.

  6. An automated 55 GHz cryogenic Josephson sampling oscilloscope

    International Nuclear Information System (INIS)

    A computer-automated superconductive 55 GHz sampling oscilloscope based on 4 kA/cm2, Nb/Nb2O5/Pb edge Josephson junctions is presented. The Josephson sampler chip was flip-chip bonded to a carrier chip with a coplanar transmission line by use of a novel flip-chip bonding machine. A 5.6 ps step pulse was successfully coupled in to the transmission line and 18.5 GHz multiple reflections plus a parasitic oscillation at 43 GHz were observed

  7. Digital microfluidic hub for automated nucleic acid sample preparation.

    Energy Technology Data Exchange (ETDEWEB)

    He, Jim; Bartsch, Michael S.; Patel, Kamlesh D.; Kittlaus, Eric A.; Remillared, Erin M.; Pezzola, Genevieve L.; Renzi, Ronald F.; Kim, Hanyoup

    2010-07-01

    We have designed, fabricated, and characterized a digital microfluidic (DMF) platform to function as a central hub for interfacing multiple lab-on-a-chip sample processing modules towards automating the preparation of clinically-derived DNA samples for ultrahigh throughput sequencing (UHTS). The platform enables plug-and-play installation of a two-plate DMF device with consistent spacing, offers flexible connectivity for transferring samples between modules, and uses an intuitive programmable interface to control droplet/electrode actuations. Additionally, the hub platform uses transparent indium-tin oxide (ITO) electrodes to allow complete top and bottom optical access to the droplets on the DMF array, providing additional flexibility for various detection schemes.

  8. Validation of Automated Scoring of Science Assessments

    Science.gov (United States)

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  9. Automated Intelligibility Assessment of Pathological Speech Using Phonological Features

    Directory of Open Access Journals (Sweden)

    Catherine Middag

    2009-01-01

    Full Text Available It is commonly acknowledged that word or phoneme intelligibility is an important criterion in the assessment of the communication efficiency of a pathological speaker. People have therefore put a lot of effort in the design of perceptual intelligibility rating tests. These tests usually have the drawback that they employ unnatural speech material (e.g., nonsense words and that they cannot fully exclude errors due to listener bias. Therefore, there is a growing interest in the application of objective automatic speech recognition technology to automate the intelligibility assessment. Current research is headed towards the design of automated methods which can be shown to produce ratings that correspond well with those emerging from a well-designed and well-performed perceptual test. In this paper, a novel methodology that is built on previous work (Middag et al., 2008 is presented. It utilizes phonological features, automatic speech alignment based on acoustic models that were trained on normal speech, context-dependent speaker feature extraction, and intelligibility prediction based on a small model that can be trained on pathological speech samples. The experimental evaluation of the new system reveals that the root mean squared error of the discrepancies between perceived and computed intelligibilities can be as low as 8 on a scale of 0 to 100.

  10. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis. PMID:25384595

  11. Automated remedial assessment methodology software system

    Energy Technology Data Exchange (ETDEWEB)

    Whiting, M.; Wilkins, M.; Stiles, D.

    1994-11-01

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies.

  12. Automated remedial assessment methodology software system

    International Nuclear Information System (INIS)

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies

  13. Semicontinuous automated measurement of organic carbon in atmospheric aerosol samples.

    Science.gov (United States)

    Lu, Chao; Rashinkar, Shilpa M; Dasgupta, Purnendu K

    2010-02-15

    A fully automated measurement system for ambient aerosol organic carbon, capable of unattended operation over extended periods, is described. Particles are collected in a cyclone with water as the collection medium. The collected sample is periodically aspirated by a syringe pump into a holding loop and then delivered to a wet oxidation reactor (WOR). Acid is added, and the WOR is purged to measure dissolved CO(2) or inorganic carbonates (IC) as evolved CO(2). The IC background can often be small and sufficiently constant to be corrected for, without separate measurement, by a blank subtraction. The organic material is now oxidized stepwise or in one step to CO(2). The one-step oxidation involves UV-persulfate treatment in the presence of ozone. This treatment converts organic carbon (OC) to CO(2), but elemental carbon is not oxidized. The CO(2) is continuously purged from solution and collected by two sequential miniature diffusion scrubbers (DSs), a short DS preceding a longer one. Each DS consists of a LiOH-filled porous hydrophobic membrane tube with terminal stainless steel tubes that function as conductance-sensing electrodes. As CO(2) is collected by the LiOH-filled DSs, hydroxide is converted into carbonate and the resulting decrease in conductivity is monitored. The simultaneous use of the dual short and long DS units bearing different concentrations of LiOH permits both good sensitivity and a large dynamic range. The limit of detection (LOD, S/N = 3) is approximately 140 ng of C. With a typical sampling period of 30 min at a sampling rate of 30 L/min, this corresponds to an LOD of 160 ng/m(3). The approach also provides information on the ease of oxidation of the carbonaceous aerosol and hence the nature of the carbon contained therein. Ambient aerosol organic carbon data are presented. PMID:20092351

  14. Automated Validation of Trusted Digital Repository Assessment Criteria

    OpenAIRE

    Moore, Reagan W.; Smith, Mackenzie

    2007-01-01

    The RLG/NARA trusted digital repository (TDR) certification checklist defines a set of preservation assessment criteria. The criteria can be mapped into management policies that define how a digital preservation environment is operated. We explore how these management policies can be automated through their characterization as rules that control preservation services. By integrating a rule-based data management system with the DSpace digital library, we expect to demonstrate automated audits ...

  15. Flexible Automation: Two Approaches to the Assessment of Employment Impacts

    OpenAIRE

    Goldberg, W.H.

    1983-01-01

    This paper on flexible automation written by Dr. Goldberg was presented to a IIASA workshop which was held in June 1982 in Berlin (GDR) and organized by the Management and Technology Area in cooperation with the Academy of Sciences of the GDR. This paper is particularly interesting in several respects. Firstly, it addresses the problem which corresponds to the Institute's interest. The main issue addressed is the impact assessment of flexible automation in employment. Two methodological ...

  16. Ability-Training-Oriented Automated Assessment in Introductory Programming Course

    Science.gov (United States)

    Wang, Tiantian; Su, Xiaohong; Ma, Peijun; Wang, Yuying; Wang, Kuanquan

    2011-01-01

    Learning to program is a difficult process for novice programmers. AutoLEP, an automated learning and assessment system, was developed by us, to aid novice programmers to obtain programming skills. AutoLEP is ability-training-oriented. It adopts a novel assessment mechanism, which combines static analysis with dynamic testing to analyze student…

  17. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.;

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually....... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management. The...

  18. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  19. Requirements for Automated Assessment of Spreadsheet Maintainability

    CERN Document Server

    Correia, José Pedro

    2011-01-01

    The use of spreadsheets is widespread. Be it in business, finance, engineering or other areas, spreadsheets are created for their flexibility and ease to quickly model a problem. Very often they evolve from simple prototypes to implementations of crucial business logic. Spreadsheets that play a crucial role in an organization will naturally have a long lifespan and will be maintained and evolved by several people. Therefore, it is important not only to look at their reliability, i.e., how well is the intended functionality implemented, but also at their maintainability, i.e., how easy it is to diagnose a spreadsheet for deficiencies and modify it without degrading its quality. In this position paper we argue for the need to create a model to estimate the maintainability of a spreadsheet based on (automated) measurement. We propose to do so by applying a structured methodology that has already shown its value in the estimation of maintainability of software products. We also argue for the creation of a curated...

  20. Fast detection of Noroviruses using a real-time PCR assay and automated sample preparation

    Directory of Open Access Journals (Sweden)

    Schmid Michael

    2004-06-01

    Full Text Available Abstract Background Noroviruses (NoV have become one of the most commonly reported causative agents of large outbreaks of non-bacterial acute gastroenteritis worldwide as well as sporadic gastroenteritis in the community. Currently, reverse transcriptase polymerase chain reaction (RT-PCR assays have been implemented in NoV diagnosis, but improvements that simplify and standardize sample preparation, amplification, and detection will be further needed. The combination of automated sample preparation and real-time PCR offers such refinements. Methods We have designed a new real-time RT-PCR assay on the LightCycler (LC with SYBR Green detection and melting curve analysis (Tm to detect NoV RNA in patient stool samples. The performance of the real-time PCR assay was compared with that obtained in parallel with a commercially available enzyme immunoassay (ELISA for antigen detection by testing a panel of 52 stool samples. Additionally, in a collaborative study with the Baden-Wuerttemberg State Health office, Stuttgart (Germany the real-time PCR results were blindly assessed using a previously well-established nested PCR (nPCR as the reference method, since PCR-based techniques are now considered as the "gold standard" for NoV detection in stool specimens. Results Analysis of 52 clinical stool samples by real-time PCR yielded results that were consistent with reference nPCR results, while marked differences between the two PCR-based methods and antigen ELISA were observed. Our results indicate that PCR-based procedures are more sensitive and specific than antigen ELISA for detecting NoV in stool specimens. Conclusions The combination of automated sample preparation and real-time PCR provided reliable diagnostic results in less time than conventional RT-PCR assays. These benefits make it a valuable tool for routine laboratory practice especially in terms of rapid and appropriate outbreak-control measures in health-care facilities and other settings.

  1. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  2. Operator-based metric for nuclear operations automation assessment

    International Nuclear Information System (INIS)

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator's information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation

  3. Automated volumetric breast density estimation: A comparison with visual assessment

    International Nuclear Information System (INIS)

    Aim: To compare automated volumetric breast density (VBD) measurement with visual assessment according to Breast Imaging Reporting and Data System (BI-RADS), and to determine the factors influencing the agreement between them. Materials and methods: One hundred and ninety-three consecutive screening mammograms reported as negative were included in the study. Three radiologists assigned qualitative BI-RADS density categories to the mammograms. An automated volumetric breast-density method was used to measure VBD (% breast density) and density grade (VDG). Each case was classified into an agreement or disagreement group according to the comparison between visual assessment and VDG. The correlation between visual assessment and VDG was obtained. Various physical factors were compared between the two groups. Results: Agreement between visual assessment by the radiologists and VDG was good (ICC value = 0.757). VBD showed a highly significant positive correlation with visual assessment (Spearman's ρ = 0.754, p < 0.001). VBD and the x-ray tube target was significantly different between the agreement group and the disagreement groups (p = 0.02 and 0.04, respectively). Conclusion: Automated VBD is a reliable objective method to measure breast density. The agreement between VDG and visual assessment by radiologist might be influenced by physical factors

  4. Automated washing of FTA Card punches and PCR setup for reference samples using a LIMS-controlled Sias Xantus automated liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Olsen, Addie Nina; Frøslev, Tobias G.;

    2009-01-01

    We have implemented and validated automated methods for washing FTA Card punches containing buccal samples and subsequent PCR setup using a Sias Xantus automated liquid handler. The automated methods were controlled by worklists generated by our LabWare Laboratory Information Management System...

  5. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  6. Automated Training Sample Extraction for Global Land Cover Mapping

    OpenAIRE

    Julien Radoux; Céline Lamarche; Eric Van Bogaert; Sophie Bontemps; Carsten Brockmann; Pierre Defourny

    2014-01-01

    Land cover is one of the essential climate variables of the ESA Climate Change Initiative (CCI). In this context, the Land Cover CCI (LC CCI) project aims at building global land cover maps suitable for climate modeling based on Earth observation by satellite sensors.  The  challenge  is  to  generate  a  set  of  successive  maps  that  are  both  accurate and consistent over time. To do so, operational methods for the automated classification of optical images are investigated. The pr...

  7. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    OpenAIRE

    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  8. Automated sample preparation and analysis using a sequential-injection-capillary electrophoresis (SI-CE) interface.

    Science.gov (United States)

    Kulka, Stephan; Quintás, Guillermo; Lendl, Bernhard

    2006-06-01

    A fully automated sequential-injection-capillary electrophoresis (SI-CE) system was developed using commercially available components as the syringe pump, the selection and injection valves and the high voltage power supply. The interface connecting the SI with the CE unit consisted of two T-pieces, where the capillary was inserted in one T-piece and a Pt electrode in the other (grounded) T-piece. By pressurising the whole system using a syringe pump, hydrodynamic injection was feasible. For characterisation, the system was applied to a mixture of adenosine and adenosine monophosphate at different concentrations. The calibration curve obtained gave a detection limit of 0.5 microg g(-1) (correlation coefficient of 0.997). The reproducibility of the injection was also assessed, resulting in a RSD value (5 injections) of 5.4%. The total time of analysis, from injection, conditioning and separation to cleaning the capillary again was 15 minutes. In another application, employing the full power of the automated SIA-CE system, myoglobin was mixed directly using the flow system with different concentrations of sodium dodecyl sulfate (SDS), a known denaturing agent. The different conformations obtained in this way were analysed with the CE system and a distinct shift in migration time and decreasing of the native peak of myoglobin (Mb) could be observed. The protein samples prepared were also analysed with off-line infrared spectroscopy (IR), confirming these results. PMID:16732362

  9. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    Science.gov (United States)

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  10. Rapid and automated determination of plutonium and neptunium in environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, J.

    2011-03-15

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  11. Rapid and automated determination of plutonium and neptunium in environmental samples

    International Nuclear Information System (INIS)

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  12. Conceptual design for comprehensive automation in radiochemical analysis of bioassay samples

    International Nuclear Information System (INIS)

    Bioassay Laboratory of Health Physics Division is entrusted with the task of carrying out the bioassay monitoring of occupational workers from various plants/divisions of BARC for various radionuclides like Pu, U, Th, 90Sr, 3H etc. On the average about 1400-1500 analyses are performed on 700-800 urine samples collected annually from radiation workers. The workload has increased by 1.5 to 2.0 times in recent past and is expected to increase further due to expanding nuclear programmes of the Department. Therefore, it was planned to carry out automation in various stages of bioassay sample handling, processing and analysis under the XI plan programme. Automation work in Bioassay Lab. is planned to be taken-up in three stages namely, automation in initial processing of i) urine samples, ii) fecal samples and iii) automation in radiochemical analysis of bioassay samples. In the initial phase, automation in radiochemical analysis of bioassay samples has been taken up

  13. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  14. Automated Research Impact Assessment: A New Bibliometrics Approach

    Science.gov (United States)

    Drew, Christina H.; Pettibone, Kristianna G.; Finch, Fallis Owen; Giles, Douglas; Jordan, Paul

    2016-01-01

    As federal programs are held more accountable for their research investments, The National Institute of Environmental Health Sciences (NIEHS) has developed a new method to quantify the impact of our funded research on the scientific and broader communities. In this article we review traditional bibliometric analyses, address challenges associated with them, and describe a new bibliometric analysis method, the Automated Research Impact Assessment (ARIA). ARIA taps into a resource that has only rarely been used for bibliometric analyses: references cited in “important” research artifacts, such as policies, regulations, clinical guidelines, and expert panel reports. The approach includes new statistics that science managers can use to benchmark contributions to research by funding source. This new method provides the ability to conduct automated impact analyses of federal research that can be incorporated in program evaluations. We apply this method to several case studies to examine the impact of NIEHS funded research. PMID:26989272

  15. An automated atmospheric sampling system operating on 747 airliners

    Science.gov (United States)

    Perkins, P. J.; Gustafsson, U. R. C.

    1976-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of particulate and gaseous constituents of the atmosphere is collecting data on commercial air routes covering the world. Measurements are made in the upper troposphere and lower stratosphere (6 to 12 km) of constituents related to aircraft engine emissions and other pollutants. Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This unique system includes specialized instrumentation, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituent and related flight data are tape recorded in flight for later computer processing on the ground.

  16. An automated atmospheric sampling system operating on 747 airliners

    Science.gov (United States)

    Perkins, P.; Gustafsson, U. R. C.

    1975-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of selected particulate and gaseous constituents of the atmosphere has been installed on a number of commercial airliners and is collecting data on commercial air routes covering the world. Measurements of constituents related to aircraft engine emissions and other pollutants are made in the upper troposphere and lower stratosphere (6 to 12 km) in support of the Global Air Sampling Program (GASP). Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This system includes specialized instrumentation for measuring carbon monoxide, ozone, water vapor, and particulates, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituents and related flight data are tape recorded in flight for later computer processing on the ground.

  17. Automated biowaste sampling system, solids subsystem operating model, part 2

    Science.gov (United States)

    Fogal, G. L.; Mangialardi, J. K.; Stauffer, R. E.

    1973-01-01

    The detail design and fabrication of the Solids Subsystem were implemented. The system's capacity for the collection, storage or sampling of feces and vomitus from six subjects was tested and verified.

  18. SASSI: Subsystems for Automated Subsurface Sampling Instruments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Autonomous surface sampling systems are necessary, near term, to construct a historical view of planetary significant events; as well as allow for the...

  19. SASSI: Subsystems for Automated Subsurface Sampling Instruments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Future robotic planetary exploration missions will benefit greatly from the ability to capture rock and/or regolith core samples that deliver the stratigraphy of...

  20. Integrating Electrochemical Detection with Centrifugal Microfluidics for Real-Time and Fully Automated Sample Testing

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga; Kwasny, Dorota; Amato, Letizia; Brøgger, Anna Line; Bosco, Filippo; Andersen, Karsten Brandt; Svendsen, Winnie Edith; Boisen, Anja

    2015-01-01

    experiments, even when the microfluidic disc is spinning at high velocities. Automated sample handling is achieved by designing a microfluidic system to release analyte sequentially, utilizing on-disc passive valving. In addition, the microfluidic system is designed to trap and keep the liquid sample...

  1. Experiences of Using Automated Assessment in Computer Science Courses

    Directory of Open Access Journals (Sweden)

    John English

    2015-10-01

    Full Text Available In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students type in their own answers. Students were allowed to correct errors based on feedback provided by the system and resubmit their answers. A total of 141 students were surveyed to assess their opinions of this approach, and we analysed their responses. Analysis of the questionnaire showed a low correlation between questions, indicating the statistical independence of the individual questions. As a whole, student feedback on using Checkpoint was very positive, emphasizing the benefits of multiple attempts, impartial marking, and a quick turnaround time for submissions. Many students said that Checkpoint gave them confidence in learning and motivation to practise. Students also said that the detailed feedback that Checkpoint generated when their programs failed helped them understand their mistakes and how to correct them.

  2. Automated biowaste sampling system urine subsystem operating model, part 1

    Science.gov (United States)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  3. New automated technique for assessing emphysema on histological sections.

    OpenAIRE

    Gillooly, M.; Lamb, D; Farrow, A S

    1991-01-01

    The assessment of emphysema in human lungs has traditionally been based on observations made on whole lung slices. These methods are inappropriate for the study of early emphysema, because as much as 75% of the alveolar wall surface area may have been lost by the time airspaces are visible to the naked eye. A new, automated image analysis system, the Fast Interval Processor (FIP), was used to measure airspace wall surface area per unit volume of lung tissue (AWUV). AWUV was measured on histol...

  4. Assessing Working Memory in Spanish-Speaking Children: Automated Working Memory Assessment Battery Adaptation

    Science.gov (United States)

    Injoque-Ricle, Irene; Calero, Alejandra D.; Alloway, Tracy P.; Burin, Debora I.

    2011-01-01

    The Automated Working Memory Assessment battery was designed to assess verbal and visuospatial passive and active working memory processing in children and adolescents. The aim of this paper is to present the adaptation and validation of the AWMA battery to Argentinean Spanish-speaking children aged 6 to 11 years. Verbal subtests were adapted and…

  5. Automated Genotyping of Biobank Samples by Multiplex Amplification of Insertion/Deletion Polymorphisms

    OpenAIRE

    Lucy Mathot; Elin Falk-Sörqvist; Lotte Moens; Marie Allen; Tobias Sjöblom; Mats Nilsson

    2012-01-01

    The genomic revolution in oncology will entail mutational analyses of vast numbers of patient-matched tumor and normal tissue samples. This has meant an increased risk of patient sample mix up due to manual handling. Therefore, scalable genotyping and sample identification procedures are essential to pathology biobanks. We have developed an efficient alternative to traditional genotyping methods suited for automated analysis. By targeting 53 prevalent deletions and insertions found in human p...

  6. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  7. Security Measures in Automated Assessment System for Programming Courses

    Directory of Open Access Journals (Sweden)

    Jana Šťastná

    2015-12-01

    Full Text Available A desirable characteristic of programming code assessment is to provide the learner the most appropriate information regarding the code functionality as well as a chance to improve. This can be hardly achieved in case the number of learners is high (500 or more. In this paper we address the problem of risky code testing and availability of an assessment platform Arena, dealing with potential security risks when providing an automated assessment for a large set of source code. Looking at students’ programs as if they were potentially malicious inspired us to investigate separated execution environments, used by security experts for secure software analysis. The results also show that availability issues of our assessment platform can be conveniently resolved with task queues. A special attention is paid to Docker, a virtual container ensuring no risky code can affect the assessment system security. The assessment platform Arena enables to regularly, effectively and securely assess students' source code in various programming courses. In addition to that it is a motivating factor and helps students to engage in the educational process.

  8. Development of an automated fracture. Assessment system for nuclear structures

    International Nuclear Information System (INIS)

    A program system for fracture assessment of nuclear power plant structures has been developed. The system consists of an easy-to-use program for engineering analysis and an automated finite element (FE) program system for more accurate analysis with solid three-dimensional (3D) models. The VTTSIF (SIF stress intensity factor) program for engineering fracture assessment applies either the weight function method or superposition method in calculating the stress intensity factor, and the fatigue crack growth analysis is based on the Paris equation. The structural geometry cases of the VTTSIF program are organized in an extendable subroutine database. The generation of a 3D FE model of a cracked structure is automated by the ACR program (automatic finite element model generation for part through cracks). The FE analyses are created with generally accepted commercial programs, and the virtual crack extension method (VCE) is used for fracture parameter evaluation by the VTTVIRT postprocessor program (program for J-integral evaluation using virtual crack extension method). The several test cases have demonstrated that the accuracy of the present system is satisfactory for practical applications. (author)

  9. Quantification of Human Movement for Assessment in Automated Exercise Coaching

    CERN Document Server

    Hagler, Stuart; Bajczy, Ruzena; Pavel, Misha

    2016-01-01

    Quantification of human movement is a challenge in many areas, ranging from physical therapy to robotics. We quantify of human movement for the purpose of providing automated exercise coaching in the home. We developed a model-based assessment and inference process that combines biomechanical constraints with movement assessment based on the Microsoft Kinect camera. To illustrate the approach, we quantify the performance of a simple squatting exercise using two model-based metrics that are related to strength and endurance, and provide an estimate of the strength and energy-expenditure of each exercise session. We look at data for 5 subjects, and show that for some subjects the metrics indicate a trend consistent with improved exercise performance.

  10. Quantitative Vulnerability Assessment of Cyber Security for Distribution Automation Systems

    Directory of Open Access Journals (Sweden)

    Xiaming Ye

    2015-06-01

    Full Text Available The distribution automation system (DAS is vulnerable to cyber-attacks due to the widespread use of terminal devices and standard communication protocols. On account of the cost of defense, it is impossible to ensure the security of every device in the DAS. Given this background, a novel quantitative vulnerability assessment model of cyber security for DAS is developed in this paper. In the assessment model, the potential physical consequences of cyber-attacks are analyzed from two levels: terminal device level and control center server level. Then, the attack process is modeled based on game theory and the relationships among different vulnerabilities are analyzed by introducing a vulnerability adjacency matrix. Finally, the application process of the proposed methodology is illustrated through a case study based on bus 2 of the Roy Billinton Test System (RBTS. The results demonstrate the reasonability and effectiveness of the proposed methodology.

  11. Automated Video Quality Assessment for Deep-Sea Video

    Science.gov (United States)

    Pirenne, B.; Hoeberechts, M.; Kalmbach, A.; Sadhu, T.; Branzan Albu, A.; Glotin, H.; Jeffries, M. A.; Bui, A. O. V.

    2015-12-01

    Video provides a rich source of data for geophysical analysis, often supplying detailed information about the environment when other instruments may not. This is especially true of deep-sea environments, where direct visual observations cannot be made. As computer vision techniques improve and volumes of video data increase, automated video analysis is emerging as a practical alternative to labor-intensive manual analysis. Automated techniques can be much more sensitive to video quality than their manual counterparts, so performing quality assessment before doing full analysis is critical to producing valid results.Ocean Networks Canada (ONC), an initiative of the University of Victoria, operates cabled ocean observatories that supply continuous power and Internet connectivity to a broad suite of subsea instruments from the coast to the deep sea, including video and still cameras. This network of ocean observatories has produced almost 20,000 hours of video (about 38 hours are recorded each day) and an additional 8,000 hours of logs from remotely operated vehicle (ROV) dives. We begin by surveying some ways in which deep-sea video poses challenges for automated analysis, including: 1. Non-uniform lighting: Single, directional, light sources produce uneven luminance distributions and shadows; remotely operated lighting equipment are also susceptible to technical failures. 2. Particulate noise: Turbidity and marine snow are often present in underwater video; particles in the water column can have sharper focus and higher contrast than the objects of interest due to their proximity to the light source and can also influence the camera's autofocus and auto white-balance routines. 3. Color distortion (low contrast): The rate of absorption of light in water varies by wavelength, and is higher overall than in air, altering apparent colors and lowering the contrast of objects at a distance.We also describe measures under development at ONC for detecting and mitigating

  12. Application of bar codes to the automation of analytical sample data collection

    International Nuclear Information System (INIS)

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented

  13. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    OpenAIRE

    Qiao, Jixin

    2011-01-01

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combina...

  14. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    Science.gov (United States)

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. PMID:26429557

  15. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  16. LAVA: A conceptual framework for automated risk assessment

    International Nuclear Information System (INIS)

    At the Los Alamos National Laboratory the authors are developing the framework for generating knowledge-based systems that perform automated risk analyses on an organizations's assets. An organization's assets can be subdivided into tangible and intangible assets. Tangible assets include facilities, material, personnel, and time, while intangible assets include such factors as reputation, employee morale, and technical knowledge. The potential loss exposure of an asset is dependent upon the threats (both static and dynamic), the vulnerabilities in the mechanisms protecting the assets from the threats, and the consequences of the threats successfully exploiting the protective systems vulnerabilities. The methodology is based upon decision analysis, fuzzy set theory, natural language processing, and event tree structures. The Los Alamos Vulnerability and Risk Assessment (LAVA) methodology has been applied to computer security. The program generates both summary reports for use by both management personnel and detailed reports for use by operations staff

  17. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33.

    Science.gov (United States)

    Round, A R; Franke, D; Moritz, S; Huchler, R; Fritsche, M; Malthan, D; Klaering, R; Svergun, D I; Roessle, M

    2008-10-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client-server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841

  18. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  19. An automated system for assessing cognitive function in any environment

    Science.gov (United States)

    Wesnes, Keith A.

    2005-05-01

    The Cognitive Drug Research (CDR) computerized assessment system has been in use in worldwide clinical trials for over 20 years. It is a computer based system which assesses core aspects of human cognitive function including attention, information, working memory and long-term memory. It has been extensively validated and can be performed by a wide range of clinical populations including patients with various types of dementia. It is currently in worldwide use in clinical trials to evaluate new medicines, as well as a variety of programs involving the effects of age, stressors illnesses and trauma upon human cognitive function. Besides being highly sensitive to drugs which will impair or improve function, its utility has been maintained over the last two decades by constantly increasing the number of platforms upon which it can operate. Besides notebook versions, the system can be used on a wrist worn device, PDA, via tht telephone and over the internet. It is the most widely used automated cognitive function assessment system in worldwide clinical research. It has dozens of parallel forms and requires little training to use or administer. The basic development of the system wil be identified, and the huge databases (normative, patient population, drug effects) which have been built up from hundreds of clinical trials will be described. The system is available for use in virtually any environment or type of trial.

  20. A fully automated plasma protein precipitation sample preparation method for LC-MS/MS bioanalysis.

    Science.gov (United States)

    Ma, Ji; Shi, Jianxia; Le, Hoa; Cho, Robert; Huang, Judy Chi-jou; Miao, Shichang; Wong, Bradley K

    2008-02-01

    This report describes the development and validation of a robust robotic system that fully integrates all peripheral devices needed for the automated preparation of plasma samples by protein precipitation. The liquid handling system consisted of a Tecan Freedom EVO 200 liquid handling platform equipped with an 8-channel liquid handling arm, two robotic plate-handling arms, and two plate shakers. Important additional components integrated into the platform were a robotic temperature-controlled centrifuge, a plate sealer, and a plate seal piercing station. These enabled unattended operation starting from a stock solution of the test compound, a set of test plasma samples and associated reagents. The stock solution of the test compound was used to prepare plasma calibration and quality control samples. Once calibration and quality control samples were prepared, precipitation of plasma proteins was achieved by addition of three volumes of acetonitrile. Integration of the peripheral devices allowed automated sequential completion of the centrifugation, plate sealing, piercing and supernatant transferral steps. The method produced a sealed, injection-ready 96-well plate of plasma extracts. Accuracy and precision of the automated system were satisfactory for the intended use: intra-day and the inter-day precision were excellent (C.V.<5%), while the intra-day and inter-day accuracies were acceptable (relative error<8%). The flexibility of the platform was sufficient to accommodate pharmacokinetic studies of different numbers of animals and time points. To the best of our knowledge, this represents the first complete automation of the protein precipitation method for plasma sample analysis. PMID:18226589

  1. Microassay for interferon, using [3H]uridine, microculture plates, and a multiple automated sample harvester.

    OpenAIRE

    Richmond, J Y; Polatnick, J; Knudsen, R C

    1980-01-01

    A microassay for interferon is described which uses target cells grown in microculture wells, [3H]uridine to measure vesicular stomatitis virus replication in target cells, and a multiple automated sample harvester to collect the radioactively labeled viral ribonucleic acid onto glass fiber filter disks. The disks were placed in minivials, and radioactivity was counted in a liquid scintillation spectrophotometer. Interferon activity was calculated as the reciprocal of the highest titer which ...

  2. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    International Nuclear Information System (INIS)

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining

  3. Assessment of a five-color flow cytometric assay for verifying automated white blood cell differentials

    Institute of Scientific and Technical Information of China (English)

    HUANG Chun-mei; YU Lian-hui; PU Cheng-wei; WANG Xin; WANG Geng; SHEN Li-song; WANG Jian-zhong

    2013-01-01

    Background White blood cell (WBC) counts and differentials performed using an automated cell counter typically require manual microscopic review.However,this last step is time consuming and requires experienced personnel.We evaluated the clinical efficiency of using flow cytometry (FCM) employing a six-antibody/five-color reagent for verifying automated WBC differentials.Methods A total of 56 apparently healthy samples were assessed using a five-color flow cytometer to verify the normal reference ranges of WBC differentials.WBC differentials of 622 samples were also determined using both a cell counter and FCM.These results were then confirmed using manual microscopic methods.Results The probabilities for all of the parameters of WBC differentials exceeded the corresponding normal reference ranges by no more than 7.5%.The resulting WBC differentials were well correlated between FCM and the cell counter (r >0.88,P <0.001),except in the case of basophils.Neutrophils,lymphocytes,and eosinophils were well correlated between FCM and standard microscopic cytology assessment (r >0.80,P <0.001).The sensitivities of FCM for identification of immature granulocytes and blast cells (72.03% and 22.22%,respectively) were higher than those of the cell counter method (44.92% and 11.11%,respectively).The specificities of FCM were all above 85%,substantially better than those of the cell counter method.Conclusion These five-color FCM assays could be applied to accurately verify abnormal results of automated assessment of WBC differentials.

  4. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    Science.gov (United States)

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles. PMID:26938837

  5. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.

    2016-01-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  6. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments.

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A

    2016-08-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  7. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method. PMID:25033319

  8. Assessment of organic matter resistance to biodegradation in volcanic ash soils assisted by automated interpretation of infrared spectra from humic acid and whole soil samples by using partial least squares

    Science.gov (United States)

    Hernández, Zulimar; Pérez Trujillo, Juan Pedro; Hernández-Hernández, Sergio Alexander; Almendros, Gonzalo; Sanz, Jesús

    2014-05-01

    From a practical viewpoint, the most interesting possibilities of applying infrared (IR) spectroscopy to soil studies lie on processing IR spectra of whole soil (WS) samples [1] in order to forecast functional descriptors at high organizational levels of the soil system, such as soil C resilience. Currently, there is a discussion on whether the resistance to biodegradation of soil organic matter (SOM) depends on its molecular composition or on environmental interactions between SOM and mineral components, such could be the case with physical encapsulation of particulate SOM or organo-mineral derivatives, e.g., those formed with amorphous oxides [2]. A set of about 200 dependent variables from WS and isolated, ash free, humic acids (HA) [3] was obtained in 30 volcanic ash soils from Tenerife Island (Spain). Soil biogeochemical properties such as SOM, allophane (Alo + 1 /2 Feo), total mineralization coefficient (TMC) or aggregate stability were determined in WS. In addition, structural information on SOM was obtained from the isolated HA fractions by visible spectroscopy and analytical pyrolysis (Py-GC/MS). Aiming to explore the potential of partial least squares regression (PLS) in forecasting soil dependent variables, exclusively using the information extracted from WS and HA IR spectral profiles, data were processed by using ParLeS [4] and Unscrambler programs. Data pre-treatments should be carefully chosen: the most significant PLS models from IR spectra of HA were obtained after second derivative pre-treatment, which prevented effects of intrinsically broadband spectral profiles typical in macromolecular heterogeneous material such as HA. Conversely, when using IR spectra of WS, the best forecasting models were obtained using linear baseline correction and maximum normalization pre-treatment. With WS spectra, the most successful prediction models were obtained for SOM, magnetite, allophane, aggregate stability, clay and total aromatic compounds, whereas the PLS

  9. Automation of Sample Transfer and Counting on Fast Neutron ActivationSystem

    International Nuclear Information System (INIS)

    The automation of sample transfer and counting were the transfer processof the sample to the activation and counting place which have been done byswitch (manually) previously, than being developed by automaticallyprogrammed logic instructions. The development was done by constructed theelectronics hardware and software for that communication. Transfer timemeasurement is on seconds and was done automatically with an error 1.6 ms.The counting and activation time were decided by the user on seconds andminutes, the execution error on minutes was 8.2 ms. This development systemwill be possible for measuring short half live elements and cyclic activationprocesses. (author)

  10. Automated low energy photon absorption equipment for measuring internal moisture and density distributions of wood samples

    International Nuclear Information System (INIS)

    Automated equipment for measuring the moisture and density distributions of wood samples was developed. Using a narrow beam of gamma rays, the equipment scans the wood samples, which are placed on the moving belt. The moisture measurement is based on the 241Am photon absorption technique (59.5 keV), where the difference of the linear absorption coefficients of the moist and dry wood is measured. The method requires no knowledge of the thickness of the specimen. The density estimation method is based on the measurement of the linear attenuation coefficient of wood. Comprehensive software including image processing was developed for treatment of the numerical values of the measurements. (author)

  11. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.; Pedersen, Karl

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat......, neckskin and environmental samples) were collected over a period of 4 months at a turkey slaughterhouse and meat-cutting plant in Denmark. Faecal and environmental samples were tested by the conventional culture method and by the two EIAs, whereas meat and neckskin samples were tested by the two EIAs only....... Two enrichment broths were used, Campylobacter Enrichment Broth (CEB) and Preston Broth (PB). Verification of positive test results was carried out by conventional culture on selective solid media. The specificities of all methods were high. The sensitivities of the EIAs were higher than that of the...

  12. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    Science.gov (United States)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  13. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    DEFF Research Database (Denmark)

    Qiao, Jixin

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development...... in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development...... and optimization for rapid determination of plutonium in environmental samples using SIextraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples...

  14. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    International Nuclear Information System (INIS)

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability

  15. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  16. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    International Nuclear Information System (INIS)

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar+ ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern

  17. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  18. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33

    OpenAIRE

    Round, A R; D. Franke; S. Moritz; Huchler, R.; Fritsche, M.; Malthan, D.; Klaering, R.; Svergun, D I; Roessle, M.

    2008-01-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein so...

  19. Design and Practices for Use of Automated Drilling and Sample Handling in MARTE While Minimizing Terrestrial and Cross Contamination

    Science.gov (United States)

    Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  20. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    Science.gov (United States)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  1. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol. PMID:18471209

  2. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    Science.gov (United States)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P 0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  3. Automation of high-frequency sampling of environmental waters for reactive species

    Science.gov (United States)

    Kim, H.; Bishop, J. K.; Wood, T.; Fung, I.; Fong, M.

    2011-12-01

    Trace metals, particularly iron and manganese, play a critical role in some ecosystems as a limiting factor to determine primary productivity, in geochemistry, especially redox chemistry as important electron donors and acceptors, and in aquatic environments as carriers of contaminant transport. Dynamics of trace metals are closely related to various hydrologic events such as rainfall. Storm flow triggers dramatic changes of both dissolved and particulate trace metals concentrations and affects other important environmental parameters linked to trace metal behavior such as dissolved organic carbon (DOC). To improve our understanding of behaviors of trace metals and underlying processes, water chemistry information must be collected for an adequately long period of time at higher frequency than conventional manual sampling (e.g. weekly, biweekly). In this study, we developed an automated sampling system to document the dynamics of trace metals, focusing on Fe and Mn, and DOC for a multiple-year high-frequency geochemistry time series in a small catchment, called Rivendell located at Angelo Coast Range Reserve, California. We are sampling ground and streamwater using the automated sampling system in daily-frequency and the condition of the site is substantially variable from season to season. The ranges of pH of ground and streamwater are pH 5 - 7 and pH 7.8 - 8.3, respectively. DOC is usually sub-ppm, but during rain events, it increases by an order of magnitude. The automated sampling system focuses on two aspects- 1) a modified design of sampler to improve sample integrity for trace metals and DOC and 2) remote controlling system to update sampling volume and timing according to hydrological conditions. To maintain sample integrity, the developed method employed gravity filtering using large volume syringes (140mL) and syringe filters connected to a set of polypropylene bottles and a borosilicate bottle via Teflon tubing. Without filtration, in a few days, the

  4. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m3/h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10-6 Bq/m3. The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  5. Automated high-throughput in vitro screening of the acetylcholine esterase inhibiting potential of environmental samples, mixtures and single compounds.

    Science.gov (United States)

    Froment, Jean; Thomas, Kevin V; Tollefsen, Knut Erik

    2016-08-01

    A high-throughput and automated assay for testing the presence of acetylcholine esterase (AChE) inhibiting compounds was developed, validated and applied to screen different types of environmental samples. Automation involved using the assay in 96-well plates and adapting it for the use with an automated workstation. Validation was performed by comparing the results of the automated assay with that of a previously validated and standardised assay for two known AChE inhibitors (paraoxon and dichlorvos). The results show that the assay provides similar concentration-response curves (CRCs) when run according to the manual and automated protocol. Automation of the assay resulted in a reduction in assay run time as well as in intra- and inter-assay variations. High-quality CRCs were obtained for both of the model AChE inhibitors (dichlorvos IC50=120µM and paraoxon IC50=0.56µM) when tested alone. The effect of co-exposure of an equipotent binary mixture of the two chemicals were consistent with predictions of additivity and best described by the concentration addition model for combined toxicity. Extracts of different environmental samples (landfill leachate, wastewater treatment plant effluent, and road tunnel construction run-off) were then screened for AChE inhibiting activity using the automated bioassay, with only landfill leachate shown to contain potential AChE inhibitors. Potential uses and limitations of the assay were discussed based on the present results. PMID:27085000

  6. Automated Generation and Assessment of Autonomous Systems Test Cases

    Science.gov (United States)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results

  7. ASSESSING SMALL SAMPLE WAR-GAMING DATASETS

    Directory of Open Access Journals (Sweden)

    W. J. HURLEY

    2013-10-01

    Full Text Available One of the fundamental problems faced by military planners is the assessment of changes to force structure. An example is whether to replace an existing capability with an enhanced system. This can be done directly with a comparison of measures such as accuracy, lethality, survivability, etc. However this approach does not allow an assessment of the force multiplier effects of the proposed change. To gauge these effects, planners often turn to war-gaming. For many war-gaming experiments, it is expensive, both in terms of time and dollars, to generate a large number of sample observations. This puts a premium on the statistical methodology used to examine these small datasets. In this paper we compare the power of three tests to assess population differences: the Wald-Wolfowitz test, the Mann-Whitney U test, and re-sampling. We employ a series of Monte Carlo simulation experiments. Not unexpectedly, we find that the Mann-Whitney test performs better than the Wald-Wolfowitz test. Resampling is judged to perform slightly better than the Mann-Whitney test.

  8. Harmonization of automated hemolysis index assessment and use: Is it possible?

    Science.gov (United States)

    Dolci, Alberto; Panteghini, Mauro

    2014-05-15

    The major source of errors producing unreliable laboratory test results is the pre-analytical phase with hemolysis accounting for approximately half of them and being the leading cause of unsuitable blood specimens. Hemolysis may produce interference in many laboratory tests by a variety of biological and analytical mechanisms. Consequently, laboratories need to systematically detect and reliably quantify hemolysis in every collected sample by means of objective and consistent technical tools that assess sample integrity. This is currently done by automated estimation of hemolysis index (HI), available on almost all clinical chemistry platforms, making the hemolysis detection reliable and reportable patient test results more accurate. Despite these advantages, a degree of variability still affects the HI estimate and more efforts should be placed on harmonization of this index. The harmonization of HI results from different analytical systems should be the immediate goal, but the scope of harmonization should go beyond analytical steps to include other aspects, such as HI decision thresholds, criteria for result interpretation and application in clinical practice as well as report formats. With regard to this, relevant issues to overcome remain the objective definition of a maximum allowable bias for hemolysis interference based on the clinical application of the measurements and the management of unsuitable samples. Particularly, for the latter a recommended harmonized approach is required when not reporting numerical results of unsuitable samples with significantly increased HI and replacing the test result with a specific comment highlighting hemolysis of the sample. PMID:24513329

  9. Automated negotiation in environmental resource management: Review and assessment.

    Science.gov (United States)

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach. PMID:26241930

  10. Adapting Assessment Procedures for Delivery via an Automated Format.

    Science.gov (United States)

    Kelly, Karen L.; And Others

    The Office of Personnel Management (OPM) decided to explore alternative examining procedures for positions covered by the Administrative Careers with America (ACWA) examination. One requirement for new procedures was that they be automated for use with OPM's recently developed Microcomputer Assisted Rating System (MARS), a highly efficient system…

  11. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  12. Microbiological monitoring and automated event sampling at karst springs using LEO-satellites.

    Science.gov (United States)

    Stadler, H; Skritek, P; Sommer, R; Mach, R L; Zerobin, W; Farnleitner, A H

    2008-01-01

    Data communication via Low-Earth-Orbit (LEO) Satellites between portable hydrometeorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E. coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples compared with the auto-sampling procedure revealed corresponding results and were in agreement with the ISO 9308-1 reference method. E. coli concentrations were individually corrected by event specific inactivation coefficients (0.10-0.14 day(-1)), compensating losses due to sample storage at spring temperature in the auto sampler.Two large summer events in 2005/2006 at an important alpine karst spring (LKAS2) were monitored including detailed analysis of E. coli dynamics (n = 271) together with comprehensive hydrological characterisations. High-resolution time series demonstrated a sudden increase of E. coli concentrations in spring water (approximately 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorption coefficient measured at 254 nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-satellite based system it is a helpful tool for early-warning systems in the field of drinking water protection. PMID:18776628

  13. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    Science.gov (United States)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  14. Automated sample preparation in a microfluidic culture device for cellular metabolomics.

    Science.gov (United States)

    Filla, Laura A; Sanders, Katherine L; Filla, Robert T; Edwards, James L

    2016-06-21

    Sample pretreatment in conventional cellular metabolomics entails rigorous lysis and extraction steps which increase the duration as well as limit the consistency of these experiments. We report a biomimetic cell culture microfluidic device (MFD) which is coupled with an automated system for rapid, reproducible cell lysis using a combination of electrical and chemical mechanisms. In-channel microelectrodes were created using facile fabrication methods, enabling the application of electric fields up to 1000 V cm(-1). Using this platform, average lysing times were 7.12 s and 3.03 s for chips with no electric fields and electric fields above 200 V cm(-1), respectively. Overall, the electroporation MFDs yielded a ∼10-fold improvement in lysing time over standard chemical approaches. Detection of multiple intracellular nucleotides and energy metabolites in MFD lysates was demonstrated using two different MS platforms. This work will allow for the integrated culture, automated lysis, and metabolic analysis of cells in an MFD which doubles as a biomimetic model of the vasculature. PMID:27118418

  15. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  16. Automation of Workplace Lifting Hazard Assessment for Musculoskeletal Injury Prevention

    OpenAIRE

    Spector, June T.; Lieblich, Max; Bao, Stephen; McQuade, Kevin; Hughes, Margaret

    2014-01-01

    Objectives Existing methods for practically evaluating musculoskeletal exposures such as posture and repetition in workplace settings have limitations. We aimed to automate the estimation of parameters in the revised United States National Institute for Occupational Safety and Health (NIOSH) lifting equation, a standard manual observational tool used to evaluate back injury risk related to lifting in workplace settings, using depth camera (Microsoft Kinect) and skeleton algorithm technology. ...

  17. Assessing bat detectability and occupancy with multiple automated echolocation detectors

    Science.gov (United States)

    Gorresen, P.M.; Miles, A.C.; Todd, C.M.; Bonaccorso, F.J.; Weller, T.J.

    2008-01-01

    Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled to digital recorders operating at a series of proximate sites on consecutive nights in 2 trial surveys for the Hawaiian hoary bat (Lasiurus cinereus semotus). Our results confirmed that the technique is readily amenable for use in occupancy analysis. We also conducted a simulation exercise to assess the effects of sampling effort on parameter estimation. The results indicated that the precision and bias of parameter estimation were often more influenced by the number of sites sampled than number of visits. Acceptable accuracy often was not attained until at least 15 sites or 15 visits were used to estimate detection probability and occupancy. The method has significant potential for use in monitoring trends in bat activity and in comparative studies of habitat use. ?? 2008 American Society of Mammalogists.

  18. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin D.; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  19. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    International Nuclear Information System (INIS)

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper

  20. Quantitative reliability assessment in the safety case of computer-based automation systems

    International Nuclear Information System (INIS)

    An essential issue in the construction of new or in the replacement of the old analogue automation applications in nuclear power plants is the reliability of computer-based systems, and especially the question of how to assess their reliability. The reliability issue is particularly important when the system under assessment is considered as a safety-critical system, such as the reactor protection system. To build sufficient confidence on the reliability of computer-based systems appropriate reliability assessment methods should be developed and applied. The assessment methods should provide useful and plausible reliability estimates, while taking the special characteristics of the reliability assessment of computer-based systems into consideration. The Bayesian inference has proved to be an efficient methodology in the reliability assessment of computer-based automation applications. Practical implementation of Bayesian inference, Bayesian networks, allow the combination of the different safety arguments concerning the system and its development process to a unified reliability estimate. Bayesian networks are also a convenient way to communicate on the safety argumentation between various participants of systems design and implementation as well as between the participants in the licensing processes of computer-based automation systems. This study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). The project aimed to provide support for the authorities and utilities in the licensing problems of computer-based automation systems. Particular objective of the project was to acquire, develop and test new and more cost-effective methods and tools for the safety and reliability assessment, and to gather practical experience on their use in order to achieve a more streamlined licensing process for the computer- based automation systems

  1. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. PMID:22310206

  2. Uranium monitoring tool for rapid analysis of environmental samples based on automated liquid-liquid microextraction.

    Science.gov (United States)

    Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2015-03-01

    A fully automated in-syringe (IS) magnetic stirring assisted (MSA) liquid-liquid microextraction (LLME) method for uranium(VI) determination was developed, exploiting a long path-length liquid waveguide capillary cell (LWCC) with spectrophotometric detection. On-line extraction of uranium was performed within a glass syringe containing a magnetic stirrer for homogenization of the sample and the successive reagents: cyanex-272 in dodecane as extractant, EDTA as interference eliminator, hydrochloric acid to make the back-extraction of U(VI) and arsenazo-III as chromogenic reagent to accomplish the spectrophotometric detection at 655 nm. Magnetic stirring assistance was performed by a specially designed driving device placed around the syringe body creating a rotating magnetic field in the syringe, and forcing the rotation of the stirring bar located inside the syringe. The detection limit (LOD) of the developed method is 3.2 µg L(-1). Its good interday precision (Relative Standard Deviation, RSD 3.3%), and its high extraction frequency (up to 6 h(-1)) makes of this method an inexpensive and fast screening tool for monitoring uranium(VI) in environmental samples. It was successfully applied to different environmental matrices: channel sediment certified reference material (BCR-320R), soil and phosphogypsum reference materials, and natural water samples, with recoveries close to 100%. PMID:25618721

  3. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry

    OpenAIRE

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Background: Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Methods: Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspens...

  4. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis. PMID:22357562

  5. Analysis of zearalenone in cereal and Swine feed samples using an automated flow-through immunosensor.

    Science.gov (United States)

    Urraca, Javier L; Benito-Peña, Elena; Pérez-Conde, Concepción; Moreno-Bondi, María C; Pestka, James J

    2005-05-01

    The development of a sensitive flow-though immunosensor for the analysis of the mycotoxin zearalenone in cereal samples is described. The sensor was completely automated and was based on a direct competitive immunosorbent assay and fluorescence detection. The mycotoxin competes with a horseradish-peroxidase-labeled derivative for the binding sites of a rabbit polyclonal antibody. Control pore glass covalently bound to Prot A was used for the oriented immobilization of the antibody-antigen immunocomplexes. The immunosensor shows an IC(50) value of 0.087 ng mL(-1) (RSD = 2.8%, n = 6) and a dynamic range from 0.019 to 0.422 ng mL(-1). The limit of detection (90% of blank signal) of 0.007 ng mL(-1) (RSD = 3.9%, n = 3) is lower than previously published methods. Corn, wheat, and swine feed samples have been analyzed with the device after extraction of the analyte using accelerated solvent extraction (ASE). The immunosensor has been validated using a corn certificate reference material and HPLC with fluorescence detection. PMID:15853369

  6. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  7. Development and evaluation of a partially-automated approach to the assessment of undergraduate mathematics

    OpenAIRE

    Rowlett, Peter

    2014-01-01

    This research explored assessment and e-assessment in undergraduate mathematics and proposed a novel, partially-automated approach, in which assessment is set via computer but completed and marked offline. This potentially offers: reduced efficiency of marking but increased validity compared with examination, via deeper and more open-ended questions; increased reliability compared with coursework, by reduction of plagiarism through individualised questions; increased efficiency for setting qu...

  8. Assessing office automation effect on Innovation Case study: Education Organizations and Schools in Esfahan Province, Iran

    Directory of Open Access Journals (Sweden)

    Hajar Safari

    2013-09-01

    Full Text Available Today organizations act in a dynamic, very ambiguous and changing environment. So each organization has to deliver high quality services and benefit from innovative systems to be successful in such an environment. This research aims to explore the relationship between implementation of office automation and innovative using structural equitation modeling method (SEM. Aim of this research is applied and its method is survey-descriptive. Statistical society is managers of education organizations and schools in Esfahan and Lenjan cities. 130 individuals were selected as sample by randomly sampling method. Content and construct validity were used In order to evaluate validity of questionnaire and relations between variables of this research have been confirmed based on results of SEM method. Regarding obtained results, effectiveness amount of office automation on innovation is measured equal to estimated standard amount as 0/24. Obtained results from main hypothesis test of this research completely conform which there is about office automation in studied organization.

  9. Two Methods for High-Throughput NGS Template Preparation for Small and Degraded Clinical Samples Without Automation

    OpenAIRE

    Kamberov, E.; Tesmer, T.; Mastronardi, M.; Langmore, John

    2012-01-01

    Clinical samples are difficult to prepare for NGS, because of the small amounts or degraded states of formalin-fixed tissue, plasma, urine, and single-cell DNA. Conventional whole genome amplification methods are too biased for NGS applications, and the existing NGS preparation kits require intermediate purifications and excessive time to prepare hundreds of samples in a day without expensive automation. We have tested two 96-well manual methods to make NGS templates from FFPE tissue, plasma,...

  10. Automated peroperative assessment of stents apposition from OCT pullbacks.

    Science.gov (United States)

    Dubuisson, Florian; Péry, Emilie; Ouchchane, Lemlih; Combaret, Nicolas; Kauffmann, Claude; Souteyrand, Géraud; Motreff, Pascal; Sarry, Laurent

    2015-04-01

    This study's aim was to control the stents apposition by automatically analyzing endovascular optical coherence tomography (OCT) sequences. Lumen is detected using threshold, morphological and gradient operators to run a Dijkstra algorithm. Wrong detection tagged by the user and caused by bifurcation, struts'presence, thrombotic lesions or dissections can be corrected using a morphing algorithm. Struts are also segmented by computing symmetrical and morphological operators. Euclidian distance between detected struts and wall artery initializes a stent's complete distance map and missing data are interpolated with thin-plate spline functions. Rejection of detected outliers, regularization of parameters by generalized cross-validation and using the one-side cyclic property of the map also optimize accuracy. Several indices computed from the map provide quantitative values of malapposition. Algorithm was run on four in-vivo OCT sequences including different incomplete stent apposition's cases. Comparison with manual expert measurements validates the segmentation׳s accuracy and shows an almost perfect concordance of automated results. PMID:25700272

  11. Improving EFL Graduate Students' Proficiency in Writing through an Online Automated Essay Assessing System

    Science.gov (United States)

    Ma, Ke

    2013-01-01

    This study investigates the effects of using an online Automated Essay Assessing (AEA) system on EFL graduate students' writing. Eighty four EFL graduate students divided into the treatment group and the control group participated in this study. The treatment group was asked to use an AEA system to assist their essay writing. Both groups were…

  12. Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™

    Science.gov (United States)

    Balfour, Stephen P.

    2013-01-01

    Two of the largest Massive Open Online Course (MOOC) organizations have chosen different methods for the way they will score and provide feedback on essays students submit. EdX, MIT and Harvard's non-profit MOOC federation, recently announced that they will use a machine-based Automated Essay Scoring (AES) application to assess written work in…

  13. Editorial for the Special Issue on Automated Design and Assessment of Heuristic Search Methods

    OpenAIRE

    Ochoa, Gabriela; Preuss, Mike; Bartz-Beielstein, Thomas; Schoenauer, Marc

    2012-01-01

    Heuristic search algorithms have been successfully applied to solve many problems in practice. Their design, however, has increased in complexity as the number of parameters and choices for operators and algorithmic components is also expanding. There is clearly the need of providing the final user with automated tools to assist the tuning, design and assessment of heuristic optimisation methods.

  14. Performance verification of the Maxwell 16 Instrument and DNA IQ Reference Sample Kit for automated DNA extraction of known reference samples.

    Science.gov (United States)

    Krnajski, Z; Geering, S; Steadman, S

    2007-12-01

    Advances in automation have been made for a number of processes conducted in the forensic DNA laboratory. However, because most robotic systems are designed for high-throughput laboratories batching large numbers of samples, smaller laboratories are left with a limited number of cost-effective options for employing automation. The Maxwell 16 Instrument and DNA IQ Reference Sample Kit marketed by Promega are designed for rapid, automated purification of DNA extracts from sample sets consisting of sixteen or fewer samples. Because the system is based on DNA capture by paramagnetic particles with maximum binding capacity, it is designed to generate extracts with yield consistency. The studies herein enabled evaluation of STR profile concordance, consistency of yield, and cross-contamination performance for the Maxwell 16 Instrument. Results indicate that the system performs suitably for streamlining the process of extracting known reference samples generally used for forensic DNA analysis and has many advantages in a small or moderate-sized laboratory environment. PMID:25869266

  15. Automated Spacecraft Conjunction Assessment at Mars and the Moon

    Science.gov (United States)

    Berry, David; Guinn, Joseph; Tarzi, Zahi; Demcak, Stuart

    2012-01-01

    Conjunction assessment and collision avoidance are areas of current high interest in space operations. Most current conjunction assessment activity focuses on the Earth orbital environment. Several of the world's space agencies have satellites in orbit at Mars and the Moon, and avoiding collisions there is important too. Smaller number of assets than Earth, and smaller number of organizations involved, but consequences similar to Earth scenarios.This presentation will examine conjunction assessment processes implemented at JPL for spacecraft in orbit at Mars and the Moon.

  16. QUAliFiER: An automated pipeline for quality assessment of gated flow cytometry data

    Directory of Open Access Journals (Sweden)

    Finak Greg

    2012-09-01

    pipeline constructed from two new R packages for importing manually gated flow cytometry data and performing flexible and robust quality assessment checks. The pipeline addresses the increasing demand for tools capable of performing quality checks on large flow data sets generated in typical clinical trials. The QUAliFiER tool objectively, efficiently, and reproducibly identifies outlier samples in an automated manner by monitoring cell population statistics from gated or ungated flow data conditioned on experiment–level metadata.

  17. Assessment of Automated Measurement and Verification (M&V) Methods

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Custodio, Claudine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sohn, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jump, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  18. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography......-mass spectrometry using several preparation techniques, including protein precipitation, solid-phase extraction and centrifugation, without any manual intervention. Pipetting of a known aliquot of whole blood was achieved by integrating a balance and performing gravimetric measurements. The system was able to...

  19. Automated high-throughput assessment of prostate biopsy tissue using infrared spectroscopic chemical imaging

    Science.gov (United States)

    Bassan, Paul; Sachdeva, Ashwin; Shanks, Jonathan H.; Brown, Mick D.; Clarke, Noel W.; Gardner, Peter

    2014-03-01

    Fourier transform infrared (FT-IR) chemical imaging has been demonstrated as a promising technique to complement histopathological assessment of biomedical tissue samples. Current histopathology practice involves preparing thin tissue sections and staining them using hematoxylin and eosin (H&E) after which a histopathologist manually assess the tissue architecture under a visible microscope. Studies have shown that there is disagreement between operators viewing the same tissue suggesting that a complementary technique for verification could improve the robustness of the evaluation, and improve patient care. FT-IR chemical imaging allows the spatial distribution of chemistry to be rapidly imaged at a high (diffraction-limited) spatial resolution where each pixel represents an area of 5.5 × 5.5 μm2 and contains a full infrared spectrum providing a chemical fingerprint which studies have shown contains the diagnostic potential to discriminate between different cell-types, and even the benign or malignant state of prostatic epithelial cells. We report a label-free (i.e. no chemical de-waxing, or staining) method of imaging large pieces of prostate tissue (typically 1 cm × 2 cm) in tens of minutes (at a rate of 0.704 × 0.704 mm2 every 14.5 s) yielding images containing millions of spectra. Due to refractive index matching between sample and surrounding paraffin, minimal signal processing is required to recover spectra with their natural profile as opposed to harsh baseline correction methods, paving the way for future quantitative analysis of biochemical signatures. The quality of the spectral information is demonstrated by building and testing an automated cell-type classifier based upon spectral features.

  20. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    OpenAIRE

    Constantinos Georgiou; Georgakopoulos, Dimitrios G.; Gerasimos Kremmydas; Efstathios Vasiliou; Efstratios Komaitis

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor biol...

  1. Access to information: assessment of the use of automated interaction technologies in call centers

    Directory of Open Access Journals (Sweden)

    Fernando de Souza Meirelles

    2011-01-01

    Full Text Available With the purpose of at lowering costs and reendering the demanded information available to users with no access to the internet, service companies have adopted automated interaction technologies in their call centers, which may or may not meet the expectations of users. Based on different areas of knowledge (man-machine interaction, consumer behavior and use of IT 13 propositions are raised and a research is carried out in three parts: focus group, field study with users and interviews with experts. Eleven automated service characteristics which support the explanation for user satisfaction are listed, a preferences model is proposed and evidence in favor or against each of the 13 propositions is brought in. With balance scorecard concepts, a managerial assessment model is proposed for the use of automated call center technology. In future works, the propositions may become verifiable hypotheses through conclusive empirical research.

  2. ASPIRE: An automated sample positioning and irradiation system for radiation biology experiments at Inter University Accelerator Centre, New Delhi

    International Nuclear Information System (INIS)

    An automated irradiation setup for biology samples has been built at Inter University Accelerator Centre (IUAC), New Delhi, India. It can automatically load and unload 20 biology samples in a run of experiment. It takes about 20 min [2% of the cell doubling time] to irradiate all the 20 samples. Cell doubling time is the time taken by the cells (kept in the medium) to grow double in numbers. The cells in the samples keep growing during entire of the experiment. The fluence irradiated to the samples is measured with two silicon surface barrier detectors. Tests show that the uniformity of fluence and dose of heavy ions reaches to 2% at the sample area in diameter of 40 mm. The accuracy of mean fluence at the center of the target area is within 1%. The irradiation setup can be used to the studies of radiation therapy, radiation dosimetry and molecular biology at the heavy ion accelerator. - Highlights: • Automated positioning and irradiation setup for biology samples at IUAC is built. • Loading and unloading of 20 biology samples can be automatically carried out. • Biologicals cells keep growing during entire experiment. • Fluence and dose of heavy ions are measured by two silicon barrier detectors. • Uniformity of fluence and dose of heavy ions at sample position reaches to 2%

  3. Gaia: automated quality assessment of protein structure models

    OpenAIRE

    Kota, Pradeep; Ding, Feng; Ramachandran, Srinivas; Dokholyan, Nikolay V.

    2011-01-01

    Motivation: Increasing use of structural modeling for understanding structure–function relationships in proteins has led to the need to ensure that the protein models being used are of acceptable quality. Quality of a given protein structure can be assessed by comparing various intrinsic structural properties of the protein to those observed in high-resolution protein structures.

  4. Image cytometer method for automated assessment of human spermatozoa concentration

    DEFF Research Database (Denmark)

    Egeberg, D L; Kjaerulff, S; Hansen, C; Petersen, J H; Glensbjerg, M; Skakkebaek, N E; Jørgensen, N; Almstrup, K

    2013-01-01

    In the basic clinical work-up of infertile couples, a semen analysis is mandatory and the sperm concentration is one of the most essential variables to be determined. Sperm concentration is usually assessed by manual counting using a haemocytometer and is hence labour intensive and may be subject...

  5. Bayesian Stratified Sampling to Assess Corpus Utility

    OpenAIRE

    Hochberg, Judith; Scovel, Clint; Thomas, Timothy; Hall, Sam

    1998-01-01

    This paper describes a method for asking statistical questions about a large text corpus. We exemplify the method by addressing the question, "What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?" We estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Stratified sampling is used to reduce the sampling uncertainty of the estimate from over 3100 documents t...

  6. Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples

    Science.gov (United States)

    Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi

    2014-01-01

    RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads

  7. Comparison between manual and automated techniques for assessment of data from dynamic antral scintigraphy

    International Nuclear Information System (INIS)

    This work aimed at determining whether data from dynamic antral scintigraphy (DAS) yielded by a simple, manual technique are as accurate as those generated by a conventional automated technique (fast Fourier transform) for assessing gastric contractility. Seventy-one stretches (4 min) of 'activity versus time' curves obtained by DAS from 10 healthy volunteers and 11 functional dyspepsia patients, after ingesting a liquid meal (320 ml, 437 kcal) labeled with technetium-99m (99mTc)-phytate, were independently analyzed by manual and automated techniques. Data obtained by both techniques for the frequency of antral contractions were similar. Contraction amplitude determined by the manual technique was significantly higher than that estimated by the automated method, in both patients and controls. The contraction frequency 30 min post-meal was significantly lower in patients than in controls, which was correctly shown by both techniques. A manual technique using ordinary resources of the gamma camera workstation, despite yielding higher figures for the amplitude of gastric contractions, is as accurate as the conventional automated technique of DAS analysis. These findings may favor a more intensive use of DAS coupled to gastric emptying studies, which would provide a more comprehensive assessment of gastric motor function in disease. (author)

  8. Bayesian Stratified Sampling to Assess Corpus Utility

    CERN Document Server

    Hochberg, J; Thomas, T; Hall, S; Hochberg, Judith; Scovel, Clint; Thomas, Timothy; Hall, Sam

    1998-01-01

    This paper describes a method for asking statistical questions about a large text corpus. We exemplify the method by addressing the question, "What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?" We estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Stratified sampling is used to reduce the sampling uncertainty of the estimate from over 3100 documents to fewer than 1000. The stratification is based on observed characteristics of real documents, while the sampling procedure incorporates a Bayesian version of Neyman allocation. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  9. Performance of optimized McRAPD in identification of 9 yeast species frequently isolated from patient samples: potential for automation

    Directory of Open Access Journals (Sweden)

    Koukalova Dagmar

    2009-11-01

    Full Text Available Abstract Background Rapid, easy, economical and accurate species identification of yeasts isolated from clinical samples remains an important challenge for routine microbiological laboratories, because susceptibility to antifungal agents, probability to develop resistance and ability to cause disease vary in different species. To overcome the drawbacks of the currently available techniques we have recently proposed an innovative approach to yeast species identification based on RAPD genotyping and termed McRAPD (Melting curve of RAPD. Here we have evaluated its performance on a broader spectrum of clinically relevant yeast species and also examined the potential of automated and semi-automated interpretation of McRAPD data for yeast species identification. Results A simple fully automated algorithm based on normalized melting data identified 80% of the isolates correctly. When this algorithm was supplemented by semi-automated matching of decisive peaks in first derivative plots, 87% of the isolates were identified correctly. However, a computer-aided visual matching of derivative plots showed the best performance with average 98.3% of the accurately identified isolates, almost matching the 99.4% performance of traditional RAPD fingerprinting. Conclusion Since McRAPD technique omits gel electrophoresis and can be performed in a rapid, economical and convenient way, we believe that it can find its place in routine identification of medically important yeasts in advanced diagnostic laboratories that are able to adopt this technique. It can also serve as a broad-range high-throughput technique for epidemiological surveillance.

  10. Donor disc attachment assessment with intraoperative spectral optical coherence tomography during descemet stripping automated endothelial keratoplasty

    Directory of Open Access Journals (Sweden)

    Edward Wylegala

    2013-01-01

    Full Text Available Optical coherence tomography has already been proven to be useful for pre- and post-surgical anterior eye segment assessment, especially in lamellar keratoplasty procedures. There is no evidence for intraoperative usefulness of optical coherence tomography (OCT. We present a case report of the intraoperative donor disc attachment assessment with spectral-domain optical coherence tomography in case of Descemet stripping automated endothelial keratoplasty (DSAEK surgery combined with corneal incisions. The effectiveness of the performed corneal stab incisions was visualized directly by OCT scan analysis. OCT assisted DSAEK allows the assessment of the accuracy of the Descemet stripping and donor disc attachment.

  11. Feasibility studies of safety assessment methods for programmable automation systems. Final report of the AVV project

    International Nuclear Information System (INIS)

    Feasibility studies of two different groups of methodologies for safety assessment of programmable automation systems has been executed at the Technical Research Centre of Finland (VTT). The studies concerned the dynamic testing methods and the fault tree (FT) and failure mode and effects analysis (FMEA) methods. In order to get real experience in the application of these methods, an experimental testing of two realistic pilot systems were executed and a FT/FMEA analysis of a programmable safety function accomplished. The purpose of the studies was not to assess the object systems, but to get experience in the application of methods and assess their potentials and development needs. (46 refs., 21 figs.)

  12. Defining food sampling strategy for chemical risk assessment

    OpenAIRE

    Wesolek, Nathalie; Roudot, Alain-Claude

    2012-01-01

    International audience Collection of accurate and reliable data is a prerequisite for informed risk assessment and risk management. For chemical contaminants in food, contamination assessments enable consumer protection and exposure assessments. And yet, the accuracy of a contamination assessment depends on both chemical analysis and sampling plan performance. A sampling plan is always used when the contamination level of a food lot is evaluated, due to the fact that the whole lot can not ...

  13. Automated Peripheral Neuropathy Assessment Using Optical Imaging and Foot Anthropometry.

    Science.gov (United States)

    Siddiqui, Hafeez-U R; Spruce, Michelle; Alty, Stephen R; Dudley, Sandra

    2015-08-01

    A large proportion of individuals who live with type-2 diabetes suffer from plantar sensory neuropathy. Regular testing and assessment for the condition is required to avoid ulceration or other damage to patient's feet. Currently accepted practice involves a trained clinician testing a patient's feet manually with a hand-held nylon monofilament probe. The procedure is time consuming, labor intensive, requires special training, is prone to error, and repeatability is difficult. With the vast increase in type-2 diabetes, the number of plantar sensory neuropathy sufferers has already grown to such an extent as to make a traditional manual test problematic. This paper presents the first investigation of a novel approach to automatically identify the pressure points on a given patient's foot for the examination of sensory neuropathy via optical image processing incorporating plantar anthropometry. The method automatically selects suitable test points on the plantar surface that correspond to those repeatedly chosen by a trained podiatrist. The proposed system automatically identifies the specific pressure points at different locations, namely the toe (hallux), metatarsal heads and heel (Calcaneum) areas. The approach is generic and has shown 100% reliability on the available database used. The database consists of Chinese, Asian, African, and Caucasian foot images. PMID:26186748

  14. Assessing Pulmonary Perfusion in Emphysema Automated Quantification of Perfused Blood Volume in Dual-Energy CTPA

    OpenAIRE

    Meinel, Felix G.; Graef, Anita; Thieme, Sven F.; Bamberg, Fabian; Schwarz, Florian; Sommer, Wieland; Helck, Andreas D.; Neurohr, Claus; Reiser, Maximilian F.; Johnson, Thorsten R. C.

    2013-01-01

    Objectives: The objective of this study was to determine whether automated quantification of lung perfused blood volume (PBV) in dual-energy computed tomographic pulmonary angiography (DE-CTPA) can be used to assess the severity and regional distribution of pulmonary hypoperfusion in emphysema. Materials and Methods: We retrospectively analyzed 40 consecutive patients (mean age, 67 13] years) with pulmonary emphysema, who have no cardiopulmonary comorbidities, and a DE-CTPA negative for pulmo...

  15. Automating the aviation command safety assessment survey as an Enterprise Information System (EIS)

    OpenAIRE

    Held, Jonathan S.; Mingo, Fred J.

    1999-01-01

    The Aviation Command Safety Assessment (ACSA) is a questionnaire survey methodology developed to evaluate a Naval Aviation Command's safety climate, culture, and safety program effectiveness. This survey was a manual process first administered in the fall of 1996. The primary goal of this thesis is to design, develop, and test an Internet-based, prototype model for administering this survey using new technologies that allow automated survey submission and analysis. The result of this thesis i...

  16. Using neural networks to assess flight deck human–automation interaction

    International Nuclear Information System (INIS)

    The increased complexity and interconnectivity of flight deck automation has made the prediction of human–automation interaction (HAI) difficult and has resulted in a number of accidents and incidents. There is a need to develop objective and robust methods by which the changes in HAI brought about by the introduction of new automation into the flight deck could be predicted and assessed prior to implementation and without use of extensive simulation. This paper presents a method to model a parametrization of flight deck automation known as HART and link it to HAI consequences using a backpropagation neural network approach. The transformation of the HART into a computational model suitable for modeling as a neural network is described. To test and train the network data were collected from 40 airline pilots for six HAI consequences based on one scenario family consisting of a baseline and four variants. For a binary classification of HAI consequences, the neural network successfully classified 62–78.5% depending on the consequence. The results were verified using a decision tree analysis

  17. Evaluation of two automated enzyme-immunoassays for detection of thermophilic campylobacters in faecal samples from cattle and swine

    DEFF Research Database (Denmark)

    Hoorfar, Jeffrey; Nielsen, E.M.; Stryhn, H.; Andersen, S.

    We evaluated the performance of two enzyme-immunoassays (EIA) for the detection of naturally occurring, thermophilic Campylobacter spp. found in faecal samples from cattle (n = 21 and n = 26) and swine (n = 43) relative to the standard culture method, and also assuming that none of the tests was...... the definitive standard. The primary isolation both for the culture and the EIA methods was carried out by overnight selective enrichment in Preston broth. The results showed good sensitivities for both EIA methods in cattle (95% and 84%) and swine (88% and 69%) samples. However, when testing cattle...... samples, EIA-2 method resulted in a rather low specificity (32%). This seemed to be partially due to the isolation of nonthermophilic species. In conclusion, EIA-1 method may provide a simple and fast tool with good accuracy in cattle and swine samples for automated screening of large number of samples....

  18. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  19. Automation impact study of Army training management 2: Extension of sampling and collection of installation resource data

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, T.F.; McCallum, M.C.; Hunt, P.S.; Slavich, A.L.; Underwood, J.A.; Toquam, J.L.; Seaver, D.A.

    1989-05-01

    This automation impact study of Army training management (TM) was performed for the Army Development and Employment Agency (ADEA) and the Combined Arms Training Activity (CATA) by the Battelle Human Affairs Research Centers and the Pacific Northwest Laboratory. The primary objective of the study was to provide the Army with information concerning the potential costs and savings associated with automating the TM process. This study expands the sample of units surveyed in Phase I of the automation impact effort (Sanquist et al., 1988), and presents data concerning installation resource management in relation to TM. The structured interview employed in Phase I was adapted to a self-administered survey. The data collected were compatible with that of Phase I, and both were combined for analysis. Three US sites, one reserve division, one National Guard division, and one unit in the active component outside the continental US (OCONUS) (referred to in this report as forward deployed) were surveyed. The total sample size was 459, of which 337 respondents contributed the most detailed data. 20 figs., 62 tabs.

  20. Visualizing and Assessing Acceptance Sampling Plans: The R Package AcceptanceSampling

    OpenAIRE

    Andreas Kiermeier

    2008-01-01

    Manufacturers and government agencies frequently use acceptance sampling to decide whether a lot from a supplier or exporting country should be accepted or rejected. International standards on acceptance sampling provide sampling plans for specific circumstances.The aim of this package is to provide an easy-to-use interface to visualize single, double or multiple sampling plans. In addition, methods have been provided to enable the user to assess sampling plans against pre-specified levels of...

  1. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  2. MLP based Reusability Assessment Automation Model for Java based Software Systems

    Directory of Open Access Journals (Sweden)

    Surbhi Maggo

    2014-08-01

    Full Text Available Reuse refers to a common principle of using existing resources repeatedly, that is pervasively applicable everywhere. In software engineering reuse refers to the development of software systems using already available artifacts or assets partially or completely, with or without modifications. Software reuse not only promises significant improvements in productivity and quality but also provides for the development of more reliable, cost effective, dependable and less buggy (considering that prior use and testing have removed errors software with reduced time and effort. In this paper we present an efficient and reliable automation model for reusability evaluation of procedure based object oriented software for predicting the reusability levels of the components as low, medium or high. The presented model follows a reusability metric framework that targets the requisite reusability attributes including maintainability (using the Maintainability Index for functional analysis of the components. Further Multilayer perceptron (using back propagation based neural network is applied for the establishment of significant relationships among these attributes for reusability prediction. The proposed approach provides support for reusability evaluation at functional level rather than at structural level. The automation support for this approach is provided in the form of a tool named JRA2M2 (Java based Reusability Assessment Automation Model using Multilayer Perceptron (MLP, implemented in Java. The performance of JRA2M2 is recorded using parameters like accuracy, classification error, precision and recall. The results generated using JRA2M2 indicate that the proposed automation tool can be effectively used as a reliable and efficient solution for automated evaluation of reusability.

  3. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    International Nuclear Information System (INIS)

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (2/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg-1 for 5-300 mg of sample.

  4. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  5. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    Science.gov (United States)

    Williams, Alex C.; Hitt, Austin; Voisin, Sophie; Tourassi, Georgia

    2013-03-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists' manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  6. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    International Nuclear Information System (INIS)

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume (γ= 0.637 for total liver and γ= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  7. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn [Seoul National University Hospital, Seoul (Korea, Republic of)

    2013-02-15

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume ({gamma}= 0.637 for total liver and {gamma}= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  8. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study

    Directory of Open Access Journals (Sweden)

    Paul Otten

    2015-08-01

    Full Text Available Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA, are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient’s upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods.

  9. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    Directory of Open Access Journals (Sweden)

    Marjan Mansourvar

    Full Text Available Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP and artificial neural networks (ANNs models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age.

  10. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    Science.gov (United States)

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  11. Evaluation of a software package for automated quality assessment of contrast detail images—comparison with subjective visual assessment

    Science.gov (United States)

    Pascoal, A.; Lawinski, C. P.; Honey, I.; Blake, P.

    2005-12-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMAdetector, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  12. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pascoal, A [Medical Engineering and Physics, King' s College London, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Lawinski, C P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Honey, I [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Blake, P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark)

    2005-12-07

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA{sub detector}, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  13. Cost analysis of automated long-term sampling in comparison to existing application modes of manual short-term sampling

    Energy Technology Data Exchange (ETDEWEB)

    Reinmann, J. [bm becker messtechnik gmbh, Eschborn (Germany); Huang, A. [TUeV Rheinland Taiwan Ltd., Taipeh (Taiwan); Mehl, K.W.

    2004-09-15

    Because of the unsatisfactory informations which are given by manual sampling, some plants are controlled more frequently by manual sampling, by demand of the local authorities. Such more frequently manual samplings lead to an intensive cost increase of the dioxin emission control. As reported in earlier publications, the ROCEPA (Republic if China EPA) was setting up a project for continuous monitoring of PCDD/F. One topic of this project, which is surely also of general international interest, was a cost analysis for the comparison of long-term sampling and different application modes of manual sampling, which are applied practice in Taiwan in different plants. For the project, the long-term sampling system AMESA {sup registered} was chosen and therefore the published results are calculated on the basis of the AMESA {sup registered} system price. Additional other calculations show that also for dioxin inventories in European countries, the costs by using a long-term sampling system would be in an acceptable cost efficient range.

  14. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    Science.gov (United States)

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  15. SU-E-I-94: Automated Image Quality Assessment of Radiographic Systems Using An Anthropomorphic Phantom

    International Nuclear Information System (INIS)

    Purpose: In a large, academic medical center, consistent radiographic imaging performance is difficult to routinely monitor and maintain, especially for a fleet consisting of multiple vendors, models, software versions, and numerous imaging protocols. Thus, an automated image quality control methodology has been implemented using routine image quality assessment with a physical, stylized anthropomorphic chest phantom. Methods: The “Duke” Phantom (Digital Phantom 07-646, Supertech, Elkhart, IN) was imaged twice on each of 13 radiographic units from a variety of vendors at 13 primary care clinics. The first acquisition used the clinical PA chest protocol to acquire the post-processed “FOR PRESENTATION” image. The second image was acquired without an antiscatter grid followed by collection of the “FOR PROCESSING” image. Manual CNR measurements were made from the largest and thickest contrast-detail inserts in the lung, heart, and abdominal regions of the phantom in each image. An automated image registration algorithm was used to estimate the CNR of the same insert using similar ROIs. Automated measurements were then compared to the manual measurements. Results: Automatic and manual CNR measurements obtained from “FOR PRESENTATION” images had average percent differences of 0.42%±5.18%, −3.44%±4.85%, and 1.04%±3.15% in the lung, heart, and abdominal regions, respectively; measurements obtained from “FOR PROCESSING” images had average percent differences of -0.63%±6.66%, −0.97%±3.92%, and −0.53%±4.18%, respectively. The maximum absolute difference in CNR was 15.78%, 10.89%, and 8.73% in the respective regions. In addition to CNR assessment of the largest and thickest contrast-detail inserts, the automated method also provided CNR estimates for all 75 contrast-detail inserts in each phantom image. Conclusion: Automated analysis of a radiographic phantom has been shown to be a fast, robust, and objective means for assessing radiographic

  16. SU-E-I-94: Automated Image Quality Assessment of Radiographic Systems Using An Anthropomorphic Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Wells, J; Wilson, J; Zhang, Y; Samei, E; Ravin, Carl E. [Advanced Imaging Laboratories, Duke Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, NC (United States)

    2014-06-01

    Purpose: In a large, academic medical center, consistent radiographic imaging performance is difficult to routinely monitor and maintain, especially for a fleet consisting of multiple vendors, models, software versions, and numerous imaging protocols. Thus, an automated image quality control methodology has been implemented using routine image quality assessment with a physical, stylized anthropomorphic chest phantom. Methods: The “Duke” Phantom (Digital Phantom 07-646, Supertech, Elkhart, IN) was imaged twice on each of 13 radiographic units from a variety of vendors at 13 primary care clinics. The first acquisition used the clinical PA chest protocol to acquire the post-processed “FOR PRESENTATION” image. The second image was acquired without an antiscatter grid followed by collection of the “FOR PROCESSING” image. Manual CNR measurements were made from the largest and thickest contrast-detail inserts in the lung, heart, and abdominal regions of the phantom in each image. An automated image registration algorithm was used to estimate the CNR of the same insert using similar ROIs. Automated measurements were then compared to the manual measurements. Results: Automatic and manual CNR measurements obtained from “FOR PRESENTATION” images had average percent differences of 0.42%±5.18%, −3.44%±4.85%, and 1.04%±3.15% in the lung, heart, and abdominal regions, respectively; measurements obtained from “FOR PROCESSING” images had average percent differences of -0.63%±6.66%, −0.97%±3.92%, and −0.53%±4.18%, respectively. The maximum absolute difference in CNR was 15.78%, 10.89%, and 8.73% in the respective regions. In addition to CNR assessment of the largest and thickest contrast-detail inserts, the automated method also provided CNR estimates for all 75 contrast-detail inserts in each phantom image. Conclusion: Automated analysis of a radiographic phantom has been shown to be a fast, robust, and objective means for assessing radiographic

  17. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    International Nuclear Information System (INIS)

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6

  18. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Salim, Nazaratul Ashifa Bt. Abdullah [Division of Waste and Environmental Technology, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Ismail, Nadiah Binti [Fakulti Kejuruteraan Elektrik, UiTM Pulau Pinang, 13500 Permatang Pauh, Pulau Pinang (Malaysia)

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  19. Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.

    Science.gov (United States)

    Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A

    2016-04-01

    Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost. PMID:26894596

  20. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    Science.gov (United States)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  1. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  2. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  3. A self-contained polymeric cartridge for automated biological sample preparationa

    OpenAIRE

    Xu, Guolin; Lee, Daniel Yoke San; Xie, Hong; Chiew, Deon; Hsieh, Tseng-Ming; Ali, Emril Mohamed; Lun Looi, Xing; Li, Mo-Huang; Ying, Jackie Y.

    2011-01-01

    Sample preparation is one of the most crucial processes for nucleic acids based disease diagnosis. Several steps are required for nucleic acids extraction, impurity washes, and DNA/RNA elution. Careful sample preparation is vital to the obtaining of reliable diagnosis, especially with low copies of pathogens and cells. This paper describes a low-cost, disposable lab cartridge for automatic sample preparation, which is capable of handling flexible sample volumes of 10 μl to 1 ml. This plastic ...

  4. Automation and environment of a sample of the modernized installation YuMO

    International Nuclear Information System (INIS)

    New possibilities of the modernized installation YuMO due to automation of separate units are shown. Main unique devices due to modernization are presented. Advantages of the upgraded spectrometer are shown. The basic approaches to creation of control systems by executive mechanisms of spectrometers on the basis of their unification and standardization are formulated. Circuits of the block of management by step-by-step engines, the switchboard-amplifier of step-by-step motors, the circuit of the system of stabilization of the period and phase of the chopper, and the block diagram of the control system of executive mechanisms of the spectrometer YuMO are submitted. Main technical parameters of the basic original mechanical devices are given. (author)

  5. Automation and Environment of a Sample of the Modernized Installation YuMO

    CERN Document Server

    Kuklin, A I; Kirilov, A S; Islamov, A H; Petukhova, N V; Utrobin, P K; Kovalev, Yu S; Gordeliy, V I

    2004-01-01

    New possibilities of the modernized installation YuMO due to automation of separate units are shown. Main unique devices due to modernization are presented. Advantages of the upgraded spectrometer are shown. The basic approaches to creation of control systems by executive mechanisms of spectrometers on the basis of their unification and standardization are formulated. Circuits of the block of management by step-by-step engines, the switchboard-amplifier of step-by-step motors, the circuit of the system of stabilization of the period and phase of the chopper, and the block diagram of the control system of executive mechanisms of the spectrometer YuMO are submitted. Main technical parameters of the basic original mechanical devices are given.

  6. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  7. Development of a methodology for automated assessment of the quality of digitized images in mammography

    International Nuclear Information System (INIS)

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  8. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    Directory of Open Access Journals (Sweden)

    Constantinos Georgiou

    2010-07-01

    Full Text Available This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+ solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor’s response to carrier solutions of different pHs was tested. Vibrio fischeri’s bioluminescence is promoted in the pH 5–10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions.

  9. Development of a fully automated Flow Injection analyzer implementing bioluminescent biosensors for water toxicity assessment.

    Science.gov (United States)

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb(2+), Hg(2+) and Cu(2+)) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor's response to carrier solutions of different pHs was tested. Vibrio fischeri's bioluminescence is promoted in the pH 5-10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  10. Development and evaluation of a virtual microscopy application for automated assessment of Ki-67 expression in breast cancer

    Directory of Open Access Journals (Sweden)

    Turpeenniemi-Hujanen Taina

    2011-01-01

    Full Text Available Abstract Background The aim of the study was to develop a virtual microscopy enabled method for assessment of Ki-67 expression and to study the prognostic value of the automated analysis in a comprehensive series of patients with breast cancer. Methods Using a previously reported virtual microscopy platform and an open source image processing tool, ImageJ, a method for assessment of immunohistochemically (IHC stained area and intensity was created. A tissue microarray (TMA series of breast cancer specimens from 1931 patients was immunostained for Ki-67, digitized with a whole slide scanner and uploaded to an image web server. The extent of Ki-67 staining in the tumour specimens was assessed both visually and with the image analysis algorithm. The prognostic value of the computer vision assessment of Ki-67 was evaluated by comparison of distant disease-free survival in patients with low, moderate or high expression of the protein. Results 1648 evaluable image files from 1334 patients were analysed in less than two hours. Visual and automated Ki-67 extent of staining assessments showed a percentage agreement of 87% and weighted kappa value of 0.57. The hazard ratio for distant recurrence for patients with a computer determined moderate Ki-67 extent of staining was 1.77 (95% CI 1.31-2.37 and for high extent 2.34 (95% CI 1.76-3.10, compared to patients with a low extent. In multivariate survival analyses, automated assessment of Ki-67 extent of staining was retained as a significant prognostic factor. Conclusions Running high-throughput automated IHC algorithms on a virtual microscopy platform is feasible. Comparison of visual and automated assessments of Ki-67 expression shows moderate agreement. In multivariate survival analysis, the automated assessment of Ki-67 extent of staining is a significant and independent predictor of outcome in breast cancer.

  11. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    Science.gov (United States)

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample. PMID:26759433

  12. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range

    NARCIS (Netherlands)

    Duijn, E. van; Sandman, H.; Grossouw, D.; Mocking, J.A.J.; Coulier, L.; Vaes, W.H.J.

    2014-01-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. H

  13. Rapid habitability assessment of Mars samples by pyrolysis-FTIR

    Science.gov (United States)

    Gordon, Peter R.; Sephton, Mark A.

    2016-02-01

    Pyrolysis Fourier transform infrared spectroscopy (pyrolysis FTIR) is a potential sample selection method for Mars Sample Return missions. FTIR spectroscopy can be performed on solid and liquid samples but also on gases following preliminary thermal extraction, pyrolysis or gasification steps. The detection of hydrocarbon and non-hydrocarbon gases can reveal information on sample mineralogy and past habitability of the environment in which the sample was created. The absorption of IR radiation at specific wavenumbers by organic functional groups can indicate the presence and type of any organic matter present. Here we assess the utility of pyrolysis-FTIR to release water, carbon dioxide, sulfur dioxide and organic matter from Mars relevant materials to enable a rapid habitability assessment of target rocks for sample return. For our assessment a range of minerals were analyzed by attenuated total reflectance FTIR. Subsequently, the mineral samples were subjected to single step pyrolysis and multi step pyrolysis and the products characterised by gas phase FTIR. Data from both single step and multi step pyrolysis-FTIR provide the ability to identify minerals that reflect habitable environments through their water and carbon dioxide responses. Multi step pyrolysis-FTIR can be used to gain more detailed information on the sources of the liberated water and carbon dioxide owing to the characteristic decomposition temperatures of different mineral phases. Habitation can be suggested when pyrolysis-FTIR indicates the presence of organic matter within the sample. Pyrolysis-FTIR, therefore, represents an effective method to assess whether Mars Sample Return target rocks represent habitable conditions and potential records of habitation and can play an important role in sample triage operations.

  14. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Vilém Guryča

    2014-03-01

    Full Text Available The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility.

  15. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    Science.gov (United States)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  16. Rapid DNA analysis for automated processing and interpretation of low DNA content samples

    OpenAIRE

    Turingan, Rosemary S.; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F.

    2016-01-01

    Background Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or f...

  17. An instrument for automated purification of nucleic acids from contaminated forensic samples

    OpenAIRE

    Broemeling, David J; Pel, Joel; Gunn, Dylan C; Mai, Laura; Thompson, Jason D.; Poon, Hiron; Marziali, Andre

    2008-01-01

    Forensic crime scene sample analysis, by its nature, often deals with samples in which there are low amounts of nucleic acids, on substrates that often lead to inhibition of subsequent enzymatic reactions such as PCR amplification for STR profiling. Common substrates include denim from blue jeans, which yields indigo dye as a PCR inhibitor, and soil, which yields humic substances as inhibitors. These inhibitors frequently co-extract with nucleic acids in standard column or bead-based preps, l...

  18. Automated sample preparation station for studying self-diffusion in porous solids with NMR spectroscopy

    International Nuclear Information System (INIS)

    In studies of gas diffusion in porous solids with nuclear magnetic resonance (NMR) spectroscopy the sample preparation procedure becomes very important. An apparatus is presented here that pretreats the sample ex situ and accurately sets the desired pressure and temperature within the NMR tube prior to its introduction in the spectrometer. The gas manifold that supplies the NMR tube is also connected to a microbalance containing another portion of the same sample, which is kept at the same temperature as the sample in the NMR tube. This arrangement permits the simultaneous measurement of the adsorption loading on the sample, which is required for the interpretation of the NMR diffusion experiments. Furthermore, to ensure a good seal of the NMR tube, a hybrid valve design composed of titanium, a Teflon registered seat, and Kalrez registered O-rings is utilized. A computer controlled algorithm ensures the accuracy and reproducibility of all the procedures, enabling the NMR diffusion experiments to be performed at well controlled conditions of pressure, temperature, and amount of gas adsorbed on the porous sample

  19. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    Science.gov (United States)

    Schneider-Zapp, K.; Salter, M. E.; Upstill-Goddard, R. C.

    2014-07-01

    In order to advance understanding of the role of seawater surfactants in the air-sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw), we constructed a fully automated, closed air-water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with Milli-Q water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air-sea gas exchange process.

  20. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    Directory of Open Access Journals (Sweden)

    K. Schneider-Zapp

    2014-02-01

    Full Text Available In order to advance understanding of the role of seawater surfactants in the air–sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw, we constructed a fully automated, closed air-water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with MilliQ water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air–sea gas exchange process.

  1. Development of a full automation solid phase microextraction method for investigating the partition coefficient of organic pollutant in complex sample.

    Science.gov (United States)

    Jiang, Ruifen; Lin, Wei; Wen, Sijia; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-08-01

    A fully automated solid phase microextraction (SPME) depletion method was developed to study the partition coefficient of organic compound between complex matrix and water sample. The SPME depletion process was conducted by pre-loading the fiber with a specific amount of organic compounds from a proposed standard gas generation vial, and then desorbing the fiber into the targeted samples. Based on the proposed method, the partition coefficients (Kmatrix) of 4 polyaromatic hydrocarbons (PAHs) between humic acid (HA)/hydroxypropyl-β-cyclodextrin (β-HPCD) and aqueous sample were determined. The results showed that the logKmatrix of 4 PAHs with HA and β-HPCD ranged from 3.19 to 4.08, and 2.45 to 3.15, respectively. In addition, the logKmatrix values decreased about 0.12-0.27 log units for different PAHs for every 10°C increase in temperature. The effect of temperature on the partition coefficient followed van't Hoff plot, and the partition coefficient at any temperature can be predicted based on the plot. Furthermore, the proposed method was applied for the real biological fluid analysis. The partition coefficients of 6 PAHs between the complex matrices in the fetal bovine serum and water were determined, and compared to ones obtained from SPME extraction method. The result demonstrated that the proposed method can be applied to determine the sorption coefficients of hydrophobic compounds between complex matrix and water in a variety of samples. PMID:26118804

  2. Automated Broad-Range Molecular Detection of Bacteria in Clinical Samples.

    Science.gov (United States)

    Budding, Andries E; Hoogewerf, Martine; Vandenbroucke-Grauls, Christina M J E; Savelkoul, Paul H M

    2016-04-01

    Molecular detection methods, such as quantitative PCR (qPCR), have found their way into clinical microbiology laboratories for the detection of an array of pathogens. Most routinely used methods, however, are directed at specific species. Thus, anything that is not explicitly searched for will be missed. This greatly limits the flexibility and universal application of these techniques. We investigated the application of a rapid universal bacterial molecular identification method, IS-pro, to routine patient samples received in a clinical microbiology laboratory. IS-pro is a eubacterial technique based on the detection and categorization of 16S-23S rRNA gene interspace regions with lengths that are specific for each microbial species. As this is an open technique, clinicians do not need to decide in advance what to look for. We compared routine culture to IS-pro using 66 samples sent in for routine bacterial diagnostic testing. The samples were obtained from patients with infections in normally sterile sites (without a resident microbiota). The results were identical in 20 (30%) samples, IS-pro detected more bacterial species than culture in 31 (47%) samples, and five of the 10 culture-negative samples were positive with IS-pro. The case histories of the five patients from whom these culture-negative/IS-pro-positive samples were obtained suggest that the IS-pro findings are highly clinically relevant. Our findings indicate that an open molecular approach, such as IS-pro, may have a high added value for clinical practice. PMID:26763956

  3. The T-lock: automated compensation of radio-frequency induced sample heating

    International Nuclear Information System (INIS)

    Modern high-field NMR spectrometers can stabilize the nominal sample temperature at a precision of less than 0.1 K. However, the actual sample temperature may differ from the nominal value by several degrees because the sample heating caused by high-power radio frequency pulses is not readily detected by the temperature sensors. Without correction, transfer of chemical shifts between different experiments causes problems in the data analysis. In principle, the temperature differences can be corrected by manual procedures but this is cumbersome and not fully reliable. Here, we introduce the concept of a 'T-lock', which automatically maintains the sample at the same reference temperature over the course of different NMR experiments. The T-lock works by continuously measuring the resonance frequency of a suitable spin and simultaneously adjusting the temperature control, thus locking the sample temperature at the reference value. For three different nuclei, 13C, 17O and 31P in the compounds alanine, water, and phosphate, respectively, the T-lock accuracy was found to be <0.1 K. The use of dummy scan periods with variable lengths allows a reliable establishment of the thermal equilibrium before the acquisition of an experiment starts

  4. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Directory of Open Access Journals (Sweden)

    Kamfai Chan

    Full Text Available Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs. Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  5. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Science.gov (United States)

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  6. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection

    Science.gov (United States)

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A.; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A.; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target’s nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer’s heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  7. Automation of the radiation measuring facilities for samples in health physics - MA 9

    International Nuclear Information System (INIS)

    Routine radation measurements of samples are performed by the HMI health physics department by means of test stations for individual samples and multiple samples (using a changing equipment). The basic device of these test stations is a SCALER/TIMER system (BF 22/25, BERTHOLD Corp.). This measuring facility has been extended by a CAMAC intrumentation which incorporates an autonomous CAMAC processor (CAPRO-1, INCAA B.V.) for monitoring an automatic control of the system. The programming language is BASIC. A DECwriter (LA 34) is used for user interaction and for printing the measurement results. This report describes the features of this system and present some examples of, the dialogue with the system and the printout of data. (orig.)

  8. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Science.gov (United States)

    El-Alaily, T. M.; El-Nimr, M. K.; Saafan, S. A.; Kamel, M. M.; Meaz, T. M.; Assar, S. T.

    2015-07-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability.

  9. Assessing the accuracy of an inter-institutional automated patient-specific health problem list

    Directory of Open Access Journals (Sweden)

    Taylor Laurel

    2010-02-01

    Full Text Available Abstract Background Health problem lists are a key component of electronic health records and are instrumental in the development of decision-support systems that encourage best practices and optimal patient safety. Most health problem lists require initial clinical information to be entered manually and few integrate information across care providers and institutions. This study assesses the accuracy of a novel approach to create an inter-institutional automated health problem list in a computerized medical record (MOXXI that integrates three sources of information for an individual patient: diagnostic codes from medical services claims from all treating physicians, therapeutic indications from electronic prescriptions, and single-indication drugs. Methods Data for this study were obtained from 121 general practitioners and all medical services provided for 22,248 of their patients. At the opening of a patient's file, all health problems detected through medical service utilization or single-indication drug use were flagged to the physician in the MOXXI system. Each new arising health problem were presented as 'potential' and physicians were prompted to specify if the health problem was valid (Y or not (N or if they preferred to reassess its validity at a later time. Results A total of 263,527 health problems, representing 891 unique problems, were identified for the group of 22,248 patients. Medical services claims contributed to the majority of problems identified (77%, followed by therapeutic indications from electronic prescriptions (14%, and single-indication drugs (9%. Physicians actively chose to assess 41.7% (n = 106,950 of health problems. Overall, 73% of the problems assessed were considered valid; 42% originated from medical service diagnostic codes, 11% from single indication drugs, and 47% from prescription indications. Twelve percent of problems identified through other treating physicians were considered valid compared to 28

  10. A self-contained polymeric cartridge for automated biological sample preparation.

    Science.gov (United States)

    Xu, Guolin; Lee, Daniel Yoke San; Xie, Hong; Chiew, Deon; Hsieh, Tseng-Ming; Ali, Emril Mohamed; Lun Looi, Xing; Li, Mo-Huang; Ying, Jackie Y

    2011-09-01

    Sample preparation is one of the most crucial processes for nucleic acids based disease diagnosis. Several steps are required for nucleic acids extraction, impurity washes, and DNA/RNA elution. Careful sample preparation is vital to the obtaining of reliable diagnosis, especially with low copies of pathogens and cells. This paper describes a low-cost, disposable lab cartridge for automatic sample preparation, which is capable of handling flexible sample volumes of 10 μl to 1 ml. This plastic cartridge contains all the necessary reagents for pathogen and cell lysis, DNA/RNA extraction, impurity washes, DNA/RNA elution and waste processing in a completely sealed cartridge. The entire sample preparation processes are automatically conducted within the cartridge on a desktop unit using a pneumatic fluid manipulation approach. Reagents transportation is achieved with a combination of push and pull forces (with compressed air and vacuum, respectively), which are connected to the pneumatic inlets at the bottom of the cartridge. These pneumatic forces are regulated by pinch valve manifold and two pneumatic syringe pumps within the desktop unit. The performance of this pneumatic reagent delivery method was examined. We have demonstrated the capability of the on-cartridge RNA extraction and cancer-specific gene amplification from 10 copies of MCF-7 breast cancer cells. The on-cartridge DNA recovery efficiency was 54-63%, which was comparable to or better than the conventional manual approach using silica spin column. The lab cartridge would be suitable for integration with lab-chip real-time polymerase chain reaction devices in providing a portable system for decentralized disease diagnosis. PMID:22662036

  11. Automated flow-through amperometric immunosensor for highly sensitive and on-line detection of okadaic acid in mussel sample.

    Science.gov (United States)

    Dominguez, Rocio B; Hayat, Akhtar; Sassolas, Audrey; Alonso, Gustavo A; Munoz, Roberto; Marty, Jean-Louis

    2012-09-15

    An electrochemical immunosensor for okadaic acid (OA) detection has been developed, and used in an indirect competitive immunoassay format under automated flow conditions. The biosensor was fabricated by injecting OA modified magnetic beads onto screen printed carbon electrode (SPCE) in the flow system. The OA present in the sample competed with the immobilized OA to bind with anti-okadaic acid monoclonal antibody (anti-OA-MAb). The secondary alkaline phosphatase labeled antibody was used to perform electrochemical detection. The current response obtained from the labeled alkaline phosphatase to 1-naphthyl phosphate decreased proportionally to the concentration of free OA in the sample. The calculated limit of detection (LOD) was 0.15 μg/L with a linear range of 0.19-25 μg/L. The good recoveries percentages validated the immunosensor application for real mussel samples. The developed system automatically controlled the incubation, washing and current measurement steps, showing its potential use for OA determination in field analysis. PMID:22967546

  12. Automated on-line preconcentration of palladium on different sorbents and its determination in environmental samples.

    Science.gov (United States)

    Sánchez Rojas, Fuensanta; Bosch Ojeda, Catalina; Cano Pavón, José Manuel

    2007-01-01

    The determination of noble metals in environmental samples is of increasing importance. Palladium is often employed as a catalyst in chemical industry and is also used with platinum and rhodium in motor car catalytic converters which might cause environmental pollution problems. Two different sorbents for palladium preconcentration in different samples were investigated: silica gel functionalized with 1,5-bis(di-2-pyridyl)methylene tbiocarbohydrazide (DPTH-gel) and [1,5-Bis(2-pyridyl)-3-sulphophenyI methylene thiocarbonohydrazide (PSTH) immobilised on an anion-exchange resin (Dowex lx8-200)]. The sorbents were tested in a micro-column, placed in the auto-sampler arm, at the flow rate 2.8 mL min(-1). Elution was performed with 4 M HCl and 4 M HNO3, respectively. Satisfactory results were obtained for two sorbents. PMID:17822233

  13. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    Science.gov (United States)

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  14. Control Performance Management in Industrial Automation Assessment, Diagnosis and Improvement of Control Loop Performance

    CERN Document Server

    Jelali, Mohieddine

    2013-01-01

    Control Performance Management in Industrial Automation provides a coherent and self-contained treatment of a group of methods and applications of burgeoning importance to the detection and solution of problems with control loops that are vital in maintaining product quality, operational safety, and efficiency of material and energy consumption in the process industries. The monograph deals with all aspects of control performance management (CPM), from controller assessment (minimum-variance-control-based and advanced methods), to detection and diagnosis of control loop problems (process non-linearities, oscillations, actuator faults), to the improvement of control performance (maintenance, re-design of loop components, automatic controller re-tuning). It provides a contribution towards the development and application of completely self-contained and automatic methodologies in the field. Moreover, within this work, many CPM tools have been developed that goes far beyond available CPM packages. Control Perform...

  15. Clinical evaluation of 64-slice CT assessment of global left ventricular function using automated cardiac phase selection

    International Nuclear Information System (INIS)

    Left ventricular (LV) function provides prognostic information regarding the morbidity and mortality of patients. An automated cardiac phase selection algorithm has the potential to support the assessment of LV function with computed tomography (CT). This algorithm is clinically evaluated for 64-slice cardiac CT. Examinations of twenty consecutive patients were selected. Electrocardiogram gated contrast-enhanced CT was performed. Reconstructions were performed using an automated and a manual method, followed by the determination of the global LV function. Significances were tested using 2-sided Student's t-tests. Reduction in post processing time and storage capacity were estimated. A slightly smaller mean end-systolic volume was found with the automated method (52±18 ml vs 54±17 ml, p=0.02, r=0.99). The mean LV ejection fraction was slightly larger with the automated method (65±8% vs 64±8%, p=0.004, r=0.99). The estimated reduction in post processing time was maximal 5 min per patient with a potential 80% data storage reduction. Results of the automated phase selection algorithm are similar to the manual method. The automated tool reduces post processing time, reconstruction time and transfer time. (author)

  16. New automated image analysis method for the assessment of Ki-67 labeling index in meningiomas.

    Directory of Open Access Journals (Sweden)

    Wielisław Papierz

    2010-05-01

    Full Text Available Many studies have emphasised the importance of Ki-67 labeling index (LI as the proliferation marker in meningiomas. Several authors confirmed, that Ki-67 LI has prognostic significance and correlates with likelihood of tumour recurrences. These observations were widely accepted by pathologists, but up till now no standard method for Ki-67 LI assessment was developed and introduced for the diagnostic pathology. In this paper we present a new computerised system for automated Ki-67 LI estimation in meningiomas as an aid for histological grading of meningiomas and potential standard method of Ki-67 LI assessment. We also discuss the concordance of Ki-67 LI results obtained by presented computerized system and expert pathologist, as well as possible pitfalls and mistakes in automated counting of immunopositive or negative cells. For the quantitative evaluation of digital images of meningiomas the designed software uses an algorithm based on mathematical description of cell morphology. This solution acts together with the Support Vector Machine (SVM used in the classification mode for the recognition of immunoreactivity of cells. The applied sequential thresholding simulated well the human process of cell recognition. The same digital images of randomly selected tumour areas were parallelly analysed by computer and blindly by two expert pathologists. Ki-67 labeling indices were estimated and the results compared. The mean relative discrepancy between the levels of Ki-67 LI by our system and by the human expert did not exceed 14% in all investigated cases. These preliminary results suggest that the designed software could be an useful tool supporting the diagnostic digital pathology. However, more extended studies are needed for approval of this suggestion.

  17. Improved automation of dissolved organic carbon sampling for organic-rich surface waters.

    Science.gov (United States)

    Grayson, Richard P; Holden, Joseph

    2016-02-01

    In-situ UV-Vis spectrophotometers offer the potential for improved estimates of dissolved organic carbon (DOC) fluxes for organic-rich systems such as peatlands because they are able to sample and log DOC proxies automatically through time at low cost. In turn, this could enable improved total carbon budget estimates for peatlands. The ability of such instruments to accurately measure DOC depends on a number of factors, not least of which is how absorbance measurements relate to DOC and the environmental conditions. Here we test the ability of a S::can Spectro::lyser™ for measuring DOC in peatland streams with routinely high DOC concentrations. Through analysis of the spectral response data collected by the instrument we have been able to accurately measure DOC up to 66 mg L(-1), which is more than double the original upper calibration limit for this particular instrument. A linear regression modelling approach resulted in an accuracy >95%. The greatest accuracy was achieved when absorbance values for several different wavelengths were used at the same time in the model. However, an accuracy >90% was achieved using absorbance values for a single wavelength to predict DOC concentration. Our calculations indicated that, for organic-rich systems, in-situ measurement with a scanning spectrophotometer can improve fluvial DOC flux estimates by 6 to 8% compared with traditional sampling methods. Thus, our techniques pave the way for improved long-term carbon budget calculations from organic-rich systems such as peatlands. PMID:26580726

  18. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Science.gov (United States)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  19. Are Flow Injection-based Approaches Suitable for Automated Handling of Solid Samples?

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald; Cerdà, Victor

    electrolytic or aqueous leaching, on-line dialysis/microdialysis, in-line filtration, and pervaporation-based procedures have been successfully implemented in continuous flow/flow injection systems. In this communication, the new generation of flow analysis, including sequential injection, multicommutated flow......, multisyringe flow injection, and micro-Lab-on-valve are presented as appealing approaches for on-line handling of solid samples. Special emphasis is given to the capability of flow systems to accommodate sequential extraction protocols for partitioning of trace elements and nutrients in environmental solids (e.......g., soils, sediments, sludges), and thus, ascertaining the potential mobility, bioavailability and eventual impact of anthropogenic elements on biota [2]. In this context, the principles of sequential injection-microcolumn extraction (SI-MCE) for dynamic fractionation are explained in detail along with the...

  20. Semi-automated procedure for the determination of 89,90Sr in environmental samples by Cherenkov counting

    International Nuclear Information System (INIS)

    Development of new chromatographic resins in the last two decades Sr resin, AnaLig-01 and SuperLig 620 has significantly simplified separation of strontium from various types of samples. These resins, that have principles based on molecular recognition, are highly selective for strontium binding. In combination with appropriate detection methods they enable automatic determination of radioactive strontium. Sequential injection analysis and equilibration based sensor column analysis were developed for the determination of long lived 90Sr (28.8 y) in liquid radioactive waste and water samples. However, 89Sr that has short half-life (50.5 d), can also be present in samples, especially in those exposed to fresh fallout from nuclear reactor. Classical analysis of 89Sr requires isolation of 90Y, usually after attaining of secular equilibrium of 90Sr-90Y and the whole procedure takes at least 16 days. However, by using Cherenkov counting technique, determination time may be significantly reduced. Unlike 90Sr that emits low energy electrons, its daughter 90Y as well as 89Sr, generates Cherenkov photons in aqueous media. Consequently, by successive counting within 64 hours, 89Sr and 90Sr via 90Y can be determined. Therefore, the main aim of this research is development of semi-automated procedure for the determination of 89,90Sr. It includes solid phase extraction (SPE) of strontium from liquid samples and Cherenkov counting of its isotopes. The procedure is based on sample - column equilibration and off-line detection of bound 89,90Sr on the column. Sample is pumped through column at constant flow rate until the breakthrough or saturation point is achieved. The 89,90Sr is determined by counting on column in PE vial. It will be shown how strontium can be selectively bound on the Sr resin, AnaLig-01 and SuperLig 620 resins and separated from interfering radionuclides. Also, influence of column geometry, amount of resin and media in PE vial around the column on quantity

  1. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune;

    2011-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO...... automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid...... 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the...

  2. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    Science.gov (United States)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  3. An Automated Version of the BAT Syntactic Comprehension Task for Assessing Auditory L2 Proficiency in Healthy Adults

    Science.gov (United States)

    Achim, Andre; Marquis, Alexandra

    2011-01-01

    Studies of bilingualism sometimes require healthy subjects to be assessed for proficiency at auditory sentence processing in their second language (L2). The Syntactic Comprehension task of the Bilingual Aphasia Test could satisfy this need. For ease and uniformity of application, we automated its English (Paradis, M., Libben, G., and Hummel, K.…

  4. Aerothermodynamics Feasibility Assessment of a Mars Atmoshperic Sample Return Mission

    Science.gov (United States)

    Ferracina, L.; Larranaga, J.; Falkner, P.

    2011-02-01

    ESA's optional Mars Robotic Exploration Preparation (MREP) programme is based on a long term collaboration with NASA, by taking Mars exploration as global objective, and Mars Sample Return (MSR) mission as long term goal to be achieved by the mid 2020's. Considering today's uncertainties, different missions are envisaged and prepared by ESA as possible alternative missions to MSR in the timeframe of 2020- 2026, in case the required technology readiness is not reached by 2015 or landed mass capabilities are exceeded for any of the MSR mission elements. One of the ESA considered missions within this framework is the Mars Atmospheric Sample Return Mission. This mission has been recently assessed by ESA using its Concurrent Design Facility (CDF), aiming to enter with a probe at Mars low altitudes (≈50 km), collect a sample of airborne atmosphere (gas and dust) and return the sample back to Earth. This paper aim at reporting the preliminary aerothermodynamic assessment of the design of the Martian entry probe conducted within the CDF study. Special attention has been paid to the selection of aerodynamically efficient vehicle concepts compare to blunt bodies and to the effect of the hot-temperature shock to the cavity placed at stagnation point and used in the atmospheric sampling system.

  5. Adjustable virtual pore-size filter for automated sample preparation using acoustic radiation force

    Energy Technology Data Exchange (ETDEWEB)

    Jung, B; Fisher, K; Ness, K; Rose, K; Mariella, R

    2008-05-22

    We present a rapid and robust size-based separation method for high throughput microfluidic devices using acoustic radiation force. We developed a finite element modeling tool to predict the two-dimensional acoustic radiation force field perpendicular to the flow direction in microfluidic devices. Here we compare the results from this model with experimental parametric studies including variations of the PZT driving frequencies and voltages as well as various particle sizes and compressidensities. These experimental parametric studies also provide insight into the development of an adjustable 'virtual' pore-size filter as well as optimal operating conditions for various microparticle sizes. We demonstrated the separation of Saccharomyces cerevisiae and MS2 bacteriophage using acoustic focusing. The acoustic radiation force did not affect the MS2 viruses, and their concentration profile remained unchanged. With optimized design of our microfluidic flow system we were able to achieve yields of > 90% for the MS2 with > 80% of the S. cerevisiae being removed in this continuous-flow sample preparation device.

  6. Equilibrium sampling for a thermodynamic assessment of contaminated sediments

    DEFF Research Database (Denmark)

    of the biota relative to the sediment. Furthermore, concentrations in lipid at thermodynamic equilibrium with sediment (Clip?Sed) can be calculated via lipid/silicone partition ratios CSil × KLip:Sil, which has been done in studies with limnic, river and marine sediments. The data can then be......) govern diffusive uptake and partitioning. Equilibrium sampling of sediment was introduced 15 years ago to measure Cfree, and it has since developed into a straightforward, precise and sensitive approach for determining Cfree and other exposure parameters that allow for thermodynamic assessment of......) toxicity. This overview lecture will focus at the latest developments in equilibrium sampling concepts and methods. Further, we will explain how these approaches can provide a new basis for a thermodynamic assessment of polluted sediments....

  7. Computerized experience-sampling approach for realtime assessment of stress

    Directory of Open Access Journals (Sweden)

    S. Serino

    2013-03-01

    Full Text Available The incredible advancement in the ICT sector has challenged technology developers, designers, and psychologists to reflect on how to develop technologies to promote mental health. Computerized experience-sampling method appears to be a promising assessment approach to investigate the real-time fluctuation of experience in daily life in order to detect stressful events. At this purpose, we developed PsychLog (http://psychlog.com a free open-source mobile experience sampling platform that allows psychophysiological data to be collected, aggregated, visualized and collated into reports. Results showed a good classification of relaxing and stressful events, defining the two groups with psychological analysis and verifying the discrimination with physiological measures. Within the paradigm of Positive Technology, our innovative approach offers for researchers and clinicians new effective opportunities for the assessment and treatment of the psychological stress in daily situations.

  8. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  9. Passive sampling methods for contaminated sediments: Risk assessment and management

    OpenAIRE

    Greenberg, Marc S; Chapman, Peter M.; Allan, Ian J.; Anderson, Kim A.; Apitz, Sabine E.; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P.; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (C free), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because C free is a better predictor of bioavailability than total bulk sediment concentration (C t...

  10. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    Science.gov (United States)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. PMID:26423626

  11. Depth-stratified soil sampling for assessing nematode communities

    Directory of Open Access Journals (Sweden)

    Giovani de Oliveira Arieira

    2016-04-01

    Full Text Available This study assessed the importance of stratified soil sampling on the detection (and therefore the distribution of nematode communities and the differentiation of ecosystems by collecting stratified soil samples at intervals of 10 cm and non-stratified samples from 0 to 30 cm in two soil management systems (no-tillage and conventional tillage and in a native forest fragment. The nematode frequency and prominence values were obtained after extraction by successive screening operations, sugar floatation clarification and the identification of nematodes to the genus level. The nematode communities were compared two-by-two based on Sorensen’s community coefficient (CC and the percentage similarity (PS. Relative abundances of functional guilds were subjected to a principal component analysis (PCA and classified in dendrograms. Thirty-two edaphic nematode genera were found, and the nematode communities sampled on a non-stratified basis in the soil profile exhibited a high level of similarity because they could not be accurately characterized. Genera with low abundances were not detected. In the stratified samples, we were able to classify and group the nematodes present at different depths, mainly from 0 to 10 cm. Stratified soil sampling allowed a more accurate characterization and greater differentiation of nematode communities, identifying taxa that occurred at lower abundance levels, irrespective of frequency.

  12. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  13. Assessing the Bias in Communication Networks Sampled from Twitter

    CERN Document Server

    González-Bailón, Sandra; Rivero, Alejandro; Borge-Holthoefer, Javier; Moreno, Yamir

    2012-01-01

    We collect and analyse messages exchanged in Twitter using two of the platform's publicly available APIs (the search and stream specifications). We assess the differences between the two samples, and compare the networks of communication reconstructed from them. The empirical context is given by political protests taking place in May 2012: we track online communication around these protests for the period of one month, and reconstruct the network of mentions and re-tweets according to the two samples. We find that the search API over-represents the more central users and does not offer an accurate picture of peripheral activity; we also find that the bias is greater for the network of mentions. We discuss the implications of this bias for the study of diffusion dynamics and collective action in the digital era, and advocate the need for more uniform sampling procedures in the study of online communication.

  14. Assessing Community Health Risks: Proactive Vs Reactive Sampling

    Directory of Open Access Journals (Sweden)

    Sarah Taylor

    2009-01-01

    Full Text Available Problem statement: A considerable number of native birds died in the West Australian coastal town of Esperance and surroundings during late 2006 and early 2007, which raised community concerns about environmental contamination. Forensic investigations of dead birds suggested that lead may have been the causative agent. At the time, lead and nickel, as well as iron ore and other materials, were being exported through the Port of Esperance (port. Government agencies undertook a targeted environmental sampling programme to identify the exposure sources and the extent of contamination. Results of ambient air monitoring, blood lead level investigations and analysis of metals in rainwater tanks suggested widespread contamination of the Esperance town site with lead and nickel. The Department of Environment and Conservation (DEC retained Golder Associates Pty Ltd., (Golder to undertake a human health and ecological risk assessment (risk assessment using the information collected through the investigation of lead and nickel contamination in Esperance. The quantity and quality of exposure data are an important contributor to the uncertainty associated with the outcomes of a risk assessment. Conclusion: As the data were collected essentially as part of the emergency response to the events in Esperance, there was some uncertainty about the suitability and completeness of the data for risk assessment. The urgent nature of the emergency response meant that sampling was opportunistic and not necessarily sufficient or suitable for risk assessment from a methodical and scientific perspective. This study demonstrated the need for collecting ‘meaningful and reliable’ data for assessing risks from environmental contamination.

  15. Sampling and Analysis for Assessment of Body Burdens

    International Nuclear Information System (INIS)

    A review of sampling criteria and techniques and of sample processing methods for indirect assessment of body burdens is presented. The text is limited to the more recent developments in the field of bioassay and to the nuclides which cannot be readily determined in the body directly. A selected bibliography is included. The planning of a bioassay programme should emphasize the detection of high or unusual exposures and the concentrated study of these cases when detected. This procedure gives the maximum amount of data for the dosimetry of individuals at risk and also adds to our scientific background for an understanding of internal emitters. Only a minimum of effort should be spent on sampling individuals having had negligible exposure. The chemical separation procedures required for bioassay also fall into two categories. The first is the rapid method, possibly of low accuracy, used for detection. The second is the more accurate method required for study of the individual after detection of the exposure. Excretion, whether exponential or a power function, drops off rapidly. It is necessary to locate the exposure in time before any evaluation can be made, even before deciding if the exposure is significant. One approach is frequent sampling and analysis by a quick screening technique. More commonly, samples are collected at longer intervals and an arbitrary level of re-sampling is set to assist in the detection of real exposures. It is probable that too much bioassay effort has gone into measurements on individuals at low risk and not enough on those at higher risk. The development of bioassay procedures for overcoming this problem has begun, and this paper emphasizes this facet of sampling and sample processing. (author)

  16. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    Science.gov (United States)

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. PMID:27006022

  17. Assessment of the 296-S-21 Stack Sampling Probe Location

    Energy Technology Data Exchange (ETDEWEB)

    Glissmeyer, John A.

    2006-09-08

    Tests were performed to assess the suitability of the location of the air sampling probe on the 296-S-21 stack according to the criteria of ANSI N13.1-1999, Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities. Pacific Northwest National Laboratory conducted most tests on a 3.67:1 scale model of the stack. CH2MHill also performed some limited confirmatory tests on the actual stack. The tests assessed the capability of the air-monitoring probe to extract a sample representative of the effluent stream. The tests were conducted for the practical combinations of operating fans and addressed: (1) Angular Flow--The purpose is to determine whether the velocity vector is aligned with the sampling nozzle. The average yaw angle relative to the nozzle axis should not be more than 20. The measured values ranged from 5 to 11 degrees on the scale model and 10 to 12 degrees on the actual stack. (2) Uniform Air Velocity--The gas momentum across the stack cross section where the sample is extracted should be well mixed or uniform. The uniformity is expressed as the variability of the measurements about the mean, the coefficient of variance (COV). The lower the COV value, the more uniform the velocity. The acceptance criterion is that the COV of the air velocity must be ?20% across the center two-thirds of the area of the stack. At the location simulating the sampling probe, the measured values ranged form 4 to 11%, which are within the criterion. To confirm the validity of the scale model results, air velocity uniformity measurements were made both on the actual stack and on the scale model at the test ports 1.5 stack diameters upstream of the sampling probe. The results ranged from 6 to 8% COV on the actual stack and 10 to 13% COV on the scale model. The average difference for the eight runs was 4.8% COV, which is within the validation criterion. The fact that the scale model results were slightly higher than the

  18. On-site detection of foot-and-mouth disease virus using a portable, automated sample preparation and PCR system

    International Nuclear Information System (INIS)

    Full text: Foot-and-mouth disease (FMD) is a highly contagious and economically devastating disease of farm livestock. The etiological agent, FMD virus (FMDV), is a single-stranded, positive-sense RNA virus belonging to the genus Aphthovirus within the family Picornaviridae. Rapid and accurate confirmation of the presence of FMDV is needed for effective control and eradication of the disease. An on-site detection test would be highly advantageous as the time taken to transport suspect clinical material to a central laboratory can often be lengthy, thus delaying a definitive diagnosis in the event of an outbreak. This study describes the development of a molecular assay for the detection of all seven serotypes of FMDV using novel technology, namely: Linear-After-The- Exponential (LATE)-PCR, for transfer onto a portable, easy-to-use, fully automated sample preparation and RT-PCR instrument. Primers and a mismatch tolerant probe were designed from consensus sequences in the FMDV 3D (RNA polymerase) gene to detect the target and its variants at low temperature. An internal control (IC) was included to validate negative results. After demonstrating that the LATE RT-PCR signal at end-point was proportional to number of target molecules over the range 10 to 1 million copies, the assay was compared with a one-step real-time RT-PCR (rRT-PCR) assay (also targeting the 3D) used routinely by reference laboratories. The LATE RT-PCR assay amplified RNA extracted from multiple strains of all FMDV serotypes. Of the 121 FMDV-positive samples tested, 119 were positive by both rRT-PCR and LATE RT-PCR tests while 118 had tested positive by virus isolation at the time of receipt. Twenty-eight FMDVnegative samples failed to react in all 3 tests. There were no false positive signals with RNA from other vesicular disease-causing viruses. Each FMDV-negative sample generated a signal from the IC, ruling out amplification failures. A dilution series of an FMDV reference strain demonstrated

  19. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    Science.gov (United States)

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  20. Assessment of anti-Salmonella activity of boot dip samples.

    Science.gov (United States)

    Rabie, André J; McLaren, Ian M; Breslin, Mark F; Sayers, Robin; Davies, Rob H

    2015-01-01

    The introduction of pathogens from the external environment into poultry houses via the boots of farm workers and visitors presents a significant risk. The use of boot dips containing disinfectant to help prevent this from happening is common practice, but the effectiveness of these boot dips as a preventive measure can vary. The aim of this study was to assess the anti-Salmonella activity of boot dips that are being used on poultry farms. Boot dip samples were collected from commercial laying hen farms in the UK and tested within 24 hours of receipt at the laboratory to assess their anti-Salmonella activity. All boot dip samples were tested against a field strain of Salmonella enterica serovar Enteritidis using three test models: pure culture, paper disc surface matrix and yeast suspension model. Of the 112 boot dip samples tested 83.6% were effective against Salmonella in pure culture, 37.3% in paper disc surface matrix and 44.5% in yeast suspension model. Numerous factors may influence the efficacy of the disinfectants. Disinfectants used in the dips may not always be fully active against surface or organic matter contamination; they may be inaccurately measured or diluted to a concentration other than that specified or recommended; dips may not be changed regularly or may have been exposed to rain and other environmental elements. This study showed that boot dips in use on poultry farms are frequently ineffective. PMID:25650744

  1. Automated content and quality assessment of full-motion-video for the generation of meta data

    Science.gov (United States)

    Harguess, Josh

    2015-05-01

    Virtually all of the video data (and full-motion-video (FMV)) that is currently collected and stored in support of missions has been corrupted to various extents by image acquisition and compression artifacts. Additionally, video collected by wide-area motion imagery (WAMI) surveillance systems and unmanned aerial vehicles (UAVs) and similar sources is often of low quality or in other ways corrupted so that it is not worth storing or analyzing. In order to make progress in the problem of automatic video analysis, the first problem that should be solved is deciding whether the content of the video is even worth analyzing to begin with. We present a work in progress to address three types of scenes which are typically found in real-world data stored in support of Department of Defense (DoD) missions: no or very little motion in the scene, large occlusions in the scene, and fast camera motion. Each of these produce video that is generally not usable to an analyst or automated algorithm for mission support and therefore should be removed or flagged to the user as such. We utilize recent computer vision advances in motion detection and optical flow to automatically assess FMV for the identification and generation of meta-data (or tagging) of video segments which exhibit unwanted scenarios as described above. Results are shown on representative real-world video data.

  2. Interim assessment of the VAL automated guideway transit system. Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Anagnostopoulos, G.

    1981-11-01

    This report describes an interim assessment of the VAL (Vehicules Automatiques Legers or Light Automated Vehicle) AGT system which is currently under construction in Lille, France, and which is to become fully operational in December 1983. This report contains a technical description and performance data resulting from a demonstration test program performed concurrently in August 1980. VAL is the first driverless AGT urban system application in France. The system operates at grade, elevated, and in tunnels on an exclusive concrete dual-lane guideway that is 12.7 kilometers long. The configuration of the system is a push-pull loop operating between 17 on-line stations. The system is designed to provide scheduled operation at 60-second headways and a normal one-way capacity of 7440 passengers per hour per direction with 55 percent of the passengers seated. Two pneumatic-tired vehicles are coupled into a single vehicle capable of carrying 124 passengers at line speeds of 60 km/hr. During the course of the demonstration test program, VAL demonstrated that it could achieve high levels of dependability and availability and could perform safely under all perceivable conditions.

  3. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan [Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States) and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27710 (United States); and Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708 (United States)

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose

  4. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism.

    Directory of Open Access Journals (Sweden)

    Alicja ePuścian

    2014-04-01

    Full Text Available Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order and cognitive rigidity (higher-order. Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repetitive behaviors during reversal learning in mice in the automated IntelliCage system. During the reward-motivated place preference reversal learning, designed to assess cognitive abilities of mice, visits to the previously rewarded places were recorded to measure cognitive flexibility. Thereafter, emotional flexibility was assessed by measuring conditioned fear extinction. Additionally, to look for neuronal correlates of cognitive impairments, we measured CA3-CA1 hippocampal long term potentiation (LTP. To standardize the designed tests we used C57BL/6 and BALB/c mice, representing two genetic backgrounds, for induction of autism by prenatal exposure to the sodium valproate. We found impairments of place learning related to perseveration and no LTP impairments in C57BL/6 valproate-treated mice. In contrast, BALB/c valproate-treated mice displayed severe deficits of place learning not associated with perseverative behaviors and accompanied by hippocampal LTP impairments. Alterations of cognitive flexibility observed in C57BL/6 valproate-treated mice were related to neither restricted exploration pattern nor to emotional flexibility. Altogether, we showed that the designed tests of cognitive performance and perseverative behaviors are efficient and highly replicable. Moreover, the results suggest that genetic background is crucial for the behavioral effects of prenatal valproate treatment.

  5. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  6. Assessment of the application of an automated electronic milk analyzer for the enumeration of total bacteria in raw goat milk.

    Science.gov (United States)

    Ramsahoi, L; Gao, A; Fabri, M; Odumeru, J A

    2011-07-01

    Automated electronic milk analyzers for rapid enumeration of total bacteria counts (TBC) are widely used for raw milk testing by many analytical laboratories worldwide. In Ontario, Canada, Bactoscan flow cytometry (BsnFC; Foss Electric, Hillerød, Denmark) is the official anchor method for TBC in raw cow milk. Penalties are levied at the BsnFC equivalent level of 50,000 cfu/mL, the standard plate count (SPC) regulatory limit. This study was conducted to assess the BsnFC for TBC in raw goat milk, to determine the mathematical relationship between the SPC and BsnFC methods, and to identify probable reasons for the difference in the SPC:BsnFC equivalents for goat and cow milks. Test procedures were conducted according to International Dairy Federation Bulletin guidelines. Approximately 115 farm bulk tank milk samples per month were tested for inhibitor residues, SPC, BsnFC, psychrotrophic bacteria count, composition (fat, protein, lactose, lactose and other solids, and freezing point), and somatic cell count from March 2009 to February 2010. Data analysis of the results for the samples tested indicated that the BsnFC method would be a good alternative to the SPC method, providing accurate and more precise results with a faster turnaround time. Although a linear regression model showed good correlation and prediction, tests for linearity indicated that the relationship was linear only beyond log 4.1 SPC. The logistic growth curve best modeled the relationship between the SPC and BsnFC for the entire sample population. The BsnFC equivalent to the SPC 50,000 cfu/mL regulatory limit was estimated to be 321,000 individual bacteria count (ibc)/mL. This estimate differs considerably from the BsnFC equivalent for cow milk (121,000 ibc/mL). Because of the low frequency of bulk tank milk pickups at goat farms, 78.5% of the samples had their oldest milking in the tank to be 6.5 to 9.0 d old when tested, compared with the cow milk samples, which had their oldest milking at 4 d

  7. Influence of commonly used primer systems on automated ribosomal intergenic spacer analysis of bacterial communities in environmental samples.

    Science.gov (United States)

    Purahong, Witoon; Stempfhuber, Barbara; Lentendu, Guillaume; Francioli, Davide; Reitz, Thomas; Buscot, François; Schloter, Michael; Krüger, Dirk

    2015-01-01

    Due to the high diversity of bacteria in many ecosystems, their slow generation times, specific but mostly unknown nutrient requirements and syntrophic interactions, isolation based approaches in microbial ecology mostly fail to describe microbial community structure. Thus, cultivation independent techniques, which rely on directly extracted nucleic acids from the environment, are a well-used alternative. For example, bacterial automated ribosomal intergenic spacer analysis (B-ARISA) is one of the widely used methods for fingerprinting bacterial communities after PCR-based amplification of selected regions of the operon coding for rRNA genes using community DNA. However, B-ARISA alone does not provide any taxonomic information and the results may be severely biased in relation to the primer set selection. Furthermore, amplified DNA stemming from mitochondrial or chloroplast templates might strongly bias the obtained fingerprints. In this study, we determined the applicability of three different B-ARISA primer sets to the study of bacterial communities. The results from in silico analysis harnessing publicly available sequence databases showed that all three primer sets tested are specific to bacteria but only two primers sets assure high bacterial taxa coverage (1406f/23Sr and ITSF/ITSReub). Considering the study of bacteria in a plant interface, the primer set ITSF/ITSReub was found to amplify (in silico) sequences of some important crop species such as Sorghum bicolor and Zea mays. Bacterial genera and plant species potentially amplified by different primer sets are given. These data were confirmed when DNA extracted from soil and plant samples were analyzed. The presented information could be useful when interpreting existing B-ARISA results and planning B-ARISA experiments, especially when plant DNA can be expected. PMID:25749323

  8. Influence of commonly used primer systems on automated ribosomal intergenic spacer analysis of bacterial communities in environmental samples.

    Directory of Open Access Journals (Sweden)

    Witoon Purahong

    Full Text Available Due to the high diversity of bacteria in many ecosystems, their slow generation times, specific but mostly unknown nutrient requirements and syntrophic interactions, isolation based approaches in microbial ecology mostly fail to describe microbial community structure. Thus, cultivation independent techniques, which rely on directly extracted nucleic acids from the environment, are a well-used alternative. For example, bacterial automated ribosomal intergenic spacer analysis (B-ARISA is one of the widely used methods for fingerprinting bacterial communities after PCR-based amplification of selected regions of the operon coding for rRNA genes using community DNA. However, B-ARISA alone does not provide any taxonomic information and the results may be severely biased in relation to the primer set selection. Furthermore, amplified DNA stemming from mitochondrial or chloroplast templates might strongly bias the obtained fingerprints. In this study, we determined the applicability of three different B-ARISA primer sets to the study of bacterial communities. The results from in silico analysis harnessing publicly available sequence databases showed that all three primer sets tested are specific to bacteria but only two primers sets assure high bacterial taxa coverage (1406f/23Sr and ITSF/ITSReub. Considering the study of bacteria in a plant interface, the primer set ITSF/ITSReub was found to amplify (in silico sequences of some important crop species such as Sorghum bicolor and Zea mays. Bacterial genera and plant species potentially amplified by different primer sets are given. These data were confirmed when DNA extracted from soil and plant samples were analyzed. The presented information could be useful when interpreting existing B-ARISA results and planning B-ARISA experiments, especially when plant DNA can be expected.

  9. Assessment of Pain Response in Capsaicin-Induced Dynamic Mechanical Allodynia Using a Novel and Fully Automated Brushing Device

    OpenAIRE

    du Jardin, Kristian G; Gregersen, Lise S; Røsland, Turid; Uggerhøj, Kathrine H; Petersen, Lars J.; Arendt-Nielsen, Lars; Gazerani, Parisa

    2013-01-01

    BACKGROUND: Dynamic mechanical allodynia is traditionally induced by manual brushing of the skin. Brushing force and speed have been shown to influence the intensity of brush-evoked pain. There are still limited data available with respect to the optimal stroke number, length, force, angle and speed. Therefore, an automated brushing device (ABD) was developed, for which brushing angle and speed could be controlled to enable quantitative assessment of dynamic mechanical allodynia.OBJECTIVES: T...

  10. Assessing Office Automation Effect on Performance Using Balanced Scorecard approach Case Study: Esfahan Education Organizations and Schools

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Moshref Javadi

    2013-09-01

    Full Text Available Survival of each organization depends on its dynamic interaction with internal and external environment. Regarding development of technology and its effect on performance of organizations, organizations need to implement these technologies in order to be successful. This research aims to explore relationship between implementation of office automation and performance using structural equitation modeling method (SEM. This study is considered an applied survey in which its method is descriptive. Statistical population was managers of offices and schools of ministry of education in Esfahan and Lenjan city.130 individuals were selected randomly as sample. In order to evaluate validity of questionnaire, content and construct validity were used and relations between variables of this research has been confirmed based on results of SEM method. For analyzes of data, structural equation method has been used. Regarding obtained results, effectiveness amount of office automation on performance was measured which was equal to estimated standard amount as 83%. Obtained results from main hypothesis test of this research completely conform which there is about office automation in studied organization and office automation could improve performance of organization.

  11. Critiquing Physician Decision Making Using Data from Automated Medical Records: Assessing the Limitations

    OpenAIRE

    Van Der Lei, Johan; Musen, Mark A.; van der Does, Emiel; Manintveld, Arie J.

    1990-01-01

    This paper describes the evaluation of a critiquing system, HYPERCRITIC, that relies on automated medical records for its data input. The purpose of HYPERCRITIC is to offer comments to general practitioners on their treatment of hypertension. HYPERCRITIC has access to the data stored in a primary-care information system that supports a fully automated medical record. Medical records of 20 patients with hypertension were submitted to both physicians and HYPERCRITIC. The critique generated by t...

  12. Fully Automated Assessment of the Severity of Parkinson's Disease from Speech.

    Science.gov (United States)

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2015-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson's disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks - the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  13. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    Energy Technology Data Exchange (ETDEWEB)

    Kertesz, Vilmos [ORNL; Van Berkel, Gary J [ORNL

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  14. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    Science.gov (United States)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. PMID:21609694

  15. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  16. Experimental Assessment of Mouse Sociability Using an Automated Image Processing Approach.

    Science.gov (United States)

    Varghese, Frency; Burket, Jessica A; Benson, Andrew D; Deutsch, Stephen I; Zemlin, Christian W

    2016-01-01

    Mouse is the preferred model organism for testing drugs designed to increase sociability. We present a method to quantify mouse sociability in which the test mouse is placed in a standardized apparatus and relevant behaviors are assessed in three different sessions (called session I, II, and III). The apparatus has three compartments (see Figure 1), the left and right compartments contain an inverted cup which can house a mouse (called "stimulus mouse"). In session I, the test mouse is placed in the cage and its mobility is characterized by the number of transitions made between compartments. In session II, a stimulus mouse is placed under one of the inverted cups and the sociability of the test mouse is quantified by the amounts of time it spends near the cup containing the enclosed stimulus mouse vs. the empty inverted cup. In session III, the inverted cups are removed and both mice interact freely. The sociability of the test mouse in session III is quantified by the number of social approaches it makes toward the stimulus mouse and by the number of times it avoids a social approach by the stimulus mouse. The automated evaluation of the movie detects the nose of the test mouse, which allows the determination of all described sociability measures in session I and II (in session III, approaches are identified automatically but classified manually). To find the nose, the image of an empty cage is digitally subtracted from each frame of the movie and the resulting image is binarized to identify the mouse pixels. The mouse tail is automatically removed and the two most distant points of the remaining mouse are determined; these are close to nose and base of tail. By analyzing the motion of the mouse and using continuity arguments, the nose is identified. Figure 1. Assessment of Sociability During 3 sessions. Session I (top): Acclimation of test mouse to the cage. Session II (middle): Test mouse moving freely in the cage while the stimulus mouse is enclosed in an

  17. Automated determination of nitrate plus nitrite in aqueous samples with flow injection analysis using vanadium (III) chloride as reductant.

    Science.gov (United States)

    Wang, Shu; Lin, Kunning; Chen, Nengwang; Yuan, Dongxing; Ma, Jian

    2016-01-01

    Determination of nitrate in aqueous samples is an important analytical objective for environmental monitoring and assessment. Here we report the first automatic flow injection analysis (FIA) of nitrate (plus nitrite) using VCl3 as reductant instead of the well-known but toxic cadmium column for reducing nitrate to nitrite. The reduced nitrate plus the nitrite originally present in the sample react with the Griess reagent (sulfanilamide and N-1-naphthylethylenediamine dihydrochloride) under acidic condition. The resulting pink azo dye can be detected at 540 nm. The Griess reagent and VCl3 are used as a single mixed reagent solution to simplify the system. The various parameters of the FIA procedure including reagent composition, temperature, volume of the injection loop, and flow rate were carefully investigated and optimized via univariate experimental design. Under the optimized conditions, the linear range and detection limit of this method are 0-100 µM (R(2)=0.9995) and 0.1 µM, respectively. The targeted analytical range can be easily extended to higher concentrations by selecting alternative detection wavelengths or increasing flow rate. The FIA system provides a sample throughput of 20 h(-1), which is much higher than that of previously reported manual methods based on the same chemistry. National reference solutions and different kinds of aqueous samples were analyzed with our method as well as the cadmium column reduction method. The results from our method agree well with both the certified value and the results from the cadmium column reduction method (no significant difference with P=0.95). The spiked recovery varies from 89% to 108% for samples with different matrices, showing insignificant matrix interference in this method. PMID:26695325

  18. Context Sampling Descriptive Assessment: A Pilot Study of a Further Approach to Functional Assessment

    Science.gov (United States)

    Garbutt, Nathalie; Furniss, Frederick

    2007-01-01

    Background: The ability of descriptive assessments to differentiate functions of problem behaviours might be increased by systematically sampling natural contexts characterized by different establishing operations. This study evaluated the stability of such characteristics, and variability in challenging behaviour, for three school contexts.…

  19. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    Science.gov (United States)

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  20. Risk Assessment on the Transition Program for Air Traffic Control Automation System Upgrade

    Directory of Open Access Journals (Sweden)

    Li Dong Bin

    2016-01-01

    Full Text Available We analyzed the safety risks of the transition program for Air Traffic Control (ATC automation system upgrade by using the event tree analysis method in this paper. We decomposed the occurrence progress of the three transition phase and built the event trees corresponding to the three stages, and then we determined the probability of success of each factor and calculated probability of success of the air traffic control automation system upgrade transition. In the conclusion, we illustrate the transition program safety risk according to the results.

  1. Comparative Assessment of Automated Nucleic Acid Sample Extraction Equipment for Biothreat Agents

    OpenAIRE

    Kalina, Warren Vincent; Douglas, Christina Elizabeth; Coyne, Susan Rajnik; Minogue, Timothy Devin

    2014-01-01

    Magnetic beads offer superior impurity removal and nucleic acid selection over older extraction methods. The performances of nucleic acid extraction of biothreat agents in blood or buffer by easyMAG, MagNA Pure, EZ1 Advanced XL, and Nordiag Arrow were evaluated. All instruments showed excellent performance in blood; however, the easyMAG had the best precision and versatility.

  2. Development of an automated mass spectrometry system for the quantitative analysis of liver microsomal incubation samples: a tool for rapid screening of new compounds for metabolic stability.

    Science.gov (United States)

    Korfmacher, W A; Palmer, C A; Nardo, C; Dunn-Meynell, K; Grotz, D; Cox, K; Lin, C C; Elicone, C; Liu, C; Duchoslav, E

    1999-01-01

    There is a continuing need for increased throughput in the evaluation of new drug entities in terms of their pharmacokinetic parameters. One useful parameter that can be measured in vitro using liver microsomal preparations is metabolic stability. In this report, we describe an automated system that can be used for unattended quantitative analysis of liver microsomal samples for a series of compounds. This system is based on the Sciex API 150 (single quadrupole) liquid chromatography/mass spectrometry system and utilizes 96-well plate autosampler technology as well as a custom-designed AppleScript which executes the on-line data processing and report generation. It has the capability of analyzing at least 75 compounds per week or 300 compounds per month in an automated fashion. PMID:10353225

  3. Assessing Racial Microaggression Distress in a Diverse Sample.

    Science.gov (United States)

    Torres-Harding, Susan; Turner, Tasha

    2015-12-01

    Racial microaggressions are everyday subtle or ambiguous racially related insults, slights, mistreatment, or invalidations. Racial microaggressions are a type of perceived racism that may negatively impact the health and well-being of people of color in the United States. This study examined the reliability and validity of the Racial Microaggression Scale distress subscales, which measure the perceived stressfulness of six types of microaggression experiences in a racially and ethnically diverse sample. These subscales exhibited acceptable to good internal consistency. The distress subscales also evidenced good convergent validity; the distress subscales were positively correlated with additional measures of stressfulness due to experiencing microaggressions or everyday discrimination. When controlling for the frequency of one's exposure to microaggression incidents, some racial/ethnic group differences were found. Asian Americans reported comparatively lower distress and Latinos reporting comparatively higher distress in response to Foreigner, Low-Achieving, Invisibility, and Environmental microaggressions. African Americans reported higher distress than the other groups in response to Environmental microaggressions. Results suggest that the Racial Microaggressions Scale distress subscales may aid health professionals in assessing the distress elicited by different types of microaggressions. In turn, this may facilitate diagnosis and treatment planning in order to provide multiculturally competent care for African American, Latino, and Asian American clients. PMID:25237154

  4. Performance of automated software in the assessment of segmental left ventricular function in cardiac CT: Comparison with cardiac magnetic resonance

    International Nuclear Information System (INIS)

    To evaluate the accuracy, reliability and time saving potential of a novel cardiac CT (CCT)-based, automated software for the assessment of segmental left ventricular function compared to visual and manual quantitative assessment of CCT and cardiac magnetic resonance (CMR). Forty-seven patients with suspected or known coronary artery disease (CAD) were enrolled in the study. Wall thickening was calculated. Segmental LV wall motion was automatically calculated and shown as a colour-coded polar map. Processing time for each method was recorded. Mean wall thickness in both systolic and diastolic phases on polar map, CCT, and CMR was 9.2 ± 0.1 mm and 14.9 ± 0.2 mm, 8.9 ± 0.1 mm and 14.5 ± 0.1 mm, 8.3 ± 0.1 mm and 13.6 ± 0.1 mm, respectively. Mean wall thickening was 68.4 ± 1.5 %, 64.8 ± 1.4 % and 67.1 ± 1.4 %, respectively. Agreement for the assessment of LV wall motion between CCT, CMR and polar maps was good. Bland-Altman plots and ICC indicated good agreement between CCT, CMR and automated polar maps of the diastolic and systolic segmental wall thickness and thickening. The processing time using polar map was significantly decreased compared with CCT and CMR. Automated evaluation of segmental LV function with polar maps provides similar measurements to manual CCT and CMR evaluation, albeit with substantially reduced analysis time. (orig.)

  5. Performance of automated software in the assessment of segmental left ventricular function in cardiac CT: Comparison with cardiac magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Rui [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Capital Medical University, Department of Radiology, Beijing Anzhen Hospital, Beijing (China); Meinel, Felix G. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Canstein, Christian [Siemens Medical Solutions USA, Malvern, PA (United States); Spearman, James V. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); De Cecco, Carlo N. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Departments of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2015-12-15

    To evaluate the accuracy, reliability and time saving potential of a novel cardiac CT (CCT)-based, automated software for the assessment of segmental left ventricular function compared to visual and manual quantitative assessment of CCT and cardiac magnetic resonance (CMR). Forty-seven patients with suspected or known coronary artery disease (CAD) were enrolled in the study. Wall thickening was calculated. Segmental LV wall motion was automatically calculated and shown as a colour-coded polar map. Processing time for each method was recorded. Mean wall thickness in both systolic and diastolic phases on polar map, CCT, and CMR was 9.2 ± 0.1 mm and 14.9 ± 0.2 mm, 8.9 ± 0.1 mm and 14.5 ± 0.1 mm, 8.3 ± 0.1 mm and 13.6 ± 0.1 mm, respectively. Mean wall thickening was 68.4 ± 1.5 %, 64.8 ± 1.4 % and 67.1 ± 1.4 %, respectively. Agreement for the assessment of LV wall motion between CCT, CMR and polar maps was good. Bland-Altman plots and ICC indicated good agreement between CCT, CMR and automated polar maps of the diastolic and systolic segmental wall thickness and thickening. The processing time using polar map was significantly decreased compared with CCT and CMR. Automated evaluation of segmental LV function with polar maps provides similar measurements to manual CCT and CMR evaluation, albeit with substantially reduced analysis time. (orig.)

  6. Accelerated solvent extraction (ASE) - a fast and automated technique with low solvent consumption for the extraction of solid samples (T12)

    International Nuclear Information System (INIS)

    Full text: Accelerated solvent extraction (ASE) is a modern extraction technique that significantly streamlines sample preparation. A common organic solvent as well as water is used as extraction solvent at elevated temperature and pressure to increase extraction speed and efficiency. The entire extraction process is fully automated and performed within 15 minutes with a solvent consumption of 18 ml for a 10 g sample. For many matrices and for a variety of solutes, ASE has proven to be equivalent or superior to sonication, Soxhlet, and reflux extraction techniques while requiring less time, solvent and labor. First ASE has been applied for the extraction of environmental hazards from solid matrices. Within a very short time ASE was approved by the U.S. EPA for the extraction of BNAs, PAHs, PCBs, pesticides, herbicides, TPH, and dioxins from solid samples in method 3545. Especially for the extraction of dioxins the extraction time with ASE is reduced to 20 minutes in comparison to 18 h using Soxhlet. In food analysis ASE is used for the extraction of pesticide and mycotoxin residues from fruits and vegetables, the fat determination and extraction of vitamins. Time consuming and solvent intensive methods for the extraction of additives from polymers as well as for the extraction of marker compounds from herbal supplements can be performed with higher efficiencies using ASE. For the analysis of chemical weapons the extraction process and sample clean-up including derivatization can be automated and combined with GC-MS using an online ASE-APEC-GC system. (author)

  7. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    Directory of Open Access Journals (Sweden)

    Demir Sumeyra U

    2012-12-01

    Full Text Available Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD

  8. Automated Monitoring Systems to Assess Gait Score and Feed Intake of Broilers

    OpenAIRE

    Aydin, Arda

    2016-01-01

    The last decades of the 20th century saw important changes in animal production. Production intensified considerably and farms became highly specialised. Traditionally, livestock management decisions were based on the observation and judgment of the farmer. However, because of the increasing scale of farms and the large number of animals, the farmer has a high technical, organisational and logistical workload and therefore has limited time to monitor his animals himself. Automated monitori...

  9. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    OpenAIRE

    Demir Sumeyra U; Hakimzadeh Roya; Hargraves Rosalyn Hobson; Ward Kevin R; Myer Eric V; Najarian Kayvan

    2012-01-01

    Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tis...

  10. Assessment of Automated Analyses of Cell Migration on Flat and Nanostructured Surfaces

    Directory of Open Access Journals (Sweden)

    Hans A Kestler

    2012-07-01

    Full Text Available Motility studies of cells often rely on computer software that analyzes time-lapse recorded movies and establishes cell trajectories fully automatically. This raises the question of reproducibility of results, since different programs could yield significantly different results of such automated analysis. The fact that the segmentation routines of such programs are often challenged by nanostructured surfaces makes the question more pertinent. Here we illustrate how it is possible to track cells on bright field microscopy images with image analysis routines implemented in an open-source cell tracking program, PACT (Program for Automated Cell Tracking. We compare the automated motility analysis of three cell tracking programs, PACT, Autozell, and TLA, using the same movies as input for all three programs. We find that different programs track overlapping, but different subsets of cells due to different segmentation methods. Unfortunately, population averages based on such different cell populations, differ significantly in some cases. Thus, results obtained with one software package are not necessarily reproducible by other software.

  11. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs: Part II

    Science.gov (United States)

    Straub, J. A.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Gladhill, R.; Nolke, S.; Riddick, J.

    2006-10-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. In the previous paper, it was shown how photomask MRC is used to uncover data related problems prior to automated defect inspection. It was demonstrated how jobs which are likely to have problems at inspection could be identified and separated from those which are not. The use of photomask MRC in production was shown to reduce time lost to aborted runs and troubleshooting due to data issues. In this paper, the effectiveness of this photomask MRC program in a high volume photomask factory over the course of a year as applied to more than ten thousand jobs will be shown. Statistics on the results of the MRC runs will be presented along with the associated impact to the automated defect inspection process. Common design problems will be shown as well as their impact to mask manufacturing throughput and productivity. Finally, solutions to the most common and most severe problems will be offered and discussed.

  12. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    OpenAIRE

    Elena Ordoñez; Laura Rueda; M. Paz Cañadas; Carme Fuster; Vincenzo Cirigliano

    2013-01-01

    Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 15...

  13. Technology assessment of automated atlas based segmentation in prostate bed contouring

    International Nuclear Information System (INIS)

    Prostate bed (PB) contouring is time consuming and associated with inter-observer variability. We evaluated an automated atlas-based segmentation (AABS) engine in its potential to reduce contouring time and inter-observer variability. An atlas builder (AB) manually contoured the prostate bed, rectum, left femoral head (LFH), right femoral head (RFH), bladder, and penile bulb of 75 post-prostatectomy cases to create an atlas according to the recent RTOG guidelines. 5 other Radiation Oncologists (RO) and the AABS contoured 5 new cases. A STAPLE contour for each of the 5 patients was generated. All contours were anonymized and sent back to the 5 RO to be edited as clinically necessary. All contouring times were recorded. The dice similarity coefficient (DSC) was used to evaluate the unedited- and edited- AABS and inter-observer variability among the RO. Descriptive statistics, paired t-tests and a Pearson correlation were performed. ANOVA analysis using logit transformations of DSC values was calculated to assess inter-observer variability. The mean time for manual contours and AABS was 17.5- and 14.1 minutes respectively (p = 0.003). The DSC results (mean, SD) for the comparison of the unedited-AABS versus STAPLE contours for the PB (0.48, 0.17), bladder (0.67, 0.19), LFH (0.92, 0.01), RFH (0.92, 0.01), penile bulb (0.33, 0.25) and rectum (0.59, 0.11). The DSC results (mean, SD) for the comparison of the edited-AABS versus STAPLE contours for the PB (0.67, 0.19), bladder (0.88, 0.13), LFH (0.93, 0.01), RFH (0.92, 0.01), penile bulb (0.54, 0.21) and rectum (0.78, 0.12). The DSC results (mean, SD) for the comparison of the edited-AABS versus the expert panel for the PB (0.47, 0.16), bladder (0.67, 0.18), LFH (0.83, 0.18), RFH (0.83, 0.17), penile bulb (0.31, 0.23) and rectum (0.58, 0.09). The DSC results (mean, SD) for the comparison of the STAPLE contours and the 5 RO are PB (0.78, 0.15), bladder (0.96, 0.02), left femoral head (0.87, 0.19), right femoral head (0

  14. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. PMID:26806135

  15. The effect of clustering on lot quality assurance sampling: a probabilistic model to calculate sample sizes for quality assessments

    OpenAIRE

    Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello

    2013-01-01

    Background: Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rw...

  16. Assessing acceptance sampling application in manufacturing electrical and electronic products

    Directory of Open Access Journals (Sweden)

    B.M. Deros

    2008-12-01

    Full Text Available Purpose: This paper discusses the use of acceptance sampling technique as a practical tool for quality assuranceapplications to decide whether the lot is to be accepted or rejected.Design/methodology/approach: In Malaysia, single attribute acceptance sampling plan is widely practicedfor quality assurance purposes in manufacturing companies. Literature showed that majority of past studies onacceptance sampling had focused on the development and establishment of new methods for acceptance-samplingapplication. However, there is none that had investigated the relationship between acceptance sampling planselection and effectiveness of the selection. Therefore, in this study, the authors had analyzed the effectivenessthe acceptance sampling plan application method and its implementation problems in manufacturing electricaland electronics products. The study was conducted by using case study methodology at three manufacturingcompanies’ coded names: company A, B and C. In this paper, the authors would like to share the case studycompanies’ experienced of acceptance sampling plan selection and difficulties that they had faced during thecourse of implementing acceptance sampling in their production lines.Findings: The result from the three case study companies showed by implementing acceptance samplingthey could easily investigate and diagnose their suppliers’ product quality immediately upon their arrivalat the company premise.Practical implications: The continuous improvement and review of acceptance sampling plan is important toimprove the products quality and ensure continuous customer satisfaction.Originality/value: All the three case study companies agreed that acceptance sampling implementation hadimproved their product’s quality in the market place.

  17. Sampling systems for visual field assessment and computerised perimetry.

    OpenAIRE

    Drasdo, N; Peaston, W C

    1980-01-01

    Three successive stages in the representations of the visual image are studied by computations from the best available data. The results are embodied in projections, drawn automatically. These projections are related to the assessment of visual disability and the dimensions of lesions in the retina and visual pathway. Fields can be assessed visually with the aid of graticules or directly during computerised perimetry.

  18. Development of automated measurement apparatus for the packaged sample in U8-type vessel and its application for the measurement of radioactive materials in environment

    International Nuclear Information System (INIS)

    The Fukushima No. 1 Nuclear Power Plant suffered major damage from the 2011 off the Pacific coast of Tohoku Earthquake and subsequent tsunami on March 11, 2011 and released large amounts of radioactive materials. Measuring the radioactivity of many samples is necessary to investigate behavior of radioactive materials from the Nuclear Power Plant and contamination in the environment. For measuring these samples automatically, we developed an automated measurement apparatus. The apparatus is composed of a rotating table for placement of samples, a hand for moving samples, a movable lead shield for covering the detector, and a disposal container for samples after measurement. A high-purity Germanium radiation detector of horizontal type is used for gamma-ray spectrometry. The apparatus is able to measure successively 14 packaged samples in U8-type vessel. Series of operations is controlled by a software which is based on LabVIEW (manufactured by National Instruments, Co), and a sinking digital output module (NI9477). We will give a presentation about the results of the performance evaluation using environmental samples. (author)

  19. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    Science.gov (United States)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  20. An approach for the automated risk assessment of structural differences between spreadsheets (DiffXL)

    CERN Document Server

    Hunt, John

    2009-01-01

    This paper outlines an approach to manage and quantify the risks associated with changes made to spreadsheets. The methodology focuses on structural differences between spreadsheets and suggests a technique by which a risk analysis can be achieved in an automated environment. The paper offers an example that demonstrates how contiguous ranges of data can be mapped into a generic list of formulae, data and metadata. The example then shows that comparison of these generic lists can establish the structural differences between spreadsheets and quantify the level of risk that each change has introduced. Lastly the benefits, drawbacks and limitations of the technique are discussed in a commercial context.

  1. Equilibrium sampling for a thermodynamic assessment of contaminated sediments

    DEFF Research Database (Denmark)

    Mayer, Philipp; Nørgaard Schmidt, Stine; Mäenpää, Kimmo;

    Hydrophobic organic contaminants (HOCs) reaching the aquatic environment are largely stored in sediments. The risk of contaminated sediments is challenging to assess since traditional exhaustive extraction methods yield total HOC concentrations, whereas freely dissolved concentrations (Cfree...

  2. Automated cytochrome c oxidase bioassay developed for ionic liquids' toxicity assessment.

    Science.gov (United States)

    Costa, Susana P F; Martins, Bárbara S F; Pinto, Paula C A G; Saraiva, M Lúcia M F S

    2016-05-15

    A fully automated cytochrome c oxidase assay resorting to sequential injection analysis (SIA) was developed for the first time and implemented to evaluate potential toxic compounds. The bioassay was validated by evaluation of 15 ionic liquids (ILs) with distinct cationic head groups, alkyl side chains and anions. The assay was based on cytochrome c oxidase activity reduction in presence of tested compounds and quantification of inhibitor concentration required to cause 50% of enzyme activity inhibition (EC50). The obtained results demonstrated that enzyme activity was considerably inhibited by BF4 anion and ILs incorporating non-aromatic pyrrolidinium and tetrabutylphosphonium cation cores. Emim [Ac] and chol [Ac], on contrary, presented the higher EC50 values among the ILs tested. The developed automated SIA methodology is a simple and robust high-throughput screening bioassay and exhibited good repeatability in all the tested conditions (rsd<3.7%, n=10). Therefore, it is expected that due to its simplicity and low cost, the developed approach can be used as alternative to traditional screening assays for evaluation of ILs toxicity and identification of possible toxicophore structures. Additionally, the results presented in this study provide further information about ILs toxicity. PMID:26894289

  3. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    Science.gov (United States)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  4. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs

    Science.gov (United States)

    Gladhill, R.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Nolke, S.; Riddick, J.; Straub, J. A.

    2005-11-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. A production implementation of automated photomask manufacturing rule checking (MRC) is presented and discussed for various photomask lithography and inspection lines. This paper will focus on identifying data which may cause production delays at the mask inspection stage. It will be shown how photomask MRC can be used to discover data related problems prior to inspection, separating jobs which are likely to have problems at inspection from those which are not. Photomask MRC can also be used to identify geometries requiring adjustment of inspection parameters for optimal inspection, and to assist with any special handling or change of routing requirements. With this foreknowledge, steps can be taken to avoid production delays that increase manufacturing costs. Finally, the data flow implemented for MRC can be used as a platform for other photomask data preparation tasks.

  5. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    Science.gov (United States)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  6. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  7. Influence of sample preparation and reliability of automated numerical refocusing in stain-free analysis of dissected tissues with quantitative phase digital holographic microscopy

    Science.gov (United States)

    Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi

    2015-05-01

    Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.

  8. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  9. Exploring trait assessment of samples, persons, and cultures.

    Science.gov (United States)

    McCrae, Robert R

    2013-01-01

    I present a very broad overview of what I have learned about personality trait assessment at different levels and offer some views on future directions for research and clinical practice. I review some basic principles of scale development and argue that internal consistency has been overemphasized; more attention to retest reliability is needed. Because protocol validity is crucial for individual assessment and because validity scales have limited utility, I urge combining assessments from multiple informants, and I present some statistical tools for that purpose. As culture-level traits, I discuss ethos, national character stereotypes, and aggregated personality traits, and summarize evidence for the validity of the latter. Our understanding of trait profiles of cultures is limited, but it can guide future exploration. PMID:23924211

  10. Rapid assessment of soil and groundwater tritium by vegetation sampling

    International Nuclear Information System (INIS)

    A rapid and relatively inexpensive technique for defining the extent of groundwater contamination by tritium has been investigated. The technique uses existing vegetation to sample the groundwater. Water taken up by deep rooted trees is collected by enclosing tree branches in clear plastic bags. Water evaporated from the leaves condenses on the inner surface of the bag. The water is removed from the bag with a syringe. The bags can be sampled many times. Tritium in the water is detected by liquid scintillation counting. The water collected in the bags has no color and counts as well as distilled water reference samples. The technique was used in an area of known tritium contamination and proved to be useful in defining the extent of tritium contamination

  11. Sampling procedures for assessing accuracy of record linkage

    OpenAIRE

    Smith, Paul; Gammon, Shelley; Cummins, Sarah; Chatzoglou, Christos; Heasman, Dick

    2016-01-01

    The use of administrative datasets as a data source in official statistics has become much more common as there is a drive for more outputs to be produced more efficiently. Many outputs rely on linkage between two or more datasets, and this is often undertaken in a number of phases with different methods and rules. In these situations we would like to be able to assess the quality of the linkage, and this involves some re-assessment of both links and non-links. In this paper we discuss sampli...

  12. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    Science.gov (United States)

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S.; Walker, Duncan; Crozier, Stuart; Engstrom, Craig

    2015-10-01

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint. Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18-49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system. High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the potential to improve analyses of cam-type lesions of the FHN junction for large-scale morphometric and clinical MR

  13. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    International Nuclear Information System (INIS)

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint.Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18–49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system.High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  <  0.01, p  =  0.98).Our automatic 3D method analysed MR images of the hip joints to generate alpha angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the

  14. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  15. Automated system for generation of soil moisture products for agricultural drought assessment

    Science.gov (United States)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically

  16. A Robust and Automated Hyperspectral Damage Assessment System Under Varying Illumination Conditions and Viewing Geometry Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Some target signatures of interest in drought monitoring, flooding assessment, fire damage assessment, coastal changes, urban changes, etc. may need to be tracked...

  17. Automated finite element updating using strain data for the lifetime reliability assessment of bridges

    International Nuclear Information System (INIS)

    The importance of improving the understanding of the performance of structures over their lifetime under uncertainty with information obtained from structural health monitoring (SHM) has been widely recognized. However, frameworks that efficiently integrate monitoring data into the life-cycle management of structures are yet to be developed. The objective of this paper is to propose and illustrate an approach for updating the lifetime reliability of aging bridges using monitored strain data obtained from crawl tests. It is proposed to use automated finite element model updating techniques as a tool for updating the resistance parameters of the structure. In this paper, the results from crawl tests are used to update the finite element model and, in turn, update the lifetime reliability. The original and updated lifetime reliabilities are computed using advanced computational tools. The approach is illustrated on an existing bridge.

  18. Mass asymmetry and tricyclic wobble motion assessment using automated launch video analysis

    Institute of Scientific and Technical Information of China (English)

    Ryan DECKER; Joseph DONINI; William GARDNER; Jobin JOHN; Walter KOENIG

    2016-01-01

    This paper describes an approach to identify epicyclic and tricyclic motion during projectile flight caused by mass asymmetries in spin-stabilized projectiles. Flight video was captured following projectile launch of several M110A2E1 155 mm artillery projectiles. These videos were then analyzed using the automated flight video analysis method to attain their initial position and orientation histories. Examination of the pitch and yaw histories clearly indicates that in addition to epicyclic motion’s nutation and precession oscillations, an even faster wobble amplitude is present during each spin revolution, even though some of the amplitudes of the oscillation are smaller than 0.02 degree. The results are compared to a sequence of shots where little appreciable mass asymmetries were present, and only nutation and precession frequencies are predominantly apparent in the motion history results. Magnitudes of the wobble motion are estimated and compared to product of inertia measurements of the asymmetric projectiles.

  19. Universality of Generalized Bunching and Efficient Assessment of Boson Sampling

    Science.gov (United States)

    Shchesnovich, V. S.

    2016-03-01

    It is found that identical bosons (fermions) show a generalized bunching (antibunching) property in linear networks: the absolute maximum (minimum) of the probability that all N input particles are detected in a subset of K output modes of any nontrivial linear M -mode network is attained only by completely indistinguishable bosons (fermions). For fermions K is arbitrary; for bosons it is either (i) arbitrary for only classically correlated bosons or (ii) satisfies K ≥N (or K =1 ) for arbitrary input states of N particles. The generalized bunching allows us to certify in a polynomial in N number of runs that a physical device realizing boson sampling with an arbitrary network operates in the regime of full quantum coherence compatible only with completely indistinguishable bosons. The protocol needs only polynomial classical computations for the standard boson sampling, whereas an analytic formula is available for the scattershot version.

  20. Standard Format for Chromatographic-polarimetric System small samples assessment

    International Nuclear Information System (INIS)

    The treatment of samples containing optically active substances to be evaluated as part of quality control of raw material entering industrial process, and also during the modifications exerted on it to obtain the desired final composition is still and unsolved problem for many industries. That is the case of sugarcane industry. Sometimes the troubles implied are enlarged because samples to be evaluated are not bigger than one milliliter. Reduction of gel beds in G-10 and G-50 chromatographic columns having an inner diameter of 16 mm, instead of 25, and bed heights adjustable to requirements by means of sliding stoppers to increase analytical power were evaluated with glucose and sucrose standards in concentrations from 1 to 10 g/dL, using aliquots of 1 ml without undesirable dilutions that could affect either detection or chromatographic profile. Assays with seaweed extracts gave good results that are shown. It is established the advantage to know concentration of a separated substance by the height of its peak and the savings in time and reagents resulting . Sample expanded uncertainty in both systems is compared. It is also presented several programs for data acquisition, storing and processing. (Author)

  1. Locoregional Control of Non-Small Cell Lung Cancer in Relation to Automated Early Assessment of Tumor Regression on Cone Beam Computed Tomography

    International Nuclear Information System (INIS)

    Purpose: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its potential use in response adapted personalized radiation therapy is evaluated empirically. Methods and Materials: Automated deformable registration with calculation of the Jacobian determinant was applied to serial CBCT scans in a series of 99 patients with NSCLC. Tumor volume at the end of treatment was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2 groups having volume regressions below or above the population median value. Kaplan-Meier plots of locoregional disease-free rate and overall survival in the 2 groups were used to evaluate the predictive value of tumor regression during treatment. Cox proportional hazards model was used to adjust for other clinical characteristics. Results: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could be quantified early into the treatment course. Interestingly, patients with pronounced volume regression had worse locoregional tumor control and overall survival. This was significant on patient with non-adenocarcinoma histology. Conclusions: Evaluation of routinely acquired CBCT images during radiation therapy provides biological information on the specific tumor. This could potentially form the basis for personalized response adaptive therapy

  2. A lab-on-a-chip system integrating tissue sample preparation and multiplex RT-qPCR for gene expression analysis in point-of-care hepatotoxicity assessment.

    Science.gov (United States)

    Lim, Geok Soon; Chang, Joseph S; Lei, Zhang; Wu, Ruige; Wang, Zhiping; Cui, Kemi; Wong, Stephen

    2015-10-21

    A truly practical lab-on-a-chip (LOC) system for point-of-care testing (POCT) hepatotoxicity assessment necessitates the embodiment of full-automation, ease-of-use and "sample-in-answer-out" diagnostic capabilities. To date, the reported microfluidic devices for POCT hepatotoxicity assessment remain rudimentary as they largely embody only semi-quantitative or single sample/gene detection capabilities. In this paper, we describe, for the first time, an integrated LOC system that is somewhat close to a practical POCT hepatotoxicity assessment device - it embodies both tissue sample preparation and multiplex real-time RT-PCR. It features semi-automation, is relatively easy to use, and has "sample-in-answer-out" capabilities for multiplex gene expression analysis. Our tissue sample preparation module incorporating both a microhomogenizer and surface-treated paramagnetic microbeads yielded high purity mRNA extracts, considerably better than manual means of extraction. A primer preloading surface treatment procedure and the single-loading inlet on our multiplex real-time RT-PCR module simplify off-chip handling procedures for ease-of-use. To demonstrate the efficacy of our LOC system for POCT hepatotoxicity assessment, we perform a preclinical animal study with the administration of cyclophosphamide, followed by gene expression analysis of two critical protein biomarkers for liver function tests, aspartate transaminase (AST) and alanine transaminase (ALT). Our experimental results depict normalized fold changes of 1.62 and 1.31 for AST and ALT, respectively, illustrating up-regulations in their expression levels and hence validating their selection as critical genes of interest. In short, we illustrate the feasibility of multiplex gene expression analysis in an integrated LOC system as a viable POCT means for hepatotoxicity assessment. PMID:26329655

  3. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    Science.gov (United States)

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  4. Attempt at in-air PIXE analysis of spot samples on a filter-tape mounted in an automated beta-ray absorption mass monitor

    International Nuclear Information System (INIS)

    We attempted in-air-PIXE analysis of SPM using spot samples on a filter-tape mounted in an automated beta-ray absorption mass monitor. Al, Si, S, Fe and Zn, etc., which are of interest for identifying the behavior and characteristics of SPM, were detected on the SPM spot samples on a glass-fiber filter-tape, but the peaks of these elements were nearly identical to those of blank glass-fiber filter-tape. As such, it was difficult to detect elements present in SPM from the X-ray spectra of the spot samples. On the other hand, in the case of a PTFE membrane filter-tape, the S peak was distinct and the Fe peak was also clear, and peaks for elements Al, Mn and Zn, etc., were also confirmed. Consequently, if a method for determining quantity is established, direct multi-elemental analysis by in-air-PIXE of high time-resolution SPM spot samples collected on a PTFE membrane filter-tape mounted in a SPM monitor will be possible. (author)

  5. Simple semi-automated portable capillary electrophoresis instrument with contactless conductivity detection for the determination of β-agonists in pharmaceutical and pig-feed samples.

    Science.gov (United States)

    Nguyen, Thi Anh Huong; Pham, Thi Ngoc Mai; Doan, Thi Tuoi; Ta, Thi Thao; Sáiz, Jorge; Nguyen, Thi Quynh Hoa; Hauser, Peter C; Mai, Thanh Duc

    2014-09-19

    An inexpensive, robust and easy to use portable capillary electrophoresis instrument with miniaturized high-voltage capacitively coupled contactless conductivity detection was developed. The system utilizes pneumatic operation to manipulate the solutions for all flushing steps. The different operations, i.e. capillary flushing, interface rinsing, and electrophoretic separation, are easily activated by turning an electronic switch. To allow the analysis of samples with limited available volume, and to render the construction less complicated compared to a computer-controlled counterpart, sample injection is carried out hydrodynamically directly from the sample vial into the capillary by manual syphoning. The system is a well performing solution where the financial means for the highly expensive commercial instruments are not available and where the in-house construction of a sophisticated automated instrument is not possible due to limited mechanical and electronic workshop facilities and software programming expertise. For demonstration, the system was employed successfully for the determination of some β-agonists, namely salbutamol, metoprolol and ractopamine down to 0.7ppm in pharmaceutical and pig-feed sample matrices in Vietnam. PMID:25115456

  6. Assessing rare earth elements in quartz rich geological samples.

    Science.gov (United States)

    Santoro, A; Thoss, V; Guevara, S Ribeiro; Urgast, D; Raab, A; Mastrolitti, S; Feldmann, J

    2016-01-01

    Sodium peroxide (Na2O2) fusion coupled to Inductively Coupled Plasma Tandem Mass Spectrometry (ICP-MS/MS) measurements was used to rapidly screen quartz-rich geological samples for rare earth element (REE) content. The method accuracy was checked with a geological reference material and Instrumental Neutron Activation Analysis (INAA) measurements. The used mass-mode combinations presented accurate results (only exception being (157)Gd in He gas mode) with recovery of the geological reference material QLO-1 between 80% and 98% (lower values for Lu, Nd and Sm) and in general comparable to INAA measurements. Low limits of detection for all elements were achieved, generally below 10 pg g(-1), as well as measurement repeatability below 15%. Overall, the Na2O2/ICP-MS/MS method proved to be a suitable lab-based method to quickly and accurately screen rock samples originating from quartz-rich geological areas for rare earth element content; particularly useful if checking commercial viability. PMID:26595776

  7. Regional groundwater sampling for the assessment of fluid movement

    International Nuclear Information System (INIS)

    On a regional scale groundwater flow is considered to take place within high permeability lithologies which are bounded by low permeability strata often designated as impermeable. These are the sort of environments proposed for feasibility studies for nuclear waste disposal which present problems in terms of choosing relevant groundwater sampling locations and subsequent data interpretation. Hydrogeological studies of a sequence of gently dipping clays and limestones in central England, suggest that cross-formational flows have a major influence on groundwater chemistry and provenance. The hydrogeochemical study was based on the tenet that very slow groundwater movement is most easily observed by measuring time-related parameters, such as element chemistry, inert gas contents and stable isotopes, at widely separated measurement points within a sequence of high permeability lithologies. The occurrence and scale of cross-formational flow, particularly through the clay lithologies is being evaluated by long-term sampling of pressure heads and groundwater chemistry from clay formations in Oxfordshire. Basic chemistry, natural series isotopes and dissolved gas contents measured from fluids in these low flow environments are expected to substantiate the model of inter-lithology water movement in general and vertical water movement across clay lithologies in particular. 10 references, 4 figures, 1 table

  8. A neural-symbolic system for automated assessment in training simulators - A position paper

    NARCIS (Netherlands)

    Penning, H.L.H. de; Kappé, B.; Bosch, K. van den

    2009-01-01

    Performance assessment in training simulators is a complex task. It requires monitoring and interpreting the student’s behaviour in the simulator using knowledge of the training task, the environment and a lot of experience. Assessment in simulators is therefore generally done by human observers. To

  9. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    Directory of Open Access Journals (Sweden)

    Colin J Torney

    Full Text Available Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future.

  10. A new approach to automated assessment of fractionation of endocardial electrograms during atrial fibrillation

    International Nuclear Information System (INIS)

    Complex fractionated atrial electrograms (CFAEs) may represent the electrophysiological substrate for atrial fibrillation (AF). Progress in signal processing algorithms to identify sites of CFAEs is crucial for the development of AF ablation strategies. A novel algorithm for automated description of fractionation of atrial electrograms (A-EGMs) based on the wavelet transform has been proposed. The algorithm was developed and validated using a representative set of 1.5 s A-EGM (n = 113) ranked by three experts into four categories: 1—organized atrial activity; 2—mild; 3—intermediate; 4—high degree of fractionation. A tight relationship between a fractionation index and expert classification of A-EGMs (Spearman correlation ρ = 0.87) was documented with a sensitivity of 82% and specificity of 90% for the identification of highly fractionated A-EGMs. This operator-independent description of A-EGM complexity may be easily incorporated into mapping systems to facilitate CFAE identification and to guide AF substrate ablation

  11. Locoregional control of non-small cell lung cancer in relation to automated early assessment of tumor regression on cone beam computed tomography

    DEFF Research Database (Denmark)

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders;

    2014-01-01

    PURPOSE: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its poten...

  12. Automated image segmentation and registration of vessel wall MRI for quantitative assessment of carotid artery vessel wall dimensions and plaque composition

    NARCIS (Netherlands)

    Klooster, Ronald van 't

    2014-01-01

    The main goal of this thesis was to develop methods for automated segmentation, registration and classification of the carotid artery vessel wall and plaque components using multi-sequence MR vessel wall images to assess atherosclerosis. First, a general introduction into atherosclerosis and differe

  13. Automated large scale parameter extraction of road-side trees sampled by a laser mobile mapping system

    NARCIS (Netherlands)

    Lindenbergh, R.C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-01-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadsid

  14. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    OpenAIRE

    Farzad Jalaei; Ahmad Jrade

    2014-01-01

    Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM) offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA) strategi...

  15. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    Energy Technology Data Exchange (ETDEWEB)

    Rosato, Antonio [University of Florence, Department of Chemistry and Magnetic Resonance Center (Italy); Vranken, Wim [Vrije Universiteit Brussel, Structural Biology Brussels (Belgium); Fogh, Rasmus H.; Ragan, Timothy J. [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom); Tejero, Roberto [Universidad de Valencia, Departamento de Química Física (Spain); Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H. [University of Georgia, Complex Carbohydrate Research Center and Northeast Structural Genomics Consortium (United States); Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H. [University of Toronto, Department of Medical Biophysics, Cancer Genomics and Proteomics, Ontario Cancer Institute, Northeast Structural Genomics Consortium (Canada); Kennedy, Michael [Miami University, Department of Chemistry and Biochemistry, Northeast Structural Genomics Consortium (United States); Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, Northeast Structural Genomics Consortium, Rutgers (United States); Vuister, Geerten W., E-mail: gv29@le.ac.uk [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom)

    2015-08-15

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

  16. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    International Nuclear Information System (INIS)

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged

  17. Automating the single crystal x-ray diffraction experiment

    OpenAIRE

    Light, Mark

    2004-01-01

    Ever decreasing data collection times and an explosion in demand present us with the situation where an automated single crystal instrument is not only advantageous but essential. With recent developments in software, instrumentation and robotics it has been possible to fully automate structure determination from mounted crystal to completed crystal structure. In Southampton we have developed a system that takes pre-mounted samples, loads them onto the diffractometer, assesses their diff...

  18. Evaluation of the RapidHIT™ 200, an automated human identification system for STR analysis of single source samples.

    Science.gov (United States)

    Holland, Mitchell; Wendt, Frank

    2015-01-01

    The RapidHIT™ 200 Human Identification System was evaluated to determine its suitability for STR analysis of single source buccal swabs. Overall, the RapidHIT™ 200 performed as well as our traditional capillary electrophoresis based method in producing useable profile information on a first-pass basis. General observations included 100% concordance with known profile information, consistent instrument performance after two weeks of buccal swab storage, and an absence of contamination in negative controls. When data analysis was performed by the instrument software, 95.3% of the 85 samples in the reproducibility study gave full profiles. Including the 81 full profiles, a total of 2682 alleles were correctly called by the instrument software, or 98.6% of 2720 possible alleles tested. Profile information was generated from as little as 10,000 nucleated cells, with swab collection technique being a major contributing factor to profile quality. The average peak-height-ratio for heterozygote profiles (81%) was comparable to conventional STR analysis, and while a high analytical threshold was required when offline profile analysis was performed (800 RFU), it was proportionally consistent with traditional methods. Stochastic sampling effects were evaluated, and a manageable approach to address limits of detection for homozygote profiles is provided. These results support consideration of the RapidHIT™ 200 as an acceptable alternative to conventional, laboratory based STR analysis for the testing of single source buccal samples, with review of profile information as a requirement until an expert software system is incorporated, and when proper developmental and internal validation studies have been completed. PMID:25286443

  19. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    Science.gov (United States)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of

  20. Calibration of a liquid scintillation counter to assess tritium levels in various samples

    CERN Document Server

    Al-Haddad, M N; Abu-Jarad, F A

    1999-01-01

    An LKB-Wallac 1217 Liquid Scintillation Counter (LSC) was calibrated with a newly adopted cocktail. The LSC was then used to measure tritium levels in various samples to assess the compliance of tritium levels with the recommended international levels. The counter was calibrated to measure both biological and operational samples for personnel and for an accelerator facility at KFUPM. The biological samples include the bioassay (urine), saliva, and nasal tests. The operational samples of the light ion linear accelerator include target cooling water, organic oil, fomblin oil, and smear samples. Sets of standards, which simulate various samples, were fabricated using traceable certified tritium standards. The efficiency of the counter was obtained for each sample. The typical range of the efficiencies varied from 33% for smear samples down to 1.5% for organic oil samples. A quenching curve for each sample is presented. The minimum detectable activity for each sample was established. Typical tritium levels in bio...

  1. Strong Prognostic Value of Tumor-infiltrating Neutrophils and Lymphocytes Assessed by Automated Digital Image Analysis in Early Stage Cervical Cancer

    DEFF Research Database (Denmark)

    Carus, Andreas; Donskov, Frede; Switten Nielsen, Patricia;

    2014-01-01

    INTRODUCTION Manual observer-assisted stereological (OAS) assessments of tumor-infiltrating neutrophils and lymphocytes are prognostic, accurate, but cumbersome. We assessed the applicability of automated digital image analysis (DIA). METHODS Visiomorph software was used to obtain DIA densities of...... prognostically strongest manual OAS assessments in the peritumoral compartment. In multivariate analysis, CD66b and CD8 densities, assessed by DIA, and regional lymph node metastases were independent predictors of RFS, while CD163 density and FIGO stage were not. The CD66b/CD8 tumorassociated neutrophil to...

  2. Test-retest reliability analysis of the Cambridge Neuropsychological Automated Tests for the assessment of dementia in older people living in retirement homes.

    Science.gov (United States)

    Gonçalves, Marta Matos; Pinho, Maria Salomé; Simões, Mário R

    2016-01-01

    The validity of the Cambridge Neuropsychological Automated Tests has been widely studied, but their reliability has not. This study aimed to estimate the test-retest reliability of these tests in a sample of 34 older adults, aged 69 to 90 years old, without neuropsychiatric diagnoses and living in retirement homes in the district of Lisbon, Portugal. The battery was administered twice, with a 4-week interval between sessions. The Paired Associates Learning (PAL), Spatial Working Memory (SWM), Rapid Visual Information Processing, and Reaction Time tests revealed measures with high-to-adequate test-retest correlations (.71-.89), although several PAL and SWM measures showed susceptibility to practice effects. Two estimated standardized regression-based methods were found to be more efficient at correcting for practice effects than a method of fixed correction. We also found weak test-retest correlations (.56-.68) for several measures. These results suggest that some, but not all, measures are suitable for cognitive assessment and monitoring in this population. PMID:26574661

  3. Automated texture scoring for assessing breast cancer masking risk in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Petersen, Kersten; Lilholm, Martin;

    PURPOSE: The goal of this work is to develop a method to identify women at high risk for having breast cancer that is easily missed in regular mammography screening. Such a method will provide a rationale for selecting women for adjunctive screening. It goes beyond current risk assessment models...... five-fold cross validation. To assess the independency of the texture scores of breast density, density was determined for each image using Volpara. RESULTS: The odds ratios for interval cancer were 1.59 (95%CI: 0.76-3.32), 2.07 (1.02-4.20), and 3.14 (1.60-6.17) for quartile 2, 3 and 4 respectively...... with the risk of having a breast cancer that is missed in regular mammography screening. As such it offers opportunities to further enhance personalized breast cancer screening....

  4. Assessing breast cancer masking risk with automated texture analysis in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lilholm, Martin; Diao, Pengfei;

    2015-01-01

    PURPOSE The goal of this work is to develop a method to assess the risk of breast cancer masking, based on image characteristics beyond breast density. METHOD AND MATERIALS From the Dutch breast cancer screening program we collected 285 screen detected cancers, and 109 cancers that were screen...... (Quartile 1/2) versus high (Quartile 3/4) texture risk score. We computed odds ratios (OR) for breast cancer masking risk (i.e. interval versus screen detected cancer) for each of the subgroups. The OR was 1.63 (1.04-2.53 95%CI) for the high dense group (as compared to the low dense group), whereas for the...... assessing the risk that a breast cancer is masked in regular mammography, independently of breast density. As such it offers opportunities to further enhance personalized breast cancer screening, beyond breast density....

  5. Computer Man Simulation of Incapacitation: An Automated Approach to Wound Ballistics and Associated Medical Care Assessments

    OpenAIRE

    Clare, V.; Ashman, W.; Broome, P; Jameson, J.; Lewis, J.; Merkler, J.; Mickiewicz, A.; Sacco, W.; Sturdivan, L.

    1981-01-01

    Wound ballistics assessments traditionally have been based on correlations between some quantification of “ballistic dose” and an empirical/subjective medical quantification of human functional degradation. Although complicated by the highly inhomogeneous nature of the human body and by the voluminous data handling requirements these correlation values were obtained by manual methods. The procedure required a substantial commitment of time and resources, thereby restricting the data base from...

  6. Automated breast tissue density assessment using high order regional texture descriptors in mammography

    Science.gov (United States)

    Law, Yan Nei; Lieng, Monica Keiko; Li, Jingmei; Khoo, David Aik-Aun

    2014-03-01

    Breast cancer is the most common cancer and second leading cause of cancer death among women in the US. The relative survival rate is lower among women with a more advanced stage at diagnosis. Early detection through screening is vital. Mammography is the most widely used and only proven screening method for reliably and effectively detecting abnormal breast tissues. In particular, mammographic density is one of the strongest breast cancer risk factors, after age and gender, and can be used to assess the future risk of disease before individuals become symptomatic. A reliable method for automatic density assessment would be beneficial and could assist radiologists in the evaluation of mammograms. To address this problem, we propose a density classification method which uses statistical features from different parts of the breast. Our method is composed of three parts: breast region identification, feature extraction and building ensemble classifiers for density assessment. It explores the potential of the features extracted from second and higher order statistical information for mammographic density classification. We further investigate the registration of bilateral pairs and time-series of mammograms. The experimental results on 322 mammograms demonstrate that (1) a classifier using features from dense regions has higher discriminative power than a classifier using only features from the whole breast region; (2) these high-order features can be effectively combined to boost the classification accuracy; (3) a classifier using these statistical features from dense regions achieves 75% accuracy, which is a significant improvement from 70% accuracy obtained by the existing approaches.

  7. An automated procedure for the assessment of white matter hyperintensities by multispectral (T1, T2, PD) MRI and an evaluation of its between-centre reproducibility based on two large community databases

    Energy Technology Data Exchange (ETDEWEB)

    Maillard, Pauline; Delcroix, Nicolas; Crivello, Fabrice; Gicquel, Sebastien; Joliot, Marc; Tzourio-Mazoyer, Nathalie [GIP Cyceron, Centre d' Imagerie-Neurosciences et Applications aux Pathologies, CI-NAPS, CNRS, CEA, Universite de Caen/Universite Paris Descartes, Boulevard Becquerel, BP 5229, Caen (France); Dufouil, Carole; Alperovitch, Annick; Tzourio, Christophe [Universite Pierre et Marie Curie, INSERM U708, Neuroepidemiologie, Paris (France); Mazoyer, Bernard [GIP Cyceron, Centre d' Imagerie-Neurosciences et Applications aux Pathologies, CI-NAPS, CNRS, CEA, Universite de Caen/Universite Paris Descartes, Boulevard Becquerel, BP 5229, Caen (France); Institut Universitaire de France, Paris (France); CHU du Caen, Unite IRM, Caen (France)

    2008-01-15

    An automated procedure for the detection, quantification, localization and statistical mapping of white matter hyperintensities (WMH) on T2-weighted magnetic resonance (MR) images is presented and validated based on the results of a between-centre reproducibility study. The first step is the identification of white matter (WM) tissue using a multispectral (T1, T2, PD) segmentation. In a second step, WMH are identified within the WM tissue by segmenting T2 images, isolating two different classes of WMH voxels - low- and high-contrast WMH voxels, respectively. The reliability of the whole procedure was assessed by applying it to the analysis of two large MR imaging databases (n = 650 and n= 710, respectively) of healthy elderly subjects matched for demographic characteristics. Average overall WMH load and spatial distribution were found to be similar in the two samples, (1.81 and 1.79% of the WM volume, respectively). White matter hyperintensity load was found to be significantly associated with both age and high blood pressure, with similar effects in both samples. With specific reference to the 650 subject cohort, we also found that WMH load provided by this automated procedure was significantly associated with visual grading of the severity of WMH, as assessed by a trained neurologist. The results show that this method is sensitive, well correlated with semi-quantitative visual rating and highly reproducible. (orig.)

  8. An automated procedure for the assessment of white matter hyperintensities by multispectral (T1, T2, PD) MRI and an evaluation of its between-centre reproducibility based on two large community databases

    International Nuclear Information System (INIS)

    An automated procedure for the detection, quantification, localization and statistical mapping of white matter hyperintensities (WMH) on T2-weighted magnetic resonance (MR) images is presented and validated based on the results of a between-centre reproducibility study. The first step is the identification of white matter (WM) tissue using a multispectral (T1, T2, PD) segmentation. In a second step, WMH are identified within the WM tissue by segmenting T2 images, isolating two different classes of WMH voxels - low- and high-contrast WMH voxels, respectively. The reliability of the whole procedure was assessed by applying it to the analysis of two large MR imaging databases (n = 650 and n710, respectively) of healthy elderly subjects matched for demographic characteristics. Average overall WMH load and spatial distribution were found to be similar in the two samples, (1.81 and 1.79% of the WM volume, respectively). White matter hyperintensity load was found to be significantly associated with both age and high blood pressure, with similar effects in both samples. With specific reference to the 650 subject cohort, we also found that WMH load provided by this automated procedure was significantly associated with visual grading of the severity of WMH, as assessed by a trained neurologist. The results show that this method is sensitive, well correlated with semi-quantitative visual rating and highly reproducible. (orig.)

  9. Automated on-line solid phase extraction coupled to HPLC-APCI-MS detection as a versatile tool for the analysis of phenols in water samples

    International Nuclear Information System (INIS)

    determination of the entire US EPA phenol range within a single chromatographic run with only one MSD interface and could be easily adapted for the analysis of further phenolic compounds. This represents a significant improvement over methods reported for the analysis of phenolic compounds by on-line SPE HPLC-MS so far. For the on-line SPE of phenols from water samples the recently introduced Hysphere GP and the Waters Oasis adsorbent materials were found to be most satisfactory. Their application resulted in quantitative recoveries for sample volumes up to 100 ml, excellent elution behavior (enabling fast elution resulting in narrower peaks) and relative standard deviations for the overall analysis system below 8 percent for all phenols. Typical enrichment factors for automated on-line SPE were estimated to be about one thousand compared to autosampler-injections. Thus, LODs ranging between 40-280 ng/l in SCAN mode could be achieved even when only 10 ml of spiked distilled or river water sample were processed which attests to the excellent screening capabilities of the optimized method. When using the SIM mode the sensitivity could be further increased by about one order of magnitude. The applicability of the proposed method to environmental analysis was demonstrated by preconcentrating phenols from spiked river water samples or waster water treatment effluents via automated on-line SPE HPLC-MS. Due to the very high concentration of matrix in the case of waste water treatment effluents, the sample volume preconcentrated had to be decreased to only 1 ml. Still, the sensitivity is high enough to monitor phenols at levels relevant for waste water monitoring. As a further example for the general applicability of the HPLC-MS method for the tentative structural elucidation of phenolic compounds, it was also used for the analysis of diesel exhaust condensate samples where a number of phenolic compounds could be tentatively identified. (author)

  10. Automated modal tracking and fatigue assessment of a wind turbine based on continuous dynamic monitoring

    Directory of Open Access Journals (Sweden)

    Oliveira Gustavo

    2015-01-01

    Full Text Available The paper describes the implementation of a dynamic monitoring system at a 2.0 MW onshore wind turbine. The system is composed by two components aiming at the structural integrity and fatigue assessment. The first component enables the continuous tracking of modal characteristics of the wind turbine (natural frequency values, modal damping ratios and mode shapes in order to detect abnormal deviations of these properties, which may be caused by the occurrence of structural damage. On the other hand, the second component allows the estimation of the remaining fatigue lifetime of the structure based on the analysis of the measured cycles of structural vibration.

  11. Automated solvent concentrator

    Science.gov (United States)

    Griffith, J. S.; Stuart, J. L.

    1976-01-01

    Designed for automated drug identification system (AUDRI), device increases concentration by 100. Sample is first filtered, removing particulate contaminants and reducing water content of sample. Sample is extracted from filtered residue by specific solvent. Concentrator provides input material to analysis subsystem.

  12. Assessing breast cancer masking risk in full field digital mammography with automated texture analysis

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lilholm, Martin; Diao, Pengfei;

    2015-01-01

    Purpose: The goal of this work is to develop a method to assess the risk of breast cancer masking, based on image characteristics beyond breast density. Method: From the Dutch breast cancer screening program we collected 285 screen detected cancers, and 109 cancers that were screen negative and....../4) texture risk score. We computed odds ratios for breast cancer masking risk (i.e. interval versus screen detected cancer) for each of the subgroups. The odds ratio was 1.63 (1.04-2.53 95%CI) in the high dense group (as compared to the low dense group), whereas for the high texture score group (as compared...... determine cancer detection status in a five-fold cross validation. To assess the interaction of the texture scores with breast density, Volpara Density Grade was determined for each image. Results: We grouped women into low (VDG 1/2) versus high (VDG 3/4) dense, and low (Quartile 1/2) versus high (Q 3...

  13. Enabling automated magnetic resonance imaging-based targeting assessment during dipole field navigation

    Science.gov (United States)

    Latulippe, Maxime; Felfoul, Ouajdi; Dupont, Pierre E.; Martel, Sylvain

    2016-02-01

    The magnetic navigation of drugs in the vascular network promises to increase the efficacy and reduce the secondary toxicity of cancer treatments by targeting tumors directly. Recently, dipole field navigation (DFN) was proposed as the first method achieving both high field and high navigation gradient strengths for whole-body interventions in deep tissues. This is achieved by introducing large ferromagnetic cores around the patient inside a magnetic resonance imaging (MRI) scanner. However, doing so distorts the static field inside the scanner, which prevents imaging during the intervention. This limitation constrains DFN to open-loop navigation, thus exposing the risk of a harmful toxicity in case of a navigation failure. Here, we are interested in periodically assessing drug targeting efficiency using MRI even in the presence of a core. We demonstrate, using a clinical scanner, that it is in fact possible to acquire, in specific regions around a core, images of sufficient quality to perform this task. We show that the core can be moved inside the scanner to a position minimizing the distortion effect in the region of interest for imaging. Moving the core can be done automatically using the gradient coils of the scanner, which then also enables the core to be repositioned to perform navigation to additional targets. The feasibility and potential of the approach are validated in an in vitro experiment demonstrating navigation and assessment at two targets.

  14. TongueSim: Development of an Automated Method for Rapid Assessment of Fungiform Papillae Density for Taste Research.

    Science.gov (United States)

    Sanyal, Shourjya; O'Brien, Shauna M; Hayes, John E; Feeney, Emma L

    2016-05-01

    Taste buds are found on the tongue in 3 types of structures: the fungiform papillae, the foliate papillae, and the circumvallate papillae. Of these, the fungiform papillae (FP) are present in the greatest numbers on the tongue, and are thought to be correlated to the overall number of taste buds. For this reason, FP density on the tongue is often used to infer taste function, although this has been controversial. Historically, videomicroscopy techniques were used to assess FP. More recently, advances in digital still photography and in software have allowed the development of rapid methods for obtaining high quality images in situ. However, these can be subject to inter-researcher variation in FP identification, and are somewhat limited in the parameters that can be measured. Here, we describe the development of a novel, automated method to count the FP, using the TongueSim suite of software. Advantages include the reduction in time required for image analysis, elimination of researcher bias, and the added potential to measure characteristics such as the degree of roundness of each papilla. We envisage that such software has a wide variety of novel research applications. PMID:26892308

  15. Automation of the quantitative determination of elemental content in samples using neutron activation analysis on the IBR-2 reactor at the frank laboratory for neutron physics, joint institute for nuclear research

    Science.gov (United States)

    Dmitriev, A. Yu.; Pavlov, S. S.

    2013-01-01

    Software for the automated quantitative determination of element concentrations in samples is described. This software is used in neutron activation analysis (NAA) at the IBR-2 reactor of the Frank Laboratory for Neutron Physics, Joint Institute for Nuclear Research (FLNP JINR).

  16. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    Science.gov (United States)

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  17. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  18. Measurement of acceleration while walking as an automated method for gait assessment in dairy cattle

    DEFF Research Database (Denmark)

    Chapinal, N.; de Passillé, A.M.; Pastell, M.;

    2011-01-01

    -dimensional accelerometers, 1 attached to each leg and 1 to the back, and acceleration data were collected while cows walked in a straight line on concrete (experiment 1) or on both concrete and rubber (experiment 2). Cows were video-recorded while walking to assess overall gait, asymmetry of the steps, and.......1, measured on a 5-point scale] and lower scores for asymmetry of the steps (18.0 vs. 23.1; SED = 2.2, measured on a continuous 100-unit scale) when they walked on rubber compared with concrete, and their walking speed increased (1.28 vs. 1.22 m/s; SED = 0.02). The acceleration of the front (1.67 vs. 1.72 g...

  19. Taking account of human factors for interface assessment and design in monitoring automated systems

    International Nuclear Information System (INIS)

    Optimum balance between control means and the operator capacities is sought for to achieve computerization of Man-Machine interfaces. Observation of the diagnosis activity of populations of operators in situation on simulators enables design criteria to be defined which are well-suited to the characteristics of the tasks with which they are confronted. This observation provides an assessment of the interfaces from the standpoint of the graphic layer, of the Human behaviour induced by the Machine and of the nature of the interaction between these two systems. This requires an original approach dialectically involving cognitive psychology, dynamic management of the knowledge bases (artificial intelligence) in a critical industrial control and monitoring application. (author)

  20. Lung ventilation-perfusion imbalance in pulmonary emphysema. Assessment with automated V/Q quotient SPECT

    International Nuclear Information System (INIS)

    Tc-99m-Technegas-macro-aggregated albumin (MAA) single photon emission computed tomography (SPECT)-derived ventilation (V)/perfusion (Q) quotient SPECT was used to assess lung V-Q imbalance in patients with pulmonary emphysema. V/Q quotient SPECT and V/Q profile were automatically built in 38 patients with pulmonary emphysema and 12 controls, and V/Q distribution and V/Q profile parameters were compared. V/Q distribution on V/Q quotient SPECT was correlated with low attenuation areas (LAA) on density-mask computed tomography (CT). Parameters of V/Q profile such as the median, standard deviation (SD), kurtosis and skewness were proposed to objectively evaluate the severity of lung V-Q imbalance. In contrast to uniform V/Q distribution on V/Q quotient SPECT and a sharp peak with symmetrical V/Q distribution on V/Q profile in controls, lung areas showing heterogeneously high or low V/Q and flattened peaks with broadened V/Q distribution were frequently seen in patients with emphysema, including lung areas with only slight LAA. V/Q distribution was also often asymmetric regardless of symmetric LAA. All the proposed parameters of V/Q profile in entire lungs of patients with emphysema showed large variations compared with controls; SD and kurtosis were significantly different from controls (P<0.0001 and P<0.001, respectively), and a significant correlation was found between SD and A-aDO2 (P<0.0001). V/Q quotient SPECT appears to be more sensitive to detect emphysematous lungs compared with morphologic CT in patients with emphysema. SD and kurtosis of V/Q profile can be adequate parameters to assess the severity of lung V-Q imbalance causing gas-exchange impairment in patients with emphysema. (author)

  1. A computer-based automated algorithm for assessing acinar cell loss after experimental pancreatitis.

    Directory of Open Access Journals (Sweden)

    John F Eisses

    Full Text Available The change in exocrine mass is an important parameter to follow in experimental models of pancreatic injury and regeneration. However, at present, the quantitative assessment of exocrine content by histology is tedious and operator-dependent, requiring manual assessment of acinar area on serial pancreatic sections. In this study, we utilized a novel computer-generated learning algorithm to construct an accurate and rapid method of quantifying acinar content. The algorithm works by learning differences in pixel characteristics from input examples provided by human experts. HE-stained pancreatic sections were obtained in mice recovering from a 2-day, hourly caerulein hyperstimulation model of experimental pancreatitis. For training data, a pathologist carefully outlined discrete regions of acinar and non-acinar tissue in 21 sections at various stages of pancreatic injury and recovery (termed the "ground truth". After the expert defined the ground truth, the computer was able to develop a prediction rule that was then applied to a unique set of high-resolution images in order to validate the process. For baseline, non-injured pancreatic sections, the software demonstrated close agreement with the ground truth in identifying baseline acinar tissue area with only a difference of 1% ± 0.05% (p = 0.21. Within regions of injured tissue, the software reported a difference of 2.5% ± 0.04% in acinar area compared with the pathologist (p = 0.47. Surprisingly, on detailed morphological examination, the discrepancy was primarily because the software outlined acini and excluded inter-acinar and luminal white space with greater precision. The findings suggest that the software will be of great potential benefit to both clinicians and researchers in quantifying pancreatic acinar cell flux in the injured and recovering pancreas.

  2. 296-B-5 Stack monitoring and sampling system annual system assessment report

    International Nuclear Information System (INIS)

    The B Plant Administration Manual requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 296-B-5 at B Plant. The sampling and monitoring system associated with stack 296-B-5 is functional and performing satisfactorily. This document is an annual assessment report of the systems associated with the 296-B-5 stack

  3. 296-B-5 Stack monitoring and sampling system annual system assessment report

    Energy Technology Data Exchange (ETDEWEB)

    Ridge, T.M.

    1995-02-01

    The B Plant Administration Manual requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 296-B-5 at B Plant. The sampling and monitoring system associated with stack 296-B-5 is functional and performing satisfactorily. This document is an annual assessment report of the systems associated with the 296-B-5 stack.

  4. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    Energy Technology Data Exchange (ETDEWEB)

    Gwynne, Sarah, E-mail: Sarah.Gwynne2@wales.nhs.uk [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Spezi, Emiliano; Wills, Lucy [Department of Medical Physics, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Nixon, Lisette; Hurt, Chris [Wales Cancer Trials Unit, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom); Joseph, George [Department of Diagnostic Radiology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Evans, Mererid [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Griffiths, Gareth [Wales Cancer Trials Unit, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom); Crosby, Tom [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Staffurth, John [Division of Cancer, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom)

    2012-11-15

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard-observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  5. Automating Risk Assessments of Hazardous Material Shipments for Transportation Routes and Mode Selection

    International Nuclear Information System (INIS)

    The METEOR project at Idaho National Laboratory (INL) successfully addresses the difficult problem in risk assessment analyses of combining the results from bounding deterministic simulation results with probabilistic (Monte Carlo) risk assessment techniques. This paper describes a software suite designed to perform sensitivity and cost/benefit analyses on selected transportation routes and vehicles to minimize risk associated with the shipment of hazardous materials. METEOR uses Monte Carlo techniques to estimate the probability of an accidental release of a hazardous substance along a proposed transportation route. A METEOR user selects the mode of transportation, origin and destination points, and charts the route using interactive graphics. Inputs to METEOR (many selections built in) include crash rates for the specific aircraft, soil/rock type and population densities over the proposed route, and bounding limits for potential accident types (velocity, temperature, etc.). New vehicle, materials, and location data are added when available. If the risk estimates are unacceptable, the risks associated with alternate transportation modes or routes can be quickly evaluated and compared. Systematic optimizing methods will provide the user with the route and vehicle selection identified with the lowest risk of hazardous material release. The effects of a selected range of potential accidents such as vehicle impact, fire, fuel explosions, excessive containment pressure, flooding, etc. are evaluated primarily using hydrocodes capable of accurately simulating the material response of critical containment components. Bounding conditions that represent credible accidents (i.e; for an impact event, velocity, orientations, and soil conditions) are used as input parameters to the hydrocode models yielding correlation functions relating accident parameters to component damage. The Monte Carlo algorithms use random number generators to make selections at the various decision

  6. Automating Risk Assessments of Hazardous Material Shipments for Transportation Routes and Mode Selection

    Energy Technology Data Exchange (ETDEWEB)

    Barbara H. Dolphin; William D. RIchins; Stephen R. Novascone

    2010-10-01

    The METEOR project at Idaho National Laboratory (INL) successfully addresses the difficult problem in risk assessment analyses of combining the results from bounding deterministic simulation results with probabilistic (Monte Carlo) risk assessment techniques. This paper describes a software suite designed to perform sensitivity and cost/benefit analyses on selected transportation routes and vehicles to minimize risk associated with the shipment of hazardous materials. METEOR uses Monte Carlo techniques to estimate the probability of an accidental release of a hazardous substance along a proposed transportation route. A METEOR user selects the mode of transportation, origin and destination points, and charts the route using interactive graphics. Inputs to METEOR (many selections built in) include crash rates for the specific aircraft, soil/rock type and population densities over the proposed route, and bounding limits for potential accident types (velocity, temperature, etc.). New vehicle, materials, and location data are added when available. If the risk estimates are unacceptable, the risks associated with alternate transportation modes or routes can be quickly evaluated and compared. Systematic optimizing methods will provide the user with the route and vehicle selection identified with the lowest risk of hazardous material release. The effects of a selected range of potential accidents such as vehicle impact, fire, fuel explosions, excessive containment pressure, flooding, etc. are evaluated primarily using hydrocodes capable of accurately simulating the material response of critical containment components. Bounding conditions that represent credible accidents (i.e; for an impact event, velocity, orientations, and soil conditions) are used as input parameters to the hydrocode models yielding correlation functions relating accident parameters to component damage. The Monte Carlo algorithms use random number generators to make selections at the various decision

  7. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography.

    Science.gov (United States)

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-03-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of-interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  8. An automated model for rooftop PV systems assessment in ArcGIS using LIDAR

    Directory of Open Access Journals (Sweden)

    Mesude Bayrakci Boz

    2015-08-01

    Full Text Available As photovoltaic (PV systems have become less expensive, building rooftops have come to be attractive for local power production. Identifying rooftops suitable for solar energy systems over large geographic areas is needed for cities to obtain more accurate assessments of production potential and likely patterns of development. This paper presents a new method for extracting roof segments and locating suitable areas for PV systems using Light Detection and Ranging (LIDAR data and building footprints. Rooftop segments are created using seven slope (tilt, ve aspect (azimuth classes and 6 different building types. Moreover, direct beam shading caused by nearby objects and the surrounding terrain is taken into account on a monthly basis. Finally, the method is implemented as an ArcGIS model in ModelBuilder and a tool is created. In order to show its validity, the method is applied to city of Philadelphia, PA, USA with the criteria of slope, aspect, shading and area used to locate suitable areas for PV system installation. The results show that 33.7% of the buildings footprints areas and 48.6% of the rooftop segments identi ed is suitable for PV systems. Overall, this study provides a replicable model using commercial software that is capable of extracting individual roof segments with more detailed criteria across an urban area.

  9. Using Teacher Work Samples to Develop and Assess Best Practices in Physical Education Teacher Education

    Science.gov (United States)

    Sariscsany, Mary Jo

    2010-01-01

    Teacher work samples (TWS) are an integrated, comprehensive assessment tool that can be used as evidence of a beginning teacher's readiness to teach. Unlike linear assessments used to determine teaching effectiveness, TWS are relevant and reflective of "real" teaching. They are designed to exhibit a clear relationship among teacher candidate…

  10. Technical assessment of compliance with work place air sampling requirements at T Plant. Revision No. 1

    International Nuclear Information System (INIS)

    The US DOE requires its contractors to conduct air sampling to detect and evaluate airborne radioactive material in the workplace. Hanford Reservation T Plant compliance with workplace air sampling requirements has been assessed. Requirements, basis for determining compliance and recommendations are included

  11. Accurate and Precise in Situ Zircon U-Pb age Dating With High Sample Throughput by Automated LA-SF-ICP-MS

    Science.gov (United States)

    Frei, D.; Gerdes, A.; Schersten, A.; Hollis, J. A.; Martina, F.; Knudsen, C.

    2006-12-01

    Zircon is an ubiquitous mineral in most crystalline rocks as well as clastic sediments. The high resistance to thermal resetting and physical erosion makes zircon an exceptionally useful mineral for precise and accurate dating of thermal geological events. For example, the analysis of the U-Pb ages of detrital zircon grains in clastic sediments is a powerful tool in sedimentary provenance studies. Accurate and precise U-Pb ages of > 100 zircon grains in a sample usually allow to detect all major sedimentary source age components with statistical confidence. U-Pb age dating of detrital zircons is generally the domain of high resolution ion microprobe techniques (high resolution SIMS), where relatively rapid in situ analysis can be achieved. The major limitations of these techniques are sample throughput (about 75 zircon age dates per 24 hours), the very high purchasing and operating costs of the equipment and the need for highly specialised personnel, resulting in high cost. These high costs usually impose uncomfortable restrictions on the number of samples that can be analysed in a provenance study. Here, we present a high sample throughput technique for highly accurate and precise U-Pb dating of zircons by laser ablation magnetic sectorfield inductively coupled plasma mass spectrometry (LA-SF-ICP-MS). This technique takes advantage of recent progress in laser technology and the introduction of magnetic sectorfield ICP-MS instruments. Based on a ThermoFinnigan Element2 magnetic sctorfield ICP-MS and a New Wave UP 213 laser ablation system, this techniques allows U-Pb dating of zircon grains with precision, accuray and spatial resolution comparable to high resolution SIMS. Because an individual analysis is carried out in less than two minutes and all data is acquired automated in pre-set mode with only minimal operator presence, the sample throughput is an order of magnitude higher compared to high resolution SIMS. Furthermore, the purchasing and operating costs of

  12. Automated image segmentation and registration of vessel wall MRI for quantitative assessment of carotid artery vessel wall dimensions and plaque composition

    OpenAIRE

    Klooster, Ronald van 't

    2014-01-01

    The main goal of this thesis was to develop methods for automated segmentation, registration and classification of the carotid artery vessel wall and plaque components using multi-sequence MR vessel wall images to assess atherosclerosis. First, a general introduction into atherosclerosis and different stages of the disease were described including the importance to differentiate between stable and vulnerable plaques. Several non-invasive imaging techniques were discussed and the advantages of...

  13. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  14. Assessment of the Current Level of Automation in the Manufacture of Fuel Cell Systems for Combined Heat and Power Applications

    Energy Technology Data Exchange (ETDEWEB)

    Ulsh, M.; Wheeler, D.; Protopappas, P.

    2011-08-01

    The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study using a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.

  15. The use of importance sampling in a trial assessment to obtain converged estimates of radiological risk

    International Nuclear Information System (INIS)

    In developing a methodology for assessing potential sites for the disposal of radioactive wastes, the Department of the Environment has conducted a series of trial assessment exercises. In order to produce converged estimates of radiological risk using the SYVAC A/C simulation system an efficient sampling procedure is required. Previous work has demonstrated that importance sampling can substantially increase sampling efficiency. This study used importance sampling to produce converged estimates of risk for the first DoE trial assessment. Four major nuclide chains were analysed. In each case importance sampling produced converged risk estimates with between 10 and 170 times fewer runs of the SYVAC A/C model. This increase in sampling efficiency can reduce the total elapsed time required to obtain a converged estimate of risk from one nuclide chain by a factor of 20. The results of this study suggests that the use of importance sampling could reduce the elapsed time required to perform a risk assessment of a potential site by a factor of ten. (author)

  16. Testing of an automated online EA-IRMS method for fast and simultaneous carbon content and stable isotope measurement of aerosol samples

    Science.gov (United States)

    Major, István; Gyökös, Brigitta; Túri, Marianna; Futó, István; Filep, Ágnes; Hoffer, András; Molnár, Mihály

    2016-04-01

    Comprehensive atmospheric studies have demonstrated that carbonaceous aerosol is one of the main components of atmospheric particulate matter over Europe. Various methods, considering optical or thermal properties, have been developed for quantification of the accurate amount of both organic and elemental carbon constituents of atmospheric aerosol. The aim of our work was to develop an alternative fast and easy method for determination of the total carbon content of individual aerosol samples collected on prebaked quartz filters whereby the mass and surface concentration becomes simply computable. We applied the conventional "elemental analyzer (EA) coupled online with an isotope ratio mass spectrometer (IRMS)" technique which is ubiquitously used in mass spectrometry. Using this technique we are able to measure simultaneously the carbon stable isotope ratio of the samples, as well. During the developing process, we compared the EA-IRMS technique with an off-line catalytic combustion method worked out previously at Hertelendi Laboratory of Environmental Studies (HEKAL). We tested the combined online total carbon content and stable isotope ratio measurement both on standard materials and real aerosol samples. Regarding the test results the novel method assures, on the one hand, at least 95% of carbon recovery yield in a broad total carbon mass range (between 100 and 3000 ug) and, on the other hand, a good reproducibility of stable isotope measurements with an uncertainty of ± 0.2 per mill. Comparing the total carbon results obtained by the EA-IRMS and the off-line catalytic combustion method we found a very good correlation (R2=0.94) that proves the applicability of both preparation method. Advantages of the novel method are the fast and simplified sample preparation steps and the fully automated, simultaneous carbon stable isotope ratio measurement processes. Furthermore stable isotope ratio results can effectively be applied in the source apportionment

  17. Radiostrontium and radium analysis in low-level environmental samples following a multi-stage semi-automated chromatographic sequential separation

    International Nuclear Information System (INIS)

    Strontium isotopes, 89Sr and 90Sr, and 226Ra being radiotoxic when ingested, are routinely monitored in milk and drinking water samples collected from different regions in Canada. In order to monitor environmental levels of activity, a novel semi-automated sensitive method has been developed at the Radiation Protection Bureau of Health Canada (Ottawa, Canada). This method allows the separation and quantification of both 89Sr and 90Sr and has also been adapted to quantify 226Ra during the same sample preparation procedure. The method uses a 2-stage purification process during which matrix constituents, such as magnesium and calcium that are rich in milk, are removed as well as the main beta-interferences (e.g., 40K, 87Rb, 134Cs, 137Cs, and 140Ba). The first purification step uses strong cation exchange (SCX) chromatography with commercially available resins. In a second step, fractions containing the radiostrontium analytes are further purified using high-performance ion chromatography (HPIC). While 89Sr is quantified by Cerenkov counting immediately after the second purification stage, the same vial is counted again after a latent period of 10-14 days to quantify the 90Sr activity based on 90Y ingrowth. Similarly, the activity of 226Ra, which is separated by SCX only, is determined via the emanation of 222Rn in a 2-phase aqueous/cocktail system using liquid scintillation counting. The minimum detectable concentration (MDC) for 89Sr and 90Sr for a 200 min count time at 95% confidence interval is 0.03 and 0.02 Bq/L, respectively. The MDC for 226Ra for a 100 min count time is 0.002 Bq/L. Semi-annual intercomparison samples from the USA Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) were used to validate the method for 89Sr and 90Sr. Spiked water samples prepared in-house and from International Atomic Energy Agency (IAEA) were used to validate the 226Ra assay.

  18. Comparison of sampling methods for the assessment of indoor microbial exposure

    DEFF Research Database (Denmark)

    Frankel, M; Timm, Michael; Hansen, E W; Madsen, A M

    2012-01-01

    Abstract Indoor microbial exposure has been related to allergy and respiratory disorders. However, the lack of standardized sampling methodology is problematic when investigating dose-response relationships between exposure and health effects. In this study, different sampling methods were compared...... with those from GSP. Settled dust from the EDC was most representative of airborne dust and may thus be considered as a surrogate for the assessment of indoor airborne microbial exposure. PRACTICAL IMPLICATIONS: Significant discrepancies between sampling methods regarding indoor microbial exposures...... regarding their assessment of microbial exposures, including culturable fungi and bacteria, endotoxin, as well as the total inflammatory potential (TIP) of dust samples from Danish homes. The Gesamtstaubprobenahme (GSP) filter sampler and BioSampler were used for sampling of airborne dust, whereas the dust...

  19. Particle analysis for uranium isotopes on swipe samples using new generation Cameca IMS 7f SIMS supported by SEM automated uranium detection

    International Nuclear Information System (INIS)

    triangle pieces. This internal reference enables the determination of parameters in the transformation of coordinates relative to the SEM, to coordinates relative to the SIMS sample stages with a precision better than 50 μm. Uranium-bearing particle detection - The main difficulty in particle detection arises because the programs which are commonly used for SIMS automated uranium-bearing particle search (e.g. P-search by Evans Analytical) still have to be updated to a version compatible with the IMS 7f software. In this study, the automated detection of uranium-bearing particles has been performed using a FEI XL 30 environmental SEM fitted with an EDAX system. An adaptation of the Gun Shot Residue forensic software allows the automatic search for uranium-containing particles using back-scattered electron image analysis and qualitative micro-analysis of major elemental composition by energy dispersed X-ray spectrometry. In addition, secondary electron images of uranium-containing particles can be acquired in order to characterize their morphology. An overnight GSR run may investigate a ∼ 1 cm2 deposition area, detecting with a high probability all uranium-bearing particles with diameter > 1 μm. The GSR program provides a listing of uranium-bearing particle coordinates relative to the MEB sample stage. Compared to the SIMS detection, this lower cost method presents some advantages: it is non-destructive, non-susceptible to isobaric interferences, and provides some additional relevant information on individual particles (e.g. volume, morphology, and major elemental composition). Compared to SIMS particle detection, the main drawback of this technique is that it is not sensitive to 235U-enrichment of the detected particles. As a consequence, no priority can be drawn among the particles to be analyzed for isotopic ratios. SIMS analysis of uranium isotopic ratios -- About 40 particles selected among the uranium-bearing particles previously detected by SEM could be analyzed

  20. Automated solid-phase extraction for the determination of polybrominated diphenyl ethers and polychlorinated biphenyls in serum--application on archived Norwegian samples from 1977 to 2003.

    Science.gov (United States)

    Thomsen, Cathrine; Liane, Veronica Horpestad; Becher, Georg

    2007-02-01

    An analytical method comprised of automated solid-phase extraction and determination using gas chromatography mass spectrometry (single quadrupole) has been developed for the determination of 12 polybrominated diphenyl ethers (PBDEs), 26 polychlorinated biphenyls (PCBs), two organochlorine compounds (OCs) (hexachlorobenzene and octachlorostyrene) and two brominated phenols (pentabromophenol, and tetrabromobisphenol-A (TBBP-A)). The analytes were extracted using a sorbent of polystyrene-divinylbenzene and an additional clean-up was performed on a sulphuric acid-silica column to remove lipids. The method has been validated by spiking horse serum at five levels. The mean accuracy given as recovery relative to internal standards was 95%, 99%, 93% and 109% for the PBDEs PCBs, OCs and brominated phenols, respectively. The mean repeatability given as RSDs was respectively 6.9%, 8.7%, 7.5% and 15%. Estimated limits of detection (S/N=3) were in the range 0.2-1.8 pg/g serum for the PBDEs and phenols, and from 0.1 pg/g to 56 pg/g serum for the PCBs and OCs. The validated method has been used to investigate the levels of PBDEs and PCBs in 21 pooled serum samples from the general Norwegian population. In serum from men (age 40-50 years) the sum of seven PBDE congeners (IUPAC No. 28, 47, 99, 100, 153, 154 and 183) increased from 1977 (0.5 ng/g lipids) to 1998 (4.8 ng/g lipids). From 1999 to 2003 the concentration of PBDEs seems to have stabilised. On the other hand, the sum of five PCBs (IUPAC No. 101, 118, 138, 153 and 180) in these samples decreased steadily from 1977 (666 ng/g lipids) to 2003 (176 ng/g lipids). Tetrabromobisphenol-A and BDE-209 were detected in almost all samples, but no similar temporal trends to that seen for the PBDEs were observed for these compounds, which might be due to the short half-lives of these brominated flame retardants (FR) in humans. PMID:17023223

  1. Determination of aflatoxins in food samples by automated on-line in-tube solid-phase microextraction coupled with liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Nonaka, Y; Saito, K; Hanioka, N; Narimatsu, S; Kataoka, H

    2009-05-15

    A simple and sensitive automated method for determination of aflatoxins (B1, B2, G1, and G2) in nuts, cereals, dried fruits, and spices was developed consisting of in-tube solid-phase microextraction (SPME) coupled with liquid chromatography-mass spectrometry (LC-MS). Aflatoxins were separated within 8 min by high-performance liquid chromatography using a Zorbax Eclipse XDB-C8 column with methanol/acetonitrile (60/40, v/v): 5mM ammonium formate (45:55) as the mobile phase. Electrospray ionization conditions in the positive ion mode were optimized for MS detection of aflatoxins. The pseudo-molecular ions [M+H](+) were used to detect aflatoxins in selected ion monitoring (SIM) mode. The optimum in-tube SPME conditions were 25draw/eject cycles of 40 microL of sample using a Supel-Q PLOT capillary column as an extraction device. The extracted aflatoxins were readily desorbed from the capillary by passage of the mobile phase, and no carryover was observed. Using the in-tube SPME LC-MS with SIM method, good linearity of the calibration curve (r>0.9994) was obtained in the concentration range of 0.05-2.0 ng/mL using aflatoxin M1 as an internal standard, and the detection limits (S/N=3) of aflatoxins were 2.1-2.8 pg/mL. The in-tube SPME method showed >23-fold higher sensitivity than the direct injection method (10 microL injection volume). The within-day and between-day precision (relative standard deviations) at the concentration of 1 ng/mL aflatoxin mixture were below 3.3% and 7.7% (n=5), respectively. This method was applied successfully to analysis of food samples without interference peaks. The recoveries of aflatoxins spiked into nuts and cereals were >80%, and the relative standard deviations were Aflatoxins were detected at <10 ng/g in several commercial food samples. PMID:19328492

  2. An automated system for access to derived climate indices in support of ecological impacts assessments and resource management

    Science.gov (United States)

    Walker, J.; Morisette, J. T.; Talbert, C.; Blodgett, D. L.; Kunicki, T.

    2012-12-01

    A U.S. Geological Survey team is working with several providers to establish standard data services for the climate projection data they host. To meet the needs of climate adaptation science and landscape management communities, the team is establishing a set of climate index calculation algorithms that will consume data from various providers and provide directly useful data derivatives. Climate projections coming from various scenarios, modeling centers, and downscaling methods are increasing in number and size. Global change impact modeling and assessment, generally, requires inputs in the form of climate indices or values derived from raw climate projections. This requirement puts a large burden on a community not familiar with climate data formats, semantics, and processing techniques and requires storage capacity and computing resources out of the reach of most. In order to fully understand the implications of our best available climate projections, assessments must take into account an ensemble of climate projections and potentially a range of parameters for calculation of climate indices. These requirements around data access and processing are not unique from project to project, or even among projected climate data sets, pointing to the need for a reusable tool to generate climate indices. The U.S. Geological Survey has developed a pilot application and supporting web service framework that automates the generation of climate indices. The web service framework consists of standards-based data servers and a data integration broker. The resulting system allows data producers to publish and maintain ownership of their data and data consumers to access climate derivatives via a simple to use "data product ordering" workflow. Data access and processing is completed on enterprise "cloud" computing resources and only the relatively small, derived climate indices are delivered to the scientist or land manager. These services will assist the scientific and land

  3. A support vector density-based importance sampling for reliability assessment

    International Nuclear Information System (INIS)

    An importance sampling method based on the adaptive Markov chain simulation and support vector density estimation is developed in this paper for efficient structural reliability assessment. The methodology involves the generation of samples that can adaptively populate the important region by the adaptive Metropolis algorithm, and the construction of importance sampling density by support vector density. The use of the adaptive Metropolis algorithm may effectively improve the convergence and stability of the classical Markov chain simulation. The support vector density can approximate the sampling density with fewer samples in comparison to the conventional kernel density estimation. The proposed importance sampling method can effectively reduce the number of structural analysis required for achieving a given accuracy. Examples involving both numerical and practical structural problems are given to illustrate the application and efficiency of the proposed methodology.

  4. Assessing terpene content variability of whitebark pine in order to estimate representative sample size

    Directory of Open Access Journals (Sweden)

    Stefanović Milena

    2013-01-01

    Full Text Available In studies of population variability, particular attention has to be paid to the selection of a representative sample. The aim of this study was to assess the size of the new representative sample on the basis of the variability of chemical content of the initial sample on the example of a whitebark pine population. Statistical analysis included the content of 19 characteristics (terpene hydrocarbons and their derivates of the initial sample of 10 elements (trees. It was determined that the new sample should contain 20 trees so that the mean value calculated from it represents a basic set with a probability higher than 95 %. Determination of the lower limit of the representative sample size that guarantees a satisfactory reliability of generalization proved to be very important in order to achieve cost efficiency of the research. [Projekat Ministarstva nauke Republike Srbije, br. OI-173011, br. TR-37002 i br. III-43007

  5. A content validated questionnaire for assessment of self reported venous blood sampling practices

    Directory of Open Access Journals (Sweden)

    Bölenius Karin

    2012-01-01

    Full Text Available Abstract Background Venous blood sampling is a common procedure in health care. It is strictly regulated by national and international guidelines. Deviations from guidelines due to human mistakes can cause patient harm. Validated questionnaires for health care personnel can be used to assess preventable "near misses"--i.e. potential errors and nonconformities during venous blood sampling practices that could transform into adverse events. However, no validated questionnaire that assesses nonconformities in venous blood sampling has previously been presented. The aim was to test a recently developed questionnaire in self reported venous blood sampling practices for validity and reliability. Findings We developed a questionnaire to assess deviations from best practices during venous blood sampling. The questionnaire contained questions about patient identification, test request management, test tube labeling, test tube handling, information search procedures and frequencies of error reporting. For content validity, the questionnaire was confirmed by experts on questionnaires and venous blood sampling. For reliability, test-retest statistics were used on the questionnaire answered twice. The final venous blood sampling questionnaire included 19 questions out of which 9 had in total 34 underlying items. It was found to have content validity. The test-retest analysis demonstrated that the items were generally stable. In total, 82% of the items fulfilled the reliability acceptance criteria. Conclusions The questionnaire could be used for assessment of "near miss" practices that could jeopardize patient safety and gives several benefits instead of assessing rare adverse events only. The higher frequencies of "near miss" practices allows for quantitative analysis of the effect of corrective interventions and to benchmark preanalytical quality not only at the laboratory/hospital level but also at the health care unit/hospital ward.

  6. Feasibility of hair sampling to assess levels of organophosphate metabolites in rural areas of Sri Lanka

    Science.gov (United States)

    Knipe, D.W.; Jayasumana, C.; Siribaddana, S.; Priyadarshana, C.; Pearson, M.; Gunnell, D.; Metcalfe, C.; Tzatzarakis, M.N.; Tsatsakis, A.M.

    2016-01-01

    Measuring chronic pesticide exposure is important in order to investigate the associated health effects. Traditional biological samples (blood/urine) are difficult to collect, store and transport in large epidemiological studies in settings such as rural Asia. We assessed the acceptability of collecting hair samples from a rural Sri Lankan population and found that this method of data collection was feasible. We also assessed the level of non-specific metabolites (DAPS) of organophosphate pesticides in the hair samples. The median concentration (pg/mg) of each DAP was: diethyl phosphate: 83.3 (IQI 56.0, 209.4); diethyl thiophosphate: 34.7 (IQI 13.8, 147.9); diethyl dithiophosphate: 34.5 (IQI 23.4, 55.2); and dimethyl phosphate: 3 (IQI 3, 109.7). Total diethylphosphates were recovered in >80% of samples and were positively correlated with self-reported pesticide exposure. PMID:26894816

  7. Feasibility of hair sampling to assess levels of organophosphate metabolites in rural areas of Sri Lanka.

    Science.gov (United States)

    Knipe, D W; Jayasumana, C; Siribaddana, S; Priyadarshana, C; Pearson, M; Gunnell, D; Metcalfe, C; Tzatzarakis, M N; Tsatsakis, A M

    2016-05-01

    Measuring chronic pesticide exposure is important in order to investigate the associated health effects. Traditional biological samples (blood/urine) are difficult to collect, store and transport in large epidemiological studies in settings such as rural Asia. We assessed the acceptability of collecting hair samples from a rural Sri Lankan population and found that this method of data collection was feasible. We also assessed the level of non-specific metabolites (DAPS) of organophosphate pesticides in the hair samples. The median concentration (pg/mg) of each DAP was: diethyl phosphate: 83.3 (IQI 56.0, 209.4); diethyl thiophosphate: 34.7 (IQI 13.8, 147.9); diethyl dithiophosphate: 34.5 (IQI 23.4, 55.2); and dimethyl phosphate: 3 (IQI 3, 109.7). Total diethylphosphates were recovered in >80% of samples and were positively correlated with self-reported pesticide exposure. PMID:26894816

  8. An integrative pharmacological approach to radio telemetry and blood sampling in pharmaceutical drug discovery and safety assessment

    Directory of Open Access Journals (Sweden)

    Kamendi Harriet W

    2011-01-01

    Full Text Available Abstract Background A successful integration of the automated blood sampling (ABS and telemetry (ABST system is described. The new ABST system facilitates concomitant collection of physiological variables with blood and urine samples for determination of drug concentrations and other biochemical measures in the same rat without handling artifact. Method Integration was achieved by designing a 13 inch circular receiving antenna that operates as a plug-in replacement for the existing pair of DSI's orthogonal antennas which is compatible with the rotating cage and open floor design of the BASi Culex® ABS system. The circular receiving antenna's electrical configuration consists of a pair of electrically orthogonal half-toroids that reinforce reception of a dipole transmitter operating within the coil's interior while reducing both external noise pickup and interference from other adjacent dipole transmitters. Results For validation, measured baclofen concentration (ABST vs. satellite (μM: 69.6 ± 23.8 vs. 76.6 ± 19.5, p = NS and mean arterial pressure (ABST vs. traditional DSI telemetry (mm Hg: 150 ± 5 vs.147 ± 4, p = NS variables were quantitatively and qualitatively similar between rats housed in the ABST system and traditional home cage approaches. Conclusion The ABST system offers unique advantages over traditional between-group study paradigms that include improved data quality and significantly reduced animal use. The superior within-group model facilitates assessment of multiple physiological and biochemical responses to test compounds in the same animal. The ABST also provides opportunities to evaluate temporal relations between parameters and to investigate anomalous outlier events because drug concentrations, physiological and biochemical measures for each animal are available for comparisons.

  9. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...

  10. Assessment of Residual Stresses in 3013 Inner and Outer Containers and Teardrop Samples

    International Nuclear Information System (INIS)

    This report is an assessment performed by LANL that examines packaging for plutonium-bearing materials and the resilience of its design. This report discusses residual stresses in the 3013 outer, the SRS/Hanford and RFETS/LLNL inner containers, and teardrop samples used in studies to assess the potential for SCC in 3013 containers. Residual tensile stresses in the heat affected zones of the closure welds are of particular concern.

  11. Assessment of Residual Stresses in 3013 Inner and Outer Containers and Teardrop Samples

    Energy Technology Data Exchange (ETDEWEB)

    Stroud, Mary Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Prime, Michael Bruce [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Veirs, Douglas Kirk [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Berg, John M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Clausen, Bjorn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Worl, Laura Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); DeWald, Adrian T. [Hill Engineering, LLC, Rancho Cordova, CA (United States)

    2015-12-08

    This report is an assessment performed by LANL that examines packaging for plutonium-bearing materials and the resilience of its design. This report discusses residual stresses in the 3013 outer, the SRS/Hanford and RFETS/LLNL inner containers, and teardrop samples used in studies to assess the potential for SCC in 3013 containers. Residual tensile stresses in the heat affected zones of the closure welds are of particular concern.

  12. Comparison of Endotoxin Exposure Assessment by Bioaerosol Impinger and Filter-Sampling Methods

    OpenAIRE

    Duchaine, Caroline; Thorne, Peter S.; Mériaux, Anne; Grimard, Yan; Whitten, Paul; Cormier, Yvon

    2001-01-01

    Environmental assessment data collected in two prior occupational hygiene studies of swine barns and sawmills allowed the comparison of concurrent, triplicate, side-by-side endotoxin measurements using air sampling filters and bioaerosol impingers. Endotoxin concentrations in impinger solutions and filter eluates were assayed using the Limulus amebocyte lysate assay. In sawmills, impinger sampling yielded significantly higher endotoxin concentration measurements and lower variances than filte...

  13. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    International Nuclear Information System (INIS)

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  14. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  15. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  16. 296-B-10 stack monitoring and sampling system annual system assessment report

    International Nuclear Information System (INIS)

    B Plant Administration Manual, requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with stack 296-B-10 at B Plant. The ventilation system of WESF (Waste Encapsulation and Storage Facility) is designed to provide airflow patterns so that air movement throughout the building is from areas of lesser radioactivity to areas of greater radioactivity. All potentially contaminated areas are maintained at a negative pressure with respect to the atmosphere so that air flows into the building at all times. The exhaust discharging through the 296-B-10 stack is continuously monitored and sampled using a sampling and monitoring probe assembly located approximately 17.4 meters (57 feet) above the base of the stack. The probe assembly consists of 5 nozzles for the sampling probe and 2 nozzles to monitor the flow. The sampling and monitoring system associated with Stack 296-B-10 is functional and performing satisfactorily

  17. 296-B-10 stack monitoring and sampling system annual system assessment report

    Energy Technology Data Exchange (ETDEWEB)

    Ridge, T.M.

    1995-04-26

    B Plant Administration Manual, requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with stack 296-B-10 at B Plant. The ventilation system of WESF (Waste Encapsulation and Storage Facility) is designed to provide airflow patterns so that air movement throughout the building is from areas of lesser radioactivity to areas of greater radioactivity. All potentially contaminated areas are maintained at a negative pressure with respect to the atmosphere so that air flows into the building at all times. The exhaust discharging through the 296-B-10 stack is continuously monitored and sampled using a sampling and monitoring probe assembly located approximately 17.4 meters (57 feet) above the base of the stack. The probe assembly consists of 5 nozzles for the sampling probe and 2 nozzles to monitor the flow. The sampling and monitoring system associated with Stack 296-B-10 is functional and performing satisfactorily.

  18. Preparation and validation of gross alpha/beta samples used in EML's quality assessment program

    International Nuclear Information System (INIS)

    A set of water and filter samples have been incorporated into the existing Environmental Measurements Laboratory's (EML) Quality Assessment Program (QAP) for gross alpha/beta determinations by participating DOE laboratories. The participating laboratories are evaluated by comparing their results with the EML value. The preferred EML method for measuring water and filter samples, described in this report, uses gas flow proportional counters with 2 in. detectors. Procedures for sample preparation, quality control and instrument calibration are presented. Liquid scintillation (LS) counting is an alternative technique that is suitable for quantifying both the alpha (241Am, 230Th and 238Pu) and beta (90Sr/90Y) activity concentrations in the solutions used to prepare the QAP water and air filter samples. Three LS counting techniques (Cerenkov, dual dpm and full spectrum analysis) are compared. These techniques may be used to validate the activity concentrations of each component in the alpha/beta solution before the QAP samples are actually prepared

  19. 296-B-13 stack monitoring and sampling system: Annual system assessment report

    International Nuclear Information System (INIS)

    This report presents the details of the annual system assessment of the air pollution monitoring and sampling system for the 296-13 stack at the Hanford site. Topics discussed include; system description, system status, system aging, spare parts considerations, long term maintenance plan, trends, and items requiring action

  20. 296-B-13 stack monitoring and sampling system: Annual system assessment report

    Energy Technology Data Exchange (ETDEWEB)

    Ridge, T.M.

    1995-05-16

    This report presents the details of the annual system assessment of the air pollution monitoring and sampling system for the 296-13 stack at the Hanford site. Topics discussed include; system description, system status, system aging, spare parts considerations, long term maintenance plan, trends, and items requiring action.

  1. Technical assessment of workplace air sampling requirements at tank farm facilities. Revision 2

    International Nuclear Information System (INIS)

    Tank Farm facilities compliance with the workplace air sampling (WPAS) program has been assessed. Requirements bases for determining compliance and recommendations are included. In the current condition all buildings are in compliance with the WPAS program. This document also supersedes WHC-SD-SQA-TA-20012, revision 0

  2. Mood disorders in everyday life : A systematic review of experience sampling and ecological momentary assessment studies

    NARCIS (Netherlands)

    Aan het Rot, M.; Hogenelst, Koen; Schoevers, R.A.

    2012-01-01

    In the past two decades, the study of mood disorder patients using experience sampling methods (ESM) and ecological momentary assessment (EMA) has yielded important findings. In patients with major depressive disorder (MDD), the dynamics of their everyday mood have been associated with various aspec

  3. Assessment of Emotional Intelligence in a Sample of Prospective Secondary Education Teachers

    Science.gov (United States)

    Gutiérrez-Moret, Margarita; Ibáñez-Martinez, Raquel; Aguilar-Moya, Remedios; Vidal-Infer, Antonio

    2016-01-01

    In the past few years, skills related to emotional intelligence (EI) have acquired special relevance in the educational domain. This study assesses EI in a sample of 155 students of 5 different specialities of a Master's degree in Teacher Training for Secondary Education. Data collection was conducted through the administration of the Trait Meta…

  4. Using Structural Equation Modeling to Assess Functional Connectivity in the Brain: Power and Sample Size Considerations

    Science.gov (United States)

    Sideridis, Georgios; Simos, Panagiotis; Papanicolaou, Andrew; Fletcher, Jack

    2014-01-01

    The present study assessed the impact of sample size on the power and fit of structural equation modeling applied to functional brain connectivity hypotheses. The data consisted of time-constrained minimum norm estimates of regional brain activity during performance of a reading task obtained with magnetoencephalography. Power analysis was first…

  5. Validation of an automated ELISA system for detection of antibodies to Aleutian mink disease virus using blood samples collected in filter paper strips

    OpenAIRE

    Knuuttila, Anna; Aronen, Pirjo; Eerola, Majvor; Gardner, Ian A; Virtala, Anna-Maija K; Vapalahti, Olli

    2014-01-01

    Background Aleutian mink disease virus (AMDV) is the cause of a chronic immune complex disease, Aleutian disease (AD), which is common in mink-producing countries. In 2005, implementation of an AMDV eradication programme in Finland created a need for an automated high-throughput assay. The aim of this study was to validate an AMDV-VP2 -recombinant antigen ELISA, which we developed earlier, in an automated assay format for the detection of anti-AMDV antibodies in mink blood and to determine th...

  6. Oral Samples as Non-Invasive Proxies for Assessing the Composition of the Rumen Microbial Community

    Science.gov (United States)

    Tapio, Ilma; Shingfield, Kevin J.; McKain, Nest; Bonin, Aurélie; Fischer, Daniel; Bayat, Ali R.; Vilkki, Johanna; Taberlet, Pierre; Snelling, Timothy J.; Wallace, R. John

    2016-01-01

    Microbial community analysis was carried out on ruminal digesta obtained directly via rumen fistula and buccal fluid, regurgitated digesta (bolus) and faeces of dairy cattle to assess if non-invasive samples could be used as proxies for ruminal digesta. Samples were collected from five cows receiving grass silage based diets containing no additional lipid or four different lipid supplements in a 5 x 5 Latin square design. Extracted DNA was analysed by qPCR and by sequencing 16S and 18S rRNA genes or the fungal ITS1 amplicons. Faeces contained few protozoa, and bacterial, fungal and archaeal communities were substantially different to ruminal digesta. Buccal and bolus samples gave much more similar profiles to ruminal digesta, although fewer archaea were detected in buccal and bolus samples. Bolus samples overall were most similar to ruminal samples. The differences between both buccal and bolus samples and ruminal digesta were consistent across all treatments. It can be concluded that either proxy sample type could be used as a predictor of the rumen microbial community, thereby enabling more convenient large-scale animal sampling for phenotyping and possible use in future animal breeding programs aimed at selecting cattle with a lower environmental footprint. PMID:26986467

  7. Gluten-containing grains skew gluten assessment in oats due to sample grind non-homogeneity.

    Science.gov (United States)

    Fritz, Ronald D; Chen, Yumin; Contreras, Veronica

    2017-02-01

    Oats are easily contaminated with gluten-rich kernels of wheat, rye and barley. These contaminants are like gluten 'pills', shown here to skew gluten analysis results. Using R-Biopharm R5 ELISA, we quantified gluten in gluten-free oatmeal servings from an in-market survey. For samples with a 5-20ppm reading on a first test, replicate analyses provided results ranging 160ppm. This suggests sample grinding may inadequately disperse gluten to allow a single accurate gluten assessment. To ascertain this, and characterize the distribution of 0.25-g gluten test results for kernel contaminated oats, twelve 50g samples of pure oats, each spiked with a wheat kernel, showed that 0.25g test results followed log-normal-like distributions. With this, we estimate probabilities of mis-assessment for a 'single measure/sample' relative to the <20ppm regulatory threshold, and derive an equation relating the probability of mis-assessment to sample average gluten content. PMID:27596406

  8. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population

    Directory of Open Access Journals (Sweden)

    Guillermo Rey Gozalo

    2016-05-01

    Full Text Available Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods were analysed and compared using the city of Talca (Chile as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population.

  9. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population.

    Science.gov (United States)

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel

    2016-01-01

    Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods) were analysed and compared using the city of Talca (Chile) as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population. PMID:27187429

  10. Threshold-dependent sample sizes for selenium assessment with stream fish tissue

    Science.gov (United States)

    Hitt, Nathaniel P.; Smith, David

    2013-01-01

    Natural resource managers are developing assessments of selenium (Se) contamination in freshwater ecosystems based on fish tissue concentrations. We evaluated the effects of sample size (i.e., number of fish per site) on the probability of correctly detecting mean whole-body Se values above a range of potential management thresholds. We modeled Se concentrations as gamma distributions with shape and scale parameters fitting an empirical mean-to-variance relationship in data from southwestern West Virginia, USA (63 collections, 382 individuals). We used parametric bootstrapping techniques to calculate statistical power as the probability of detecting true mean concentrations up to 3 mg Se/kg above management thresholds ranging from 4-8 mg Se/kg. Sample sizes required to achieve 80% power varied as a function of management thresholds and type-I error tolerance (α). Higher thresholds required more samples than lower thresholds because populations were more heterogeneous at higher mean Se levels. For instance, to assess a management threshold of 4 mg Se/kg, a sample of 8 fish could detect an increase of ∼ 1 mg Se/kg with 80% power (given α = 0.05), but this sample size would be unable to detect such an increase from a management threshold of 8 mg Se/kg with more than a coin-flip probability. Increasing α decreased sample size requirements to detect above-threshold mean Se concentrations with 80% power. For instance, at an α-level of 0.05, an 8-fish sample could detect an increase of ∼ 2 units above a threshold of 8 mg Se/kg with 80% power, but when α was relaxed to 0.2 this sample size was more sensitive to increasing mean Se concentrations, allowing detection of an increase of ∼ 1.2 units with equivalent power. Combining individuals into 2- and 4-fish composite samples for laboratory analysis did not decrease power because the reduced number of laboratory samples was compensated by increased precision of composites for estimating mean

  11. Differential proteomic analysis of mouse macrophages exposed to adsorbate-loaded heavy fuel oil derived combustion particles using an automated sample-preparation workflow.

    Science.gov (United States)

    Kanashova, Tamara; Popp, Oliver; Orasche, Jürgen; Karg, Erwin; Harndorf, Horst; Stengel, Benjamin; Sklorz, Martin; Streibel, Thorsten; Zimmermann, Ralf; Dittmar, Gunnar

    2015-08-01

    Ship diesel combustion particles are known to cause broad cytotoxic effects and thereby strongly impact human health. Particles from heavy fuel oil (HFO) operated ships are considered as particularly dangerous. However, little is known about the relevant components of the ship emission particles. In particular, it is interesting to know if the particle cores, consisting of soot and metal oxides, or the adsorbate layers, consisting of semi- and low-volatile organic compounds and salts, are more relevant. We therefore sought to relate the adsorbates and the core composition of HFO combustion particles to the early cellular responses, allowing for the development of measures that counteract their detrimental effects. Hence, the semi-volatile coating of HFO-operated ship diesel engine particles was removed by stepwise thermal stripping using different temperatures. RAW 264.7 macrophages were exposed to native and thermally stripped particles in submersed culture. Proteomic changes were monitored by two different quantitative mass spectrometry approaches, stable isotope labeling by amino acids in cell culture (SILAC) and dimethyl labeling. Our data revealed that cells reacted differently to native or stripped HFO combustion particles. Cells exposed to thermally stripped particles showed a very differential reaction with respect to the composition of the individual chemical load of the particle. The cellular reactions of the HFO particles included reaction to oxidative stress, reorganization of the cytoskeleton and changes in endocytosis. Cells exposed to the 280 °C treated particles showed an induction of RNA-related processes, a number of mitochondria-associated processes as well as DNA damage response, while the exposure to 580 °C treated HFO particles mainly induced the regulation of intracellular transport. In summary, our analysis based on a highly reproducible automated proteomic sample-preparation procedure shows a diverse cellular response, depending on the

  12. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  13. Utilizing Internal Standard Responses to Assess Risk on Reporting Bioanalytical Results from Hemolyzed Samples.

    Science.gov (United States)

    Fung, Eliza N; Aubry, Anne-Françoise; Allentoff, Alban; Ji, Qin C

    2015-09-01

    Bioanalytical analysis of toxicokinetic and pharmacokinetic samples is an integral part of small molecule drugs development and liquid chromatography-tandem mass spectrometry (LC-MS/MS) has been the technique of choice. One important consideration is the matrix effect, in which ionization of the analytes of interest is affected by the presence of co-eluting interfering components present in the sample matrix. Hemolysis, which results in additional endogenous components being released from the lysed red blood cells, may cause additional matrix interferences. The effects of the degree of hemolysis on the accuracy and precision of the method and the reported sample concentrations from hemolyzed study samples have drawn increasing attention in recent years, especially in cases where the sample concentrations are critical for pharmacokinetic calculation. Currently, there is no established procedure to objectively assess the risk of reporting potentially inaccurate bioanalytical results from hemolyzed study samples. In this work, we evaluated the effect of different degrees of hemolysis on the internal standard peak area, accuracy, and precision of the analyses of BMS-906024 and its metabolite, BMS-911557, in human plasma by LC-MS/MS. In addition, we proposed the strategy of using the peak area of the stable isotope-labeled internal standard (SIL-IS) from the LC-MS/MS measurement as the surrogate marker for risk assessment. Samples with peak areas outside of the pre-defined acceptance criteria, e.g., less than 50% or more than 150% of the average IS response in study samples, plasma standards, and QC samples when SIL-IS is used, are flagged out for further investigation. PMID:25975617

  14. Determination of appropriate grid dimension and sampling plot size for assessment of woody species diversity in Zagros Forest, Iran

    OpenAIRE

    ALI ASGHAR ZOHREVANDI; HASSAN POURBABAEI; REZA AKHAVAN; AMIR ESLAM BONYAD

    2016-01-01

    Abstract. Zohrevandi AA, Pourbabaei H, Akhavan R, Bonyad AE. 2015. Determination of appropriate grid dimension and sampling plot size for assessment of woody species diversity in Zagros Forest, Iran. Biodiversitas 17: 24-30. This research was conducted to determine the most suitable grid (dimensions for sampling) and sampling plot size for assessment of woody species diversity in protected Zagros forests, west of Iran. Sampling was carried out using circular sample plots with areas of 1000 m2...

  15. Ticks in the wrong boxes: assessing error in blanket-drag studies due to occasional sampling

    OpenAIRE

    Dobson, Andrew DM

    2013-01-01

    Background The risk posed by ticks as vectors of disease is typically assessed by blanket-drag sampling of host-seeking individuals. Comparisons of peak abundance between plots – either in order to establish their relative risk or to identify environmental correlates – are often carried out by sampling on one or two occasions during the period of assumed peak tick activity. Methods This paper simulates this practice by ‘re-sampling’ from model datasets derived from an empirical field study. R...

  16. Automated assessment of β-cell area and density per islet and patient using TMEM27 and BACE2 immunofluorescence staining in human pancreatic β-cells.

    Directory of Open Access Journals (Sweden)

    Markus P Rechsteiner

    Full Text Available In this study we aimed to establish an unbiased automatic quantification pipeline to assess islet specific features such as β-cell area and density per islet based on immunofluorescence stainings. To determine these parameters, the in vivo protein expression levels of TMEM27 and BACE2 in pancreatic islets of 32 patients with type 2 diabetes (T2D and in 28 non-diabetic individuals (ND were used as input for the automated pipeline. The output of the automated pipeline was first compared to a previously developed manual area scoring system which takes into account the intensity of the staining as well as the percentage of cells which are stained within an islet. The median TMEM27 and BACE2 area scores of all islets investigated per patient correlated significantly with the manual scoring and with the median area score of insulin. Furthermore, the median area scores of TMEM27, BACE2 and insulin calculated from all T2D were significantly lower compared to the one of all ND. TMEM27, BACE2, and insulin area scores correlated as well in each individual tissue specimen. Moreover, islet size determined by costaining of glucagon and either TMEM27 or BACE2 and β-cell density based either on TMEM27 or BACE2 positive cells correlated significantly. Finally, the TMEM27 area score showed a positive correlation with BMI in ND and an inverse pattern in T2D. In summary, automated quantification outperforms manual scoring by reducing time and individual bias. The simultaneous changes of TMEM27, BACE2, and insulin in the majority of the β-cells suggest that these proteins reflect the total number of functional insulin producing β-cells. Additionally, β-cell subpopulations may be identified which are positive for TMEM27, BACE2 or insulin only. Thus, the cumulative assessment of all three markers may provide further information about the real β-cell number per islet.

  17. The Automated Geospatial Watershed Assessment Tool (AGWA): Developing Post-Fire Model Parameters Using Precipitation and Runoff Records from Gauged Watersheds

    Science.gov (United States)

    Sheppard, B. S.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.; Canfield, E.; Sidman, G.

    2014-12-01

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impacts of wildfire on runoff and erosion. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of a suite of hydrologic and erosion models (RHEM, WEPP, KINEROS2 and SWAT). Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM). The watershed model elements are then intersected with terrain, soils, and land cover data layers to derive the requisite model input parameters. With the addition of a burn severity map AGWA can be used to model post wildfire changes to a catchment. By applying the same design storm to burned and unburned conditions a rapid assessment of the watershed can be made and areas that are the most prone to flooding can be identified. Post-fire precipitation and runoff records from gauged forested watersheds are now being used to make improvements to post fire model input parameters. Rainfall and runoff pairs have been selected from these records in order to calibrate parameter values for surface roughness and saturated hydraulic conductivity used in the KINEROS2 model. Several objective functions will be tried in the calibration process. Results will be validated. Currently Department of Interior Burn Area Emergency Response (DOI BAER) teams are using the AGWA-KINEROS2 modeling interface to assess hydrologically imposed risk immediately following wild fire. These parameter refinements are being made to further improve the quality of these assessments.

  18. Assessing genetic polymorphisms using DNA extracted from cells present in saliva samples

    Directory of Open Access Journals (Sweden)

    Nemoda Zsofia

    2011-12-01

    Full Text Available Abstract Background Technical advances following the Human Genome Project revealed that high-quality and -quantity DNA may be obtained from whole saliva samples. However, usability of previously collected samples and the effects of environmental conditions on the samples during collection have not been assessed in detail. In five studies we document the effects of sample volume, handling and storage conditions, type of collection device, and oral sampling location, on quantity, quality, and genetic assessment of DNA extracted from cells present in saliva. Methods Saliva samples were collected from ten adults in each study. Saliva volumes from .10-1.0 ml, different saliva collection devices, sampling locations in the mouth, room temperature storage, and multiple freeze-thaw cycles were tested. One representative single nucleotide polymorphism (SNP in the catechol-0-methyltransferase gene (COMT rs4680 and one representative variable number of tandem repeats (VNTR in the serotonin transporter gene (5-HTTLPR: serotonin transporter linked polymorphic region were selected for genetic analyses. Results The smallest tested whole saliva volume of .10 ml yielded, on average, 1.43 ± .77 μg DNA and gave accurate genotype calls in both genetic analyses. The usage of collection devices reduced the amount of DNA extracted from the saliva filtrates compared to the whole saliva sample, as 54-92% of the DNA was retained on the device. An "adhered cell" extraction enabled recovery of this DNA and provided good quality and quantity DNA. The DNA from both the saliva filtrates and the adhered cell recovery provided accurate genotype calls. The effects of storage at room temperature (up to 5 days, repeated freeze-thaw cycles (up to 6 cycles, and oral sampling location on DNA extraction and on genetic analysis from saliva were negligible. Conclusions Whole saliva samples with volumes of at least .10 ml were sufficient to extract good quality and quantity DNA. Using

  19. Methods for collecting benthic invertebrate samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.

  20. Molecular Method To Assess the Diversity of Burkholderia Species in Environmental Samples

    OpenAIRE

    Salles, J; Souza, de, H.R.; Elsas, van, J.D.

    2002-01-01

    In spite of the importance of many members of the genus Burkholderia in the soil microbial community, no direct method to assess the diversity of this genus has been developed so far. The aim of this work was the development of soil DNA-based PCR-denaturing gradient gel electrophoresis (DGGE), a powerful tool for studying the diversity of microbial communities, for detection and analysis of the Burkholderia diversity in soil samples. Primers specific for the genus Burkholderia were developed ...

  1. Assessing the diagnostic validity of a structured psychiatric interview in a first-admission hospital sample

    OpenAIRE

    NORDGAARD, JULIE; REVSBECH, RASMUS; Sæbye, Ditte; Parnas, Josef

    2012-01-01

    The use of structured psychiatric interviews performed by non-clinicians is frequent for research purposes and is becoming increasingly common in clini-cal practice. The validity of such interviews has rarely been evaluated empirically. In this study of a sample of 100 diagnostically heterogeneous, first-admitted inpatients, the results of an assessment with the Structured Clinical Interview for DSM-IV (SCID), yielding a DSM-IV diagnosis and performed by a trained non-clinic...

  2. Assessing decentering: Validation, psychometric properties and clinical usefulness of the Experiences Questionnaire in a Spanish sample

    OpenAIRE

    Soler Ribaudi, Joaquim; Franquesa, Alba; Feliu-Soler, Albert; Cebolla i Martí, Ausiàs Josep; García Campayo, Javier; Tejedor, Rosa; Demarzo, Marcelo; Baños Rivera, Rosa María; Pascual, Juan Carlos; Portella, María J.

    2014-01-01

    Decentering is defined as the ability to observe one’s thoughts and feelings in a detached manner. The Experiences Questionnaire (EQ) is a self-report instrument that originally assessed decentering and rumination. The purpose of this study was to evaluate the psychometric properties of the Spanish version of EQ-Decentering and to explore its clinical usefulness. The 11-item EQ-Decentering subscale was translated into Spanish and psychometric properties were examined in a sample of 921 adult ...

  3. PREVALENCE AND ANTIMICROBIAL RESISTANCE ASSESSMENT OF SUBCLINICAL MASTITIS IN MILK SAMPLES FROM SELECTED DAIRY FARMS

    OpenAIRE

    Murugaiyah Marimuthu; Faez Firdaus Jesse Abdullah; Konto Mohammed; Sangeetha D/O Sarvananthan Poshpum; Lawan Adamu; Abdinasir Yusuf Osman; Yusuf Abba; Abdulnasir Tijjani

    2014-01-01

    This study was conducted in order to determine the prevalence and bacteriological assessment of subclinical mastitis and antimicrobial resistance of bacterial isolates from dairy cows in different farms around Selangor, Malaysia. A total of 120 milk samples from 3 different farms were randomly collected and tested for subclinical mastitis using California Mastitis Test (CMT), as well as for bacterial culture for isolation, identification and antimicrobial resistance. The most prevalent bacter...

  4. Central Colorado Assessment Project (CCAP)-Geochemical data for rock, sediment, soil, and concentrate sample media

    Science.gov (United States)

    Granitto, Matthew; DeWitt, Ed H.; Klein, Terry L.

    2010-01-01

    This database was initiated, designed, and populated to collect and integrate geochemical data from central Colorado in order to facilitate geologic mapping, petrologic studies, mineral resource assessment, definition of geochemical baseline values and statistics, environmental impact assessment, and medical geology. The Microsoft Access database serves as a geochemical data warehouse in support of the Central Colorado Assessment Project (CCAP) and contains data tables describing historical and new quantitative and qualitative geochemical analyses determined by 70 analytical laboratory and field methods for 47,478 rock, sediment, soil, and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed either in the analytical laboratories of the USGS or by contract with commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects. In addition, geochemical data from 7,470 sediment and soil samples collected and analyzed under the Atomic Energy Commission National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program (henceforth called NURE) have been included in this database. In addition to data from 2,377 samples collected and analyzed under CCAP, this dataset includes archived geochemical data originally entered into the in-house Rock Analysis Storage System (RASS) database (used by the USGS from the mid-1960s through the late 1980s) and the in-house PLUTO database (used by the USGS from the mid-1970s through the mid-1990s). All of these data are maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB and from the NURE database were used to generate most of this dataset. In addition, USGS data that have been excluded previously from the NGDB because the data predate earliest USGS geochemical databases, or were once excluded for programmatic reasons

  5. Delayed matching-to-sample: A tool to assess memory and other cognitive processes in pigeons.

    Science.gov (United States)

    Zentall, Thomas R; Smith, Aaron P

    2016-02-01

    Delayed matching-to-sample is a versatile task that has been used to assess the nature of animal memory. Although once thought to be a relatively passive process, matching research has demonstrated considerable flexibility in how animals actively represent events in memory. But delayed matching can also demonstrate how animals fail to maintain representations in memory when they are cued that they will not be tested (directed forgetting) and how the outcome expected can serve as a choice cue. When pigeons have shown divergent retention functions following training without a delay, it has been taken as evidence of the use of a single-code/default coding strategy but in many cases an alternative account may be involved. Delayed matching has also been used to investigate equivalence learning (how animals represent stimuli when they learn that the same comparison response is correct following the presentation of two different samples) and to test for metamemory (the ability of pigeons to indicate that they understand what they know) by allowing animals to decline to be tested when they are uncertain that they remember a stimulus. How animals assess the passage of time has also been studied using the matching task. And there is evidence that when memory for the sample is impaired by a delay, rather than use the probability of being correct for choice of each of the comparison stimuli, pigeons tend to choose based on the overall sample frequency (base-rate neglect). Finally, matching has been used to identify natural color categories as well as dimensional categories in pigeons. Overall, matching to sample has provided an excellent methodology for assessing an assortment of cognitive processes in animals. PMID:26165174

  6. Assessment of metal concentrations in sediment samples from Billings reservoir, Rio Grande tributary, Sao Paulo, Brazil

    International Nuclear Information System (INIS)

    The present study chemically characterized sediment samples from the Billings reservoir, Rio Grande tributary, in the Metropolitan region of Sao Paulo, by determining metal concentration and other elements of interest. The chosen chemical parameters for this characterization were Aluminum, Arsenic, Barium, Cadmium, Copper, Chromium, Iron, Lead, Manganese, Mercury, Nickel, Selenium and Zinc. These parameters are also used in the water quality index, with the exception of Selenium. The concentrations were determined through different analytical techniques such as atomic absorption spectrometry (FAAS, GFAAS and CVAAS), optical emission spectrometry (ICP OES) and neutron activation analysis. These analytical methodologies were assessed for precision, accuracy and detection and/or quantification limits for the sediment elements in question. Advantages and disadvantages of each technique for each element and its concentration were also discussed. From these assessment the most adequate technique was selected for the routine analysis of sediment samples for each element concentration determination. This assessment verified also that digestion in a closed microwave system with nitric acid is efficient for the evaluation of extracted metals of environmental interest. The analytical techniques chosen were equally efficient for metals determination. In the case of Cd and Pb, the FAAS technique was selected due to better results than ICP OES, as it does not present matrix interference. The concentration values obtained for metals As, Cd, Cu, Cr, Hg, Ni, Pb and Zn in the sediment samples were compared to Canadian Council of Minister of the Environment (CCME) TEL and PEL values. (author)

  7. Assessment of metal concentrations in sediment samples from Billings Reservoir, Rio Grande tributary, Sao Paulo, Brazil

    International Nuclear Information System (INIS)

    The present study chemically characterized sediment samples from the Billings reservoir, Rio Grande tributary, in the Metropolitan region of Sao Paulo, by determining metal concentration and other elements of interest. The chosen chemical parameters for this characterization were Aluminum, Arsenic, Barium, Cadmium, Copper, Chromium, Iron, Lead, Manganese, Mercury, Nickel, Selenium and Zinco. These parameters are also used in the water quality index, with the exception of Selenium. The concentrations were determined through different analytical techniques such as atomic absorption spectrometry (FAAS, GFAAS and CVAAS), optical emission spectrometry (ICP OES) and neutron activation analysis. These analytical methodologies were assessed for precision, accuracy and detection and/or quantification limits for the sediment elements in question. Advantages and disadvantages of each technique for each element and its concentration were also discussed. From these assessments the most adequate technique was selected for the routine analysis of sediment samples for each element concentration determination. This assessment verified also that digestion in a closed microwave system with nitric acid is efficient for the evaluation of extracted metals of environmental interest. The analytical techniques chosen were equally efficient for metals determination. In the case of Cd and Pb, the FAAS technique was selected due to better results than ICP OES, as it does not present matrix interference. The concentration values obtained for metals As, Cd, Cu, Cr, Hg, Ni, Pb and Zn in the sediment samples were compared to Canadian Council of Minister of the Environment (CCME) TEL and PEL values. (author)

  8. Using Experience Sampling Methods/Ecological Momentary Assessment (ESM/EMA) in Clinical Assessment and Clinical Research: Introduction to the Special Section

    OpenAIRE

    Trull, Timothy J.; Ebner-Priemer, Ulrich W.

    2009-01-01

    This article introduces the special section on experience sampling methods and ecological momentary assessment in clinical assessment. We review the conceptual basis for experience sampling methods (ESM; Csikszentmihalyi & Larson, 1987) and ecological momentary assessment (EMA; Stone & Shiffman, 1994). Next, we highlight several advantageous features of ESM/EMA as applied to psychological assessment and clinical research. We provide a brief overview of the articles in this special section, ea...

  9. Assessment of fish assemblages in coastal lagoon habitats: Effect of sampling method

    Science.gov (United States)

    Franco, A.; Pérez-Ruzafa, A.; Drouineau, H.; Franzoi, P.; Koutrakis, E. T.; Lepage, M.; Verdiell-Cubedo, D.; Bouchoucha, M.; López-Capel, A.; Riccato, F.; Sapounidis, A.; Marcos, C.; Oliva-Paterna, F. J.; Torralva-Forero, M.; Torricelli, P.

    2012-10-01

    The structure of fish assemblages accounted for by different sampling methods (namely fyke net, seine nets, visual census) applied to vegetated and unvegetated lagoon habitats was investigated in terms of species composition, functional groups (ecological and trophic guilds), and fish size distribution. Significant differences were detected among methods, even among similar ones (seine nets). Visual census and fyke net detected more easily pelagic species, allowing the sampling of larger fish, whereas seine nets targeted more efficiently benthic-demersal species, with a dominance of 2-10 cm size classes in the fish catches. Differences were detected also among habitats, reflecting the different fish assemblages associated to vegetated and unvegetated habitats in coastal lagoons and transitional waters. However a different ability of discriminating between habitat-associated fish assemblages was recorded for the sampling methods. The different selectivity and functioning of the tested sampling methods confirm the importance of considering the targeted scale at which the research is being carried out, as well as the method that will be used to assess the ecological status of lagoon fish assemblages when choosing the most appropriate sampling method. A cross-validation of fish sampling methodologies in transitional waters is necessary to cope with the mandatory of the Water Framework Directive of standardization and comparability of monitoring methods.

  10. On the assessment of extremely low breakdown probabilities by an inverse sampling procedure [gaseous insulation

    DEFF Research Database (Denmark)

    Thyregod, Poul; Vibholm, Svend

    1991-01-01

    First breakdown voltages obtained under the inverse sampling procedure assuming a double exponential flashover probability function are discussed. An inverse sampling procedure commences the voltage application at a very low level, followed by applications at stepwise increased levels until a...... breakdown occurs. Following a breakdown, the procedure is restarted at the initial level. The procedure is repeated until a predetermined number of breakdowns have occurred, and the average and standard deviation of the observed first breakdown levels are recorded. The authors derive the relation between...... the flashover probability function and the corresponding distribution of first breakdown voltages under the inverse sampling procedure, and show how this relation may be utilized to assess the single-shot flashover probability corresponding to the observed average first breakdown voltage. Since the...

  11. The influence of sampling interval on the accuracy of trail impact assessment

    Science.gov (United States)

    Leung, Y.-F.; Marion, J.L.

    1999-01-01

    Trail impact assessment and monitoring (IA&M) programs have been growing in importance and application in recreation resource management at protected areas. Census-based and sampling-based approaches have been developed in such programs, with systematic point sampling being the most common survey design. This paper examines the influence of sampling interval on the accuracy of estimates for selected trail impact problems. A complete census of four impact types on 70 trails in Great Smoky Mountains National Park was utilized as the base data set for the analyses. The census data were resampled at increasing intervals to create a series of simulated point data sets. Estimates of frequency of occurrence and lineal extent for the four impact types were compared with the census data set. The responses of accuracy loss on lineal extent estimates to increasing sampling intervals varied across different impact types, while the responses on frequency of occurrence estimates were consistent, approximating an inverse asymptotic curve. These findings suggest that systematic point sampling may be an appropriate method for estimating the lineal extent but not the frequency of trail impacts. Sample intervals of less than 100 m appear to yield an excellent level of accuracy for the four impact types evaluated. Multiple regression analysis results suggest that appropriate sampling intervals are more likely to be determined by the type of impact in question rather than the length of trail. The census-based trail survey and the resampling-simulation method developed in this study can be a valuable first step in establishing long-term trail IA&M programs, in which an optimal sampling interval range with acceptable accuracy is determined before investing efforts in data collection.

  12. Assessing cereal grain quality with a fully automated instrument using artificial neural network processing of digitized color video images

    Science.gov (United States)

    Egelberg, Peter J.; Mansson, Olle; Peterson, Carsten

    1995-01-01

    A fully integrated instrument for cereal grain quality assessment is presented. Color video images of grains fed onto a belt are digitized. These images are then segmented into kernel entities, which are subject to the analysis. The number of degrees of freedom for each such object is decreased to a suitable level for Artificial Neural Network (ANN) processing. Feed- forward ANN's with one hidden layer are trained with respect to desired features such as purity and flour yield. The resulting performance is compatible with that of manual human ocular inspection and alternative measuring methods. A statistical analysis of training and test set population densities is used to estimate the prediction reliabilities and to set appropriate alarm levels. The instrument containing feeder belts, balance and CCD video camera is physically separated from the 90 MHz Pentium PC computer which is used to perform the segmentation, ANN analysis and for controlling the instrument under the Unix operating system. A user-friendly graphical user interface is used to operate the instrument. The processing time for a 50 g grain sample is approximately 2 - 3 minutes.

  13. Validating the use of biopsy sampling in contamination assessment studies of small cetaceans.

    Science.gov (United States)

    Méndez-Fernandez, Paula; Galluzzi Polesi, Paola; Taniguchi, Satie; de O Santos, Marcos C; Montone, Rosalinda C

    2016-06-15

    Remote biopsy sampling is the most common technique for acquiring samples from free-ranging marine mammals. However, such techniques may result in variable sampling being sometimes superficial skin and blubber biopsies. For decades, blubber has been used to monitor the exposure of marine mammals to persistent organic pollutants (POPs), but little is known regarding the variability of POPs as a function of blubber depth in small cetaceans and the available literature offers variable results. Thus, the aim of the present study was to validate biopsy sampling for monitoring contaminant concentrations in small, free-ranging cetaceans. Samples from the dorsal blubber of 10 incidentally captured Atlantic spotted dolphins (Stenella frontalis) were separated into two different layers (outer and inner) to investigate the influence of sampling depth on POP concentrations. POP concentrations were compared to those of the full blubber layer. The results revealed no significant differences in lipid content between males and females or among the inner, outer and full blubber layers (p>0.05). Moreover, the wet and lipid weight concentrations of all POP classes analysed [i.e. polychlorinated biphenyls (PCBs), dichlorodiphenyltrichloroethanes (DDTs), polybrominated diphenyl ethers (PBDEs), hexachlorobenzene (HCB), hexachlorocyclohexanes (HCHs), chlordanes (CHLs) and mirex] did not differ significantly with blubber depth (p>0.05). POP classes followed the same decreasing order of wet weight concentrations in blubber layers and full blubber: PCBs>DDTs>PBDEs>mirex>HCB>HCHs>CHLs. Moreover, there was a low degree of differentiation in the accumulation of POP congeners. The present findings indicated that the distribution of contaminants was homogenous with blubber depth, which validates the use of biopsy sampling for the assessment of contaminants in small cetaceans. PMID:27113024

  14. Assessing the effects of sampling design on water quality status classification

    Science.gov (United States)

    Lloyd, Charlotte; Freer, Jim; Johnes, Penny; Collins, Adrian

    2013-04-01

    The Water Framework Directive (WFD) requires continued reporting of the water quality status of all European waterbodies, with this status partly determined by the time a waterbody exceeds different pollution concentration thresholds. Routine water quality monitoring most commonly takes place at weekly to monthly time steps meaning that potentially important pollution events can be missed. This has the potential to result in the misclassification of water quality status. Against this context, this paper investigates the implications of sampling design on a range of existing water quality status metrics routinely applied to WFD compliance assessments. Previous research has investigated the effect of sampling design on the calculation of annual nutrient and sediment loads using a variety of different interpolation and extrapolation models. This work builds on this foundation, extending the analysis to include the effects of sampling regime on flow- and concentration-duration curves as well as threshold-exceedance statistics, which form an essential part of WFD reporting. The effects of sampling regime on both the magnitude of the summary metrics and their corresponding uncertainties are investigated. This analysis is being undertaken on data collected as part of the Hampshire Avon Demonstration Test Catchment (DTC) project; a DEFRA funded initiative investigating cost-effective solutions for reducing diffuse pollution from agriculture. The DTC monitoring platform is collecting water quality data at a variety of temporal resolutions and using differing collection methods, including weekly grab samples, daily ISCO autosamples and high resolution samples (15-30 min time step) using analysers in situ on the river bank. Datasets collected during 2011-2013 were used to construct flow- and concentration-duration curves. A bootstrapping methodology was employed to resample randomly the individual datasets and produce distributions of the curves in order to quantify the

  15. Bayesian Reliability Modeling and Assessment Solution for NC Machine Tools under Small-sample Data

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaojun; KAN Yingnan; CHEN Fei; XU Binbin; CHEN Chuanhai; YANG Chuangui

    2015-01-01

    Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters’ prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters’ posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.

  16. Bayesian reliability modeling and assessment solution for NC machine tools under small-sample data

    Science.gov (United States)

    Yang, Zhaojun; Kan, Yingnan; Chen, Fei; Xu, Binbin; Chen, Chuanhai; Yang, Chuangui

    2015-11-01

    Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters' prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters' posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.

  17. A Simulation Approach to Assessing Sampling Strategies for Insect Pests: An Example with the Balsam Gall Midge

    OpenAIRE

    Carleton, R. Drew; Heard, Stephen B.; Silk, Peter J.

    2013-01-01

    Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using ...

  18. Automated volumetric assessment of the Achilles tendon (AVAT) using a 3D T2 weighted SPACE sequence at 3 T in healthy and pathologic cases

    International Nuclear Information System (INIS)

    Purpose: Achilles tendinopathy has been reported to be frequently associated with increasing volume of the tendon. This work aims at reliable and accurate volumetric quantification of the Achilles tendon using a newly developed contour detection algorithm applied on high resolution MRI data sets recorded at 3 T. Materials and methods: A total of 26 healthy tendons and 4 degenerated tendons were examined for this study. Automated identification (AI) of tendon boundaries was performed in transverse slices with isotropic resolution (0.8 mm) gained with a T2-weighted SPACE sequence at 3 T. For AI a snake algorithm was applied and compared to manual tracing (MT). Results: AI was feasible in all examined tendons without further correction. AI of both tendons was performed in each participant within 2 min (2 × 37 slices) compared to MT lasting 20 min. MT and AI showed excellent agreement and correlation (R2 = 0.99, p 3 vs. 0.5 cm3) and coefficient of variation (1% vs. 2%). Discussion: Compared to MT the AI allows assessment of tendon volumes in highly resolved MRI data in a more accurate and reliable time-saving way. Therefore automated volume detection is seen as a helpful clinical tool for evaluation of small volumetric changes of the Achilles tendon.

  19. Quality Assessment of Attribute Data in GIS Based on Simple Random Sampling

    Institute of Scientific and Technical Information of China (English)

    LIU Chun; SHI Wenzhong; LIU Dajie

    2003-01-01

    On the basis of the principles of simple random sampling, the statistical model of rate of disfigurement (RD) is put forward and described in detail. According to the definition of simple random sampling for the attribute data in GIS, the mean and variance of the RD are deduced as the characteristic value of the statistical model in order to explain the feasibility of the accuracy measurement of the attribute data in GIS by using the RD. Moreover, on the basis of the mean and variance of the RD, the quality assessment method for attribute data of vector maps during the data collecting is discussed. The RD spread graph is also drawn to see whether the quality of the attribute data is under control. The RD model can synthetically judge the quality of attribute data, which is different from other measurement coefficients that only discuss accuracy of classification.

  20. Regional flood impact assessment based on local land use patterns and sample damage records

    International Nuclear Information System (INIS)

    Increasing land consumption and land demand particularly in mountainous regions entail further expansion of settlements to known hazard-prone areas. Potential impacts as well as regionally defined levels of 'acceptable risk' are often not transparently communicated and residual risks are not perceived by the public. Analysing past events and assessing regional damage potentials can help planners on all levels to improve comprehensive and sustainable risk management. In this letter, a geospatial and statistical approach to regional damage cost assessment is presented, integrating information on actual conditions in terms of land use disparities and recorded damage data from a documented severe flooding event. In a first step building objects are categorized according to their function and use. Tabular company information is linked to the building model via geocoded postal address data, enabling classification of building types in terms of predominant uses. For the disaster impact assessment the flood plain is delineated based on post-disaster aerial imagery and a digital terrain model distinguishing areas of long and short term flooding. Finally, four regional damage cost assessment scenarios on different levels of detail are calculated. The damage cost projection relies on available sample building-level damage records, allowing rough damage averaging for distinct building uses. Results confirm that consideration of local land use patterns is essential for optimizing regional damage cost projections.

  1. Urban air quality assessment using monitoring data of fractionized aerosol samples, chemometrics and meteorological conditions.

    Science.gov (United States)

    Yotova, Galina I; Tsitouridou, Roxani; Tsakovski, Stefan L; Simeonov, Vasil D

    2016-06-01

    The present article deals with assessment of urban air by using monitoring data for 10 different aerosol fractions (0.015-16 μm) collected at a typical urban site in City of Thessaloniki, Greece. The data set was subject to multivariate statistical analysis (cluster analysis and principal components analysis) and, additionally, to HYSPLIT back trajectory modeling in order to assess in a better way the impact of the weather conditions on the pollution sources identified. A specific element of the study is the effort to clarify the role of outliers in the data set. The reason for the appearance of outliers is strongly related to the atmospheric condition on the particular sampling days leading to enhanced concentration of pollutants (secondary emissions, sea sprays, road and soil dust, combustion processes) especially for ultra fine and coarse particles. It is also shown that three major sources affect the urban air quality of the location studied-sea sprays, mineral dust and anthropogenic influences (agricultural activity, combustion processes, and industrial sources). The level of impact is related to certain extent to the aerosol fraction size. The assessment of the meteorological conditions leads to defining of four downwind patterns affecting the air quality (Pelagic, Western and Central Europe, Eastern and Northeastern Europe and Africa and Southern Europe). Thus, the present study offers a complete urban air assessment taking into account the weather conditions, pollution sources and aerosol fractioning. PMID:26942452

  2. Review on the fire risk evaluation items and sample fire models for its assessment

    International Nuclear Information System (INIS)

    NFPA-803, the prescriptive regulation for Fire Protection Standard for Nuclear Power Plant (NPP), has to be replaced with NFPA-805, whose main tenet is based on probabilistic analysis or quantitative approach. With this insight, this paper introduces the evaluation items that must be reviewed and selected for the fire risk evaluation and the sample Fire Model for their assessment when the new Standard is applied in NPP. In addition, it is suggested that there has to be some modification as well as complementary renewal in some parts of the fire modeling programs if these kind of tools are comprehensibly used with validity

  3. The use of ESR technique for assessment of heating temperatures of archaeological lentil samples

    Science.gov (United States)

    Aydaş, Canan; Engin, Birol; Dönmez, Emel Oybak; Belli, Oktay

    2010-01-01

    Heat-induced paramagnetic centers in modern and archaeological lentils ( Lens culinaris, Medik.) were studied by X-band (9.3 GHz) electron spin resonance (ESR) technique. The modern red lentil samples were heated in an electrical furnace at increasing temperatures in the range 70-500 °C. The ESR spectral parameters (the intensity, g-value and peak-to-peak line width) of the heat-induced organic radicals were investigated for modern red lentil ( Lens culinaris, Medik.) samples. The obtained ESR spectra indicate that the relative number of heat-induced paramagnetic species and peak-to-peak line widths depends on the temperature and heating time of the modern lentil. The g-values also depend on the heating temperature but not heating time. Heated modern red lentils produced a range of organic radicals with g-values from g = 2.0062 to 2.0035. ESR signals of carbonised archaeological lentil samples from two archaeological deposits of the Van province in Turkey were studied and g-values, peak-to-peak line widths, intensities and elemental compositions were compared with those obtained for modern samples in order to assess at which temperature these archaeological lentils were heated in prehistoric sites. The maximum temperatures of the previous heating of carbonised UA5 and Y11 lentil seeds are as follows about 500 °C and above 500 °C, respectively.

  4. Automated system for surveillance, assessment and prediction as a basis for complex protection of environment and population health

    International Nuclear Information System (INIS)

    A formulated concept of a more perfect system for protecting the environment and population health in the regions of industrial and power complexes is the sum of interrelated elements: automated information system, control system, control object. A unified automatic information system is suggested a means of increasing the efficiency and expediency of decisions taken on the protection of labour conditions, life and health of the population against the effect of chemical and other harmful environmental factors. The main requirements, principles and ways of constructing the system that permits to perform a dynamic surveillance, analysis estimate and forecast in the source-environment-health of population system are outlined. Some results of constructing the surveillance system, unified data bank, program package and model family are presented

  5. A preliminary study to assess the construct validity of a cultural intelligence measure on a South African sample

    OpenAIRE

    Bright Mahembe; Amos S. Engelbrecht

    2014-01-01

    Orientation: Cultural intelligence is an essential social competence for effective individual interaction in a cross-cultural context. The cultural intelligence scale (CQS) is used extensively for assessing cultural intelligence; nevertheless, its reliability and validity on a South African sample are yet to be ascertained.Research purpose: The purpose of the current study was to assess the construct validity of the CQS on a South African sample. The results of the psychometric assessment off...

  6. Genotoxicity assessment of water sampled from R-11 reservoir by means of allium test

    International Nuclear Information System (INIS)

    -orcein. Approximately 150 ana-telophases were scored for each root. 20-40 roots were analyzed for each water sample. In total 3000 - 6000 ana-telophases for each water sample were analyzed. Chromosome aberrations in ana-telophases (chromatid and chromosomal bridges and fragments), mitotic abnormalities (multipolar mitosis and laggards) were scored. The data analysis was arranged using R statistics. Aberration frequency in water samples from the natural control reservoir (0.46 ± 0.12%) exceeded insignificantly the frequency of aberrations in distilled (0.15 ± 0.08%) and bottled waters (0.33 ± 0.08%). Average frequency of aberrant cells in root meristem of onion germinated in water samples from R-11 reservoir (1.36 ± 0.24%) was about 3 times higher compared to control ones. Mitotic activity in root meristem was slightly inhibited in bulbs germinated in R-11 sample, but this effect was statistically insignificant. There was no difference in types of aberrations among all water samples but only in the frequency of abnormalities. So genotoxicity assessment of water sampled from R-11 reservoir by means of allium test shows the presence of genotoxic factor in water from the reservoir. Document available in abstract form only. (authors)

  7. Genotoxicity assessment of water sampled from R-11 reservoir by means of allium test

    Energy Technology Data Exchange (ETDEWEB)

    Bukatich, E.; Pryakhin, E. [Urals Research Center for Radiation Medicine (Russian Federation); Geraskin, S. [Russian Institute of Agricultural Radiology and Agroecology (Russian Federation)

    2014-07-01

    slides of root tips meristem were dyed with aceto-orcein. Approximately 150 ana-telophases were scored for each root. 20-40 roots were analyzed for each water sample. In total 3000 - 6000 ana-telophases for each water sample were analyzed. Chromosome aberrations in ana-telophases (chromatid and chromosomal bridges and fragments), mitotic abnormalities (multipolar mitosis and laggards) were scored. The data analysis was arranged using R statistics. Aberration frequency in water samples from the natural control reservoir (0.46 ± 0.12%) exceeded insignificantly the frequency of aberrations in distilled (0.15 ± 0.08%) and bottled waters (0.33 ± 0.08%). Average frequency of aberrant cells in root meristem of onion germinated in water samples from R-11 reservoir (1.36 ± 0.24%) was about 3 times higher compared to control ones. Mitotic activity in root meristem was slightly inhibited in bulbs germinated in R-11 sample, but this effect was statistically insignificant. There was no difference in types of aberrations among all water samples but only in the frequency of abnormalities. So genotoxicity assessment of water sampled from R-11 reservoir by means of allium test shows the presence of genotoxic factor in water from the reservoir. Document available in abstract form only. (authors)

  8. 全自动定量浓缩-气相色谱法分析地表水中的有机氯农药%Determination of organochlorine Pesticides in Water Samples by Fully Automated Quantitative Concentrator-Gas Chromatography

    Institute of Scientific and Technical Information of China (English)

    曹旭静

    2016-01-01

    地表水中的有机氯农药用正己烷萃取后,用全自动定量蒸发浓缩仪在水浴温度35℃,真空度为300mbar时浓缩定容到1mL,一个样品只需要25min。用液液萃取-全自动定量浓缩仪-气相色谱法分析地表水水中的有机氯农药,该方法的检出限为为0.001~0.008μg/L,方法的平均回收率在78.6%~104%之间。该方法检出限低,精密度好,省时省力,自动化程度高,适合于大批量样品的监测。%Organochlorine pesticides in water were extracted by n-hexan,the extracted liquid was concentrated to 1mL with fully automated quantitative concentrator in the water bath temperature 35℃and the vacuum 300mbar.Which only need 25min. Organochlorine pesticides were determined by gas chromatograph after samples pre-treatment by liquid-liquid ex⁃traction with n-hexane and concentration with fully automated quantitative concentrator.The detection limits of method for organochlorine pesticides were in the range of 0.001~0.008μg/L.The average recoveries were 78.6%~104%. This method had advantages of good accuracy and precision,rapid,high degree of automation and was suitable for batch samples.

  9. Assessment of neuropsychological function through use of the Cambridge Neuropsychological Testing Automated Battery: performance in 4- to 12-year-old children.

    Science.gov (United States)

    Luciana, Monica; Nelson, Charles A

    2002-01-01

    In this article, children's performance on subtasks from the Cambridge Neuropsychological Testing Automated Battery (CANTAB) is described. Two samples were recruited, one of which included children who spoke English as a second language. Children in this group also completed subtests from the Wechsler Intelligence Scale for Children-Third Revision (WISC-III). Despite the fact that ESL children scored over 1 SD below the norm on the WISC-III Vocabulary subtest, there were no CANTAB performance distinctions between primary versus secondary English-language speakers. In addition, several aspects of CANTAB performance were significantly correlated with verbal and nonverbal IQ. When developmental trends were examined, findings indicated that several aspects of frontal lobe function (memory span, working memory, and planning skills) are not functionally mature, by the age of 12 years. Implications for use of the CANTAB in clinical studies are discussed. PMID:12661972

  10. A machine vision system for automated non-invasive assessment of cell viability via dark field microscopy, wavelet feature selection and classification

    Directory of Open Access Journals (Sweden)

    Friehs Karl

    2008-10-01

    Full Text Available Abstract Background Cell viability is one of the basic properties indicating the physiological state of the cell, thus, it has long been one of the major considerations in biotechnological applications. Conventional methods for extracting information about cell viability usually need reagents to be applied on the targeted cells. These reagent-based techniques are reliable and versatile, however, some of them might be invasive and even toxic to the target cells. In support of automated noninvasive assessment of cell viability, a machine vision system has been developed. Results This system is based on supervised learning technique. It learns from images of certain kinds of cell populations and trains some classifiers. These trained classifiers are then employed to evaluate the images of given cell populations obtained via dark field microscopy. Wavelet decomposition is performed on the cell images. Energy and entropy are computed for each wavelet subimage as features. A feature selection algorithm is implemented to achieve better performance. Correlation between the results from the machine vision system and commonly accepted gold standards becomes stronger if wavelet features are utilized. The best performance is achieved with a selected subset of wavelet features. Conclusion The machine vision system based on dark field microscopy in conjugation with supervised machine learning and wavelet feature selection automates the cell viability assessment, and yields comparable results to commonly accepted methods. Wavelet features are found to be suitable to describe the discriminative properties of the live and dead cells in viability classification. According to the analysis, live cells exhibit morphologically more details and are intracellularly more organized than dead ones, which display more homogeneous and diffuse gray values throughout the cells. Feature selection increases the system's performance. The reason lies in the fact that feature

  11. Development of a Sampling Method for a Radionuclide Assessment of a Spent HEPA Filter Waste

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Young-Yong; Hong, Dae-Seok; Kang, Il-Sig; Kim, Tae-Kuk; Lee, Young-Hee; Shon, Jong-Sik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2007-07-01

    Spent filter wastes of about 2,160 units have been stored in the waste storage facility of the Korea Atomic Energy Research Institute since its operation. These spent filters have generally consisted of a HEPA filter after its filtering of all the contaminants in the gas stream generated during the operation of the HANARO research reactor and the nuclear fuel cycle facilities. At the moment, to secure a storage space, it is necessary to make a volume reduction of the stored radioactive wastes through a compression treatment or a regulatory clearance. These methods are considered in view of a reduction of a management cost and disposal cost and the security of a free space for a waste storage facility approaching saturation. In order to dispose of the spent filters, it is first necessary to conduct a radionuclide assessment of them. To do that, a sampling procedure should be prepared for obtaining the representative sample in the spent filter. As for conducting a nuclide analysis for this representative sample, a corresponding spent filter can be sorted as either a regulatory clearance waste or a radioactive waste. In this study, the spent filter wastes were classified according to their generating facilities, their generation date and their surface dose rate. After selecting several HEPA filters, they were dismantled into a frame part and a filter medium part. And then, a quantitative analysis of the nuclide existing in the filter medium was conducted. From the analysis results, it was possible to divide the filter medium into three specific regions in respect of the nuclide distribution. As a result, these three regions could be a sampling guide to take a representative sample of a spent HEPA filter waste for treating it.

  12. Development of a Sampling Method for a Radionuclide Assessment of a Spent HEPA Filter Waste

    International Nuclear Information System (INIS)

    Spent filter wastes of about 2,160 units have been stored in the waste storage facility of the Korea Atomic Energy Research Institute since its operation. These spent filters have generally consisted of a HEPA filter after its filtering of all the contaminants in the gas stream generated during the operation of the HANARO research reactor and the nuclear fuel cycle facilities. At the moment, to secure a storage space, it is necessary to make a volume reduction of the stored radioactive wastes through a compression treatment or a regulatory clearance. These methods are considered in view of a reduction of a management cost and disposal cost and the security of a free space for a waste storage facility approaching saturation. In order to dispose of the spent filters, it is first necessary to conduct a radionuclide assessment of them. To do that, a sampling procedure should be prepared for obtaining the representative sample in the spent filter. As for conducting a nuclide analysis for this representative sample, a corresponding spent filter can be sorted as either a regulatory clearance waste or a radioactive waste. In this study, the spent filter wastes were classified according to their generating facilities, their generation date and their surface dose rate. After selecting several HEPA filters, they were dismantled into a frame part and a filter medium part. And then, a quantitative analysis of the nuclide existing in the filter medium was conducted. From the analysis results, it was possible to divide the filter medium into three specific regions in respect of the nuclide distribution. As a result, these three regions could be a sampling guide to take a representative sample of a spent HEPA filter waste for treating it

  13. The impact of genetic heterogeneity on biomarker development in kidney cancer assessed by multiregional sampling

    International Nuclear Information System (INIS)

    Primary clear cell renal cell carcinoma (ccRCC) genetic heterogeneity may lead to an underestimation of the mutational burden detected from a single site evaluation. We sought to characterize the extent of clonal branching involving key tumor suppressor mutations in primary ccRCC and determine if genetic heterogeneity could limit the mutation profiling from a single region assessment. Ex vivo core needle biopsies were obtained from three to five different regions of resected renal tumors at a single institution from 2012 to 2013. DNA was extracted and targeted sequencing was performed on five genes associated with ccRCC (von-Hippel Lindau [VHL], PBRM1, SETD2, BAP1, and KDM5C). We constructed phylogenetic trees by inferring clonal evolution based on the mutations present within each core and estimated the predictive power of detecting a mutation for each successive tumor region sampled. We obtained 47 ex vivo biopsy cores from 14 primary ccRCC's (median tumor size 4.5 cm, IQR 4.0–5.9 cm). Branching patterns of various complexities were observed in tumors with three or more mutations. A VHL mutation was detected in nine tumors (64%), each time being present ubiquitously throughout the tumor. Other genes had various degrees of regional mutational variation. Based on the mutations' prevalence we estimated that three different tumor regions should be sampled to detect mutations in PBRM1, SETD2, BAP1, and/or KDM5C with 90% certainty. The mutational burden of renal tumors varies by region sampled. Single site assessment of key tumor suppressor mutations in primary ccRCC may not adequately capture the genetic predictors of tumor behavior

  14. The impact of genetic heterogeneity on biomarker development in kidney cancer assessed by multiregional sampling.

    Science.gov (United States)

    Sankin, Alexander; Hakimi, Abraham A; Mikkilineni, Nina; Ostrovnaya, Irina; Silk, Mikhail T; Liang, Yupu; Mano, Roy; Chevinsky, Michael; Motzer, Robert J; Solomon, Stephen B; Cheng, Emily H; Durack, Jeremy C; Coleman, Jonathan A; Russo, Paul; Hsieh, James J

    2014-12-01

    Primary clear cell renal cell carcinoma (ccRCC) genetic heterogeneity may lead to an underestimation of the mutational burden detected from a single site evaluation. We sought to characterize the extent of clonal branching involving key tumor suppressor mutations in primary ccRCC and determine if genetic heterogeneity could limit the mutation profiling from a single region assessment. Ex vivo core needle biopsies were obtained from three to five different regions of resected renal tumors at a single institution from 2012 to 2013. DNA was extracted and targeted sequencing was performed on five genes associated with ccRCC (von-Hippel Lindau [VHL], PBRM1, SETD2, BAP1, and KDM5C). We constructed phylogenetic trees by inferring clonal evolution based on the mutations present within each core and estimated the predictive power of detecting a mutation for each successive tumor region sampled. We obtained 47 ex vivo biopsy cores from 14 primary ccRCC's (median tumor size 4.5 cm, IQR 4.0-5.9 cm). Branching patterns of various complexities were observed in tumors with three or more mutations. A VHL mutation was detected in nine tumors (64%), each time being present ubiquitously throughout the tumor. Other genes had various degrees of regional mutational variation. Based on the mutations' prevalence we estimated that three different tumor regions should be sampled to detect mutations in PBRM1, SETD2, BAP1, and/or KDM5C with 90% certainty. The mutational burden of renal tumors varies by region sampled. Single site assessment of key tumor suppressor mutations in primary ccRCC may not adequately capture the genetic predictors of tumor behavior. PMID:25124064

  15. Automated sequential injection-microcolumn approach with on-line flame atomic absorption spectrometric detection for implementing metal fractionation schemes of homogeneous and non-homogeneous solid samples of environmental interest

    DEFF Research Database (Denmark)

    Chomchoei, Roongrat; Miró, Manuel; Hansen, Elo Harald; Shiowatana, Juwadee

    2005-01-01

    An automated sequential injection (SI) system incorporating a dual-conical microcolumn is proposed as a versatile approach for the accommodation of both single and sequential extraction schemes for metal fractionation of solid samples of environmental concern. Coupled to flame atomic absorption...... Testing sequential extraction method have been also performed in a dynamic fashion and critically compared with the conventional batch-wise protocols. The ecotoxicological relevance of the data provided by both methods with different operationally defined conditions is thoroughly discussed. As compared to...... traditional batch systems, the developed SI assembly offers minimum risks of sample contamination, absence of metal re-distribution/re-adsorption, and dramatic saving of operational times (from 16 h to 40-80 min per partitioning step). It readily facilitates the accurate manipulation of the extracting...

  16. Automation of TL brick dating by ADAM-1

    International Nuclear Information System (INIS)

    supralinearity of the response at low doses, the first group of nine samples is irradiated by doses of beta radiation after the measurement of the value 'N'. All procedures of alpha and beta irradiation by varying doses and the TL- signal measurement as also the age evaluation and error assessment are programmable and fully automated. (author)

  17. Evaluation of an alternate method for sampling benthic macroinvertebrates in low-gradient streams sampled as part of the National Rivers and Streams Assessment

    Science.gov (United States)

    Benthic macroinvertebrates are sampled in streams and rivers as one of the assessment elements of the U.S. Environmental Protection Agency’s National Aquatic Resource Surveys. In a 2006 report, the recommendation was made that different yet comparable methods be evaluated for di...

  18. Automated Microbial Metabolism Laboratory

    Science.gov (United States)

    1973-01-01

    Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.

  19. Automated uranium assays

    International Nuclear Information System (INIS)

    Precise, timely inventories of enriched uranium stocks are vital to help prevent the loss, theft, or diversion of this material for illicit use. A wet-chemistry analyzer has been developed at LLL to assist in these inventories by performing automated analyses of uranium samples from different stages in the nuclear fuel cycle. These assays offer improved accuracy, reduced costs, significant savings in manpower, and lower radiation exposure for personnel compared with present techniques

  20. Automated determination of the stable carbon isotopic composition (δ13C) of total dissolved inorganic carbon (DIC) and total nonpurgeable dissolved organic carbon (DOC) in aqueous samples: RSIL lab codes 1851 and 1852

    Science.gov (United States)

    Révész, Kinga M.; Doctor, Daniel H.

    2014-01-01

    The purposes of the Reston Stable Isotope Laboratory (RSIL) lab codes 1851 and 1852 are to determine the total carbon mass and the ratio of the stable isotopes of carbon (δ13C) for total dissolved inorganic carbon (DIC, lab code 1851) and total nonpurgeable dissolved organic carbon (DOC, lab code 1852) in aqueous samples. The analysis procedure is automated according to a method that utilizes a total carbon analyzer as a peripheral sample preparation device for analysis of carbon dioxide (CO2) gas by a continuous-flow isotope ratio mass spectrometer (CF-IRMS). The carbon analyzer produces CO2 and determines the carbon mass in parts per million (ppm) of DIC and DOC in each sample separately, and the CF-IRMS determines the carbon isotope ratio of the produced CO2. This configuration provides a fully automated analysis of total carbon mass and δ13C with no operator intervention, additional sample preparation, or other manual analysis. To determine the DIC, the carbon analyzer transfers a specified sample volume to a heated (70 °C) reaction vessel with a preprogrammed volume of 10% phosphoric acid (H3PO4), which allows the carbonate and bicarbonate species in the sample to dissociate to CO2. The CO2 from the reacted sample is subsequently purged with a flow of helium gas that sweeps the CO2 through an infrared CO2 detector and quantifies the CO2. The CO2 is then carried through a high-temperature (650 °C) scrubber reactor, a series of water traps, and ultimately to the inlet of the mass spectrometer. For the analysis of total dissolved organic carbon, the carbon analyzer performs a second step on the sample in the heated reaction vessel during which a preprogrammed volume of sodium persulfate (Na2S2O8) is added, and the hydroxyl radicals oxidize the organics to CO2. Samples containing 2 ppm to 30,000 ppm of carbon are analyzed. The precision of the carbon isotope analysis is within 0.3 per mill for DIC, and within 0.5 per mill for DOC.

  1. Assessing the Role of Automation in Managing of Iranian E-banking and its Impact on Social Benefit

    Directory of Open Access Journals (Sweden)

    Hamidreza Salmani MOJAVERI

    2011-06-01

    Full Text Available Banks in the field of commercial developments have attention to create structural changes in the receiving and payment systems and also have facilities in services process to customers. In fact we can claim one of the reasons of general tendency to electronic business is the banks managers’ attention to the importance and necessity of this phenomenon, thus have led to their trend and serious attention for providing banking structure, based on electronic method. What banking services makes it different in comparing with other conventional methods for using E-Banking systems, is, quantitative and qualitative expansion in customer service. In other words, E-Banking, prepares the situation to customer till have wider and more diverse services. Furthermore, time and spatial dimension will not have effect in reducing or increasing services to customers. Also the customer can control his/her financial activities in every time and everywhere without attending in bank’s branches. The aim of this paper is to illustrate the status of banking automation, its social and organizational consequences in Iranian E-banking system, and providing appropriate recommendations.

  2. Automated extraction and assessment of functional features of areal measured microstructures using a segmentation-based evaluation method

    International Nuclear Information System (INIS)

    In addition to currently available surface parameters, according to ISO 4287:2010 and ISO 25178-2:2012—which are defined particularly for stochastic surfaces—a universal evaluation procedure is provided for geometrical, well-defined, microstructured surfaces. Since several million of features (like diameters, depths, etc) are present on microstructured surfaces, segmentation techniques are used for the automation of the feature-based dimensional evaluation. By applying an additional extended 3D evaluation after the segmentation and classification procedure, the accuracy of the evaluation is improved compared to the direct evaluation of segments, and additional functional parameters can be derived. Advantages of the extended segmentation-based evaluation method include not only the ability to evaluate the manufacturing process statistically (e.g. by capability indices, according to ISO 21747:2007 and ISO 3534-2:2013) and to derive statistical reliable values for the correction of microstructuring processes but also the direct re-use of the evaluated parameter (including its statistical distribution) in simulations for the calculation of probabilities with respect to the functionality of the microstructured surface. The practical suitability of this method is demonstrated using examples of microstructures for the improvement of sliding and ink transfers for printing machines. (paper)

  3. use of nuclear spectroscopic techniques for assessment of polluting elements in environmental samples

    International Nuclear Information System (INIS)

    The concentrations of elements and radioisotopes in sediment, soil, water and wild plant samples collected from Burullus Lake, Egypt, has been studied in order to understand current contamination due to agricultural and industrial wastewaters. A multiple approaches were applied to assess properly sediment contamination in the Burullus Lake. The distributions of the Al, Fe and Mn in the lake's sediments are relatively homogenous with the exception of three locations with significantly high levels of Al and Fe in close approximation in the southeastern part. Sediments collected from the lake can be categorized as unpolluted with the exception of three locations which were very low polluted with Sr based on the geo-accumulation indices. High enrichment factors were obtained for Mn, Co, Cr, Cu and Zn. The MPIs indicate that one of the drain may have a major role in mobilizing major and trace metals in the lake environment while cluster analysis indicates possible pollution from only three of the drainage channels. Comparisons with consensus-based sediment quality guidelines revealed that 100%, ∼69%, ∼92% and ∼∼15% of the samples exceeded the threshold effect concentration for Cr, Cu, Ni and Zn, respectively, with over 15% for Cr and Ni of the sample concentrations falling above the probable effect concentration. On the other hand, no samples exceed both levels for Pb. The concentration of 40K is uniform and that of 137Cs is generally higher in eastern part of the lake. The result indicate that 226 Ra is less soluble in the lake environment than 232Th. Elemental concentrations in water have uniform distributions and the Fe, Mn, Co, Cr, Cu and Ni are more likely to exist in soluble phase in the lake environment.

  4. Comparison of soil solution sampling techniques to assess metal fluxes from contaminated soil to groundwater.

    Science.gov (United States)

    Coutelot, F; Sappin-Didier, V; Keller, C; Atteia, O

    2014-12-01

    The unsaturated zone plays a major role in elemental fluxes in terrestrial ecosystems. A representative chemical analysis of soil pore water is required for the interpretation of soil chemical phenomena and particularly to assess Trace Elements (TEs) mobility. This requires an optimal sampling system to avoid modification of the extracted soil water chemistry and allow for an accurate estimation of solute fluxes. In this paper, the chemical composition of soil solutions sampled by Rhizon® samplers connected to a standard syringe was compared to two other types of suction probes (Rhizon® + vacuum tube and Rhizon® + diverted flow system). We investigated the effects of different vacuum application procedures on concentrations of spiked elements (Cr, As, Zn) mixed as powder into the first 20 cm of 100-cm columns and non-spiked elements (Ca, Na, Mg) concentrations in two types of columns (SiO2 sand and a mixture of kaolinite + SiO2 sand substrates). Rhizon® was installed at different depths. The metals concentrations showed that (i) in sand, peak concentrations cannot be correctly sampled, thus the flux cannot be estimated, and the errors can easily reach a factor 2; (ii) in sand + clay columns, peak concentrations were larger, indicating that they could be sampled but, due to sorption on clay, it was not possible to compare fluxes at different depths. The different samplers tested were not able to reflect the elemental flux to groundwater and, although the Rhizon® + syringe device was more accurate, the best solution remains to be the use of a lysimeter, whose bottom is kept continuously at a suction close to the one existing in the soil. PMID:25277861

  5. Sampling Procedure for a Radionuclide Assessment of a Spent HEPA Filter Waste

    International Nuclear Information System (INIS)

    According to the operation of nuclear facilities and a continuous construction of them, a great amount of used high efficiency particulate air (HEPA) filters which are widely used in a ventilation system in the nuclear industry has been generated as spent filter waste. All these HEPA filter wastes generated at KAERI have been stored in accordance with the original form without any treatment of them. However, to secure space in a waste storage facility approaching saturation, it is desirable to treat them by a compaction in view of a radioactive waste treatment and storage, and finally, to repack the compacted spent filters into a 200 liter drum for sending them to a disposal site. In order to dispose of the HEPA filters, it is first necessary to conduct a radionuclide assessment of them before compacting them. However, it is difficult to directly measure a radioactive concentration level of the nuclides captured in a HEPA filter because of its great bulk and specific shape. Therefore, after taking a representative sample from a HEPA filter, the analysis results for it are regarded as a representative value for the corresponding HEPA filter. To use this method, it is essential to confirm the validity of the sampling procedure and representative value. In this study, the depth distribution of the captured nuclides in a HEPA filter waste was first investigated. From the results, it was possible to obtain a representative sample from the intake part and the outlet part of a HEPA filter without a dismantlement. And then, a punch device with a diameter of 2 inch was developed for taking a representative sample which has a regular size

  6. Sampling Procedure for a Radionuclide Assessment of a Spent HEPA Filter Waste

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Young-Yong; Hong, Dae-Seok; Kang, Il-Sik; Kim, Tae-Kuk; Lee, Young-Hee; Shon, Jong-Sik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2007-10-15

    According to the operation of nuclear facilities and a continuous construction of them, a great amount of used high efficiency particulate air (HEPA) filters which are widely used in a ventilation system in the nuclear industry has been generated as spent filter waste. All these HEPA filter wastes generated at KAERI have been stored in accordance with the original form without any treatment of them. However, to secure space in a waste storage facility approaching saturation, it is desirable to treat them by a compaction in view of a radioactive waste treatment and storage, and finally, to repack the compacted spent filters into a 200 liter drum for sending them to a disposal site. In order to dispose of the HEPA filters, it is first necessary to conduct a radionuclide assessment of them before compacting them. However, it is difficult to directly measure a radioactive concentration level of the nuclides captured in a HEPA filter because of its great bulk and specific shape. Therefore, after taking a representative sample from a HEPA filter, the analysis results for it are regarded as a representative value for the corresponding HEPA filter. To use this method, it is essential to confirm the validity of the sampling procedure and representative value. In this study, the depth distribution of the captured nuclides in a HEPA filter waste was first investigated. From the results, it was possible to obtain a representative sample from the intake part and the outlet part of a HEPA filter without a dismantlement. And then, a punch device with a diameter of 2 inch was developed for taking a representative sample which has a regular size.

  7. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  8. Assessments of Cancer Risk from Soil Samples in Gebeng Industrial Estate, Pahang and Amang Samples in Perak

    International Nuclear Information System (INIS)

    Industrial activities such as the tin tailings and rare earth processing contribute to radiological risk to human health and environment. Those activities can accumulate the naturally occurring radioactive materials (NORM) with significant concentration in the environment. The aims of this study was to determine the activities concentration of Thorium-232 (232Th), Uranium-238 (238U) and Potassium-40 (40K) in soil samples around the Gebeng Industrial Estate, Pahang and in samples of ilmenite and monazite from three tin tailings processing plants in Perak using gamma ray spectrometry. The terrestrial gamma dose rate, the annual dose and cancer risk were also determined. The activities concentration of 232Th, 238U and 40K in the Gebeng soil samples were found in the range of 14.3 - 102.4, 23.8 - 81.3 and 73.3 - 451 Bq kg-1, respectively. While the activities concentration of 232Th, 238U and 40K for ilmenite and monazite samples were in the range of 259 - 166500, 194 - 28750 and 26.4 - 11991 Bq kg-1, respectively. The range terrestrial gamma dose rate at the Gebeng Industrial Estate was 22 - 108 nGy h-1 and the tin tailings processing plants was 390 - 6650 nGy h-1. Whereas the annual dose at the Gebeng Industrial Estate and tin tailings processing plants were 0.02 - 0.15 and 0.47 - 68 mSv y-1, respectively. The study showed that the cancer risk in the Gebeng industrial area were 4 peoples per million and 3702 peoples per million in the tin tailings processing plants. The activity concentration of soil from industrial area reported by UNSCEAR 2000 was in range of the Malaysia soil background. The activity concentration, the terrestrial gamma dose rate, the annual dose and the cancer risk were lower in the industrial area compared to tin tailings processing plants due to the high activity among the tin tailings processing area due to the high content of thorium in monazite. This study is recommended to monitor the environmental dose continuously in order to ensure the

  9. A simulation approach to assessing sampling strategies for insect pests: an example with the balsam gall midge.

    Directory of Open Access Journals (Sweden)

    R Drew Carleton

    Full Text Available Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with "pre-sampling" data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n ∼ 100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand was the most efficient, with sample means converging on true mean density for sample sizes of n ∼ 25-40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods.

  10. Quality-control design for surface-water sampling in the National Water-Quality Assessment Program

    Science.gov (United States)

    Mueller, David K.; Martin, Jeffrey D.; Lopes, Thomas J.

    1997-01-01

    The data-quality objectives of the National Water-Quality Assessment Program include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of chemical analyses of surface-water samples. The quality-control samples used to make these estimates include field blanks, field matrix spikes, and replicates. This report describes the design for collection of these quality-control samples in National Water-Quality Assessment Program studies and the data management needed to properly identify these samples in the U.S. Geological Survey's national data base.

  11. Automated fault-management in a simulated spaceflight micro-world

    Science.gov (United States)

    Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja

    2002-01-01

    BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.

  12. 291-B-1 stack monitoring and sampling system annual system assessment report

    International Nuclear Information System (INIS)

    The B Plant 291-B-1 main stack exhausts gaseous effluents to the atmosphere from the 221-B Building canyon and cells, the No. 1 Vessel Ventilation System (VVS1), the 212-B Cask Station cell ventilation system, and, to a limited capacity, the 224-B Building. VVS1 collects offgases from various process tanks in 221-B Building, while the 224-B system maintains a negative pressure in out-of-service, sealed process tanks. B Plant Administration Manual, WHC-CM-7-5, Section 5.30 requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 291-B-1 (System Number B977A) at B Plant. The system is functional and performing satisfactorily

  13. 291-B-1 stack monitoring and sampling system annual system assessment report

    Energy Technology Data Exchange (ETDEWEB)

    Ridge, T.M.

    1994-12-16

    The B Plant 291-B-1 main stack exhausts gaseous effluents to the atmosphere from the 221-B Building canyon and cells, the No. 1 Vessel Ventilation System (VVS1), the 212-B Cask Station cell ventilation system, and, to a limited capacity, the 224-B Building. VVS1 collects offgases from various process tanks in 221-B Building, while the 224-B system maintains a negative pressure in out-of-service, sealed process tanks. B Plant Administration Manual, WHC-CM-7-5, Section 5.30 requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 291-B-1 (System Number B977A) at B Plant. The system is functional and performing satisfactorily.

  14. Systematic assessment of reduced representation bisulfite sequencing to human blood samples

    DEFF Research Database (Denmark)

    Wang, Li; Sun, Jihua; Wu, Honglong;

    2012-01-01

    systematically assessed the genomic coverage, coverage depth and reproducibility of this technology as well as the concordance of DNA methylation levels measured by RRBS and direct bisulfite sequencing for the detected CpG sites. Our result suggests that RRBS can cover more than half of CpG islands and promoter...... regions with a good coverage depth and the proportion of the CpG sites covered by the biological replicates reaches 80-90%, indicating good reproducibility. Given a smaller data quantity, RRBS enjoys much better coverage depth than direct bisulfite sequencing and the concordance of DNA methylation levels...... between the two methods is high. It can be concluded that RRBS is a time and cost-effective sequencing method for unbiased DNA methylation profiling of CpG islands and promoter regions in a genome-wide scale and it is the method of choice to assay certain genomic regions for multiple samples in a rapid...

  15. Natural radioactivity measurements and dose assessments in sand samples collected from Zonguldak beaches in Turkey

    International Nuclear Information System (INIS)

    In this study, measurements of the gamma activity concentrations of 238U, 232Th and 40K natural radionuclides and associated dose assessments in sand samples collected from Zonguldak beaches (Turkey) were carried out. In order to measure the activity concentrations an Ortec GEM30 P4 model gamma spectrometer with HPGe detector was used. The activity concentrations for 238U, 232Th and 40K radionuclides were found to vary from 9.98 ± 0.82 to 56.81 ± 2.44, from 9.93 ± 1.46 to 48.87 ± 3.81 and from 103.00 ± 6.73 to 610.50 ± 24.24 Bq kg-1, respectively. The gamma dose rates were estimated to be from 16.67 ± 1.63 to 79.44 ± 4.37 nGy h-1. (author)

  16. Assessing the diagnostic validity of a structured psychiatric interview in a first-admission hospital sample

    DEFF Research Database (Denmark)

    Frederiksen, Julie Elisabeth Nordgaard; Revsbech, Rasmus; Sæbye, Ditte;

    2012-01-01

    The use of structured psychiatric interviews performed by non-clinicians is frequent for research purposes and is becoming increasingly common in clini-cal practice. The validity of such interviews has rarely been evaluated empirically. In this study of a sample of 100 diagnostically heterogeneous......, first-admitted inpatients, the results of an assessment with the Structured Clinical Interview for DSM-IV (SCID), yielding a DSM-IV diagnosis and performed by a trained non-clinician, were compared with a consensus lifetime best diagnostic estimate (DSM-IV) by two experienced research clinicians, based...... on multiple sources of information, which included videotaped comprehensive semi-structured narrative interviews. The overall kappa agreement was 0.18. The sensitivity and specificity for the diagnosis of schizophrenia by SCID were 19% and 100%, respectively. It is concluded that structured...

  17. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  18. Osteoporosis Self-Assessment Tool Performance in a Large Sample of Postmenopausal Women of Mendoza, Argentina

    Directory of Open Access Journals (Sweden)

    Fernando D. Saraví

    2013-01-01

    Full Text Available The Osteoporosis Self-assessment Tool (OST is a clinical instrument designed to select patients at risk of osteoporosis, who would benefit from a bone mineral density measurement. The OST only takes into account the age and weight of the subject. It was developed for Asian women and later validated for European and North American white women. The performance of the OST in a sample of 4343 women from Greater Mendoza, a large metropolitan area of Argentina, was assessed. Dual X-ray absorptiometry (DXA scans of lumbar spine and hip were obtained. Patients were classified as either osteoporotic (N=1830 or nonosteoporotic (n=2513 according to their lowest T-score at any site. Osteoporotic patients had lower OST scores (P<0.0001. A receiver operating characteristic (ROC curve showed an area under the curve of 71% (P<0.0001, with a sensitivity of 83.7% and a specificity of 44% for a cut-off value of 2. Positive predictive value was 52% and negative predictive value was 79%. The odds ratio for the diagnosis of osteoporosis was 4.06 (CI95 3.51 to 4.71; P<0.0001. It is concluded that the OST is useful for selecting postmenopausal women for DXA testing in the studied population.

  19. Philadelphia Brief Assessment of Cognition in healthy and clinical Brazilian sample

    Directory of Open Access Journals (Sweden)

    Danilo Assis Pereira

    2012-03-01

    Full Text Available The Philadelphia Brief Assessment of Cognition (PBAC is a neuropsychological screening instrument that assesses five cognitive domains: working memory, visuospatial functioning, language, episodic memory and comportment. The aim is to verify if PBAC can properly be used in the Brazilian sample. Participated in this study: (a 200 healthy volunteers - 100 young [21.6(2.5 years old] and 100 older adults [70.1(7.3 years old]; >12 years of education; (b 30 Alzheimer's patients (AD [73.7(5.7 years old], 4-11 years in education. The PBAC scores: (a 95.8(2.6, 90.0(4.4 and (b 65.0(10.8 were correlated with the Mini-Mental State Examination (MMSE for young 29.1(0.9, older adults 28.3(1.4 and AD 18.4(3.0 groups. A positive correlation between MMSE and PBAC (r=0.9, p<0.001 was found. Negative correlations were observed between PBAC domains [memory (-0.63, visuospatial abilities (-0.44 and working memory (-0.3 tasks]. MANOVA showed a better male performance in visuospatial functioning (F=8.5, p=0.004. The Brazilian version of PBAC proved to be a promising screening instrument for clinical purposes.

  20. A cost-effective technique for integrating personal radiation dose assessment with personal gravimetric sampling

    International Nuclear Information System (INIS)

    During recent years there has been an increasing awareness internationally of radiation levels in the mining and milling of radioactive ores, including those from non-uranium mines. A major aspect of radiation control is concerned with the measurement of radiation levels and the assessment of radiation doses incurred by individual workers. Current techniques available internationally for personnel monitoring of radiation exposures are expensive and there is a particular need to reduce the cost of personal radiation monitoring in South African gold mines because of the large labour force employed. In this regard the obvious benefits of integrating personal radiation monitoring with existing personal monitoring systems already in place in South African gold mines should be exploited. A system which can be utilized for this purpose is personal gravimetric sampling. A new cost-effective technique for personal radiation monitoring, which can be fully integrated with the personal gravimetric sampling strategy being implemented on mines, has been developed in South Africa. The basic principles of this technique and its potential in South African mines are described. 9 refs., 7 figs

  1. SU-E-I-89: Assessment of CT Radiation Dose and Image Quality for An Automated Tube Potential Selection Algorithm Using Pediatric Anthropomorphic and ACR Phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Mahmood, U; Erdi, Y; Wang, W [Memorial Sloan Kettering Cancer Center, NY, NY (United States)

    2014-06-01

    Purpose: To assess the impact of General Electrics automated tube potential algorithm, kV assist (kVa) on radiation dose and image quality, with an emphasis on optimizing protocols based on noise texture. Methods: Radiation dose was assessed by inserting optically stimulated luminescence dosimeters (OSLs) throughout the body of a pediatric anthropomorphic phantom (CIRS). The baseline protocol was: 120 kVp, 80 mA, 0.7s rotation time. Image quality was assessed by calculating the contrast to noise ratio (CNR) and noise power spectrum (NPS) from the ACR CT accreditation phantom. CNRs were calculated according to the steps described in ACR CT phantom testing document. NPS was determined by taking the 3D FFT of the uniformity section of the ACR phantom. NPS and CNR were evaluated with and without kVa and for all available adaptive iterative statistical reconstruction (ASiR) settings, ranging from 0 to 100%. Each NPS was also evaluated for its peak frequency difference (PFD) with respect to the baseline protocol. Results: For the baseline protocol, CNR was found to decrease from 0.460 ± 0.182 to 0.420 ± 0.057 when kVa was activated. When compared against the baseline protocol, the PFD at ASiR of 40% yielded a decrease in noise magnitude as realized by the increase in CNR = 0.620 ± 0.040. The liver dose decreased by 30% with kVa activation. Conclusion: Application of kVa reduces the liver dose up to 30%. However, reduction in image quality for abdominal scans occurs when using the automated tube voltage selection feature at the baseline protocol. As demonstrated by the CNR and NPS analysis, the texture and magnitude of the noise in reconstructed images at ASiR 40% was found to be the same as our baseline images. We have demonstrated that 30% dose reduction is possible when using 40% ASiR with kVa in pediatric patients.

  2. Comparison of Individual and Pooled Stool Samples for the Assessment of Soil-Transmitted Helminth Infection Intensity and Drug Efficacy

    OpenAIRE

    Zeleke Mekonnen; Selima Meka; Mio Ayana; Johannes Bogers; Jozef Vercruysse; Bruno Levecke

    2013-01-01

    BACKGROUND: In veterinary parasitology samples are often pooled for a rapid assessment of infection intensity and drug efficacy. Currently, studies evaluating this strategy in large-scale drug administration programs to control human soil-transmitted helminths (STHs; Ascaris lumbricoides, Trichuris trichiura, and hookworm), are absent. Therefore, we developed and evaluated a pooling strategy to assess intensity of STH infections and drug efficacy. METHODS/PRINCIPAL FINDINGS: Stool samples fro...

  3. Assessment of natural radioactivity levels and associated dose rates in soil samples from Northern Rajasthan, India.

    Science.gov (United States)

    Duggal, Vikas; Rani, Asha; Mehra, Rohit; Ramola, R C

    2014-01-01

    The analysis of naturally occurring radionuclides ((226)Ra, (232)Th and (40)K) has been carried out in 40 soil samples collected from four districts of the Northern Rajasthan, India using gamma-ray spectrometry with an NaI(Tl) detector. The activity concentrations of the samples range from 38±9 to 65±11 Bq kg(-1) with a mean value of 52 Bq kg(-1) for (226)Ra, from 8±8 to 32±9 Bq kg(-1) with a mean value of 19 Bq kg(-1) for (232)Th and from 929±185 to 1894±249 Bq kg(-1) with a mean value of 1627 Bq kg(-1) for (40)K. The measured activity concentration of (226)Ra and (40)K in soil was higher and for (232)Th was lower than the worldwide range. Radium equivalent activities were calculated for the soil samples to assess the radiation hazards arising due to the use of these soils in the construction of buildings. The calculated average radium equivalent activity was 205±20 Bq kg(-1), which is less than the recommended limit of 370 Bq kg(-1) by the Organization for Economic Cooperation and Development. The total absorbed dose rate calculated from the activity concentration of (226)Ra, (232)Th and (40)K ranges from 77 to 123 nGy h(-1) with an average value of 103 nGy h(-1). The mean external (Hex) and internal hazard indices (Hin) for the area under study were determined to be 0.55 and 0.69, respectively. The corresponding average annual effective dose was found to be 0.63 mSv. PMID:23943368

  4. Automated activation-analysis system

    International Nuclear Information System (INIS)

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  5. Macrothrombocytopenia in North India: Role of Automated Platelet Data in the Detection of an Under Diagnosed Entity

    OpenAIRE

    Kakkar, Naveen; John, M. Joseph; Mathew, Amrith

    2014-01-01

    Congenital macrothrombocytopenia is being increasingly recognised because of the increasing availability of automated platelet counts during routine complete blood count. If not recognised, these patients may be unnecessarily investigated or treated. The study was done to assess the occurrence of macrothrombocytopenia in the North Indian population and the role of automated platelet parameters in its detection. This prospective study was done on patients whose blood samples were sent for CBC ...

  6. Relationships between Narrative Language Samples and Norm-Referenced Test Scores in Language Assessments of School-Age Children

    Science.gov (United States)

    Ebert, Kerry Danahy; Scott, Cheryl M.

    2014-01-01

    Purpose: Both narrative language samples and norm-referenced language tests can be important components of language assessment for school-age children. The present study explored the relationship between these 2 tools within a group of children referred for language assessment. Method: The study is a retrospective analysis of clinical records from…

  7. A comparison of three macroinvertebrate sampling devices for use in conducting rapid-assessment procedures of Delmarva Peninsula wetlands

    Science.gov (United States)

    Lowe, Terrence (Peter); Tebbs, Kerry; Sparling, Donald W.

    2016-01-01

    Three types of macroinvertebrate collecting devices, Gerking box traps, D-shaped sweep nets, and activity traps, have commonly been used to sample macroinvertebrates when conducting rapid biological assessments of North American wetlands. We compared collections of macroinvertebrates identified to the family level made with these devices in 6 constructed and 2 natural wetlands on the Delmarva Peninsula of Maryland. We also assessed their potential efficacy in comparisons among wetlands using several proportional and richness attributes. Differences in median diversity among samples from the 3 devices were significant; the sweep-net samples had the greatest diversity and the activity-trap samples had the least diversity. Differences in median abundance were not significant between the Gerking box-trap samples and sweep-net samples, but median abundance among activity-trap samples was significantly lower than among samples of the other 2 devices. Within samples, the proportions of median diversity composed of major class and order groupings were similar among the 3 devices. However the proportions of median abundance composed of the major class and order groupings within activity-trap samples were not similar to those of the other 2 devices. There was a slight but significant increase in the total number of families captured when we combined activity-trap samples with Gerking box-trap samples or with sweep-net samples, and the per-sample median numbers of families of the combined activity-trap and sweep-net samples was significantly higher than that of the combined activity-trap and Gerking box-trap samples. We detected significant differences among wetlands for 4 macroinvertebrate attributes with the Gerking box-trap data, 6 attributes with sweep-net data, and 5 attributes with the activity-trap data. A small, but significant increase in the number of attributes showing differences among wetlands occurred when we combined activity-trap samples with those of the

  8. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection

    International Nuclear Information System (INIS)

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 μg mL-1 of propoxur, with a detection limit of 5 ng mL-1. The repeatability was 0.82% expressed as relative standard deviation (n = 10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL-1 levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L-1 using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 μg kg-1

  9. Field trial of applicability of lot quality assurance sampling survey method for rapid assessment of prevalence of active trachoma.

    OpenAIRE

    2003-01-01

    OBJECTIVE: To test the applicability of lot quality assurance sampling (LQAS) for the rapid assessment of the prevalence of active trachoma. METHODS: Prevalence of active trachoma in six communities was found by examining all children aged 2-5 years. Trial surveys were conducted in these communities. A sampling plan appropriate for classifying communities with prevalences or =40% was applied to the survey data. Operating characteristic and average sample number curves were plo...

  10. Índice Tornozelo-Braquial (ITB determinado por esfigmomanômetros oscilométricos automáticos Assessing Ankle-Brachial Index (ABI by using automated oscillometric devices

    Directory of Open Access Journals (Sweden)

    Takao Kawamura

    2008-05-01

    Full Text Available FUNDAMENTO: Índice Tornozelo-Braquial (ITB é essencial na prática clínica, mas dificuldades técnicas na sua execução pelo padrão de referência Doppler vascular (DV tornam-no ainda pouco utilizado. OBJETIVO: Avaliar aplicabilidade da determinação do ITB com uso de esfigmomanômetros oscilométricos automáticos (EOA e sugerir a utilização dos índices delta-Bráquio-Braquial (delta-BB e delta-ITB como marcadores de risco cardiovascular. MÉTODOS: Estudo descritivo e observacional de 247 pacientes ambulatoriais (56,2% feminino, média 62,0 anos submetidos à determinação do ITB com aferição simultânea da pressão arterial (PA em membros superiores (MMSS e inferiores (MMII utilizando-se dois EOA (OMRON-HEM705CP. Nos casos em que não foi possível aferir PA em pelo menos um dos MMII utilizou-se DV. Os pacientes divididos em Grupo N (ITB normal: 0,91 a 1,30 e Grupo A (ITB alterado: 1,30 tiveram comparados entre si os valores de delta-ITB (diferença absoluta ITB/MMII e delta-BB (diferença absoluta PAS/MMSS. RESULTADOS: Utilizando-se EOA foi possível determinar ITB em 90,7%. Com dados do Grupo N determinaram-se valores de referência (VR no percentil 95 de delta-ITB (0-0,13 e delta-BB (0-8 mmHg. Quando comparado com o Grupo N, o Grupo A apresentou prevalência mais elevada tanto de delta-ITB (30/52 contra 10/195; Razão de Chances: 25,23; pBACKGROUND: Assessing Ankle-Brachial Index is an essential procedure in clinical settings, but since its measurement by the gold standard Doppler Ultrasonic (DU technique is impaired by technical difficulties, it is underperformed. OBJECTIVE: The aim of this study was to assess the efficacy of an automated oscillometric device (AOD by performing Ankle-Brachial Index (ABI assessments and to suggest delta brachial-brachial (delta-BB and delta-ABI as markers of cardiovascular risk. METHODS: In this observational and descriptive study, 247 patients (56.2% females, mean age 62.0 years had their

  11. Assessment of outdoor radiofrequency electromagnetic field exposure through hotspot localization using kriging-based sequential sampling.

    Science.gov (United States)

    Aerts, Sam; Deschrijver, Dirk; Verloock, Leen; Dhaene, Tom; Martens, Luc; Joseph, Wout

    2013-10-01

    In this study, a novel methodology is proposed to create heat maps that accurately pinpoint the outdoor locations with elevated exposure to radiofrequency electromagnetic fields (RF-EMF) in an extensive urban region (or, hotspots), and that would allow local authorities and epidemiologists to efficiently assess the locations and spectral composition of these hotspots, while at the same time developing a global picture of the exposure in the area. Moreover, no prior knowledge about the presence of radiofrequency radiation sources (e.g., base station parameters) is required. After building a surrogate model from the available data using kriging, the proposed method makes use of an iterative sampling strategy that selects new measurement locations at spots which are deemed to contain the most valuable information-inside hotspots or in search of them-based on the prediction uncertainty of the model. The method was tested and validated in an urban subarea of Ghent, Belgium with a size of approximately 1 km2. In total, 600 input and 50 validation measurements were performed using a broadband probe. Five hotspots were discovered and assessed, with maximum total electric-field strengths ranging from 1.3 to 3.1 V/m, satisfying the reference levels issued by the International Commission on Non-Ionizing Radiation Protection for exposure of the general public to RF-EMF. Spectrum analyzer measurements in these hotspots revealed five radiofrequency signals with a relevant contribution to the exposure. The radiofrequency radiation emitted by 900 MHz Global System for Mobile Communications (GSM) base stations was always dominant, with contributions ranging from 45% to 100%. Finally, validation of the subsequent surrogate models shows high prediction accuracy, with the final model featuring an average relative error of less than 2dB (factor 1.26 in electric-field strength), a correlation coefficient of 0.7, and a specificity of 0.96. PMID:23759207

  12. Enumeration of total aerobic bacteria and Escherichia coli in minced meat and on carcass surface samples with an automated most-probable-number method compared with colony count protocols.

    Science.gov (United States)

    Paulsen, P; Schopf, E; Smulders, F J M

    2006-10-01

    An automated most-probable-number (MPN) system for the enumeration of total bacterial flora and Escherichia coli was compared with plate count agar and tryptone-bile-glucuronide (TBX) and ColiID (in-house method) agar methodology. The MPN partitioning of sample aliquots was done automatically on a disposable card containing 48 wells of 3 different volumes, i.e., 16 replicates per volume. Bacterial growth was detected by the formation of fluorescent 4-methylumbilliferone. After incubation, the number of fluorescent wells was read with a separate device, and the MPN was calculated automatically. A total of 180 naturally contaminated samples were tested (pig and cattle carcass surfaces, n = 63; frozen minced meat, n = 62; and refrigerated minced meat, n = 55). Plate count agar results and MPN were highly correlated (r = 0.99), with log MPN = -0.25 + 1.05 x log CFU (plate count agar) (n = 163; range, 2.2 to 7.5 log CFU/g or cm2). Only a few discrepancies were recorded. In two samples (1.1%), the differences were > or = 1.0 log; in three samples (1.7%), the differences were > or = 0.5 log. For E. coli, regression analysis was done for all three methods for 80 minced meat samples, which were above the limit of detection (1.0 log CFU/g): log MPN = 0.18 + 0.98 x log CFU (TBX), r = 0.96, and log MPN = -0.02 + 0.99 x log CFU (ColiID), r = 0.99 (range, 1.0 to 4.2 log CFU/g). Four discrepant results were recorded, with differences of > 0.5 but < 1.0 log unit. These results suggest that the automated MPN method described is a suitable and labor-saving alternative to colony count techniques for total bacterial flora and E. coli determination in minced meat or on carcass surfaces. PMID:17066934

  13. Use of pooled urine samples and automated DNA isolation to achieve improved sensitivity and cost-effectiveness of large-scale testing for Chlamydia trachomatis in pregnant women.

    NARCIS (Netherlands)

    Rours, G.I.J.G.; Verkooyen, R.P.; Willemse, H.F.M.; Zwaan, E.A. van der; Belkum, A. van; Groot, R. de; Verbrugh, H.A.; Ossewaarde, J.M.

    2005-01-01

    The success of large-scale screening for Chlamydia trachomatis depends on the availability of noninvasive samples, low costs, and high-quality testing. To evaluate C. trachomatis testing with pregnant women, first-void urine specimens from 750 consecutive asymptomatic pregnant women from the Rotterd

  14. ACCELERATED SOLVENT EXTRACTION COMBINED WITH AUTOMATED SOLID PHASE EXTRACTION-GC/MS FOR ANALYSIS OF SEMIVOLATILE COMPOUNDS IN HIGH MOISTURE CONTENT SOLID SAMPLES

    Science.gov (United States)

    A research project was initiated to address a recurring problem of elevated detection limits above required risk-based concentrations for the determination of semivolatile organic compounds in high moisture content solid samples. This project was initiated, in cooperation with t...

  15. Use of pooled urine samples and automated DNA isolation to achieve improved sensitivity and cost-effectiveness of large-scale testing for Chlamydia trachomatis in pregnant women.

    NARCIS (Netherlands)

    G.I.J.G. Rours (Ingrid); R.P.A.J. Verkooyen (Roel); H.F. Willemse; E.A.E. van der Zwaan (Elizabeth); A.F. van Belkum (Alex); R. de Groot (Ronald); H.A. Verbrugh (Henri); J.M. Ossewaarde (Jacobus)

    2005-01-01

    textabstractThe success of large-scale screening for Chlamydia trachomatis depends on the availability of noninvasive samples, low costs, and high-quality testing. To evaluate C. trachomatis testing with pregnant women, first-void urine specimens from 750 consecutive asymptomatic pregnant women from

  16. Determination of benzoylureas in ground water samples by fully automated on-line pre-concentration and liquid chromatography-fluorescence detection.

    Science.gov (United States)

    Gil García, M D; Martínez Galera, M; Barranco Martínez, D; Gisbert Gallego, J

    2006-01-27

    An on-line pre-concentration method for the analysis of five benzoylureas (diflubenzuron, triflumuron, hexaflumuron, lufenuron and flufenoxuron) in ground water samples was evaluated using two C(18) columns, and fluorescence detection after photochemical induced fluorescence (PIF) post-column derivatization. The trace enrichment was carried out with 35 mL of ground water modified with 15 mL of MeOH on a 50 mm x 4.6 mm I.D. first enrichment column (C-1) packed with 5 microm Hypersil Elite C(18). Retention properties of pesticides and humic acids usually contained in ground water were studied on C-1 at concentration levels ranging between 0.04 and 14.00 microg/L in water samples. The results obtained in this study show that the pesticides are pre-concentrated in the first short column while the humic acids contained in the ground water samples are eluted to waste. Pesticides recoveries ranged between 92.3 and 109.5%. The methodology proposed was used to determine benzoylureas in ground water samples at levels lower than 0.1 microg/L (maximum levels established by the European Union). PMID:16337641

  17. Detection of Giardia lamblia, Cryptosporidium spp. and Entamoeba histolytica in clinical stool samples by using multiplex real-time PCR after automated DNA isolation

    NARCIS (Netherlands)

    Van Lint, P; Rossen, J W; Vermeiren, S; Ver Elst, K; Weekx, S; Van Schaeren, J; Jeurissen, A

    2013-01-01

    Diagnosis of intestinal parasites in stool samples is generally still carried out by microscopy; however, this technique is known to suffer from a low sensitivity and is unable to discriminate between certain protozoa. In order to overcome these limitations, a real-time multiplex PCR was evaluated a

  18. Radioactivity concentrations and dose assessment for soil samples around nuclear power plant IV in Taiwan.

    Science.gov (United States)

    Tsai, Tsuey-Lin; Lin, Chun-Chih; Wang, Tzu-Wen; Chu, Tieh-Chi

    2008-09-01

    Activity concentrations and distributions of natural and man-made radionuclides in soil samples collected around nuclear power plant IV, Taiwan, were investigated for five years to assess the environmental radioactivity and characterisation of radiological hazard prior to commercial operation. The activity concentrations of radionuclides were determined via gamma-ray spectrometry using an HPGe detector. Data obtained show that the average concentrations of the (238)U and (232)Th series, and (40)K, were within world median ranges in the UNSCEAR report. The (137)Cs ranged from 2.46 +/- 0.55 to 12.13 +/- 1.31 Bq kg(-1). The terrestrial absorbed dose rate estimated by soil activity and directly measured with a thermoluminescence dosemeter (excluding cosmic rays), and the annual effective doses, were 45.63, 57.34 nGy h(-1) and 57.19 microSv, respectively. Experimental results were compared with international recommended values. Since the soil in this area is an important building material, the mean radium equivalent activity, external and inhalation hazard indices and the representative level index using various models given in the literature for the study area were 98.18 Bq kg(-1), 0.27, 0.34 and 0.73, respectively, which were below the recommended limits. Analytical results demonstrate that no radiological anomaly exists. The baseline data will prove useful and important in estimating the collective dose near the new nuclear power plant under construction in Taiwan. PMID:18714131

  19. OCT as a convenient tool to assess the quality and application of organotypic retinal samples

    Science.gov (United States)

    Gater, Rachel; Khoshnaw, Nicholas; Nguyen, Dan; El Haj, Alicia J.; Yang, Ying

    2016-03-01

    Eye diseases such as macular degeneration and glaucoma have profound consequences on the quality of human life. Without treatment, these diseases can lead to loss of sight. To develop better treatments for retinal diseases, including cell therapies and drug intervention, establishment of an efficient and reproducible 3D native retinal tissue system, enabled over a prolonged culture duration, will be valuable. The retina is a complex tissue, consisting of ten layers with a different density and cellular composition to each. Uniquely, as a light transmitting tissue, retinal refraction of light differs among the layers, forming a good basis to use optical coherence tomography (OCT) in assessing the layered structure of the retina and its change during the culture and treatments. In this study, we develop a new methodology to generate retinal organotypic tissues and compare two substrates: filter paper and collagen hydrogel, to culture the organotypic tissue. Freshly slaughtered pig eyes have been obtained for use in this study. The layered morphology of intact organotypic retinal tissue cultured on two different substrates has been examined by spectral domain OCT. The viability of the tissues has been examined by live/dead fluorescence dye kit to cross validate the OCT images. For the first time, it is demonstrated that the use of a collagen hydrogel supports the viability of retinal organotypic tissue, capable of prolonged culture up to 2 weeks. OCT is a convenient tool for appraising the quality and application of organotypic retinal samples and is important in the development of current organotypic models.

  20. Radiometric assessment of natural radioactivity levels of agricultural soil samples collected in Dakahlia, Egypt.

    Science.gov (United States)

    Issa, Shams A M

    2013-01-01

    Determination of the natural radioactivity has been carried out, by using a gamma-ray spectrometry [NaI (Tl) 3″ × 3″] system, in surface soil samples collected from various locations in Dakahlia governorate, Egypt. These locations form the agriculturally important regions of Egypt. The study area has many industries such as chemical, paper, organic fertilisers and construction materials, and the soils of the study region are used as a construction material. Therefore, it becomes necessary to study the natural radioactivity levels in soil to assess the dose for the population in order to know the health risks. The activity concentrations of (226)Ra, (232)Th and (40)K in the soil ranged from 5.7 ± 0.3 to 140 ± 7, from 9.0 ± 0.4 to 139 ± 7 and from 22 ± 1 to 319 ± 16 Bq kg(-1), respectively. The absorbed dose rate, annual effective dose rate, radium equivalent (Req), excess lifetime cancer risk, hazard indices (Hex and Hin) and annual gonadal dose equivalent, which resulted from the natural radionuclides in the soil were calculated. PMID:23509393

  1. PREVALENCE AND ANTIMICROBIAL RESISTANCE ASSESSMENT OF SUBCLINICAL MASTITIS IN MILK SAMPLES FROM SELECTED DAIRY FARMS

    Directory of Open Access Journals (Sweden)

    Murugaiyah Marimuthu

    2014-01-01

    Full Text Available This study was conducted in order to determine the prevalence and bacteriological assessment of subclinical mastitis and antimicrobial resistance of bacterial isolates from dairy cows in different farms around Selangor, Malaysia. A total of 120 milk samples from 3 different farms were randomly collected and tested for subclinical mastitis using California Mastitis Test (CMT, as well as for bacterial culture for isolation, identification and antimicrobial resistance. The most prevalent bacteria was Staphylococcus sp. (55%, followed by Bacillus sp., (21% and Corynebacterium sp., (7%, Yersinia sp. and Neisseria sp. both showed 5% prevalence, other species with prevalence below 5% are Acinetobacter sp., Actinobacillus sp., Vibrio sp., Pseudomonas sp., E.coli, Klebsiella sp. and Chromobacter sp. Selected Staphylococcus sp. showed a mean antimicrobial resistance of 73.3% to Ampicillin, 26.7% to Penicillin, Methicillin and Compound Sulphonamide each, 20% to Oxacillin, Amoxycillin and Cefuroxime, 13.3% to Polymyxin B, Erythromycin, Ceftriaxone and Azithromycin and 6.7% to Streptomycin, Clindamycin, Lincomycin and Tetracycline each. This study indicates the need for urgent and effective control measures to tackle the increase in prevalence of subclinical mastitis and their antimicrobial resistance in the study area.

  2. Analysis of terrestrial natural radionuclides in soil samples and assessment of average effective dose

    International Nuclear Information System (INIS)

    Radionuclides that are present in soil significantly affect terrestrial gamma radiation levels which in turn can be used for the assessment of terrestrial gamma dose rates. Natural radioactivity analysis has been done for the soil samples collected from different villages/towns of Hoshiarpur district of Punjab, India. The measurements have been carried out using HPGe detector based on high-resolution gamma spectrometry system. The calculated activity concentration values for terrestrial gamma viz. 238U, 232Th and 40K have been found to vary from 8.89 to 56.71 Bq kg-1, from 137.32 to 334.47 Bq kg-1 and from 823.62 to 1064.97 Bq kg-1, respectively. The total average absorbed dose rate in the study areas is 185.32 nGyh-1. The calculated value of average radium equivalent activity (401.13 Bq kg-1) exceeds the permissible limit (370 Bqkg-1) recommended by Organization for Economic Corporation and Development (OECD). The calculated average value of external hazard index (Hex) is 1.097. The calculated values of Indoor and Outdoor annual effective doses vary from 0.61 to 1.28 mSv and from 0.15 to 0.32 mSv, respectively. A positive correlation (R2 = 0.71) has also been observed for concentration of 232Th and 40K. (author)

  3. Effect of size and heterogeneity of samples on biomarker discovery: synthetic and real data assessment.

    Directory of Open Access Journals (Sweden)

    Barbara Di Camillo

    Full Text Available MOTIVATION: The identification of robust lists of molecular biomarkers related to a disease is a fundamental step for early diagnosis and treatment. However, methodologies for the discovery of biomarkers using microarray data often provide results with limited overlap. These differences are imputable to 1 dataset size (few subjects with respect to the number of features; 2 heterogeneity of the disease; 3 heterogeneity of experimental protocols and computational pipelines employed in the analysis. In this paper, we focus on the first two issues and assess, both on simulated (through an in silico regulation network model and real clinical datasets, the consistency of candidate biomarkers provided by a number of different methods. METHODS: We extensively simulated the effect of heterogeneity characteristic of complex diseases on different sets of microarray data. Heterogeneity was reproduced by simulating both intrinsic variability of the population and the alteration of regulatory mechanisms. Population variability was simulated by modeling evolution of a pool of subjects; then, a subset of them underwent alterations in regulatory mechanisms so as to mimic the disease state. RESULTS: The simulated data allowed us to outline advantages and drawbacks of different methods across multiple studies and varying number of samples and to evaluate precision of feature selection on a benchmark with known biomarkers. Although comparable classification accuracy was reached by different methods, the use of external cross-validation loops is helpful in finding features with a higher degree of precision and stability. Application to real data confirmed these results.

  4. Dynamic three-dimensional echocardiography combined with semi-automated border detection offers advantages for assessment of resynchronization therapy

    Directory of Open Access Journals (Sweden)

    Voormolen Marco M

    2003-10-01

    Full Text Available Abstract Simultaneous electrical stimulation of both ventricles in patients with interventricular conduction disturbance and advanced heart failure improves hemodynamics and results in increased exercise tolerance, quality of life. We have developed a novel technique for the assessment and optimization of resynchronization therapy. Our approach is based on transthoracic dynamic three-dimensional (3D echocardiography and allows determination of the most delayed contraction site of the left ventricle (LV together with global LV function data. Our initial results suggest that fast reconstruction of the LV is feasible for the selection of the optimal pacing site and allows identifying LV segments with dyssynchrony.

  5. Using the Sampling Margin of Error to Assess the Interpretative Validity of Student Evaluations of Teaching

    Science.gov (United States)

    James, David E.; Schraw, Gregory; Kuch, Fred

    2015-01-01

    We present an equation, derived from standard statistical theory, that can be used to estimate sampling margin of error for student evaluations of teaching (SETs). We use the equation to examine the effect of sample size, response rates and sample variability on the estimated sampling margin of error, and present results in four tables that allow…

  6. Automation of a high-speed imaging setup for differential viscosity measurements

    International Nuclear Information System (INIS)

    We present the automation of a setup previously used to assess the viscosity of pleural effusion samples and discriminate between transudates and exudates, an important first step in clinical diagnostics. The presented automation includes the design, testing, and characterization of a vacuum-actuated loading station that handles the 2 mm glass spheres used as sensors, as well as the engineering of electronic Printed Circuit Board (PCB) incorporating a microcontroller and their synchronization with a commercial high-speed camera operating at 10 000 fps. The hereby work therefore focuses on the instrumentation-related automation efforts as the general method and clinical application have been reported earlier [Hurth et al., J. Appl. Phys. 110, 034701 (2011)]. In addition, we validate the performance of the automated setup with the calibration for viscosity measurements using water/glycerol standard solutions and the determination of the viscosity of an “unknown” solution of hydroxyethyl cellulose

  7. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  8. Automated extraction of DNA from reference samples from various types of biological materials on the Qiagen BioRobot EZ1 Workstation

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Jørgensen, Mads; Hansen, Anders Johannes;

    2009-01-01

    We have validated and implemented a protocol for DNA extraction from various types of biological materials using a Qiagen BioRobot EZ1 Workstation. The sample materials included whole blood, blood from deceased, buccal cells on Omni swabs and FTA Cards, blood on FTA Cards and cotton swabs, and...... muscle biopsies. The DNA extraction was validated according to EN/ISO 17025 for the STR kits AmpFlSTR« Identifiler« and AmpFlSTR« Yfiler« (Applied Biosystems). Of 298 samples extracted, 11 (4%) did not yield acceptable results. In conclusion, we have demonstrated that extraction of DNA from various types...... of biological material can be performed quickly and without the use of hazardous chemicals, and that the DNA may be successfully STR typed according to the requirements of forensic genetic investigations accredited according to EN/ISO 17025...

  9. Use of Pooled Urine Samples and Automated DNA Isolation To Achieve Improved Sensitivity and Cost-Effectiveness of Large-Scale Testing for Chlamydia trachomatis in Pregnant Women

    OpenAIRE

    2005-01-01

    The success of large-scale screening for Chlamydia trachomatis depends on the availability of noninvasive samples, low costs, and high-quality testing. To evaluate C. trachomatis testing with pregnant women, first-void urine specimens from 750 consecutive asymptomatic pregnant women from the Rotterdam area (The Netherlands) were collected. Initially, we investigated the performance of three different DNA isolation methods with 350 of these urines and 70 pools of 5 of the same subset of urine ...

  10. Automated In-Injector Derivatization Combined with High-Performance Liquid Chromatography-Fluorescence Detection for the Determination of Semicarbazide in Fish and Bread Samples.

    Science.gov (United States)

    Wang, Yinan; Chan, Wan

    2016-04-01

    Semicarbazide (1) is a widespread genotoxic food contaminant originating as a metabolic byproduct of the antibiotic nitrofurazone used in fish farming or as a thermal degradation product of the common flour additive azodicarbonamide. The goal of this study is to develop a simple and sensitive high-performance liquid chromatography coupled with fluorescence detection (HPLC-FLD) method for the detection of compound 1 in food products. In comparison to existing methods for the determination of compound 1, the reported method combining online precolumn derivatization and HPLC-FLD is less labor-intensive, produces higher sample throughput, and does not require the use of expensive analytical instruments. After validation of accuracy and precision, this method was applied to determine the amount of compound 1 in fish and bread samples. Comparative studies using an established liquid chromatography coupled with tandem mass spectrometry method did not yield systematically different results, indicating that the developed HPLC-FLD method is accurate and suitable for the determination of compound 1 in fish and bread samples. PMID:26985968

  11. Automated detection of breast tumor in MRI and comparison of kinetic features for assessing tumor response to chemotherapy

    Science.gov (United States)

    Aghaei, Faranak; Tan, Maxine; Zheng, Bin

    2015-03-01

    Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) is used increasingly in diagnosis of breast cancer and assessment of treatment efficacy in current clinical practice. The purpose of this preliminary study is to develop and test a new quantitative kinetic image feature analysis method and biomarker to predict response of breast cancer patients to neoadjuvant chemotherapy using breast MR images acquired before the chemotherapy. For this purpose, we developed a computer-aided detection scheme to automatically segment breast areas and tumors depicting on the sequentially scanned breast MR images. From a contrast-enhancement map generated by subtraction of two image sets scanned pre- and post-injection of contrast agent, our scheme computed 38 morphological and kinetic image features from both tumor and background parenchymal regions. We applied a number of statistical data analysis methods to identify effective image features in predicting response of the patients to the chemotherapy. Based on the performance assessment of individual features and their correlations, we applied a fusion method to generate a final image biomarker. A breast MR image dataset involving 68 patients was used in this study. Among them, 25 had complete response and 43 had partially response to the chemotherapy based on the RECIST guideline. Using this image feature fusion based biomarker, the area under a receiver operating characteristic curve is AUC = 0.850±0.047. This study demonstrated that a biomarker developed from the fusion of kinetic image features computed from breast MR images acquired pre-chemotherapy has potentially higher discriminatory power in predicting response of the patients to the chemotherapy.

  12. Assessment of outdoor radiofrequency electromagnetic field exposure through hotspot localization using kriging-based sequential sampling

    Energy Technology Data Exchange (ETDEWEB)

    Aerts, Sam, E-mail: sam.aerts@intec.ugent.be; Deschrijver, Dirk; Verloock, Leen; Dhaene, Tom; Martens, Luc; Joseph, Wout

    2013-10-15

    In this study, a novel methodology is proposed to create heat maps that accurately pinpoint the outdoor locations with elevated exposure to radiofrequency electromagnetic fields (RF-EMF) in an extensive urban region (or, hotspots), and that would allow local authorities and epidemiologists to efficiently assess the locations and spectral composition of these hotspots, while at the same time developing a global picture of the exposure in the area. Moreover, no prior knowledge about the presence of radiofrequency radiation sources (e.g., base station parameters) is required. After building a surrogate model from the available data using kriging, the proposed method makes use of an iterative sampling strategy that selects new measurement locations at spots which are deemed to contain the most valuable information—inside hotspots or in search of them—based on the prediction uncertainty of the model. The method was tested and validated in an urban subarea of Ghent, Belgium with a size of approximately 1 km{sup 2}. In total, 600 input and 50 validation measurements were performed using a broadband probe. Five hotspots were discovered and assessed, with maximum total electric-field strengths ranging from 1.3 to 3.1 V/m, satisfying the reference levels issued by the International Commission on Non-Ionizing Radiation Protection for exposure of the general public to RF-EMF. Spectrum analyzer measurements in these hotspots revealed five radiofrequency signals with a relevant contribution to the exposure. The radiofrequency radiation emitted by 900 MHz Global System for Mobile Communications (GSM) base stations was always dominant, with contributions ranging from 45% to 100%. Finally, validation of the subsequent surrogate models shows high prediction accuracy, with the final model featuring an average relative error of less than 2 dB (factor 1.26 in electric-field strength), a correlation coefficient of 0.7, and a specificity of 0.96. -- Highlights: • We present an

  13. Assessment of outdoor radiofrequency electromagnetic field exposure through hotspot localization using kriging-based sequential sampling

    International Nuclear Information System (INIS)

    In this study, a novel methodology is proposed to create heat maps that accurately pinpoint the outdoor locations with elevated exposure to radiofrequency electromagnetic fields (RF-EMF) in an extensive urban region (or, hotspots), and that would allow local authorities and epidemiologists to efficiently assess the locations and spectral composition of these hotspots, while at the same time developing a global picture of the exposure in the area. Moreover, no prior knowledge about the presence of radiofrequency radiation sources (e.g., base station parameters) is required. After building a surrogate model from the available data using kriging, the proposed method makes use of an iterative sampling strategy that selects new measurement locations at spots which are deemed to contain the most valuable information—inside hotspots or in search of them—based on the prediction uncertainty of the model. The method was tested and validated in an urban subarea of Ghent, Belgium with a size of approximately 1 km2. In total, 600 input and 50 validation measurements were performed using a broadband probe. Five hotspots were discovered and assessed, with maximum total electric-field strengths ranging from 1.3 to 3.1 V/m, satisfying the reference levels issued by the International Commission on Non-Ionizing Radiation Protection for exposure of the general public to RF-EMF. Spectrum analyzer measurements in these hotspots revealed five radiofrequency signals with a relevant contribution to the exposure. The radiofrequency radiation emitted by 900 MHz Global System for Mobile Communications (GSM) base stations was always dominant, with contributions ranging from 45% to 100%. Finally, validation of the subsequent surrogate models shows high prediction accuracy, with the final model featuring an average relative error of less than 2 dB (factor 1.26 in electric-field strength), a correlation coefficient of 0.7, and a specificity of 0.96. -- Highlights: • We present an iterative

  14. Radiological assessment of fish samples due to natural radionuclides in river Yobe, Northern Nigeria

    International Nuclear Information System (INIS)

    Assessment of natural radioactivity of some fish samples in river Yobe was conducted, using gamma spectroscopy method with NaI(TI) detector. Radioactivity is phenomenon that leads to production of radiations, whereas radiation is known to trigger or induce cancer. The fish were analyzed to estimate the radioactivity (activity) concentrations due to natural radionuclides 226Ra, 232Th and 40K. The obtained result show that the activity concentration for (226Ra), in all the fish samples collected ranges from 15.23±2.45BqKg-1 to 67.39±2.13BqKg-1 with an average value of 34.13±1.34BqKg-1. That of 232Th, ranges from 42.66±0.81BqKg-1 to 201.18±3.82BqKg-1, and the average value stands at 96.01±3.82BqKg-1. The activity concentration for 40K, ranges between 243.3±1.56 BqKg-1 to 618.2±2.81 BqKg-1 and the average is 413.92±1.7 BqKg-1. This study indicated that average daily intake due to natural activity from the fish is valued at 0.913 Bq/day, 2.577Bq/day and 11.088 Bq/day for 226Ra, 232Th and 40K respectively. This shows that the activity concentration values for fish, shows a promising result with most of the fish activity concentrations been within the acceptable limits. However locations (F02, F07 and F12), fish became outliers with significant values of 112.53μSvy-1, 121.11μSvy-1 and 114.32μSvy-1 effective dose. This could be attributed to variation in geological formations within the river as well as the feeding habits of these fish. The work shows that consumers of fish from River Yobe have no risk of radioactivity ingestion, even though no amount of radiation is assumed to be totally safe.

  15. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  16. Assessing the efficacy of hair snares as a method for noninvasive sampling of Neotropical felids

    Directory of Open Access Journals (Sweden)

    Tatiana P. Portella

    2013-02-01

    Full Text Available Hair snares have been used in North and Central America for a long time in assessment and monitoring studies of several mammalian species. This method can provide a cheap, suitable, and efficient way to monitor mammals because it combines characteristics that are not present in most alternative techniques. However, despite their usefulness, hair snares are rarely used in other parts of the world. The aim of our study was to evaluate the effectiveness of hair snares and three scent lures (cinnamon, catnip, and vanilla in the detection of felids in one of the largest remnants of the Brazilian Atlantic Forest. We performed tests with six captive felid species - Panthera onca (Linnaeus, 1758, Leopardus pardalis (Linnaeus, 1758, L. tigrinus (Schreber, 1775, L. wiedii (Schinz, 1821, Puma concolor (Linnaeus, 1771, and P. yagouaroundi (É. Geoffroy Saint-Hilaire, 1803 - to examine their responses to the attractants, and to correlate those with lure efficiency in the field. The field tests were conducted at the Parque Estadual Pico do Marumbi, state of Paraná, Brazil. Hair traps were placed on seven transects. There were equal numbers of traps with each scent lure, for a total of 1,551 trap-days. In captivity, vanilla provided the greatest response, yet no felids were detected in the field with any of the tested lures, although other species were recorded. Based on the sampling of non-target species, and the comparison with similar studies elsewhere, this study points to a possible caveat of this method when rare species or small populations are concerned. Meanwhile, we believe that improved hair snares could provide important results with several species in the location tested and others.

  17. Assessment of polychlorinated biphenyls and organochlorine pesticides in water samples from the Yamuna River

    Directory of Open Access Journals (Sweden)

    Bhupander Kumar

    2012-07-01

    Full Text Available Polychlorinated biphenyls (PCBs, hexachlorocyclohexane (HCH and dichlorodiphenyltrichloroethane (DDT are toxic, persistent and bioaccumulative long-range atmospheric transport pollutants. These are transported worldwide affecting remote regions far from their original sources, and can transfer into food webs with a wide range of acute and chronic health effects. India ratified the Stockholm Convention with the intention of reducing and eliminating persistent organic pollutants (POPs, and encouraged the support of research on POPs. Despite the ban and restriction on the use of these chemicals in India, their contamination of air, water, sediment, biota and humans has been reported. In this study, surface water samples were collected during January 2012 from the Yamuna River in Delhi, India, and analyzed for PCBs and organochlorine pesticides (OCPs. The concentrations of ΣPCBs and ΣOCPs ranged between 2-779 ng L–1 and from less than 0.1 to 618 ng L–1 (mean 99±38 ng L–1 and 221±50 ng L–1, respectively. The PCB homolog was dominated by 3-4 chlorinated biphenyls. In calculating the toxicity equivalent of dioxin-like PCBs (dl-PCBsusing World Health Organization toxic equivalency factors, dl-PCBs accounted for 10% of a total of 27 PCBs. The concentration of ΣHCH ranged between less than 0.1 and 285 ng L–1 (mean 151±32 ng L–1. However, ΣDDTs concentrations varied between less than 0.1 and 354 ng L–1 (mean 83±26 ng L–1. The concentrations were lower than the US guideline values; however, levels of lindane exceeded those recommended in guidelines. Further in-depth study is proposed to determine the bioaccumulation of these pollutants through aquatic biota to assess the risk of contaminants to human health.

  18. AUtomated Risk Assessment for Stroke in Atrial Fibrillation (AURAS-AF) - an automated software system to promote anticoagulation and reduce stroke risk: study protocol for a cluster randomised controlled trial

    OpenAIRE

    Holt, Tim A; Fitzmaurice, David A; Marshall, Tom; Fay, Matthew; Qureshi, Nadeem; Dalton, Andrew R H; Hobbs, F D Richard; Lasserson, Daniel S; Kearley, Karen; Hislop, Jenny; Jin, Jing

    2013-01-01

    Background Patients with atrial fibrillation (AF) are at significantly increased risk of stroke. Oral anticoagulants (OACs) substantially reduce this risk, with gains seen across the spectrum of baseline risk. Despite the benefit to patients, OAC prescribing remains suboptimal in the United Kingdom (UK). We will investigate whether an automated software system, operating within primary care electronic medical records, can improve the management of AF by identifying patients eligible for OAC t...

  19. An artifacts removal post-processing for epiphyseal region-of-interest (EROI localization in automated bone age assessment (BAA

    Directory of Open Access Journals (Sweden)

    Salleh Sh-Hussain

    2011-09-01

    Full Text Available Abstract Background Segmentation is the most crucial part in the computer-aided bone age assessment. A well-known type of segmentation performed in the system is adaptive segmentation. While providing better result than global thresholding method, the adaptive segmentation produces a lot of unwanted noise that could affect the latter process of epiphysis extraction. Methods A proposed method with anisotropic diffusion as pre-processing and a novel Bounded Area Elimination (BAE post-processing algorithm to improve the algorithm of ossification site localization technique are designed with the intent of improving the adaptive segmentation result and the region-of interest (ROI localization accuracy. Results The results are then evaluated by quantitative analysis and qualitative analysis using texture feature evaluation. The result indicates that the image homogeneity after anisotropic diffusion has improved averagely on each age group for 17.59%. Results of experiments showed that the smoothness has been improved averagely 35% after BAE algorithm and the improvement of ROI localization has improved for averagely 8.19%. The MSSIM has improved averagely 10.49% after performing the BAE algorithm on the adaptive segmented hand radiograph. Conclusions The result indicated that hand radiographs which have undergone anisotropic diffusion have greatly reduced the noise in the segmented image and the result as well indicated that the BAE algorithm proposed is capable of removing the artifacts generated in adaptive segmentation.

  20. Allium test peculiarities for toxicity assessment of water and soil samples from radioactively contaminated sites

    International Nuclear Information System (INIS)

    The results of allium test application for toxicity estimation of water and soil samples from Semipalatinsk Experimental Test Site (SET) are presented. Comparative analysis of water and soil samples cyto- and genotoxicyty from SET areas, contrasting on radionuclide composition, was carried out. The analysis results has shown the need of the biotesting modification for taking into account external gamma-irradiations in sampling area