WorldWideScience

Sample records for automated sampling assessment

  1. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B;

    2014-01-01

    corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters......Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...

  2. AUTOMATING GROUNDWATER SAMPLING AT HANFORD

    Energy Technology Data Exchange (ETDEWEB)

    CONNELL CW; HILDEBRAND RD; CONLEY SF; CUNNINGHAM DE

    2009-01-16

    Until this past October, Fluor Hanford managed Hanford's integrated groundwater program for the U.S. Department of Energy (DOE). With the new contract awards at the Site, however, the CH2M HILL Plateau Remediation Company (CHPRC) has assumed responsibility for the groundwater-monitoring programs at the 586-square-mile reservation in southeastern Washington State. These programs are regulated by the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). The purpose of monitoring is to track existing groundwater contamination from past practices, as well as other potential contamination that might originate from RCRA treatment, storage, and disposal (TSD) facilities. An integral part of the groundwater-monitoring program involves taking samples of the groundwater and measuring the water levels in wells scattered across the site. More than 1,200 wells are sampled each year. Historically, field personnel or 'samplers' have been issued pre-printed forms that have information about the well(s) for a particular sampling evolution. This information is taken from the Hanford Well Information System (HWIS) and the Hanford Environmental Information System (HEIS)--official electronic databases. The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and the collected information was posted onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. This is a pilot project for automating this tedious process by providing an electronic tool for automating water-level measurements and groundwater field-sampling activities. The automation will eliminate the manual forms and associated data entry, improve the

  3. Technology modernization assessment flexible automation

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  4. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  5. Automating Spreadsheet Discovery & Risk Assessment

    CERN Document Server

    Perry, Eric

    2008-01-01

    There have been many articles and mishaps published about the risks of uncontrolled spreadsheets in today's business environment, including non-compliance, operational risk, errors, and fraud all leading to significant loss events. Spreadsheets fall into the realm of end user developed applications and are often absent the proper safeguards and controls an IT organization would enforce for enterprise applications. There is also an overall lack of software programming discipline enforced in how spreadsheets are developed. However, before an organization can apply proper controls and discipline to critical spreadsheets, an accurate and living inventory of spreadsheets across the enterprise must be created, and all critical spreadsheets must be identified. As such, this paper proposes an automated approach to the initial stages of the spreadsheet management lifecycle - discovery, inventory and risk assessment. Without the use of technology, these phases are often treated as a one-off project. By leveraging techn...

  6. Automated Assessment, Face to Face

    OpenAIRE

    Rizik M. H. Al-Sayyed; Amjad Hudaib; Muhannad AL-Shboul; Yousef Majdalawi; Mohammed Bataineh

    2010-01-01

    This research paper evaluates the usability of automated exams and compares them with the paper-and-pencil traditional ones. It presents the results of a detailed study conducted at The University of Jordan (UoJ) that comprised students from 15 faculties. A set of 613 students were asked about their opinions concerning automated exams; and their opinions were deeply analyzed. The results indicate that most students reported that they are satisfied with using automated exams but they have sugg...

  7. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system that fulfi......The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...... of a robot, a production cell, a production line and the final product. The case study results illustrate that, depending on the actor and the level he/she acts at, sustainability and the actions that can be taken to contribute to a more sustainable product are perceived differently: even though the robot...

  8. Automated Data Quality Assessment of Marine Sensors

    OpenAIRE

    Smith, Daniel V; Leon Reznik; Paulo A. Souza; Timms, Greg P.

    2011-01-01

    The automated collection of data (e.g., through sensor networks) has led to a massive increase in the quantity of environmental and other data available. The sheer quantity of data and growing need for real-time ingestion of sensor data (e.g., alerts and forecasts from physical models) means that automated Quality Assurance/Quality Control (QA/QC) is necessary to ensure that the data collected is fit for purpose. Current automated QA/QC approaches provide assessments based upon hard classific...

  9. Automated Training Sample Extraction for Global Land Cover Mapping

    Directory of Open Access Journals (Sweden)

    Julien Radoux

    2014-05-01

    Full Text Available Land cover is one of the essential climate variables of the ESA Climate Change Initiative (CCI. In this context, the Land Cover CCI (LC CCI project aims at building global land cover maps suitable for climate modeling based on Earth observation by satellite sensors.  The  challenge  is  to  generate  a  set  of  successive  maps  that  are  both  accurate and consistent over time. To do so, operational methods for the automated classification of optical images are investigated. The proposed approach consists of a locally trained classification using an automated selection of training samples from existing, but outdated land cover information. Combinations of local extraction (based on spatial criteria and self-cleaning of training samples (based on spectral criteria are quantitatively assessed. Two large study areas, one in Eurasia and the other in South America, are considered. The proposed morphological cleaning of the training samples leads to higher accuracies than the statistical outlier removal in the spectral domain. An optimal neighborhood has been identified for the local sample extraction. The results are coherent for the two test areas, showing an improvement of the overall accuracy compared with the original reference datasets and a significant reduction of macroscopic errors. More importantly, the proposed method partly controls the reliability of existing land cover maps as sources of training samples for supervised classification.

  10. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  11. Six Key Topics for Automated Assessment Utilisation and Acceptance

    Directory of Open Access Journals (Sweden)

    Torsten REINERS

    2011-04-01

    Full Text Available Automated assessment technologies have been used in education for decades (e.g., computerised multiple choice tests. In contrast, Automated Essay Grading (AEG technologies: have existed for decades; are `good in theory' (e.g., as accurate as humans, temporally and financially efficient, and can enhance formative feedback, and yet; are ostensibly used comparatively infrequently in Australian universities. To empirically examine these experiential observations we conducted a national survey to explore the use of automated assessment in Australian universities and examine why adoption of AEG is limited. Quantitative and qualitative data were collected in an online survey from a sample of 265 staff and students from 5 Australian universities. The type of assessment used by the greatest proportion of respondents was essays/reports (82.6%, however very few respondents had used AEG (3.8%. Recommendations are made regarding methods to promote technology utilisation, including the use of innovative dissemination channels such as 3D Virtual Worlds.

  12. Constructing Aligned Assessments Using Automated Test Construction

    Science.gov (United States)

    Porter, Andrew; Polikoff, Morgan S.; Barghaus, Katherine M.; Yang, Rui

    2013-01-01

    We describe an innovative automated test construction algorithm for building aligned achievement tests. By incorporating the algorithm into the test construction process, along with other test construction procedures for building reliable and unbiased assessments, the result is much more valid tests than result from current test construction…

  13. Automated Autonomy Assessment System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA has expressed the need to assess crew autonomy relative to performance and evaluate an optimal level of autonomy that maximizes individual and team...

  14. Automated Bone Age Assessment: Motivation, Taxonomies, and Challenges

    Directory of Open Access Journals (Sweden)

    Marjan Mansourvar

    2013-01-01

    Full Text Available Bone age assessment (BAA of unknown people is one of the most important topics in clinical procedure for evaluation of biological maturity of children. BAA is performed usually by comparing an X-ray of left hand wrist with an atlas of known sample bones. Recently, BAA has gained remarkable ground from academia and medicine. Manual methods of BAA are time-consuming and prone to observer variability. This is a motivation for developing automated methods of BAA. However, there is considerable research on the automated assessment, much of which are still in the experimental stage. This survey provides taxonomy of automated BAA approaches and discusses the challenges. Finally, we present suggestions for future research.

  15. An Automated Home Made Low Cost Vibrating Sample Magnetometer

    Science.gov (United States)

    Kundu, S.; Nath, T. K.

    2011-07-01

    The design and operation of a homemade low cost vibrating sample magnetometer is described here. The sensitivity of this instrument is better than 10-2 emu and found to be very efficient for the measurement of magnetization of most of the ferromagnetic and other magnetic materials as a function of temperature down to 77 K and magnetic field upto 800 Oe. Both M(H) and M(T) data acquisition are fully automated employing computer and Labview software.

  16. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    Science.gov (United States)

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput.

  17. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    Science.gov (United States)

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  18. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  19. AUTOMATING GROUNDWATER SAMPLING AT HANFORD THE NEXT STEP

    Energy Technology Data Exchange (ETDEWEB)

    CONNELL CW; CONLEY SF; HILDEBRAND RD; CUNNINGHAM DE; R_D_Doug_Hildebrand@rl.gov; DeVon_E_Cunningham@rl.gov

    2010-01-21

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very "people intensive." Approximately 1500 wells are sampled each year by field personnel or "samplers." These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  20. Automated assessment of mobility in bedridden patients.

    Science.gov (United States)

    Bennett, Stephanie; Goubran, Rafik; Rockwood, Kenneth; Knoefel, Frank

    2013-01-01

    Immobility in older patients is a costly problem for both patients and healthcare workers. The Hierarchical Assessment of Balance and Mobility (HABAM) is a clinical tool able to assess immobile patients and predict morbidity, yet could become more reliable and informative through automation. This paper proposes an algorithm to automatically determine which of three enacted HABAM scores (associated with bedridden patients) had been performed by volunteers. A laptop was used to gather pressure data from three mats placed on a standard hospital bed frame while five volunteers performed three enactments each. A system of algorithms was created, consisting of three subsystems. The first subsystem used mattress data to calculate individual sensor sums and eliminate the weight of the mattress. The second subsystem established a baseline pressure reading for each volunteer and used percentage change to identify and distinguish between two enactments. The third subsystem used calculated weight distribution ratios to determine if the data represented the remaining enactment. The system was tested for accuracy by inputting the volunteer data and recording the assessment output (a score per data set). The system identified 13 of 15 sets of volunteer data as expected. Examination of these results indicated that the two sets of data were not misidentified; rather, the volunteers had made mistakes in performance. These results suggest that this system of algorithms is effective in distinguishing between the three HABAM score enactments examined here, and emphasizes the potential for pervasive computing to improve traditional healthcare. PMID:24110676

  1. Validation of Automated Scoring of Science Assessments

    Science.gov (United States)

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  2. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.;

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually....... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management. The...

  3. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  4. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  5. Fast detection of Noroviruses using a real-time PCR assay and automated sample preparation

    Directory of Open Access Journals (Sweden)

    Schmid Michael

    2004-06-01

    Full Text Available Abstract Background Noroviruses (NoV have become one of the most commonly reported causative agents of large outbreaks of non-bacterial acute gastroenteritis worldwide as well as sporadic gastroenteritis in the community. Currently, reverse transcriptase polymerase chain reaction (RT-PCR assays have been implemented in NoV diagnosis, but improvements that simplify and standardize sample preparation, amplification, and detection will be further needed. The combination of automated sample preparation and real-time PCR offers such refinements. Methods We have designed a new real-time RT-PCR assay on the LightCycler (LC with SYBR Green detection and melting curve analysis (Tm to detect NoV RNA in patient stool samples. The performance of the real-time PCR assay was compared with that obtained in parallel with a commercially available enzyme immunoassay (ELISA for antigen detection by testing a panel of 52 stool samples. Additionally, in a collaborative study with the Baden-Wuerttemberg State Health office, Stuttgart (Germany the real-time PCR results were blindly assessed using a previously well-established nested PCR (nPCR as the reference method, since PCR-based techniques are now considered as the "gold standard" for NoV detection in stool specimens. Results Analysis of 52 clinical stool samples by real-time PCR yielded results that were consistent with reference nPCR results, while marked differences between the two PCR-based methods and antigen ELISA were observed. Our results indicate that PCR-based procedures are more sensitive and specific than antigen ELISA for detecting NoV in stool specimens. Conclusions The combination of automated sample preparation and real-time PCR provided reliable diagnostic results in less time than conventional RT-PCR assays. These benefits make it a valuable tool for routine laboratory practice especially in terms of rapid and appropriate outbreak-control measures in health-care facilities and other settings.

  6. Automated washing of FTA Card punches and PCR setup for reference samples using a LIMS-controlled Sias Xantus automated liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Olsen, Addie Nina; Frøslev, Tobias G.;

    2009-01-01

    We have implemented and validated automated methods for washing FTA Card punches containing buccal samples and subsequent PCR setup using a Sias Xantus automated liquid handler. The automated methods were controlled by worklists generated by our LabWare Laboratory Information Management System...

  7. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  8. Evaluation of the measurement uncertainty in automated long-term sampling of PCDD/PCDFs.

    Science.gov (United States)

    Vicaretti, M; D'Emilia, G; Mosca, S; Guerriero, E; Rotatori, M

    2013-12-01

    Since the publication of the first version of European standard EN-1948 in 1996, long-term sampling equipment has been improved to a high standard for the sampling and analysis of polychlorodibenzo-p-dioxin (PCDD)/polychlorodibenzofuran (PCDF) emissions from industrial sources. The current automated PCDD/PCDF sampling systems enable to extend the measurement time from 6-8 h to 15-30 days in order to have data values better representative of the real pollutant emission of the plant in the long period. EN-1948:2006 is still the European technical reference standard for the determination of PCDD/PCDF from stationary source emissions. In this paper, a methodology to estimate the measurement uncertainty of long-term automated sampling is presented. The methodology has been tested on a set of high concentration sampling data resulting from a specific experience; it is proposed with the intent that it is to be applied on further similar studies and generalized. A comparison between short-term sampling data resulting from manual and automated parallel measurements has been considered also in order to verify the feasibility and usefulness of automated systems and to establish correlations between results of the two methods to use a manual method for calibration of automatic long-term one. The uncertainty components of the manual method are analyzed, following the requirements of EN-1948-3:2006, allowing to have a preliminary evaluation of the corresponding uncertainty components of the automated system. Then, a comparison between experimental data coming from parallel sampling campaigns carried out in short- and long-term sampling periods is realized. Long-term sampling is more reliable to monitor PCDD/PCDF emissions than occasional short-term sampling. Automated sampling systems can assure very useful emission data both in short and long sampling periods. Despite this, due to the different application of the long-term sampling systems, the automated results could not be

  9. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  10. Automated assessment of medical training evaluation text.

    Science.gov (United States)

    Zhang, Rui; Pakhomov, Serguei; Gladding, Sophia; Aylward, Michael; Borman-Shoap, Emily; Melton, Genevieve B

    2012-01-01

    Medical post-graduate residency training and medical student training increasingly utilize electronic systems to evaluate trainee performance based on defined training competencies with quantitative and qualitative data, the later of which typically consists of text comments. Medical education is concomitantly becoming a growing area of clinical research. While electronic systems have proliferated in number, little work has been done to help manage and analyze qualitative data from these evaluations. We explored the use of text-mining techniques to assist medical education researchers in sentiment analysis and topic analysis of residency evaluations with a sample of 812 evaluation statements. While comments were predominantly positive, sentiment analysis improved the ability to discriminate statements with 93% accuracy. Similar to other domains, Latent Dirichlet Analysis and Information Gain revealed groups of core subjects and appear to be useful for identifying topics from this data.

  11. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    OpenAIRE

    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  12. Automated sample preparation and analysis using a sequential-injection-capillary electrophoresis (SI-CE) interface.

    Science.gov (United States)

    Kulka, Stephan; Quintás, Guillermo; Lendl, Bernhard

    2006-06-01

    A fully automated sequential-injection-capillary electrophoresis (SI-CE) system was developed using commercially available components as the syringe pump, the selection and injection valves and the high voltage power supply. The interface connecting the SI with the CE unit consisted of two T-pieces, where the capillary was inserted in one T-piece and a Pt electrode in the other (grounded) T-piece. By pressurising the whole system using a syringe pump, hydrodynamic injection was feasible. For characterisation, the system was applied to a mixture of adenosine and adenosine monophosphate at different concentrations. The calibration curve obtained gave a detection limit of 0.5 microg g(-1) (correlation coefficient of 0.997). The reproducibility of the injection was also assessed, resulting in a RSD value (5 injections) of 5.4%. The total time of analysis, from injection, conditioning and separation to cleaning the capillary again was 15 minutes. In another application, employing the full power of the automated SIA-CE system, myoglobin was mixed directly using the flow system with different concentrations of sodium dodecyl sulfate (SDS), a known denaturing agent. The different conformations obtained in this way were analysed with the CE system and a distinct shift in migration time and decreasing of the native peak of myoglobin (Mb) could be observed. The protein samples prepared were also analysed with off-line infrared spectroscopy (IR), confirming these results. PMID:16732362

  13. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    Science.gov (United States)

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  14. Rapid and automated determination of plutonium and neptunium in environmental samples

    International Nuclear Information System (INIS)

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  15. Rapid and automated determination of plutonium and neptunium in environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, J.

    2011-03-15

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development and optimization for rapid determination of plutonium in environmental samples using SI-extraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples (Paper IV); (4) Investigation of the suitability and applicability of 242Pu as a tracer for rapid neptunium determination using anion exchange chromatography in an SI-network coupled with inductively coupled plasma mass spectrometry (Paper V); (5) Exploration of macro-porous anion exchange chromatography for rapid and simultaneous determination of plutonium and neptunium within an SI system (Paper VI). The results demonstrate that the developed methods in this study are reliable and efficient for accurate assays of trace levels of plutonium and neptunium as demanded in different situations including environmental risk monitoring and assessment, emergency preparedness and surveillance of contaminated areas. (Author)

  16. An automated atmospheric sampling system operating on 747 airliners

    Science.gov (United States)

    Perkins, P. J.; Gustafsson, U. R. C.

    1976-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of particulate and gaseous constituents of the atmosphere is collecting data on commercial air routes covering the world. Measurements are made in the upper troposphere and lower stratosphere (6 to 12 km) of constituents related to aircraft engine emissions and other pollutants. Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This unique system includes specialized instrumentation, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituent and related flight data are tape recorded in flight for later computer processing on the ground.

  17. Automated Research Impact Assessment: A New Bibliometrics Approach

    Science.gov (United States)

    Drew, Christina H.; Pettibone, Kristianna G.; Finch, Fallis Owen; Giles, Douglas; Jordan, Paul

    2016-01-01

    As federal programs are held more accountable for their research investments, The National Institute of Environmental Health Sciences (NIEHS) has developed a new method to quantify the impact of our funded research on the scientific and broader communities. In this article we review traditional bibliometric analyses, address challenges associated with them, and describe a new bibliometric analysis method, the Automated Research Impact Assessment (ARIA). ARIA taps into a resource that has only rarely been used for bibliometric analyses: references cited in “important” research artifacts, such as policies, regulations, clinical guidelines, and expert panel reports. The approach includes new statistics that science managers can use to benchmark contributions to research by funding source. This new method provides the ability to conduct automated impact analyses of federal research that can be incorporated in program evaluations. We apply this method to several case studies to examine the impact of NIEHS funded research. PMID:26989272

  18. SASSI: Subsystems for Automated Subsurface Sampling Instruments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Autonomous surface sampling systems are necessary, near term, to construct a historical view of planetary significant events; as well as allow for the...

  19. Automated biowaste sampling system, solids subsystem operating model, part 2

    Science.gov (United States)

    Fogal, G. L.; Mangialardi, J. K.; Stauffer, R. E.

    1973-01-01

    The detail design and fabrication of the Solids Subsystem were implemented. The system's capacity for the collection, storage or sampling of feces and vomitus from six subjects was tested and verified.

  20. SASSI: Subsystems for Automated Subsurface Sampling Instruments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Future robotic planetary exploration missions will benefit greatly from the ability to capture rock and/or regolith core samples that deliver the stratigraphy of...

  1. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.;

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat...

  2. Experiences of Using Automated Assessment in Computer Science Courses

    Directory of Open Access Journals (Sweden)

    John English

    2015-10-01

    Full Text Available In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students type in their own answers. Students were allowed to correct errors based on feedback provided by the system and resubmit their answers. A total of 141 students were surveyed to assess their opinions of this approach, and we analysed their responses. Analysis of the questionnaire showed a low correlation between questions, indicating the statistical independence of the individual questions. As a whole, student feedback on using Checkpoint was very positive, emphasizing the benefits of multiple attempts, impartial marking, and a quick turnaround time for submissions. Many students said that Checkpoint gave them confidence in learning and motivation to practise. Students also said that the detailed feedback that Checkpoint generated when their programs failed helped them understand their mistakes and how to correct them.

  3. Automated biowaste sampling system urine subsystem operating model, part 1

    Science.gov (United States)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  4. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    Science.gov (United States)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  5. Automated Genotyping of Biobank Samples by Multiplex Amplification of Insertion/Deletion Polymorphisms

    OpenAIRE

    Lucy Mathot; Elin Falk-Sörqvist; Lotte Moens; Marie Allen; Tobias Sjöblom; Mats Nilsson

    2012-01-01

    The genomic revolution in oncology will entail mutational analyses of vast numbers of patient-matched tumor and normal tissue samples. This has meant an increased risk of patient sample mix up due to manual handling. Therefore, scalable genotyping and sample identification procedures are essential to pathology biobanks. We have developed an efficient alternative to traditional genotyping methods suited for automated analysis. By targeting 53 prevalent deletions and insertions found in human p...

  6. Assessing Working Memory in Spanish-Speaking Children: Automated Working Memory Assessment Battery Adaptation

    Science.gov (United States)

    Injoque-Ricle, Irene; Calero, Alejandra D.; Alloway, Tracy P.; Burin, Debora I.

    2011-01-01

    The Automated Working Memory Assessment battery was designed to assess verbal and visuospatial passive and active working memory processing in children and adolescents. The aim of this paper is to present the adaptation and validation of the AWMA battery to Argentinean Spanish-speaking children aged 6 to 11 years. Verbal subtests were adapted and…

  7. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  8. Security Measures in Automated Assessment System for Programming Courses

    Directory of Open Access Journals (Sweden)

    Jana Šťastná

    2015-12-01

    Full Text Available A desirable characteristic of programming code assessment is to provide the learner the most appropriate information regarding the code functionality as well as a chance to improve. This can be hardly achieved in case the number of learners is high (500 or more. In this paper we address the problem of risky code testing and availability of an assessment platform Arena, dealing with potential security risks when providing an automated assessment for a large set of source code. Looking at students’ programs as if they were potentially malicious inspired us to investigate separated execution environments, used by security experts for secure software analysis. The results also show that availability issues of our assessment platform can be conveniently resolved with task queues. A special attention is paid to Docker, a virtual container ensuring no risky code can affect the assessment system security. The assessment platform Arena enables to regularly, effectively and securely assess students' source code in various programming courses. In addition to that it is a motivating factor and helps students to engage in the educational process.

  9. Automated Video Quality Assessment for Deep-Sea Video

    Science.gov (United States)

    Pirenne, B.; Hoeberechts, M.; Kalmbach, A.; Sadhu, T.; Branzan Albu, A.; Glotin, H.; Jeffries, M. A.; Bui, A. O. V.

    2015-12-01

    Video provides a rich source of data for geophysical analysis, often supplying detailed information about the environment when other instruments may not. This is especially true of deep-sea environments, where direct visual observations cannot be made. As computer vision techniques improve and volumes of video data increase, automated video analysis is emerging as a practical alternative to labor-intensive manual analysis. Automated techniques can be much more sensitive to video quality than their manual counterparts, so performing quality assessment before doing full analysis is critical to producing valid results.Ocean Networks Canada (ONC), an initiative of the University of Victoria, operates cabled ocean observatories that supply continuous power and Internet connectivity to a broad suite of subsea instruments from the coast to the deep sea, including video and still cameras. This network of ocean observatories has produced almost 20,000 hours of video (about 38 hours are recorded each day) and an additional 8,000 hours of logs from remotely operated vehicle (ROV) dives. We begin by surveying some ways in which deep-sea video poses challenges for automated analysis, including: 1. Non-uniform lighting: Single, directional, light sources produce uneven luminance distributions and shadows; remotely operated lighting equipment are also susceptible to technical failures. 2. Particulate noise: Turbidity and marine snow are often present in underwater video; particles in the water column can have sharper focus and higher contrast than the objects of interest due to their proximity to the light source and can also influence the camera's autofocus and auto white-balance routines. 3. Color distortion (low contrast): The rate of absorption of light in water varies by wavelength, and is higher overall than in air, altering apparent colors and lowering the contrast of objects at a distance.We also describe measures under development at ONC for detecting and mitigating

  10. Quantification of Human Movement for Assessment in Automated Exercise Coaching

    CERN Document Server

    Hagler, Stuart; Bajczy, Ruzena; Pavel, Misha

    2016-01-01

    Quantification of human movement is a challenge in many areas, ranging from physical therapy to robotics. We quantify of human movement for the purpose of providing automated exercise coaching in the home. We developed a model-based assessment and inference process that combines biomechanical constraints with movement assessment based on the Microsoft Kinect camera. To illustrate the approach, we quantify the performance of a simple squatting exercise using two model-based metrics that are related to strength and endurance, and provide an estimate of the strength and energy-expenditure of each exercise session. We look at data for 5 subjects, and show that for some subjects the metrics indicate a trend consistent with improved exercise performance.

  11. Quantitative Vulnerability Assessment of Cyber Security for Distribution Automation Systems

    Directory of Open Access Journals (Sweden)

    Xiaming Ye

    2015-06-01

    Full Text Available The distribution automation system (DAS is vulnerable to cyber-attacks due to the widespread use of terminal devices and standard communication protocols. On account of the cost of defense, it is impossible to ensure the security of every device in the DAS. Given this background, a novel quantitative vulnerability assessment model of cyber security for DAS is developed in this paper. In the assessment model, the potential physical consequences of cyber-attacks are analyzed from two levels: terminal device level and control center server level. Then, the attack process is modeled based on game theory and the relationships among different vulnerabilities are analyzed by introducing a vulnerability adjacency matrix. Finally, the application process of the proposed methodology is illustrated through a case study based on bus 2 of the Roy Billinton Test System (RBTS. The results demonstrate the reasonability and effectiveness of the proposed methodology.

  12. Application of bar codes to the automation of analytical sample data collection

    International Nuclear Information System (INIS)

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented

  13. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    OpenAIRE

    Qiao, Jixin

    2011-01-01

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combina...

  14. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    Science.gov (United States)

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences.

  15. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33.

    Science.gov (United States)

    Round, A R; Franke, D; Moritz, S; Huchler, R; Fritsche, M; Malthan, D; Klaering, R; Svergun, D I; Roessle, M

    2008-10-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client-server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841

  16. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies.

    Directory of Open Access Journals (Sweden)

    Asad Abdi

    Full Text Available Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively.This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing.

  17. An automated system for assessing cognitive function in any environment

    Science.gov (United States)

    Wesnes, Keith A.

    2005-05-01

    The Cognitive Drug Research (CDR) computerized assessment system has been in use in worldwide clinical trials for over 20 years. It is a computer based system which assesses core aspects of human cognitive function including attention, information, working memory and long-term memory. It has been extensively validated and can be performed by a wide range of clinical populations including patients with various types of dementia. It is currently in worldwide use in clinical trials to evaluate new medicines, as well as a variety of programs involving the effects of age, stressors illnesses and trauma upon human cognitive function. Besides being highly sensitive to drugs which will impair or improve function, its utility has been maintained over the last two decades by constantly increasing the number of platforms upon which it can operate. Besides notebook versions, the system can be used on a wrist worn device, PDA, via tht telephone and over the internet. It is the most widely used automated cognitive function assessment system in worldwide clinical research. It has dozens of parallel forms and requires little training to use or administer. The basic development of the system wil be identified, and the huge databases (normative, patient population, drug effects) which have been built up from hundreds of clinical trials will be described. The system is available for use in virtually any environment or type of trial.

  18. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies

    Science.gov (United States)

    Abdi, Asad; Idris, Norisma; Alguliyev, Rasim M.; Aliguliyev, Ramiz M.

    2016-01-01

    Background Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively. Design/Results This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing. PMID:26735139

  19. Functional profiling of live melanoma samples using a novel automated platform.

    Directory of Open Access Journals (Sweden)

    Adam Schayowitz

    Full Text Available AIMS: This proof-of-concept study was designed to determine if functional, pharmacodynamic profiles relevant to targeted therapy could be derived from live human melanoma samples using a novel automated platform. METHODS: A series of 13 melanoma cell lines was briefly exposed to a BRAF inhibitor (PLX-4720 on a platform employing automated fluidics for sample processing. Levels of the phosphoprotein p-ERK in the mitogen-activated protein kinase (MAPK pathway from treated and untreated sample aliquots were determined using a bead-based immunoassay. Comparison of these levels provided a determination of the pharmacodynamic effect of the drug on the MAPK pathway. A similar ex vivo analysis was performed on fine needle aspiration (FNA biopsy samples from four murine xenograft models of metastatic melanoma, as well as 12 FNA samples from patients with metastatic melanoma. RESULTS: Melanoma cell lines with known sensitivity to BRAF inhibitors displayed marked suppression of the MAPK pathway in this system, while most BRAF inhibitor-resistant cell lines showed intact MAPK pathway activity despite exposure to a BRAF inhibitor (PLX-4720. FNA samples from melanoma xenografts showed comparable ex vivo MAPK activity as their respective cell lines in this system. FNA samples from patients with metastatic melanoma successfully yielded three categories of functional profiles including: MAPK pathway suppression; MAPK pathway reactivation; MAPK pathway stimulation. These profiles correlated with the anticipated MAPK activity, based on the known BRAF mutation status, as well as observed clinical responses to BRAF inhibitor therapy. CONCLUSION: Pharmacodynamic information regarding the ex vivo effect of BRAF inhibitors on the MAPK pathway in live human melanoma samples can be reproducibly determined using a novel automated platform. Such information may be useful in preclinical and clinical drug development, as well as predicting response to targeted therapy in

  20. Development of automated preparation system for isotopocule analysis of N2O in various air samples

    Science.gov (United States)

    Toyoda, Sakae; Yoshida, Naohiro

    2016-05-01

    Nitrous oxide (N2O), an increasingly abundant greenhouse gas in the atmosphere, is the most important stratospheric ozone-depleting gas of this century. Natural abundance ratios of isotopocules of N2O, NNO molecules substituted with stable isotopes of nitrogen and oxygen, are a promising index of various sources or production pathways of N2O and of its sink or decomposition pathways. Several automated methods have been reported to improve the analytical precision for the isotopocule ratio of atmospheric N2O and to reduce the labor necessary for complicated sample preparation procedures related to mass spectrometric analysis. However, no method accommodates flask samples with limited volume or pressure. Here we present an automated preconcentration system which offers flexibility with respect to the available gas volume, pressure, and N2O concentration. The shortest processing time for a single analysis of typical atmospheric sample is 40 min. Precision values of isotopocule ratio analysis are < 0.1 ‰ for δ15Nbulk (average abundances of 14N15N16O and 15N14N16O relative to 14N14N16O), < 0.2 ‰ for δ18O (relative abundance of 14N14N18O), and < 0.5 ‰ for site preference (SP; difference between relative abundance of 14N15N16O and 15N14N16O). This precision is comparable to that of other automated systems, but better than that of our previously reported manual measurement system.

  1. An instrument for automated purification of nucleic acids from contaminated forensic samples.

    Science.gov (United States)

    Broemeling, David J; Pel, Joel; Gunn, Dylan C; Mai, Laura; Thompson, Jason D; Poon, Hiron; Marziali, Andre

    2008-02-01

    Forensic crime scene sample analysis, by its nature, often deals with samples in which there are low amounts of nucleic acids, on substrates that often lead to inhibition of subsequent enzymatic reactions such as PCR amplification for STR profiling. Common substrates include denim from blue jeans, which yields indigo dye as a PCR inhibitor, and soil, which yields humic substances as inhibitors. These inhibitors frequently co-extract with nucleic acids in standard column or bead-based preps, leading to frequent failure of STR profiling. We present a novel instrument for DNA purification of forensic samples that is capable of highly effective concentration of nucleic acids from soil particulates, fabric, and other complex samples including solid components. The novel concentration process, known as SCODA, is inherently selective for long charged polymers such as DNA, and therefore is able to effectively reject known contaminants. We present an automated sample preparation instrument based on this process, and preliminary results based on mock forensic samples.

  2. Automated mango fruit assessment using fuzzy logic approach

    Science.gov (United States)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  3. Assessment of a five-color flow cytometric assay for verifying automated white blood cell differentials

    Institute of Scientific and Technical Information of China (English)

    HUANG Chun-mei; YU Lian-hui; PU Cheng-wei; WANG Xin; WANG Geng; SHEN Li-song; WANG Jian-zhong

    2013-01-01

    Background White blood cell (WBC) counts and differentials performed using an automated cell counter typically require manual microscopic review.However,this last step is time consuming and requires experienced personnel.We evaluated the clinical efficiency of using flow cytometry (FCM) employing a six-antibody/five-color reagent for verifying automated WBC differentials.Methods A total of 56 apparently healthy samples were assessed using a five-color flow cytometer to verify the normal reference ranges of WBC differentials.WBC differentials of 622 samples were also determined using both a cell counter and FCM.These results were then confirmed using manual microscopic methods.Results The probabilities for all of the parameters of WBC differentials exceeded the corresponding normal reference ranges by no more than 7.5%.The resulting WBC differentials were well correlated between FCM and the cell counter (r >0.88,P <0.001),except in the case of basophils.Neutrophils,lymphocytes,and eosinophils were well correlated between FCM and standard microscopic cytology assessment (r >0.80,P <0.001).The sensitivities of FCM for identification of immature granulocytes and blast cells (72.03% and 22.22%,respectively) were higher than those of the cell counter method (44.92% and 11.11%,respectively).The specificities of FCM were all above 85%,substantially better than those of the cell counter method.Conclusion These five-color FCM assays could be applied to accurately verify abnormal results of automated assessment of WBC differentials.

  4. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.

    2016-01-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  5. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments.

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A

    2016-08-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  6. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments.

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A

    2016-08-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically.

  7. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method. PMID:25033319

  8. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    Energy Technology Data Exchange (ETDEWEB)

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  9. Are Flow Injection-based Approaches Suitable for Automated Handling of Solid Samples?

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald; Cerdà, Victor

    , multisyringe flow injection, and micro-Lab-on-valve are presented as appealing approaches for on-line handling of solid samples. Special emphasis is given to the capability of flow systems to accommodate sequential extraction protocols for partitioning of trace elements and nutrients in environmental solids (e......Flow-based approaches were originally conceived for liquid-phase analysis, implying that constituents in solid samples generally had to be transferred into the liquid state, via appropriate batch pretreatment procedures, prior to analysis. Yet, in recent years, much effort has been focused...... on the design and characterisation of sample processings units coupled with flowing systems aiming to enable the direct introduction and treatment of solid samples of environmental and agricultural origin in an automated fashion [1]. In this respect, various sample pre-treatment techniques including...

  10. Assessment of organic matter resistance to biodegradation in volcanic ash soils assisted by automated interpretation of infrared spectra from humic acid and whole soil samples by using partial least squares

    Science.gov (United States)

    Hernández, Zulimar; Pérez Trujillo, Juan Pedro; Hernández-Hernández, Sergio Alexander; Almendros, Gonzalo; Sanz, Jesús

    2014-05-01

    From a practical viewpoint, the most interesting possibilities of applying infrared (IR) spectroscopy to soil studies lie on processing IR spectra of whole soil (WS) samples [1] in order to forecast functional descriptors at high organizational levels of the soil system, such as soil C resilience. Currently, there is a discussion on whether the resistance to biodegradation of soil organic matter (SOM) depends on its molecular composition or on environmental interactions between SOM and mineral components, such could be the case with physical encapsulation of particulate SOM or organo-mineral derivatives, e.g., those formed with amorphous oxides [2]. A set of about 200 dependent variables from WS and isolated, ash free, humic acids (HA) [3] was obtained in 30 volcanic ash soils from Tenerife Island (Spain). Soil biogeochemical properties such as SOM, allophane (Alo + 1 /2 Feo), total mineralization coefficient (TMC) or aggregate stability were determined in WS. In addition, structural information on SOM was obtained from the isolated HA fractions by visible spectroscopy and analytical pyrolysis (Py-GC/MS). Aiming to explore the potential of partial least squares regression (PLS) in forecasting soil dependent variables, exclusively using the information extracted from WS and HA IR spectral profiles, data were processed by using ParLeS [4] and Unscrambler programs. Data pre-treatments should be carefully chosen: the most significant PLS models from IR spectra of HA were obtained after second derivative pre-treatment, which prevented effects of intrinsically broadband spectral profiles typical in macromolecular heterogeneous material such as HA. Conversely, when using IR spectra of WS, the best forecasting models were obtained using linear baseline correction and maximum normalization pre-treatment. With WS spectra, the most successful prediction models were obtained for SOM, magnetite, allophane, aggregate stability, clay and total aromatic compounds, whereas the PLS

  11. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    Science.gov (United States)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  12. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  13. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    International Nuclear Information System (INIS)

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability

  14. Rapid and Automated Determination of Plutonium and Neptunium in Environmental Samples

    DEFF Research Database (Denmark)

    Qiao, Jixin

    This thesis presents improved analytical methods for rapid and automated determination of plutonium and neptunium in environmental samples using sequential injection (SI) based chromatography and inductively coupled plasma mass spectrometry (ICP-MS). The progress of methodology development...... in this work consists of 5 subjects stated as follows: 1) Development and optimization of an SI-anion exchange chromatographic method for rapid determination of plutonium in environmental samples in combination of inductively coupled plasma mass spectrometry detection (Paper II); (2) Methodology development...... and optimization for rapid determination of plutonium in environmental samples using SIextraction chromatography prior to inductively coupled plasma mass spectrometry (Paper III); (3) Development of an SI-chromatographic method for simultaneous determination of plutonium and neptunium in environmental samples...

  15. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33

    OpenAIRE

    Round, A R; D. Franke; S. Moritz; Huchler, R.; Fritsche, M.; Malthan, D.; Klaering, R.; Svergun, D I; Roessle, M.

    2008-01-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein so...

  16. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    Science.gov (United States)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P 0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  17. Deuterium and oxygen-18 determination of microliter quantities of a water sample using an automated equilibrator.

    Science.gov (United States)

    Uemura, Ryu; Matsui, Yohei; Motoyama, Hideaki; Yoshida, Naohiro

    2007-01-01

    We describe a modified version of the equilibration method and a correction algorithm for isotope ratio measurements of small quantities of water samples. The deltaD and the delta(18)O of the same water sample can both be analyzed using an automated equilibrator with sample sizes as small as 50 microL. Conventional equilibration techniques generally require water samples of several microL. That limitation is attributable mainly to changes in the isotope ratio ((18)O/(16)O or D/H) of water samples during isotopic exchange between the equilibration gas (CO(2) or H(2)) and water, and therefore the technique for microL quantities of water requires mass-balance correction using the water/gas (CO(2) or H(2)) mole ratio to correct this isotopic effect. We quantitatively evaluate factors controlling the variability of the isotopic effect due to sample size. Theoretical consideration shows that a simple linear equation corrects for the effects without determining parameters such as isotope fractionation factors and water/gas mole ratios. Precisions (1-sigma) of 50-microL meteoric water samples whose isotopic compositions of -1.4 to -396.2 per thousand for deltaD are +/-0.5 to +/-0.6 per thousand, and of -0.37 to -51.37 per thousand for delta(18)O are +/-0.01 to +/-0.11 per thousand. PMID:17487828

  18. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  19. Automated high-throughput in vitro screening of the acetylcholine esterase inhibiting potential of environmental samples, mixtures and single compounds.

    Science.gov (United States)

    Froment, Jean; Thomas, Kevin V; Tollefsen, Knut Erik

    2016-08-01

    A high-throughput and automated assay for testing the presence of acetylcholine esterase (AChE) inhibiting compounds was developed, validated and applied to screen different types of environmental samples. Automation involved using the assay in 96-well plates and adapting it for the use with an automated workstation. Validation was performed by comparing the results of the automated assay with that of a previously validated and standardised assay for two known AChE inhibitors (paraoxon and dichlorvos). The results show that the assay provides similar concentration-response curves (CRCs) when run according to the manual and automated protocol. Automation of the assay resulted in a reduction in assay run time as well as in intra- and inter-assay variations. High-quality CRCs were obtained for both of the model AChE inhibitors (dichlorvos IC50=120µM and paraoxon IC50=0.56µM) when tested alone. The effect of co-exposure of an equipotent binary mixture of the two chemicals were consistent with predictions of additivity and best described by the concentration addition model for combined toxicity. Extracts of different environmental samples (landfill leachate, wastewater treatment plant effluent, and road tunnel construction run-off) were then screened for AChE inhibiting activity using the automated bioassay, with only landfill leachate shown to contain potential AChE inhibitors. Potential uses and limitations of the assay were discussed based on the present results. PMID:27085000

  20. Automated Generation and Assessment of Autonomous Systems Test Cases

    Science.gov (United States)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results

  1. Automated negotiation in environmental resource management: Review and assessment.

    Science.gov (United States)

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach. PMID:26241930

  2. Automated negotiation in environmental resource management: Review and assessment.

    Science.gov (United States)

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach.

  3. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  4. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    Science.gov (United States)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  5. ASSESSING SMALL SAMPLE WAR-GAMING DATASETS

    Directory of Open Access Journals (Sweden)

    W. J. HURLEY

    2013-10-01

    Full Text Available One of the fundamental problems faced by military planners is the assessment of changes to force structure. An example is whether to replace an existing capability with an enhanced system. This can be done directly with a comparison of measures such as accuracy, lethality, survivability, etc. However this approach does not allow an assessment of the force multiplier effects of the proposed change. To gauge these effects, planners often turn to war-gaming. For many war-gaming experiments, it is expensive, both in terms of time and dollars, to generate a large number of sample observations. This puts a premium on the statistical methodology used to examine these small datasets. In this paper we compare the power of three tests to assess population differences: the Wald-Wolfowitz test, the Mann-Whitney U test, and re-sampling. We employ a series of Monte Carlo simulation experiments. Not unexpectedly, we find that the Mann-Whitney test performs better than the Wald-Wolfowitz test. Resampling is judged to perform slightly better than the Mann-Whitney test.

  6. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  7. Image cytometer method for automated assessment of human spermatozoa concentration

    DEFF Research Database (Denmark)

    Egeberg, D L; Kjaerulff, S; Hansen, C;

    2013-01-01

    In the basic clinical work-up of infertile couples, a semen analysis is mandatory and the sperm concentration is one of the most essential variables to be determined. Sperm concentration is usually assessed by manual counting using a haemocytometer and is hence labour intensive and may be subjected...... to investigator bias. Here we show that image cytometry can be used to accurately measure the sperm concentration of human semen samples with great ease and reproducibility. The impact of several factors (pipetting, mixing, round cell content, sperm concentration), which can influence the read-out as well....... Moreover, by evaluation of repeated measurements it appeared that image cytometry produced more consistent and accurate measurements than manual counting of human spermatozoa concentration. In conclusion, image cytometry provides an appealing substitute of manual counting by providing reliable, robust...

  8. Automation of Workplace Lifting Hazard Assessment for Musculoskeletal Injury Prevention

    OpenAIRE

    Spector, June T.; Lieblich, Max; Bao, Stephen; McQuade, Kevin; Hughes, Margaret

    2014-01-01

    Objectives Existing methods for practically evaluating musculoskeletal exposures such as posture and repetition in workplace settings have limitations. We aimed to automate the estimation of parameters in the revised United States National Institute for Occupational Safety and Health (NIOSH) lifting equation, a standard manual observational tool used to evaluate back injury risk related to lifting in workplace settings, using depth camera (Microsoft Kinect) and skeleton algorithm technology. ...

  9. Automated Three-Dimensional Microbial Sensing and Recognition Using Digital Holography and Statistical Sampling

    Directory of Open Access Journals (Sweden)

    Inkyu Moon

    2010-09-01

    Full Text Available We overview an approach to providing automated three-dimensional (3D sensing and recognition of biological micro/nanoorganisms integrating Gabor digital holographic microscopy and statistical sampling methods. For 3D data acquisition of biological specimens, a coherent beam propagates through the specimen and its transversely and longitudinally magnified diffraction pattern observed by the microscope objective is optically recorded with an image sensor array interfaced with a computer. 3D visualization of the biological specimen from the magnified diffraction pattern is accomplished by using the computational Fresnel propagation algorithm. For 3D recognition of the biological specimen, a watershed image segmentation algorithm is applied to automatically remove the unnecessary background parts in the reconstructed holographic image. Statistical estimation and inference algorithms are developed to the automatically segmented holographic image. Overviews of preliminary experimental results illustrate how the holographic image reconstructed from the Gabor digital hologram of biological specimen contains important information for microbial recognition.

  10. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  11. Quantitative reliability assessment in the safety case of computer-based automation systems

    International Nuclear Information System (INIS)

    An essential issue in the construction of new or in the replacement of the old analogue automation applications in nuclear power plants is the reliability of computer-based systems, and especially the question of how to assess their reliability. The reliability issue is particularly important when the system under assessment is considered as a safety-critical system, such as the reactor protection system. To build sufficient confidence on the reliability of computer-based systems appropriate reliability assessment methods should be developed and applied. The assessment methods should provide useful and plausible reliability estimates, while taking the special characteristics of the reliability assessment of computer-based systems into consideration. The Bayesian inference has proved to be an efficient methodology in the reliability assessment of computer-based automation applications. Practical implementation of Bayesian inference, Bayesian networks, allow the combination of the different safety arguments concerning the system and its development process to a unified reliability estimate. Bayesian networks are also a convenient way to communicate on the safety argumentation between various participants of systems design and implementation as well as between the participants in the licensing processes of computer-based automation systems. This study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). The project aimed to provide support for the authorities and utilities in the licensing problems of computer-based automation systems. Particular objective of the project was to acquire, develop and test new and more cost-effective methods and tools for the safety and reliability assessment, and to gather practical experience on their use in order to achieve a more streamlined licensing process for the computer- based automation systems

  12. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  13. Automated Formative Assessment as a Tool to Scaffold Student Documentary Writing

    Science.gov (United States)

    Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt

    2012-01-01

    The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…

  14. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry

    OpenAIRE

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Background: Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Methods: Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspens...

  15. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  16. Plasma cortisol and noradrenalin concentrations in pigs: automated sampling of freely moving pigs housed in PigTurn versus manually sampled and restrained pigs

    Science.gov (United States)

    Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...

  17. Fully automated algorithm for wound surface area assessment.

    Science.gov (United States)

    Deana, Alessandro Melo; de Jesus, Sérgio Henrique Costa; Sampaio, Brunna Pileggi Azevedo; Oliveira, Marcelo Tavares; Silva, Daniela Fátima Teixeira; França, Cristiane Miranda

    2013-01-01

    Worldwide, clinicians, dentists, nurses, researchers, and other health professionals need to monitor the wound healing progress and to quantify the rate of wound closure. The aim of this study is to demonstrate, step by step, a fully automated numerical method to estimate the size of the wound and the percentage damaged relative to the body surface area (BSA) in images, without the requirement for human intervention. We included the formula for BSA in rats in the algorithm. The methodology was validated in experimental wounds and human ulcers and was compared with the analysis of an experienced pathologist, with good agreement. Therefore, this algorithm is suitable for experimental wounds and burns and human ulcers, as they have a high contrast with adjacent normal skin.

  18. An automated maze task for assessing hippocampus-sensitive memory in mice.

    Science.gov (United States)

    Pioli, Elsa Y; Gaskill, Brianna N; Gilmour, Gary; Tricklebank, Mark D; Dix, Sophie L; Bannerman, David; Garner, Joseph P

    2014-03-15

    Memory deficits associated with hippocampal dysfunction are a key feature of a number of neurodegenerative and psychiatric disorders. The discrete-trial rewarded alternation T-maze task is highly sensitive to hippocampal dysfunction. Normal mice have spontaneously high levels of alternation, whereas hippocampal-lesioned mice are dramatically impaired. However, this is a hand-run task and handling has been shown to impact crucially on behavioural responses, as well as being labour-intensive and therefore unsuitable for high-throughput studies. To overcome this, a fully automated maze was designed. The maze was attached to the mouse's home cage and the subject earned all of its food by running through the maze. In this study the hippocampal dependence of rewarded alternation in the automated maze was assessed. Bilateral hippocampal-lesioned mice were assessed in the standard, hand-run, discrete-trial rewarded alternation paradigm and in the automated paradigm, according to a cross-over design. A similarly robust lesion effect on alternation performance was found in both mazes, confirming the sensitivity of the automated maze to hippocampal lesions. Moreover, the performance of the animals in the automated maze was not affected by their handling history whereas performance in the hand-run maze was affected by prior testing history. By having more stable performance and by decreasing human contact the automated maze may offer opportunities to reduce extraneous experimental variation and therefore increase the reproducibility within and/or between laboratories. Furthermore, automation potentially allows for greater experimental throughput and hence suitability for use in assessment of cognitive function in drug discovery.

  19. Development and evaluation of a partially-automated approach to the assessment of undergraduate mathematics

    OpenAIRE

    Rowlett, Peter

    2014-01-01

    This research explored assessment and e-assessment in undergraduate mathematics and proposed a novel, partially-automated approach, in which assessment is set via computer but completed and marked offline. This potentially offers: reduced efficiency of marking but increased validity compared with examination, via deeper and more open-ended questions; increased reliability compared with coursework, by reduction of plagiarism through individualised questions; increased efficiency for setting qu...

  20. IHC Profiler: An Open Source Plugin for the Quantitative Evaluation and Automated Scoring of Immunohistochemistry Images of Human Tissue Samples

    Science.gov (United States)

    Malhotra, Renu; De, Abhijit

    2014-01-01

    In anatomic pathology, immunohistochemistry (IHC) serves as a diagnostic and prognostic method for identification of disease markers in tissue samples that directly influences classification and grading the disease, influencing patient management. However, till today over most of the world, pathological analysis of tissue samples remained a time-consuming and subjective procedure, wherein the intensity of antibody staining is manually judged and thus scoring decision is directly influenced by visual bias. This instigated us to design a simple method of automated digital IHC image analysis algorithm for an unbiased, quantitative assessment of antibody staining intensity in tissue sections. As a first step, we adopted the spectral deconvolution method of DAB/hematoxylin color spectra by using optimized optical density vectors of the color deconvolution plugin for proper separation of the DAB color spectra. Then the DAB stained image is displayed in a new window wherein it undergoes pixel-by-pixel analysis, and displays the full profile along with its scoring decision. Based on the mathematical formula conceptualized, the algorithm is thoroughly tested by analyzing scores assigned to thousands (n = 1703) of DAB stained IHC images including sample images taken from human protein atlas web resource. The IHC Profiler plugin developed is compatible with the open resource digital image analysis software, ImageJ, which creates a pixel-by-pixel analysis profile of a digital IHC image and further assigns a score in a four tier system. A comparison study between manual pathological analysis and IHC Profiler resolved in a match of 88.6% (P<0.0001, CI = 95%). This new tool developed for clinical histopathological sample analysis can be adopted globally for scoring most protein targets where the marker protein expression is of cytoplasmic and/or nuclear type. We foresee that this method will minimize the problem of inter-observer variations across labs and further help in

  1. Performance of Three Mode-Meter Block-Processing Algorithms for Automated Dynamic Stability Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel J.; Pierre, John W.; Zhou, Ning; Hauer, John F.; Parashar, Manu

    2008-05-31

    The frequency and damping of electromechanical modes offer considerable insight into the dynamic stability properties of a power system. The performance properties of three block-processing algorithms from the perspective of near real-time automated stability assessment are demonstrated and examined. The algorithms are: the extended modified Yule Walker (YW); extended modified Yule Walker with Spectral analysis (YWS); and numerical state-space subspace system identification(N4SID) algorithm. The YW and N4SID have been introduced in previous publications while the YWS is introduced here. Issues addressed include: stability assessment requirements; automated subset selecting identified modes; using algorithms in an automated format; data assumptions and quality; and expected algorithm estimation performance.

  2. Assessing office automation effect on Innovation Case study: Education Organizations and Schools in Esfahan Province, Iran

    Directory of Open Access Journals (Sweden)

    Hajar Safari

    2013-09-01

    Full Text Available Today organizations act in a dynamic, very ambiguous and changing environment. So each organization has to deliver high quality services and benefit from innovative systems to be successful in such an environment. This research aims to explore the relationship between implementation of office automation and innovative using structural equitation modeling method (SEM. Aim of this research is applied and its method is survey-descriptive. Statistical society is managers of education organizations and schools in Esfahan and Lenjan cities. 130 individuals were selected as sample by randomly sampling method. Content and construct validity were used In order to evaluate validity of questionnaire and relations between variables of this research have been confirmed based on results of SEM method. Regarding obtained results, effectiveness amount of office automation on innovation is measured equal to estimated standard amount as 0/24. Obtained results from main hypothesis test of this research completely conform which there is about office automation in studied organization.

  3. High-Throughput Serum 25-Hydroxy Vitamin D Testing with Automated Sample Preparation.

    Science.gov (United States)

    Stone, Judy

    2016-01-01

    Serum from bar-coded tubes, and then internal standard, are pipetted to 96-well plates with an 8-channel automated liquid handler (ALH). The first precipitation reagent (methanol:ZnSO4) is added and mixed with the 8-channel ALH. A second protein precipitating agent, 1 % formic acid in acetonitrile, is added and mixed with a 96-channel ALH. After a 4-min delay for larger precipitates to settle to the bottom of the plate, the upper 36 % of the precipitate/supernatant mix is transferred with the 96-channel ALH to a Sigma Hybrid SPE(®) plate and vacuumed through for removal of phospholipids and precipitated proteins. The filtrate is collected in a second 96-well plate (collection plate) which is foil-sealed, placed in the autosampler (ALS), and injected into a multiplexed LC-MS/MS system running AB Sciex Cliquid(®) and MPX(®) software. Two Shimadzu LC stacks, with multiplex timing controlled by MPX(®) software, inject alternately to one AB Sciex API-5000 MS/MS using positive atmospheric pressure chemical ionization (APCI) and a 1.87 min water/acetonitrile LC gradient with a 2.1 × 20 mm, 2.7 μm, C18 fused core particle column (Sigma Ascentis Express). LC-MS/MS through put is ~44 samples/h/LC-MS/MS system with dual-LC channel multiplexing. Plate maps are transferred electronically from the ALH and reformatted into LC-MS/MS sample table format using the Data Innovations LLC (DI) Instrument Manager middleware application. Before collection plates are loaded into the ALS, the plate bar code is manually scanned to download the sample table from the DI middleware to the LC-MS/MS. After acquisition-LC-MS/MS data is analyzed with AB Sciex Multiquant(®) software using customized queries, and then results are transferred electronically via a DI interface to the LIS. 2500 samples/day can be extracted by two analysts using four ALHs in 4-6 h. LC-MS/MS analysis of those samples on three dual-channel LC multiplexed LC-MS/MS systems requires 19-21 h and data analysis can be

  4. Two Methods for High-Throughput NGS Template Preparation for Small and Degraded Clinical Samples Without Automation

    OpenAIRE

    Kamberov, E.; Tesmer, T.; Mastronardi, M.; Langmore, John

    2012-01-01

    Clinical samples are difficult to prepare for NGS, because of the small amounts or degraded states of formalin-fixed tissue, plasma, urine, and single-cell DNA. Conventional whole genome amplification methods are too biased for NGS applications, and the existing NGS preparation kits require intermediate purifications and excessive time to prepare hundreds of samples in a day without expensive automation. We have tested two 96-well manual methods to make NGS templates from FFPE tissue, plasma,...

  5. Automated peroperative assessment of stents apposition from OCT pullbacks.

    Science.gov (United States)

    Dubuisson, Florian; Péry, Emilie; Ouchchane, Lemlih; Combaret, Nicolas; Kauffmann, Claude; Souteyrand, Géraud; Motreff, Pascal; Sarry, Laurent

    2015-04-01

    This study's aim was to control the stents apposition by automatically analyzing endovascular optical coherence tomography (OCT) sequences. Lumen is detected using threshold, morphological and gradient operators to run a Dijkstra algorithm. Wrong detection tagged by the user and caused by bifurcation, struts'presence, thrombotic lesions or dissections can be corrected using a morphing algorithm. Struts are also segmented by computing symmetrical and morphological operators. Euclidian distance between detected struts and wall artery initializes a stent's complete distance map and missing data are interpolated with thin-plate spline functions. Rejection of detected outliers, regularization of parameters by generalized cross-validation and using the one-side cyclic property of the map also optimize accuracy. Several indices computed from the map provide quantitative values of malapposition. Algorithm was run on four in-vivo OCT sequences including different incomplete stent apposition's cases. Comparison with manual expert measurements validates the segmentation׳s accuracy and shows an almost perfect concordance of automated results. PMID:25700272

  6. Automated Tissue Classification Framework for Reproducible Chronic Wound Assessment

    Directory of Open Access Journals (Sweden)

    Rashmi Mukherjee

    2014-01-01

    Full Text Available The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough scheme for chronic wound (CW evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity color space and subsequently the “S” component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM, were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793.

  7. Automated tissue classification framework for reproducible chronic wound assessment.

    Science.gov (United States)

    Mukherjee, Rashmi; Manohar, Dhiraj Dhane; Das, Dev Kumar; Achar, Arun; Mitra, Analava; Chakraborty, Chandan

    2014-01-01

    The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough) scheme for chronic wound (CW) evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB) wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity) color space and subsequently the "S" component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM), were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793).

  8. Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™

    Science.gov (United States)

    Balfour, Stephen P.

    2013-01-01

    Two of the largest Massive Open Online Course (MOOC) organizations have chosen different methods for the way they will score and provide feedback on essays students submit. EdX, MIT and Harvard's non-profit MOOC federation, recently announced that they will use a machine-based Automated Essay Scoring (AES) application to assess written work in…

  9. Automated Spacecraft Conjunction Assessment at Mars and the Moon

    Science.gov (United States)

    Berry, David; Guinn, Joseph; Tarzi, Zahi; Demcak, Stuart

    2012-01-01

    Conjunction assessment and collision avoidance are areas of current high interest in space operations. Most current conjunction assessment activity focuses on the Earth orbital environment. Several of the world's space agencies have satellites in orbit at Mars and the Moon, and avoiding collisions there is important too. Smaller number of assets than Earth, and smaller number of organizations involved, but consequences similar to Earth scenarios.This presentation will examine conjunction assessment processes implemented at JPL for spacecraft in orbit at Mars and the Moon.

  10. Assessment of Automated Measurement and Verification (M&V) Methods

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Custodio, Claudine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sohn, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jump, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  11. QUAliFiER: An automated pipeline for quality assessment of gated flow cytometry data

    Directory of Open Access Journals (Sweden)

    Finak Greg

    2012-09-01

    pipeline constructed from two new R packages for importing manually gated flow cytometry data and performing flexible and robust quality assessment checks. The pipeline addresses the increasing demand for tools capable of performing quality checks on large flow data sets generated in typical clinical trials. The QUAliFiER tool objectively, efficiently, and reproducibly identifies outlier samples in an automated manner by monitoring cell population statistics from gated or ungated flow data conditioned on experiment–level metadata.

  12. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    OpenAIRE

    Constantinos Georgiou; Georgakopoulos, Dimitrios G.; Gerasimos Kremmydas; Efstathios Vasiliou; Efstratios Komaitis

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor biol...

  13. Access to information: assessment of the use of automated interaction technologies in call centers

    Directory of Open Access Journals (Sweden)

    Fernando de Souza Meirelles

    2011-01-01

    Full Text Available With the purpose of at lowering costs and reendering the demanded information available to users with no access to the internet, service companies have adopted automated interaction technologies in their call centers, which may or may not meet the expectations of users. Based on different areas of knowledge (man-machine interaction, consumer behavior and use of IT 13 propositions are raised and a research is carried out in three parts: focus group, field study with users and interviews with experts. Eleven automated service characteristics which support the explanation for user satisfaction are listed, a preferences model is proposed and evidence in favor or against each of the 13 propositions is brought in. With balance scorecard concepts, a managerial assessment model is proposed for the use of automated call center technology. In future works, the propositions may become verifiable hypotheses through conclusive empirical research.

  14. Evaluation of two automated enzyme-immunoassays for detection of thermophilic campylobacters in faecal samples from cattle and swine

    DEFF Research Database (Denmark)

    Hoorfar, Jeffrey; Nielsen, E.M.; Stryhn, H.;

    1999-01-01

    We evaluated the performance of two enzyme-immunoassays (EIA) for the detection of naturally occurring, thermophilic Campylobacter spp. found in faecal samples from cattle (n = 21 and n = 26) and swine (n = 43) relative to the standard culture method, and also assuming that none of the tests......-2 method resulted in a rather low specificity (32%). This seemed to be partially due to the isolation of nonthermophilic species. In conclusion, EIA-1 method may provide a simple and fast tool with good accuracy in cattle and swine samples for automated screening of large number of samples....

  15. Bayesian Stratified Sampling to Assess Corpus Utility

    OpenAIRE

    Hochberg, Judith; Scovel, Clint; Thomas, Timothy; Hall, Sam

    1998-01-01

    This paper describes a method for asking statistical questions about a large text corpus. We exemplify the method by addressing the question, "What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?" We estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Stratified sampling is used to reduce the sampling uncertainty of the estimate from over 3100 documents t...

  16. Feasibility studies of safety assessment methods for programmable automation systems. Final report of the AVV project

    International Nuclear Information System (INIS)

    Feasibility studies of two different groups of methodologies for safety assessment of programmable automation systems has been executed at the Technical Research Centre of Finland (VTT). The studies concerned the dynamic testing methods and the fault tree (FT) and failure mode and effects analysis (FMEA) methods. In order to get real experience in the application of these methods, an experimental testing of two realistic pilot systems were executed and a FT/FMEA analysis of a programmable safety function accomplished. The purpose of the studies was not to assess the object systems, but to get experience in the application of methods and assess their potentials and development needs. (46 refs., 21 figs.)

  17. Donor disc attachment assessment with intraoperative spectral optical coherence tomography during descemet stripping automated endothelial keratoplasty

    Directory of Open Access Journals (Sweden)

    Edward Wylegala

    2013-01-01

    Full Text Available Optical coherence tomography has already been proven to be useful for pre- and post-surgical anterior eye segment assessment, especially in lamellar keratoplasty procedures. There is no evidence for intraoperative usefulness of optical coherence tomography (OCT. We present a case report of the intraoperative donor disc attachment assessment with spectral-domain optical coherence tomography in case of Descemet stripping automated endothelial keratoplasty (DSAEK surgery combined with corneal incisions. The effectiveness of the performed corneal stab incisions was visualized directly by OCT scan analysis. OCT assisted DSAEK allows the assessment of the accuracy of the Descemet stripping and donor disc attachment.

  18. Automated Peripheral Neuropathy Assessment Using Optical Imaging and Foot Anthropometry.

    Science.gov (United States)

    Siddiqui, Hafeez-U R; Spruce, Michelle; Alty, Stephen R; Dudley, Sandra

    2015-08-01

    A large proportion of individuals who live with type-2 diabetes suffer from plantar sensory neuropathy. Regular testing and assessment for the condition is required to avoid ulceration or other damage to patient's feet. Currently accepted practice involves a trained clinician testing a patient's feet manually with a hand-held nylon monofilament probe. The procedure is time consuming, labor intensive, requires special training, is prone to error, and repeatability is difficult. With the vast increase in type-2 diabetes, the number of plantar sensory neuropathy sufferers has already grown to such an extent as to make a traditional manual test problematic. This paper presents the first investigation of a novel approach to automatically identify the pressure points on a given patient's foot for the examination of sensory neuropathy via optical image processing incorporating plantar anthropometry. The method automatically selects suitable test points on the plantar surface that correspond to those repeatedly chosen by a trained podiatrist. The proposed system automatically identifies the specific pressure points at different locations, namely the toe (hallux), metatarsal heads and heel (Calcaneum) areas. The approach is generic and has shown 100% reliability on the available database used. The database consists of Chinese, Asian, African, and Caucasian foot images. PMID:26186748

  19. Bayesian Stratified Sampling to Assess Corpus Utility

    CERN Document Server

    Hochberg, J; Thomas, T; Hall, S; Hochberg, Judith; Scovel, Clint; Thomas, Timothy; Hall, Sam

    1998-01-01

    This paper describes a method for asking statistical questions about a large text corpus. We exemplify the method by addressing the question, "What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?" We estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Stratified sampling is used to reduce the sampling uncertainty of the estimate from over 3100 documents to fewer than 1000. The stratification is based on observed characteristics of real documents, while the sampling procedure incorporates a Bayesian version of Neyman allocation. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  20. Automated Assessment of Right Ventricular Volumes and Function Using Three-Dimensional Transesophageal Echocardiography.

    Science.gov (United States)

    Nillesen, Maartje M; van Dijk, Arie P J; Duijnhouwer, Anthonie L; Thijssen, Johan M; de Korte, Chris L

    2016-02-01

    Assessment of right ventricular (RV) function is known to be of diagnostic value in patients with RV dysfunction. Because of its complex anatomic shape, automated determination of the RV volume is difficult and strong reliance on geometric assumptions is not desired. A method for automated RV assessment was developed using three-dimensional (3-D) echocardiography without relying on a priori knowledge of the cardiac anatomy. A 3-D adaptive filtering technique that optimizes the discrimination between blood and myocardium was applied to facilitate endocardial border detection. Filtered image data were incorporated in a segmentation model to automatically detect the endocardial RV border. End-systolic and end-diastolic RV volumes, as well as ejection fraction, were computed from the automatically segmented endocardial surfaces and compared against reference volumes manually delineated by two expert cardiologists. The results reported good performance in terms of correlation and agreement with the results from the reference volumes.

  1. Automated sample preparation for radiogenic and non-traditional metal isotope analysis by MC-ICP-MS

    Science.gov (United States)

    Field, M. P.; Romaniello, S. J.; Gordon, G. W.; Anbar, A. D.

    2012-12-01

    High throughput analysis is becoming increasingly important for many applications of radiogenic and non-traditional metal isotopes. While MC-ICP-MS instruments offer the potential for very high sample throughout, the requirement for labor-intensive sample preparation and purification procedures remains a substantial bottleneck. Current purification protocols require manually feeding gravity-driven separation columns, a process that is both costly and time consuming. This bottleneck is eliminated with the prepFAST-MC™, an automated, low-pressure ion exchange chromatography system that can process from 1 to 60 samples in unattended operation. The syringe-driven system allows sample loading, multiple acid washes, column conditioning and elution cycles necessary to isolate elements of interest and automatically collect up to 3 discrete eluent fractions at user-defined intervals (time, volume and flow rate). Newly developed protocols for automated purification of uranium illustrates high throughput (>30 per run), multiple samples processed per column (>30), complete (>99%) matrix removal, high recovery (> 98%, n=25), and excellent precision (2 sigma =0.03 permil, n=10). The prepFAST-MC™ maximizes sample throughput and minimizes costs associated with personnel and consumables providing an opportunity to greatly expand research horizons in fields where large isotopic data sets are required, including archeology, geochemistry, and climate/environmental science

  2. Assessing Pulmonary Perfusion in Emphysema Automated Quantification of Perfused Blood Volume in Dual-Energy CTPA

    OpenAIRE

    Meinel, Felix G.; Graef, Anita; Thieme, Sven F.; Bamberg, Fabian; Schwarz, Florian; Sommer, Wieland; Helck, Andreas D.; Neurohr, Claus; Reiser, Maximilian F.; Johnson, Thorsten R. C.

    2013-01-01

    Objectives: The objective of this study was to determine whether automated quantification of lung perfused blood volume (PBV) in dual-energy computed tomographic pulmonary angiography (DE-CTPA) can be used to assess the severity and regional distribution of pulmonary hypoperfusion in emphysema. Materials and Methods: We retrospectively analyzed 40 consecutive patients (mean age, 67 13] years) with pulmonary emphysema, who have no cardiopulmonary comorbidities, and a DE-CTPA negative for pulmo...

  3. Automating the aviation command safety assessment survey as an Enterprise Information System (EIS)

    OpenAIRE

    Held, Jonathan S.; Mingo, Fred J.

    1999-01-01

    The Aviation Command Safety Assessment (ACSA) is a questionnaire survey methodology developed to evaluate a Naval Aviation Command's safety climate, culture, and safety program effectiveness. This survey was a manual process first administered in the fall of 1996. The primary goal of this thesis is to design, develop, and test an Internet-based, prototype model for administering this survey using new technologies that allow automated survey submission and analysis. The result of this thesis i...

  4. Using neural networks to assess flight deck human–automation interaction

    International Nuclear Information System (INIS)

    The increased complexity and interconnectivity of flight deck automation has made the prediction of human–automation interaction (HAI) difficult and has resulted in a number of accidents and incidents. There is a need to develop objective and robust methods by which the changes in HAI brought about by the introduction of new automation into the flight deck could be predicted and assessed prior to implementation and without use of extensive simulation. This paper presents a method to model a parametrization of flight deck automation known as HART and link it to HAI consequences using a backpropagation neural network approach. The transformation of the HART into a computational model suitable for modeling as a neural network is described. To test and train the network data were collected from 40 airline pilots for six HAI consequences based on one scenario family consisting of a baseline and four variants. For a binary classification of HAI consequences, the neural network successfully classified 62–78.5% depending on the consequence. The results were verified using a decision tree analysis

  5. Automation impact study of Army training management 2: Extension of sampling and collection of installation resource data

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, T.F.; McCallum, M.C.; Hunt, P.S.; Slavich, A.L.; Underwood, J.A.; Toquam, J.L.; Seaver, D.A.

    1989-05-01

    This automation impact study of Army training management (TM) was performed for the Army Development and Employment Agency (ADEA) and the Combined Arms Training Activity (CATA) by the Battelle Human Affairs Research Centers and the Pacific Northwest Laboratory. The primary objective of the study was to provide the Army with information concerning the potential costs and savings associated with automating the TM process. This study expands the sample of units surveyed in Phase I of the automation impact effort (Sanquist et al., 1988), and presents data concerning installation resource management in relation to TM. The structured interview employed in Phase I was adapted to a self-administered survey. The data collected were compatible with that of Phase I, and both were combined for analysis. Three US sites, one reserve division, one National Guard division, and one unit in the active component outside the continental US (OCONUS) (referred to in this report as forward deployed) were surveyed. The total sample size was 459, of which 337 respondents contributed the most detailed data. 20 figs., 62 tabs.

  6. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  7. Defining food sampling strategy for chemical risk assessment

    OpenAIRE

    Wesolek, Nathalie; Roudot, Alain-Claude

    2012-01-01

    International audience; Collection of accurate and reliable data is a prerequisite for informed risk assessment and risk management. For chemical contaminants in food, contamination assessments enable consumer protection and exposure assessments. And yet, the accuracy of a contamination assessment depends on both chemical analysis and sampling plan performance. A sampling plan is always used when the contamination level of a food lot is evaluated, due to the fact that the whole lot can not be...

  8. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  9. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    International Nuclear Information System (INIS)

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (2/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg-1 for 5-300 mg of sample.

  10. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  11. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    Science.gov (United States)

    Williams, Alex C.; Hitt, Austin; Voisin, Sophie; Tourassi, Georgia

    2013-03-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists' manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  12. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alex C [ORNL; Hitt, Austin N [ORNL; Voisin, Sophie [ORNL; Tourassi, Georgia [ORNL

    2013-01-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  13. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn [Seoul National University Hospital, Seoul (Korea, Republic of)

    2013-02-15

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume ({gamma}= 0.637 for total liver and {gamma}= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  14. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study

    Directory of Open Access Journals (Sweden)

    Paul Otten

    2015-08-01

    Full Text Available Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA, are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient’s upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods.

  15. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    Directory of Open Access Journals (Sweden)

    Marjan Mansourvar

    Full Text Available Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP and artificial neural networks (ANNs models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age.

  16. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    Science.gov (United States)

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  17. Interdisciplinary development of manual and automated product usability assessments for older adults with dementia: lessons learned.

    Science.gov (United States)

    Boger, Jennifer; Taati, Babak; Mihailidis, Alex

    2016-10-01

    The changes in cognitive abilities that accompany dementia can make it difficult to use everyday products that are required to complete activities of daily living. Products that are inherently more usable for people with dementia could facilitate independent activity completion, thus reducing the need for caregiver assistance. The objectives of this research were to: (1) gain an understanding of how water tap design impacted tap usability and (2) create an automated computerized tool that could assess tap usability. 27 older adults, who ranged from cognitively intact to advanced dementia, completed 1309 trials on five tap designs. Data were manually analyzed to investigate tap usability as well as used to develop an automated usability analysis tool. Researchers collaborated to modify existing techniques and to create novel ones to accomplish both goals. This paper presents lessons learned through the course of this research, which could be applicable in the development of other usability studies, automated vision-based assessments and the development of assistive technologies for cognitively impaired older adults. Collaborative interdisciplinary teamwork, which included older adult with dementia participants, was key to enabling innovative advances that achieved the projects' research goals. Implications for Rehabilitation Products that are implicitly familiar and usable by older adults could foster independent activity completion, potentially reducing reliance on a caregiver. The computer-based automated tool can significantly reduce the time and effort required to perform product usability analysis, making this type of analysis more feasible. Interdisciplinary collaboration can result in a more holistic understanding of assistive technology research challenges and enable innovative solutions.

  18. High-resolution laboratory lysimeter for automated sampling of tracers through a 0.5 m soil block

    OpenAIRE

    K. N. Andrew; Worsfold, P. J.; Matthews, G. P.; Patel, D.; Mathews, T.J.; Johnson, A

    2003-01-01

    A computer-controlled, automated sample collection from a 0.5-m lysimeter, designed to give superior temporal and spatial resolution for monitoring the movement of chemical tracers through a large undisturbed soil block, is described. The soil block, 0.520.520.5 m, was monitored for saturation using eight time domain reflectometry probes. Rainfall was applied at approximately 1600 ml hm1 using a 12212 array of 23-gauge (0.318 mm internal diameter) hypodermic needles. Soil leachates were colle...

  19. SU-E-I-94: Automated Image Quality Assessment of Radiographic Systems Using An Anthropomorphic Phantom

    International Nuclear Information System (INIS)

    Purpose: In a large, academic medical center, consistent radiographic imaging performance is difficult to routinely monitor and maintain, especially for a fleet consisting of multiple vendors, models, software versions, and numerous imaging protocols. Thus, an automated image quality control methodology has been implemented using routine image quality assessment with a physical, stylized anthropomorphic chest phantom. Methods: The “Duke” Phantom (Digital Phantom 07-646, Supertech, Elkhart, IN) was imaged twice on each of 13 radiographic units from a variety of vendors at 13 primary care clinics. The first acquisition used the clinical PA chest protocol to acquire the post-processed “FOR PRESENTATION” image. The second image was acquired without an antiscatter grid followed by collection of the “FOR PROCESSING” image. Manual CNR measurements were made from the largest and thickest contrast-detail inserts in the lung, heart, and abdominal regions of the phantom in each image. An automated image registration algorithm was used to estimate the CNR of the same insert using similar ROIs. Automated measurements were then compared to the manual measurements. Results: Automatic and manual CNR measurements obtained from “FOR PRESENTATION” images had average percent differences of 0.42%±5.18%, −3.44%±4.85%, and 1.04%±3.15% in the lung, heart, and abdominal regions, respectively; measurements obtained from “FOR PROCESSING” images had average percent differences of -0.63%±6.66%, −0.97%±3.92%, and −0.53%±4.18%, respectively. The maximum absolute difference in CNR was 15.78%, 10.89%, and 8.73% in the respective regions. In addition to CNR assessment of the largest and thickest contrast-detail inserts, the automated method also provided CNR estimates for all 75 contrast-detail inserts in each phantom image. Conclusion: Automated analysis of a radiographic phantom has been shown to be a fast, robust, and objective means for assessing radiographic

  20. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Science.gov (United States)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  1. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    International Nuclear Information System (INIS)

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6

  2. COST-WORTH ASSESSMENT OF AUTOMATED RADIAL DISTRIBUTION SYSTEM BASED ON RELIABILITY

    Directory of Open Access Journals (Sweden)

    E. Vidya Sagar

    2010-11-01

    Full Text Available Power reliability and quality are gaining their greater grounds than ever in the power and industrial market of the times. It has become an essential means for the successful dispatch of quality product, operation, services of any industry. The reliability of power distribution network can be greatly enhanced by the automation of its feeder system and other associated parts. Remotely controlled and automated restoration services can avoid the necessity of executing manual switching schedules and are bound to bring about remarkable levels of system reliability and interruption costs. The reliability cost-worth analysis is the excellent tool for evaluation of interruption costs. Reliability cost-worth analysis is very significant in the power system planning, operation, maintenance andexpansion, for it takes into account the customer concerns in the analysis. This paper deals in due detail with the reliability based cost-worth analysis of automated radial distribution network. The direct method of assessing the worth of reliability is to calculate the user costs relevant to interruptions in the power supply. By the application of failure mode effect analysis and cost-worth analysis byusing direct costing method the reliability of radial distribution network can be evaluated. The customer interruption costs’ indices of a radial distribution network calculated using the analytical method applied to an Indian utility network system with 2000KVA& sixteen nodes and the related results of the study are duly discussed in this paper.

  3. Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.

    Science.gov (United States)

    Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A

    2016-04-01

    Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost. PMID:26894596

  4. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    Science.gov (United States)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  5. Non-destructive automated sampling of mycotoxins in bulk food and feed - A new tool for required harmonization.

    Science.gov (United States)

    Spanjer, M; Stroka, J; Patel, S; Buechler, S; Pittet, A; Barel, S

    2001-06-01

    Mycotoxins contamination is highly non-uniformly distributed as is well recog-nized by the EC, by not only setting legal limits in a series of commodities, but also schedule a sampling plan that takes this heterogeneity into account. In practice however, it turns out that it is very difficult to carry out this sampling plan in a harmonised way. Applying the sampling plan to a container filled with pallets of bags (i.e. with nuts or coffee beans) varies from very laborious to almost impossible. The presented non-destructive automated method to sample bulk food could help to overcome these practical problems and to enforcing of EC directives. It is derived from a tested and approved technology for detection of illicit substances in security applications. It has capability to collect and iden-tify ultra trace contaminants, i.e. from a fingerprint of chemical substance in a bulk of goods, a cargo pallet load (~ 1000 kg) with boxes and commodities.The technology, patented for explosives detection, uses physical and chemistry processes for excitation and remote rapid enhanced release of contaminant residues, vapours and particulate, of the inner/outer surfaces of inspected bulk and collect them on selective probes. The process is automated, takes only 10 minutes, is non-destructive and the bulk itself remains unharmed. The system design is based on applicable international regulations for shipped cargo hand-ling and transportation by road, sea and air. After this process the pallet can be loaded on a truck, ship or plane. Analysis can be carried out before the cargo leaves the place of shipping. The potent application of this technology for myco-toxins detection, has been demonstrated by preliminary feasibility experiments. Aflatoxins were detected in pistachios and ochratoxin A in green coffee beans bulk. Both commodities were naturally contaminated, priory found and confirm-ed by common methods as used at routine inspections. Once the contaminants are extracted from a

  6. Automation and environment of a sample of the modernized installation YuMO

    International Nuclear Information System (INIS)

    New possibilities of the modernized installation YuMO due to automation of separate units are shown. Main unique devices due to modernization are presented. Advantages of the upgraded spectrometer are shown. The basic approaches to creation of control systems by executive mechanisms of spectrometers on the basis of their unification and standardization are formulated. Circuits of the block of management by step-by-step engines, the switchboard-amplifier of step-by-step motors, the circuit of the system of stabilization of the period and phase of the chopper, and the block diagram of the control system of executive mechanisms of the spectrometer YuMO are submitted. Main technical parameters of the basic original mechanical devices are given. (author)

  7. A self-contained polymeric cartridge for automated biological sample preparationa

    OpenAIRE

    Xu, Guolin; Lee, Daniel Yoke San; Xie, Hong; Chiew, Deon; Hsieh, Tseng-Ming; Ali, Emril Mohamed; Lun Looi, Xing; Li, Mo-Huang; Ying, Jackie Y.

    2011-01-01

    Sample preparation is one of the most crucial processes for nucleic acids based disease diagnosis. Several steps are required for nucleic acids extraction, impurity washes, and DNA/RNA elution. Careful sample preparation is vital to the obtaining of reliable diagnosis, especially with low copies of pathogens and cells. This paper describes a low-cost, disposable lab cartridge for automatic sample preparation, which is capable of handling flexible sample volumes of 10 μl to 1 ml. This plastic ...

  8. Qualification of an automated device to objectively assess the effect of hair care products on hair shine.

    Science.gov (United States)

    Hagens, Ralf; Wiersbinski, Tim; Becker, Michael E; Weisshaar, Jürgen; Schreiner, Volker; Wenck, Horst

    2011-01-01

    The authors developed and qualified an automated routine screening tool to quantify hair shine. This tool is able to separately record individual properties of hair shine such as specular reflection and multiple reflection, as well as additional features such as sparkle, parallelism of hair fibers, and hair color, which strongly affect the subjective ranking by individual readers. A side-by-side comparison of different hair care and styling products with regard to hair shine using the automated screening tool in parallel with standard panel assessment showed that the automated system provides an almost identical ranking and the same statistical significances as the panel assessment. Provided stringent stratification of hair fibers for color and parallelism, the automated tool competes favorably with panel assessments of hair shine. In this case, data generated with the opsira Shine-Box are clearly superior over data generated by panel assessment in terms of reliability and repeatability, workload and time consumption, and sensitivity and specificity to detect differences after shampoo, conditioner, and leave-in treatment. The automated tool is therefore well suited to replace standard panel assessments in claim support, at least as a screening tool. A further advantage of the automated system over panel assessments is the fact that absolute numeric values are generated for a given hair care product, whereas panel assessments can only give rankings of a series of hair care products included in the same study. Thus, the absolute numeric data generated with the automated system allow comparison of hair care products between studies or at different time points after treatment.

  9. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  10. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  11. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  12. Development of a methodology for automated assessment of the quality of digitized images in mammography

    International Nuclear Information System (INIS)

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  13. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    Directory of Open Access Journals (Sweden)

    Constantinos Georgiou

    2010-07-01

    Full Text Available This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+ solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor’s response to carrier solutions of different pHs was tested. Vibrio fischeri’s bioluminescence is promoted in the pH 5–10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions.

  14. Development of a fully automated Flow Injection analyzer implementing bioluminescent biosensors for water toxicity assessment.

    Science.gov (United States)

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb(2+), Hg(2+) and Cu(2+)) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor's response to carrier solutions of different pHs was tested. Vibrio fischeri's bioluminescence is promoted in the pH 5-10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  15. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Vilém Guryča

    2014-03-01

    Full Text Available The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility.

  16. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    Science.gov (United States)

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample. PMID:26759433

  17. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  18. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range

    NARCIS (Netherlands)

    Duijn, E. van; Sandman, H.; Grossouw, D.; Mocking, J.A.J.; Coulier, L.; Vaes, W.H.J.

    2014-01-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. H

  19. Development and evaluation of a virtual microscopy application for automated assessment of Ki-67 expression in breast cancer

    Directory of Open Access Journals (Sweden)

    Turpeenniemi-Hujanen Taina

    2011-01-01

    Full Text Available Abstract Background The aim of the study was to develop a virtual microscopy enabled method for assessment of Ki-67 expression and to study the prognostic value of the automated analysis in a comprehensive series of patients with breast cancer. Methods Using a previously reported virtual microscopy platform and an open source image processing tool, ImageJ, a method for assessment of immunohistochemically (IHC stained area and intensity was created. A tissue microarray (TMA series of breast cancer specimens from 1931 patients was immunostained for Ki-67, digitized with a whole slide scanner and uploaded to an image web server. The extent of Ki-67 staining in the tumour specimens was assessed both visually and with the image analysis algorithm. The prognostic value of the computer vision assessment of Ki-67 was evaluated by comparison of distant disease-free survival in patients with low, moderate or high expression of the protein. Results 1648 evaluable image files from 1334 patients were analysed in less than two hours. Visual and automated Ki-67 extent of staining assessments showed a percentage agreement of 87% and weighted kappa value of 0.57. The hazard ratio for distant recurrence for patients with a computer determined moderate Ki-67 extent of staining was 1.77 (95% CI 1.31-2.37 and for high extent 2.34 (95% CI 1.76-3.10, compared to patients with a low extent. In multivariate survival analyses, automated assessment of Ki-67 extent of staining was retained as a significant prognostic factor. Conclusions Running high-throughput automated IHC algorithms on a virtual microscopy platform is feasible. Comparison of visual and automated assessments of Ki-67 expression shows moderate agreement. In multivariate survival analysis, the automated assessment of Ki-67 extent of staining is a significant and independent predictor of outcome in breast cancer.

  20. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-01

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  1. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    Science.gov (United States)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  2. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    Directory of Open Access Journals (Sweden)

    K. Schneider-Zapp

    2014-02-01

    Full Text Available In order to advance understanding of the role of seawater surfactants in the air–sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw, we constructed a fully automated, closed air-water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with MilliQ water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air–sea gas exchange process.

  3. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    Science.gov (United States)

    Schneider-Zapp, K.; Salter, M. E.; Upstill-Goddard, R. C.

    2014-07-01

    In order to advance understanding of the role of seawater surfactants in the air-sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw), we constructed a fully automated, closed air-water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with Milli-Q water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air-sea gas exchange process.

  4. Rapid DNA analysis for automated processing and interpretation of low DNA content samples

    OpenAIRE

    Turingan, Rosemary S.; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F.

    2016-01-01

    Background Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or f...

  5. An instrument for automated purification of nucleic acids from contaminated forensic samples

    OpenAIRE

    Broemeling, David J; Pel, Joel; Gunn, Dylan C; Mai, Laura; Thompson, Jason D.; Poon, Hiron; Marziali, Andre

    2008-01-01

    Forensic crime scene sample analysis, by its nature, often deals with samples in which there are low amounts of nucleic acids, on substrates that often lead to inhibition of subsequent enzymatic reactions such as PCR amplification for STR profiling. Common substrates include denim from blue jeans, which yields indigo dye as a PCR inhibitor, and soil, which yields humic substances as inhibitors. These inhibitors frequently co-extract with nucleic acids in standard column or bead-based preps, l...

  6. The use of automated assessments in internet-based CBT: The computer will be with you shortly

    Directory of Open Access Journals (Sweden)

    Elizabeth C. Mason

    2014-10-01

    Full Text Available There is evidence from randomized control trials that internet-based cognitive behavioral therapy (iCBT is efficacious in the treatment of anxiety and depression, and recent research demonstrates the effectiveness of iCBT in routine clinical care. The aims of this study were to implement and evaluate a new pathway by which patients could access online treatment by completing an automated assessment, rather than seeing a specialist health professional. We compared iCBT treatment outcomes in patients who received an automated pre-treatment questionnaire assessment with patients who were assessed by a specialist psychiatrist prior to treatment. Participants were treated as part of routine clinical care and were therefore not randomized. The results showed that symptoms of anxiety and depression decreased significantly with iCBT, and that the mode of assessment did not affect outcome. That is, a pre-treatment assessment by a psychiatrist conferred no additional treatment benefits over an automated assessment. These findings suggest that iCBT is effective in routine care and may be implemented with an automated assessment. By providing wider access to evidence-based interventions and reducing waiting times, the use of iCBT within a stepped-care model is a cost-effective way to reduce the burden of disease caused by these common mental disorders.

  7. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou;

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure co...

  8. Automated Broad-Range Molecular Detection of Bacteria in Clinical Samples.

    Science.gov (United States)

    Budding, Andries E; Hoogewerf, Martine; Vandenbroucke-Grauls, Christina M J E; Savelkoul, Paul H M

    2016-04-01

    Molecular detection methods, such as quantitative PCR (qPCR), have found their way into clinical microbiology laboratories for the detection of an array of pathogens. Most routinely used methods, however, are directed at specific species. Thus, anything that is not explicitly searched for will be missed. This greatly limits the flexibility and universal application of these techniques. We investigated the application of a rapid universal bacterial molecular identification method, IS-pro, to routine patient samples received in a clinical microbiology laboratory. IS-pro is a eubacterial technique based on the detection and categorization of 16S-23S rRNA gene interspace regions with lengths that are specific for each microbial species. As this is an open technique, clinicians do not need to decide in advance what to look for. We compared routine culture to IS-pro using 66 samples sent in for routine bacterial diagnostic testing. The samples were obtained from patients with infections in normally sterile sites (without a resident microbiota). The results were identical in 20 (30%) samples, IS-pro detected more bacterial species than culture in 31 (47%) samples, and five of the 10 culture-negative samples were positive with IS-pro. The case histories of the five patients from whom these culture-negative/IS-pro-positive samples were obtained suggest that the IS-pro findings are highly clinically relevant. Our findings indicate that an open molecular approach, such as IS-pro, may have a high added value for clinical practice. PMID:26763956

  9. The T-lock: automated compensation of radio-frequency induced sample heating

    International Nuclear Information System (INIS)

    Modern high-field NMR spectrometers can stabilize the nominal sample temperature at a precision of less than 0.1 K. However, the actual sample temperature may differ from the nominal value by several degrees because the sample heating caused by high-power radio frequency pulses is not readily detected by the temperature sensors. Without correction, transfer of chemical shifts between different experiments causes problems in the data analysis. In principle, the temperature differences can be corrected by manual procedures but this is cumbersome and not fully reliable. Here, we introduce the concept of a 'T-lock', which automatically maintains the sample at the same reference temperature over the course of different NMR experiments. The T-lock works by continuously measuring the resonance frequency of a suitable spin and simultaneously adjusting the temperature control, thus locking the sample temperature at the reference value. For three different nuclei, 13C, 17O and 31P in the compounds alanine, water, and phosphate, respectively, the T-lock accuracy was found to be <0.1 K. The use of dummy scan periods with variable lengths allows a reliable establishment of the thermal equilibrium before the acquisition of an experiment starts

  10. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Directory of Open Access Journals (Sweden)

    Kamfai Chan

    Full Text Available Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs. Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  11. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    Science.gov (United States)

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  12. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection

    Science.gov (United States)

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A.; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A.; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target’s nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer’s heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  13. [Automated serial diagnosis of donor blood samples. Ergonomic and economic organization structure].

    Science.gov (United States)

    Stoll, T; Fischer-Fröhlich, C L; Mayer, G; Hanfland, P

    1990-01-01

    A comprehensive computer-aided administration-system for blood-donors is presented. Ciphered informations of barcode-labels allow the automatic and nevertheless selective pipetting of samples by pipetting-robots. Self-acting analysis-results are transferred to a host-computer in order to actualize a donor data-base.

  14. Assessing the accuracy of an inter-institutional automated patient-specific health problem list

    Directory of Open Access Journals (Sweden)

    Taylor Laurel

    2010-02-01

    Full Text Available Abstract Background Health problem lists are a key component of electronic health records and are instrumental in the development of decision-support systems that encourage best practices and optimal patient safety. Most health problem lists require initial clinical information to be entered manually and few integrate information across care providers and institutions. This study assesses the accuracy of a novel approach to create an inter-institutional automated health problem list in a computerized medical record (MOXXI that integrates three sources of information for an individual patient: diagnostic codes from medical services claims from all treating physicians, therapeutic indications from electronic prescriptions, and single-indication drugs. Methods Data for this study were obtained from 121 general practitioners and all medical services provided for 22,248 of their patients. At the opening of a patient's file, all health problems detected through medical service utilization or single-indication drug use were flagged to the physician in the MOXXI system. Each new arising health problem were presented as 'potential' and physicians were prompted to specify if the health problem was valid (Y or not (N or if they preferred to reassess its validity at a later time. Results A total of 263,527 health problems, representing 891 unique problems, were identified for the group of 22,248 patients. Medical services claims contributed to the majority of problems identified (77%, followed by therapeutic indications from electronic prescriptions (14%, and single-indication drugs (9%. Physicians actively chose to assess 41.7% (n = 106,950 of health problems. Overall, 73% of the problems assessed were considered valid; 42% originated from medical service diagnostic codes, 11% from single indication drugs, and 47% from prescription indications. Twelve percent of problems identified through other treating physicians were considered valid compared to 28

  15. A self-contained polymeric cartridge for automated biological sample preparation.

    Science.gov (United States)

    Xu, Guolin; Lee, Daniel Yoke San; Xie, Hong; Chiew, Deon; Hsieh, Tseng-Ming; Ali, Emril Mohamed; Lun Looi, Xing; Li, Mo-Huang; Ying, Jackie Y

    2011-09-01

    Sample preparation is one of the most crucial processes for nucleic acids based disease diagnosis. Several steps are required for nucleic acids extraction, impurity washes, and DNA/RNA elution. Careful sample preparation is vital to the obtaining of reliable diagnosis, especially with low copies of pathogens and cells. This paper describes a low-cost, disposable lab cartridge for automatic sample preparation, which is capable of handling flexible sample volumes of 10 μl to 1 ml. This plastic cartridge contains all the necessary reagents for pathogen and cell lysis, DNA/RNA extraction, impurity washes, DNA/RNA elution and waste processing in a completely sealed cartridge. The entire sample preparation processes are automatically conducted within the cartridge on a desktop unit using a pneumatic fluid manipulation approach. Reagents transportation is achieved with a combination of push and pull forces (with compressed air and vacuum, respectively), which are connected to the pneumatic inlets at the bottom of the cartridge. These pneumatic forces are regulated by pinch valve manifold and two pneumatic syringe pumps within the desktop unit. The performance of this pneumatic reagent delivery method was examined. We have demonstrated the capability of the on-cartridge RNA extraction and cancer-specific gene amplification from 10 copies of MCF-7 breast cancer cells. The on-cartridge DNA recovery efficiency was 54-63%, which was comparable to or better than the conventional manual approach using silica spin column. The lab cartridge would be suitable for integration with lab-chip real-time polymerase chain reaction devices in providing a portable system for decentralized disease diagnosis. PMID:22662036

  16. Automated structural design with aeroelastic constraints - A review and assessment of the state of the art

    Science.gov (United States)

    Stroud, W. J.

    1974-01-01

    A review and assessment of the state of the art in automated aeroelastic design is presented. Most of the aeroelastic design studies appearing in the literature deal with flutter, and, therefore, this paper also concentrates on flutter. The flutter design problem is divided into three cases: as isolated flutter mode, neighboring flutter modes, and a hump mode which can rise and cause a sudden, discontinuous change in the flutter velocity. Synthesis procedures are presented in terms of techniques that are appropriate for problems of various levels of difficulty. Current trends, which should result in more efficient, powerful and versatile design codes, are discussed. Approximate analysis procedures and the need for simultaneous consideration of multiple design requirements are emphasized.

  17. Control Performance Management in Industrial Automation Assessment, Diagnosis and Improvement of Control Loop Performance

    CERN Document Server

    Jelali, Mohieddine

    2013-01-01

    Control Performance Management in Industrial Automation provides a coherent and self-contained treatment of a group of methods and applications of burgeoning importance to the detection and solution of problems with control loops that are vital in maintaining product quality, operational safety, and efficiency of material and energy consumption in the process industries. The monograph deals with all aspects of control performance management (CPM), from controller assessment (minimum-variance-control-based and advanced methods), to detection and diagnosis of control loop problems (process non-linearities, oscillations, actuator faults), to the improvement of control performance (maintenance, re-design of loop components, automatic controller re-tuning). It provides a contribution towards the development and application of completely self-contained and automatic methodologies in the field. Moreover, within this work, many CPM tools have been developed that goes far beyond available CPM packages. Control Perform...

  18. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    Science.gov (United States)

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  19. Small sample Bayesian analyses in assessment of weapon performance

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Abundant test data are required in assessment of weapon performance.When weapon test data are insufficient,Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations.The several Bayesian approaches are discussed and some limitations are founded.An improvement is put forward after limitations of Bayesian approaches available are analyzed and t he improved approach is applied to assessment of some new weapon performance.

  20. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    Science.gov (United States)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  1. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Science.gov (United States)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  2. Genomic data sampling and its effect on classification performance assessment

    Directory of Open Access Journals (Sweden)

    Azuaje Francisco

    2003-01-01

    Full Text Available Abstract Background Supervised classification is fundamental in bioinformatics. Machine learning models, such as neural networks, have been applied to discover genes and expression patterns. This process is achieved by implementing training and test phases. In the training phase, a set of cases and their respective labels are used to build a classifier. During testing, the classifier is used to predict new cases. One approach to assessing its predictive quality is to estimate its accuracy during the test phase. Key limitations appear when dealing with small-data samples. This paper investigates the effect of data sampling techniques on the assessment of neural network classifiers. Results Three data sampling techniques were studied: Cross-validation, leave-one-out, and bootstrap. These methods are designed to reduce the bias and variance of small-sample estimations. Two prediction problems based on small-sample sets were considered: Classification of microarray data originating from a leukemia study and from small, round blue-cell tumours. A third problem, the prediction of splice-junctions, was analysed to perform comparisons. Different accuracy estimations were produced for each problem. The variations are accentuated in the small-data samples. The quality of the estimates depends on the number of train-test experiments and the amount of data used for training the networks. Conclusion The predictive quality assessment of biomolecular data classifiers depends on the data size, sampling techniques and the number of train-test experiments. Conservative and optimistic accuracy estimations can be obtained by applying different methods. Guidelines are suggested to select a sampling technique according to the complexity of the prediction problem under consideration.

  3. ICSH recommendations for assessing automated high-performance liquid chromatography and capillary electrophoresis equipment for the quantitation of HbA2.

    Science.gov (United States)

    Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J

    2015-10-01

    Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed.

  4. Adjustable virtual pore-size filter for automated sample preparation using acoustic radiation force

    Energy Technology Data Exchange (ETDEWEB)

    Jung, B; Fisher, K; Ness, K; Rose, K; Mariella, R

    2008-05-22

    We present a rapid and robust size-based separation method for high throughput microfluidic devices using acoustic radiation force. We developed a finite element modeling tool to predict the two-dimensional acoustic radiation force field perpendicular to the flow direction in microfluidic devices. Here we compare the results from this model with experimental parametric studies including variations of the PZT driving frequencies and voltages as well as various particle sizes and compressidensities. These experimental parametric studies also provide insight into the development of an adjustable 'virtual' pore-size filter as well as optimal operating conditions for various microparticle sizes. We demonstrated the separation of Saccharomyces cerevisiae and MS2 bacteriophage using acoustic focusing. The acoustic radiation force did not affect the MS2 viruses, and their concentration profile remained unchanged. With optimized design of our microfluidic flow system we were able to achieve yields of > 90% for the MS2 with > 80% of the S. cerevisiae being removed in this continuous-flow sample preparation device.

  5. Assessment of paclitaxel induced sensory polyneuropathy with "Catwalk" automated gait analysis in mice.

    Directory of Open Access Journals (Sweden)

    Petra Huehnchen

    Full Text Available Neuropathic pain as a symptom of sensory nerve damage is a frequent side effect of chemotherapy. The most common behavioral observation in animal models of chemotherapy induced polyneuropathy is the development of mechanical allodynia, which is quantified with von Frey filaments. The data from one study, however, cannot be easily compared with other studies owing to influences of environmental factors, inter-rater variability and differences in test paradigms. To overcome these limitations, automated quantitative gait analysis was proposed as an alternative, but its usefulness for assessing animals suffering from polyneuropathy has remained unclear. In the present study, we used a novel mouse model of paclitaxel induced polyneuropathy to compare results from electrophysiology and the von Frey method to gait alterations measured with the Catwalk test. To mimic recently improved clinical treatment strategies of gynecological malignancies, we established a mouse model of dose-dense paclitaxel therapy on the common C57Bl/6 background. In this model paclitaxel treated animals developed mechanical allodynia as well as reduced caudal sensory nerve action potential amplitudes indicative of a sensory polyneuropathy. Gait analysis with the Catwalk method detected distinct alterations of gait parameters in animals suffering from sensory neuropathy, revealing a minimized contact of the hind paws with the floor. Treatment of mechanical allodynia with gabapentin improved altered dynamic gait parameters. This study establishes a novel mouse model for investigating the side effects of dose-dense paclitaxel therapy and underlines the usefulness of automated gait analysis as an additional easy-to-use objective test for evaluating painful sensory polyneuropathy.

  6. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  7. Automated column liquid chromatographic determination of amoxicillin and cefadroxil in bovine serum and muscle tissue using on-line dialysis for sample preparation

    NARCIS (Netherlands)

    Snippe, N; van de Merbel, N C; Ruiter, F P; Steijger, O M; Lingeman, H; Brinkman, U A

    1994-01-01

    A fully automated method is described for the determination of amoxicillin and cefadroxil in bovine serum and muscle tissue. The method is based on the on-line combination of dialysis and solid-phase extraction for sample preparation, and column liquid chromatography with ultraviolet detection. In o

  8. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    Science.gov (United States)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. PMID:26423626

  9. Examples of Optical Assessment of Surface Cleanliness of Genesis Samples

    Science.gov (United States)

    Rodriquez, Melissa C.; Allton, J. H.; Burkett, P. J.; Gonzalez, C. P.

    2013-01-01

    Optical microscope assessment of Genesis solar wind collector surfaces is a coordinated part of the effort to obtain an assessed clean subset of flown wafer material for the scientific community. Microscopic survey is typically done at 50X magnification at selected approximately 1 square millimeter areas on the fragment surface. This survey is performed each time a principle investigator (PI) returns a sample to JSC for documentation as part of the established cleaning plan. The cleaning plan encompasses sample handling and analysis by Genesis science team members, and optical survey is done at each step in the process. Sample surface cleaning is performed at JSC (ultrapure water [1] and UV ozone cleaning [2]) and experimentally by other science team members (acid etch [3], acetate replica peels [4], CO2 snow [5], etc.). The documentation of each cleaning method can potentially be assessed with optical observation utilizing Image Pro Plus software [6]. Differences in particle counts can be studied and discussed within analysis groups. Approximately 25 samples have been identified as part of the cleaning matrix effort to date.

  10. Automated Cognitive Health Assessment From Smart Home-Based Behavior Data.

    Science.gov (United States)

    Dawadi, Prafulla Nath; Cook, Diane Joyce; Schmitter-Edgecombe, Maureen

    2016-07-01

    Smart home technologies offer potential benefits for assisting clinicians by automating health monitoring and well-being assessment. In this paper, we examine the actual benefits of smart home-based analysis by monitoring daily behavior in the home and predicting clinical scores of the residents. To accomplish this goal, we propose a clinical assessment using activity behavior (CAAB) approach to model a smart home resident's daily behavior and predict the corresponding clinical scores. CAAB uses statistical features that describe characteristics of a resident's daily activity performance to train machine learning algorithms that predict the clinical scores. We evaluate the performance of CAAB utilizing smart home sensor data collected from 18 smart homes over two years. We obtain a statistically significant correlation ( r=0.72) between CAAB-predicted and clinician-provided cognitive scores and a statistically significant correlation ( r=0.45) between CAAB-predicted and clinician-provided mobility scores. These prediction results suggest that it is feasible to predict clinical scores using smart home sensor data and learning-based data analysis.

  11. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  12. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  13. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    Science.gov (United States)

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. PMID:27006022

  14. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    Science.gov (United States)

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples.

  15. On-site detection of foot-and-mouth disease virus using a portable, automated sample preparation and PCR system

    International Nuclear Information System (INIS)

    Full text: Foot-and-mouth disease (FMD) is a highly contagious and economically devastating disease of farm livestock. The etiological agent, FMD virus (FMDV), is a single-stranded, positive-sense RNA virus belonging to the genus Aphthovirus within the family Picornaviridae. Rapid and accurate confirmation of the presence of FMDV is needed for effective control and eradication of the disease. An on-site detection test would be highly advantageous as the time taken to transport suspect clinical material to a central laboratory can often be lengthy, thus delaying a definitive diagnosis in the event of an outbreak. This study describes the development of a molecular assay for the detection of all seven serotypes of FMDV using novel technology, namely: Linear-After-The- Exponential (LATE)-PCR, for transfer onto a portable, easy-to-use, fully automated sample preparation and RT-PCR instrument. Primers and a mismatch tolerant probe were designed from consensus sequences in the FMDV 3D (RNA polymerase) gene to detect the target and its variants at low temperature. An internal control (IC) was included to validate negative results. After demonstrating that the LATE RT-PCR signal at end-point was proportional to number of target molecules over the range 10 to 1 million copies, the assay was compared with a one-step real-time RT-PCR (rRT-PCR) assay (also targeting the 3D) used routinely by reference laboratories. The LATE RT-PCR assay amplified RNA extracted from multiple strains of all FMDV serotypes. Of the 121 FMDV-positive samples tested, 119 were positive by both rRT-PCR and LATE RT-PCR tests while 118 had tested positive by virus isolation at the time of receipt. Twenty-eight FMDVnegative samples failed to react in all 3 tests. There were no false positive signals with RNA from other vesicular disease-causing viruses. Each FMDV-negative sample generated a signal from the IC, ruling out amplification failures. A dilution series of an FMDV reference strain demonstrated

  16. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    Science.gov (United States)

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  17. Depth-stratified soil sampling for assessing nematode communities

    Directory of Open Access Journals (Sweden)

    Giovani de Oliveira Arieira

    2016-04-01

    Full Text Available This study assessed the importance of stratified soil sampling on the detection (and therefore the distribution of nematode communities and the differentiation of ecosystems by collecting stratified soil samples at intervals of 10 cm and non-stratified samples from 0 to 30 cm in two soil management systems (no-tillage and conventional tillage and in a native forest fragment. The nematode frequency and prominence values were obtained after extraction by successive screening operations, sugar floatation clarification and the identification of nematodes to the genus level. The nematode communities were compared two-by-two based on Sorensen’s community coefficient (CC and the percentage similarity (PS. Relative abundances of functional guilds were subjected to a principal component analysis (PCA and classified in dendrograms. Thirty-two edaphic nematode genera were found, and the nematode communities sampled on a non-stratified basis in the soil profile exhibited a high level of similarity because they could not be accurately characterized. Genera with low abundances were not detected. In the stratified samples, we were able to classify and group the nematodes present at different depths, mainly from 0 to 10 cm. Stratified soil sampling allowed a more accurate characterization and greater differentiation of nematode communities, identifying taxa that occurred at lower abundance levels, irrespective of frequency.

  18. Assessing the Bias in Communication Networks Sampled from Twitter

    CERN Document Server

    González-Bailón, Sandra; Rivero, Alejandro; Borge-Holthoefer, Javier; Moreno, Yamir

    2012-01-01

    We collect and analyse messages exchanged in Twitter using two of the platform's publicly available APIs (the search and stream specifications). We assess the differences between the two samples, and compare the networks of communication reconstructed from them. The empirical context is given by political protests taking place in May 2012: we track online communication around these protests for the period of one month, and reconstruct the network of mentions and re-tweets according to the two samples. We find that the search API over-represents the more central users and does not offer an accurate picture of peripheral activity; we also find that the bias is greater for the network of mentions. We discuss the implications of this bias for the study of diffusion dynamics and collective action in the digital era, and advocate the need for more uniform sampling procedures in the study of online communication.

  19. Assessing Community Health Risks: Proactive Vs Reactive Sampling

    Directory of Open Access Journals (Sweden)

    Sarah Taylor

    2009-01-01

    Full Text Available Problem statement: A considerable number of native birds died in the West Australian coastal town of Esperance and surroundings during late 2006 and early 2007, which raised community concerns about environmental contamination. Forensic investigations of dead birds suggested that lead may have been the causative agent. At the time, lead and nickel, as well as iron ore and other materials, were being exported through the Port of Esperance (port. Government agencies undertook a targeted environmental sampling programme to identify the exposure sources and the extent of contamination. Results of ambient air monitoring, blood lead level investigations and analysis of metals in rainwater tanks suggested widespread contamination of the Esperance town site with lead and nickel. The Department of Environment and Conservation (DEC retained Golder Associates Pty Ltd., (Golder to undertake a human health and ecological risk assessment (risk assessment using the information collected through the investigation of lead and nickel contamination in Esperance. The quantity and quality of exposure data are an important contributor to the uncertainty associated with the outcomes of a risk assessment. Conclusion: As the data were collected essentially as part of the emergency response to the events in Esperance, there was some uncertainty about the suitability and completeness of the data for risk assessment. The urgent nature of the emergency response meant that sampling was opportunistic and not necessarily sufficient or suitable for risk assessment from a methodical and scientific perspective. This study demonstrated the need for collecting ‘meaningful and reliable’ data for assessing risks from environmental contamination.

  20. Assessment of the 296-S-21 Stack Sampling Probe Location

    Energy Technology Data Exchange (ETDEWEB)

    Glissmeyer, John A.

    2006-09-08

    Tests were performed to assess the suitability of the location of the air sampling probe on the 296-S-21 stack according to the criteria of ANSI N13.1-1999, Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities. Pacific Northwest National Laboratory conducted most tests on a 3.67:1 scale model of the stack. CH2MHill also performed some limited confirmatory tests on the actual stack. The tests assessed the capability of the air-monitoring probe to extract a sample representative of the effluent stream. The tests were conducted for the practical combinations of operating fans and addressed: (1) Angular Flow--The purpose is to determine whether the velocity vector is aligned with the sampling nozzle. The average yaw angle relative to the nozzle axis should not be more than 20. The measured values ranged from 5 to 11 degrees on the scale model and 10 to 12 degrees on the actual stack. (2) Uniform Air Velocity--The gas momentum across the stack cross section where the sample is extracted should be well mixed or uniform. The uniformity is expressed as the variability of the measurements about the mean, the coefficient of variance (COV). The lower the COV value, the more uniform the velocity. The acceptance criterion is that the COV of the air velocity must be ?20% across the center two-thirds of the area of the stack. At the location simulating the sampling probe, the measured values ranged form 4 to 11%, which are within the criterion. To confirm the validity of the scale model results, air velocity uniformity measurements were made both on the actual stack and on the scale model at the test ports 1.5 stack diameters upstream of the sampling probe. The results ranged from 6 to 8% COV on the actual stack and 10 to 13% COV on the scale model. The average difference for the eight runs was 4.8% COV, which is within the validation criterion. The fact that the scale model results were slightly higher than the

  1. Interim assessment of the VAL automated guideway transit system. Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Anagnostopoulos, G.

    1981-11-01

    This report describes an interim assessment of the VAL (Vehicules Automatiques Legers or Light Automated Vehicle) AGT system which is currently under construction in Lille, France, and which is to become fully operational in December 1983. This report contains a technical description and performance data resulting from a demonstration test program performed concurrently in August 1980. VAL is the first driverless AGT urban system application in France. The system operates at grade, elevated, and in tunnels on an exclusive concrete dual-lane guideway that is 12.7 kilometers long. The configuration of the system is a push-pull loop operating between 17 on-line stations. The system is designed to provide scheduled operation at 60-second headways and a normal one-way capacity of 7440 passengers per hour per direction with 55 percent of the passengers seated. Two pneumatic-tired vehicles are coupled into a single vehicle capable of carrying 124 passengers at line speeds of 60 km/hr. During the course of the demonstration test program, VAL demonstrated that it could achieve high levels of dependability and availability and could perform safely under all perceivable conditions.

  2. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan [Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States) and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27710 (United States); and Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708 (United States)

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose

  3. Beyond crosswalks: reliability of exposure assessment following automated coding of free-text job descriptions for occupational epidemiology.

    Science.gov (United States)

    Burstyn, Igor; Slutsky, Anton; Lee, Derrick G; Singer, Alison B; An, Yuan; Michael, Yvonne L

    2014-05-01

    Epidemiologists typically collect narrative descriptions of occupational histories because these are less prone than self-reported exposures to recall bias of exposure to a specific hazard. However, the task of coding these narratives can be daunting and prohibitively time-consuming in some settings. The aim of this manuscript is to evaluate the performance of a computer algorithm to translate the narrative description of occupational codes into standard classification of jobs (2010 Standard Occupational Classification) in an epidemiological context. The fundamental question we address is whether exposure assignment resulting from manual (presumed gold standard) coding of the narratives is materially different from that arising from the application of automated coding. We pursued our work through three motivating examples: assessment of physical demands in Women's Health Initiative observational study, evaluation of predictors of exposure to coal tar pitch volatiles in the US Occupational Safety and Health Administration's (OSHA) Integrated Management Information System, and assessment of exposure to agents known to cause occupational asthma in a pregnancy cohort. In these diverse settings, we demonstrate that automated coding of occupations results in assignment of exposures that are in reasonable agreement with results that can be obtained through manual coding. The correlation between physical demand scores based on manual and automated job classification schemes was reasonable (r = 0.5). The agreement between predictive probability of exceeding the OSHA's permissible exposure level for polycyclic aromatic hydrocarbons, using coal tar pitch volatiles as a surrogate, based on manual and automated coding of jobs was modest (Kendall rank correlation = 0.29). In the case of binary assignment of exposure to asthmagens, we observed that fair to excellent agreement in classifications can be reached, depending on presence of ambiguity in assigned job classification (κ

  4. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism.

    Directory of Open Access Journals (Sweden)

    Alicja ePuścian

    2014-04-01

    Full Text Available Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order and cognitive rigidity (higher-order. Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repetitive behaviors during reversal learning in mice in the automated IntelliCage system. During the reward-motivated place preference reversal learning, designed to assess cognitive abilities of mice, visits to the previously rewarded places were recorded to measure cognitive flexibility. Thereafter, emotional flexibility was assessed by measuring conditioned fear extinction. Additionally, to look for neuronal correlates of cognitive impairments, we measured CA3-CA1 hippocampal long term potentiation (LTP. To standardize the designed tests we used C57BL/6 and BALB/c mice, representing two genetic backgrounds, for induction of autism by prenatal exposure to the sodium valproate. We found impairments of place learning related to perseveration and no LTP impairments in C57BL/6 valproate-treated mice. In contrast, BALB/c valproate-treated mice displayed severe deficits of place learning not associated with perseverative behaviors and accompanied by hippocampal LTP impairments. Alterations of cognitive flexibility observed in C57BL/6 valproate-treated mice were related to neither restricted exploration pattern nor to emotional flexibility. Altogether, we showed that the designed tests of cognitive performance and perseverative behaviors are efficient and highly replicable. Moreover, the results suggest that genetic background is crucial for the behavioral effects of prenatal valproate treatment.

  5. Assessment of anti-Salmonella activity of boot dip samples.

    Science.gov (United States)

    Rabie, André J; McLaren, Ian M; Breslin, Mark F; Sayers, Robin; Davies, Rob H

    2015-01-01

    The introduction of pathogens from the external environment into poultry houses via the boots of farm workers and visitors presents a significant risk. The use of boot dips containing disinfectant to help prevent this from happening is common practice, but the effectiveness of these boot dips as a preventive measure can vary. The aim of this study was to assess the anti-Salmonella activity of boot dips that are being used on poultry farms. Boot dip samples were collected from commercial laying hen farms in the UK and tested within 24 hours of receipt at the laboratory to assess their anti-Salmonella activity. All boot dip samples were tested against a field strain of Salmonella enterica serovar Enteritidis using three test models: pure culture, paper disc surface matrix and yeast suspension model. Of the 112 boot dip samples tested 83.6% were effective against Salmonella in pure culture, 37.3% in paper disc surface matrix and 44.5% in yeast suspension model. Numerous factors may influence the efficacy of the disinfectants. Disinfectants used in the dips may not always be fully active against surface or organic matter contamination; they may be inaccurately measured or diluted to a concentration other than that specified or recommended; dips may not be changed regularly or may have been exposed to rain and other environmental elements. This study showed that boot dips in use on poultry farms are frequently ineffective. PMID:25650744

  6. Influence of commonly used primer systems on automated ribosomal intergenic spacer analysis of bacterial communities in environmental samples.

    Directory of Open Access Journals (Sweden)

    Witoon Purahong

    Full Text Available Due to the high diversity of bacteria in many ecosystems, their slow generation times, specific but mostly unknown nutrient requirements and syntrophic interactions, isolation based approaches in microbial ecology mostly fail to describe microbial community structure. Thus, cultivation independent techniques, which rely on directly extracted nucleic acids from the environment, are a well-used alternative. For example, bacterial automated ribosomal intergenic spacer analysis (B-ARISA is one of the widely used methods for fingerprinting bacterial communities after PCR-based amplification of selected regions of the operon coding for rRNA genes using community DNA. However, B-ARISA alone does not provide any taxonomic information and the results may be severely biased in relation to the primer set selection. Furthermore, amplified DNA stemming from mitochondrial or chloroplast templates might strongly bias the obtained fingerprints. In this study, we determined the applicability of three different B-ARISA primer sets to the study of bacterial communities. The results from in silico analysis harnessing publicly available sequence databases showed that all three primer sets tested are specific to bacteria but only two primers sets assure high bacterial taxa coverage (1406f/23Sr and ITSF/ITSReub. Considering the study of bacteria in a plant interface, the primer set ITSF/ITSReub was found to amplify (in silico sequences of some important crop species such as Sorghum bicolor and Zea mays. Bacterial genera and plant species potentially amplified by different primer sets are given. These data were confirmed when DNA extracted from soil and plant samples were analyzed. The presented information could be useful when interpreting existing B-ARISA results and planning B-ARISA experiments, especially when plant DNA can be expected.

  7. Equilibrium sampling for a thermodynamic assessment of contaminated sediments

    DEFF Research Database (Denmark)

    of polluted sediments. Glass jars with µm-thin silicone coatings on the inner walls can be used for ex situ equilibration while a device housing several silicone-coated fibers can be used for in situ equilibration. In both cases, parallel sampling with varying silicone thicknesses can be applied to confirm...... of the biota relative to the sediment. Furthermore, concentrations in lipid at thermodynamic equilibrium with sediment (Clip?Sed) can be calculated via lipid/silicone partition ratios CSil × KLip:Sil, which has been done in studies with limnic, river and marine sediments. The data can then be compared to lipid...... will focus at the latest developments in equilibrium sampling concepts and methods. Further, we will explain how these approaches can provide a new basis for a thermodynamic assessment of polluted sediments....

  8. Assessing Office Automation Effect on Performance Using Balanced Scorecard approach Case Study: Esfahan Education Organizations and Schools

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Moshref Javadi

    2013-09-01

    Full Text Available Survival of each organization depends on its dynamic interaction with internal and external environment. Regarding development of technology and its effect on performance of organizations, organizations need to implement these technologies in order to be successful. This research aims to explore relationship between implementation of office automation and performance using structural equitation modeling method (SEM. This study is considered an applied survey in which its method is descriptive. Statistical population was managers of offices and schools of ministry of education in Esfahan and Lenjan city.130 individuals were selected randomly as sample. In order to evaluate validity of questionnaire, content and construct validity were used and relations between variables of this research has been confirmed based on results of SEM method. For analyzes of data, structural equation method has been used. Regarding obtained results, effectiveness amount of office automation on performance was measured which was equal to estimated standard amount as 83%. Obtained results from main hypothesis test of this research completely conform which there is about office automation in studied organization and office automation could improve performance of organization.

  9. Performance assessment of automated tissue characterization for prostate H and E stained histopathology

    Science.gov (United States)

    DiFranco, Matthew D.; Reynolds, Hayley M.; Mitchell, Catherine; Williams, Scott; Allan, Prue; Haworth, Annette

    2015-03-01

    Reliable automated prostate tumor detection and characterization in whole-mount histology images is sought in many applications, including post-resection tumor staging and as ground-truth data for multi-parametric MRI interpretation. In this study, an ensemble-based supervised classification algorithm for high-resolution histology images was trained on tile-based image features including histogram and gray-level co-occurrence statistics. The algorithm was assessed using different combinations of H and E prostate slides from two separate medical centers and at two different magnifications (400x and 200x), with the aim of applying tumor classification models to new data. Slides from both datasets were annotated by expert pathologists in order to identify homogeneous cancerous and non-cancerous tissue regions of interest, which were then categorized as (1) low-grade tumor (LG-PCa), including Gleason 3 and high-grade prostatic intraepithelial neoplasia (HG-PIN), (2) high-grade tumor (HG-PCa), including various Gleason 4 and 5 patterns, or (3) non-cancerous, including benign stroma and benign prostatic hyperplasia (BPH). Classification models for both LG-PCa and HG-PCa were separately trained using a support vector machine (SVM) approach, and per-tile tumor prediction maps were generated from the resulting ensembles. Results showed high sensitivity for predicting HG-PCa with an AUC up to 0.822 using training data from both medical centres, while LG-PCa showed a lower sensitivity of 0.763 with the same training data. Visual inspection of cancer probability heatmaps from 9 patients showed that 17/19 tumors were detected, and HG-PCa generally reported less false positives than LG-PCa.

  10. Fully Automated Assessment of the Severity of Parkinson's Disease from Speech.

    Science.gov (United States)

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2015-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson's disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks - the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  11. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    Energy Technology Data Exchange (ETDEWEB)

    Kertesz, Vilmos [ORNL; Van Berkel, Gary J [ORNL

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  12. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    Science.gov (United States)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. PMID:21609694

  13. Automated determination of nitrate plus nitrite in aqueous samples with flow injection analysis using vanadium (III) chloride as reductant.

    Science.gov (United States)

    Wang, Shu; Lin, Kunning; Chen, Nengwang; Yuan, Dongxing; Ma, Jian

    2016-01-01

    Determination of nitrate in aqueous samples is an important analytical objective for environmental monitoring and assessment. Here we report the first automatic flow injection analysis (FIA) of nitrate (plus nitrite) using VCl3 as reductant instead of the well-known but toxic cadmium column for reducing nitrate to nitrite. The reduced nitrate plus the nitrite originally present in the sample react with the Griess reagent (sulfanilamide and N-1-naphthylethylenediamine dihydrochloride) under acidic condition. The resulting pink azo dye can be detected at 540 nm. The Griess reagent and VCl3 are used as a single mixed reagent solution to simplify the system. The various parameters of the FIA procedure including reagent composition, temperature, volume of the injection loop, and flow rate were carefully investigated and optimized via univariate experimental design. Under the optimized conditions, the linear range and detection limit of this method are 0-100 µM (R(2)=0.9995) and 0.1 µM, respectively. The targeted analytical range can be easily extended to higher concentrations by selecting alternative detection wavelengths or increasing flow rate. The FIA system provides a sample throughput of 20 h(-1), which is much higher than that of previously reported manual methods based on the same chemistry. National reference solutions and different kinds of aqueous samples were analyzed with our method as well as the cadmium column reduction method. The results from our method agree well with both the certified value and the results from the cadmium column reduction method (no significant difference with P=0.95). The spiked recovery varies from 89% to 108% for samples with different matrices, showing insignificant matrix interference in this method. PMID:26695325

  14. Risk Assessment on the Transition Program for Air Traffic Control Automation System Upgrade

    Directory of Open Access Journals (Sweden)

    Li Dong Bin

    2016-01-01

    Full Text Available We analyzed the safety risks of the transition program for Air Traffic Control (ATC automation system upgrade by using the event tree analysis method in this paper. We decomposed the occurrence progress of the three transition phase and built the event trees corresponding to the three stages, and then we determined the probability of success of each factor and calculated probability of success of the air traffic control automation system upgrade transition. In the conclusion, we illustrate the transition program safety risk according to the results.

  15. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    Science.gov (United States)

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis

  16. Performance of automated software in the assessment of segmental left ventricular function in cardiac CT: Comparison with cardiac magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Rui [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Capital Medical University, Department of Radiology, Beijing Anzhen Hospital, Beijing (China); Meinel, Felix G. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Canstein, Christian [Siemens Medical Solutions USA, Malvern, PA (United States); Spearman, James V. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); De Cecco, Carlo N. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Departments of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2015-12-15

    To evaluate the accuracy, reliability and time saving potential of a novel cardiac CT (CCT)-based, automated software for the assessment of segmental left ventricular function compared to visual and manual quantitative assessment of CCT and cardiac magnetic resonance (CMR). Forty-seven patients with suspected or known coronary artery disease (CAD) were enrolled in the study. Wall thickening was calculated. Segmental LV wall motion was automatically calculated and shown as a colour-coded polar map. Processing time for each method was recorded. Mean wall thickness in both systolic and diastolic phases on polar map, CCT, and CMR was 9.2 ± 0.1 mm and 14.9 ± 0.2 mm, 8.9 ± 0.1 mm and 14.5 ± 0.1 mm, 8.3 ± 0.1 mm and 13.6 ± 0.1 mm, respectively. Mean wall thickening was 68.4 ± 1.5 %, 64.8 ± 1.4 % and 67.1 ± 1.4 %, respectively. Agreement for the assessment of LV wall motion between CCT, CMR and polar maps was good. Bland-Altman plots and ICC indicated good agreement between CCT, CMR and automated polar maps of the diastolic and systolic segmental wall thickness and thickening. The processing time using polar map was significantly decreased compared with CCT and CMR. Automated evaluation of segmental LV function with polar maps provides similar measurements to manual CCT and CMR evaluation, albeit with substantially reduced analysis time. (orig.)

  17. Assessing tiger population dynamics using photographic capture-recapture sampling

    Science.gov (United States)

    Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E.

    2006-01-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, ?robust design? capture?recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of =K' =Y' 0.10 ? 0.069 (values are estimated mean ? SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 ? 0.051, and the estimated probability that a newly caught animal was a transient was = 0.18 ? 0.11. During the period when the sampled area was of constant size, the estimated population size Nt varied from 17 ? 1.7 to 31 ? 2.1 tigers, with a geometric mean rate of annual population change estimated as = 1.03 ? 0.020, representing a 3% annual increase. The estimated recruitment of new animals, Bt, varied from 0 ? 3.0 to 14 ? 2.9 tigers. Population density estimates, D, ranged from 7.33 ? 0.8 tigers/100 km2 to 21.73 ? 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain

  18. Assessing Racial Microaggression Distress in a Diverse Sample.

    Science.gov (United States)

    Torres-Harding, Susan; Turner, Tasha

    2015-12-01

    Racial microaggressions are everyday subtle or ambiguous racially related insults, slights, mistreatment, or invalidations. Racial microaggressions are a type of perceived racism that may negatively impact the health and well-being of people of color in the United States. This study examined the reliability and validity of the Racial Microaggression Scale distress subscales, which measure the perceived stressfulness of six types of microaggression experiences in a racially and ethnically diverse sample. These subscales exhibited acceptable to good internal consistency. The distress subscales also evidenced good convergent validity; the distress subscales were positively correlated with additional measures of stressfulness due to experiencing microaggressions or everyday discrimination. When controlling for the frequency of one's exposure to microaggression incidents, some racial/ethnic group differences were found. Asian Americans reported comparatively lower distress and Latinos reporting comparatively higher distress in response to Foreigner, Low-Achieving, Invisibility, and Environmental microaggressions. African Americans reported higher distress than the other groups in response to Environmental microaggressions. Results suggest that the Racial Microaggressions Scale distress subscales may aid health professionals in assessing the distress elicited by different types of microaggressions. In turn, this may facilitate diagnosis and treatment planning in order to provide multiculturally competent care for African American, Latino, and Asian American clients. PMID:25237154

  19. Assessing uncertainty in DNA evidence caused by sampling effects.

    Science.gov (United States)

    Curran, J M; Buckleton, J S; Triggs, C M; Weir, B S

    2002-01-01

    Sampling error estimation in forensic DNA testimony was discussed. Is an estimate necessary and how should it be made? The authors find that all modern methods have areas of strength and weakness. The assessment of which is the 'best' is subjective and depends on the performance of the method, the type of problem (criminal work or paternity), the database size and availability of computing software and support. The authors preferred the highest posterior density approach for performance, however the other methods all have areas where their performance is adequate. For single-contributor stains normal approximation methods are suitable, also the bootstrap and the highest posterior density method. For multiple-contributor stains or other complex situations the match probability expressions become quite complex and it may not be possible to derive the necessary variance expressions. The highest posterior density or the bootstrap provide a better general method, with non-zero theta. The size-bias correction and the factor of 10 approaches may be considered acceptable by many forensic scientists as long as their limitations are understood.

  20. Assessment of Automated Analyses of Cell Migration on Flat and Nanostructured Surfaces

    Directory of Open Access Journals (Sweden)

    Hans A Kestler

    2012-07-01

    Full Text Available Motility studies of cells often rely on computer software that analyzes time-lapse recorded movies and establishes cell trajectories fully automatically. This raises the question of reproducibility of results, since different programs could yield significantly different results of such automated analysis. The fact that the segmentation routines of such programs are often challenged by nanostructured surfaces makes the question more pertinent. Here we illustrate how it is possible to track cells on bright field microscopy images with image analysis routines implemented in an open-source cell tracking program, PACT (Program for Automated Cell Tracking. We compare the automated motility analysis of three cell tracking programs, PACT, Autozell, and TLA, using the same movies as input for all three programs. We find that different programs track overlapping, but different subsets of cells due to different segmentation methods. Unfortunately, population averages based on such different cell populations, differ significantly in some cases. Thus, results obtained with one software package are not necessarily reproducible by other software.

  1. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. PMID:26806135

  2. Automated ambulatory assessment of cognitive performance, environmental conditions, and motor activity during military operations

    Science.gov (United States)

    Lieberman, Harris R.; Kramer, F. Matthew; Montain, Scott J.; Niro, Philip; Young, Andrew J.

    2005-05-01

    Until recently scientists had limited opportunities to study human cognitive performance in non-laboratory, fully ambulatory situations. Recently, advances in technology have made it possible to extend behavioral assessment to the field environment. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device, now widely employed, can acquire minute-by-minute information on an individual"s level of motor activity. Actigraphs can, with reasonable accuracy, distinguish sleep from waking, the most critical and basic aspect of human behavior. However, rapid technologic advances have provided the opportunity to collect much more information from fully ambulatory humans. Our laboratory has developed a series of wrist-worn devices, which are not much larger then a watch, which can assess simple and choice reaction time, vigilance and memory. In addition, the devices can concurrently assess motor activity with much greater temporal resolution then the standard actigraph. Furthermore, they continuously monitor multiple environmental variables including temperature, humidity, sound and light. We have employed these monitors during training and simulated military operations to collect information that would typically be unavailable under such circumstances. In this paper we will describe various versions of the vigilance monitor and how each successive version extended the capabilities of the device. Samples of data from several studies are presented, included studies conducted in harsh field environments during simulated infantry assaults, a Marine Corps Officer training course and mechanized infantry (Stryker) operations. The monitors have been useful for documenting environmental conditions experienced by wearers, studying patterns of sleep and activity and examining the effects of nutritional manipulations on warfighter performance.

  3. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    to handle 1,073 of 1,092 (98.3%) samples of whole blood from forensic material, including postmortem samples, without any need for repeating sample preparation. Only three samples required special treatment such as dilution. The addition of internal and calibration standards were validated by pipetting...

  4. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    Science.gov (United States)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  5. An approach for the automated risk assessment of structural differences between spreadsheets (DiffXL)

    CERN Document Server

    Hunt, John

    2009-01-01

    This paper outlines an approach to manage and quantify the risks associated with changes made to spreadsheets. The methodology focuses on structural differences between spreadsheets and suggests a technique by which a risk analysis can be achieved in an automated environment. The paper offers an example that demonstrates how contiguous ranges of data can be mapped into a generic list of formulae, data and metadata. The example then shows that comparison of these generic lists can establish the structural differences between spreadsheets and quantify the level of risk that each change has introduced. Lastly the benefits, drawbacks and limitations of the technique are discussed in a commercial context.

  6. Assessing acceptance sampling application in manufacturing electrical and electronic products

    Directory of Open Access Journals (Sweden)

    B.M. Deros

    2008-12-01

    Full Text Available Purpose: This paper discusses the use of acceptance sampling technique as a practical tool for quality assuranceapplications to decide whether the lot is to be accepted or rejected.Design/methodology/approach: In Malaysia, single attribute acceptance sampling plan is widely practicedfor quality assurance purposes in manufacturing companies. Literature showed that majority of past studies onacceptance sampling had focused on the development and establishment of new methods for acceptance-samplingapplication. However, there is none that had investigated the relationship between acceptance sampling planselection and effectiveness of the selection. Therefore, in this study, the authors had analyzed the effectivenessthe acceptance sampling plan application method and its implementation problems in manufacturing electricaland electronics products. The study was conducted by using case study methodology at three manufacturingcompanies’ coded names: company A, B and C. In this paper, the authors would like to share the case studycompanies’ experienced of acceptance sampling plan selection and difficulties that they had faced during thecourse of implementing acceptance sampling in their production lines.Findings: The result from the three case study companies showed by implementing acceptance samplingthey could easily investigate and diagnose their suppliers’ product quality immediately upon their arrivalat the company premise.Practical implications: The continuous improvement and review of acceptance sampling plan is important toimprove the products quality and ensure continuous customer satisfaction.Originality/value: All the three case study companies agreed that acceptance sampling implementation hadimproved their product’s quality in the market place.

  7. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune;

    2011-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO...... 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained...... the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFlSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI...

  8. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    Science.gov (United States)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  9. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  10. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  11. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    Science.gov (United States)

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S.; Walker, Duncan; Crozier, Stuart; Engstrom, Craig

    2015-10-01

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint. Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18-49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system. High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the potential to improve analyses of cam-type lesions of the FHN junction for large-scale morphometric and clinical MR

  12. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    International Nuclear Information System (INIS)

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint.Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18–49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system.High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  <  0.01, p  =  0.98).Our automatic 3D method analysed MR images of the hip joints to generate alpha angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the

  13. Equilibrium sampling for a thermodynamic assessment of contaminated sediments

    DEFF Research Database (Denmark)

    Mayer, Philipp; Nørgaard Schmidt, Stine; Mäenpää, Kimmo;

    Hydrophobic organic contaminants (HOCs) reaching the aquatic environment are largely stored in sediments. The risk of contaminated sediments is challenging to assess since traditional exhaustive extraction methods yield total HOC concentrations, whereas freely dissolved concentrations (Cfree...

  14. Mass asymmetry and tricyclic wobble motion assessment using automated launch video analysis

    Institute of Scientific and Technical Information of China (English)

    Ryan DECKER; Joseph DONINI; William GARDNER; Jobin JOHN; Walter KOENIG

    2016-01-01

    This paper describes an approach to identify epicyclic and tricyclic motion during projectile flight caused by mass asymmetries in spin-stabilized projectiles. Flight video was captured following projectile launch of several M110A2E1 155 mm artillery projectiles. These videos were then analyzed using the automated flight video analysis method to attain their initial position and orientation histories. Examination of the pitch and yaw histories clearly indicates that in addition to epicyclic motion’s nutation and precession oscillations, an even faster wobble amplitude is present during each spin revolution, even though some of the amplitudes of the oscillation are smaller than 0.02 degree. The results are compared to a sequence of shots where little appreciable mass asymmetries were present, and only nutation and precession frequencies are predominantly apparent in the motion history results. Magnitudes of the wobble motion are estimated and compared to product of inertia measurements of the asymmetric projectiles.

  15. A Robust and Automated Hyperspectral Damage Assessment System Under Varying Illumination Conditions and Viewing Geometry Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Some target signatures of interest in drought monitoring, flooding assessment, fire damage assessment, coastal changes, urban changes, etc. may need to be tracked...

  16. Exploring trait assessment of samples, persons, and cultures.

    Science.gov (United States)

    McCrae, Robert R

    2013-01-01

    I present a very broad overview of what I have learned about personality trait assessment at different levels and offer some views on future directions for research and clinical practice. I review some basic principles of scale development and argue that internal consistency has been overemphasized; more attention to retest reliability is needed. Because protocol validity is crucial for individual assessment and because validity scales have limited utility, I urge combining assessments from multiple informants, and I present some statistical tools for that purpose. As culture-level traits, I discuss ethos, national character stereotypes, and aggregated personality traits, and summarize evidence for the validity of the latter. Our understanding of trait profiles of cultures is limited, but it can guide future exploration. PMID:23924211

  17. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  18. Genesis Solar Wind Collector Cleaning Assessment: 60366 Sample Case Study

    Science.gov (United States)

    Goreva, Y. S.; Gonzalez, C. P.; Kuhlman, K. R.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, M. C.; Burkett, P. J.

    2014-01-01

    In order to recognize, localize, characterize and remove particle and thin film surface contamination, a small subset of Genesis mission collector fragments are being subjected to extensive study via various techniques [1-5]. Here we present preliminary results for sample 60336, a Czochralski silicon (Si-CZ) based wafer from the bulk array (B/C).

  19. Locoregional Control of Non-Small Cell Lung Cancer in Relation to Automated Early Assessment of Tumor Regression on Cone Beam Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Brink, Carsten, E-mail: carsten.brink@rsyd.dk [Institute of Clinical Research, University of Southern Denmark (Denmark); Laboratory of Radiation Physics, Odense University Hospital (Denmark); Bernchou, Uffe [Institute of Clinical Research, University of Southern Denmark (Denmark); Laboratory of Radiation Physics, Odense University Hospital (Denmark); Bertelsen, Anders [Laboratory of Radiation Physics, Odense University Hospital (Denmark); Hansen, Olfred [Institute of Clinical Research, University of Southern Denmark (Denmark); Department of Oncology, Odense University Hospital (Denmark); Schytte, Tine [Department of Oncology, Odense University Hospital (Denmark); Bentzen, Soren M. [Division of Biostatistics and Bioinformatics, University of Maryland Greenebaum Cancer Center, and Department of Epidemiology and Public Health, University of Maryland School of Medicine, Baltimore, MD (United States)

    2014-07-15

    Purpose: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its potential use in response adapted personalized radiation therapy is evaluated empirically. Methods and Materials: Automated deformable registration with calculation of the Jacobian determinant was applied to serial CBCT scans in a series of 99 patients with NSCLC. Tumor volume at the end of treatment was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2 groups having volume regressions below or above the population median value. Kaplan-Meier plots of locoregional disease-free rate and overall survival in the 2 groups were used to evaluate the predictive value of tumor regression during treatment. Cox proportional hazards model was used to adjust for other clinical characteristics. Results: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could be quantified early into the treatment course. Interestingly, patients with pronounced volume regression had worse locoregional tumor control and overall survival. This was significant on patient with non-adenocarcinoma histology. Conclusions: Evaluation of routinely acquired CBCT images during radiation therapy provides biological information on the specific tumor. This could potentially form the basis for personalized response adaptive therapy.

  20. Automation system risk assessment; Gestao de riscos de sistemas de automacao

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Felipe; Furuta, Margarete [PricewaterhouseCoopers, Sao Paulo, SP (Brazil)

    2008-07-01

    In spite of what was learnt with the history of several industrial accidents and the initiative in relation to the protection measures taken by several Organizations, even today extremely serious accidents keep happening in the automation environment. If on one hand, the growing competitive demands an increase in the productivity, which several times is only possible through more complex processes that make the facilities operate in their limits, on the other, it is noticeable that the control, automation and security related to these more complex processes are also more difficult to manage. The incessant investigation of past accidents resulted in the prevention of specific dangerous events in relation to industrial facilities but it also brought to light the importance of actions related to the Risk Management Process. Without doubt, the consequences resulting from the materialization of an event can reach disastrous and unrecoverable levels, taking into account the comprehensiveness of the potential risks. Studies carried out by international entities illustrate that the inadequate management of risks is the factor that contributes more for the occurrence of accidents. The initial phase of the risk management results from the analysis of the risks inherent to the process (e.g. to determine the probability of occurring each different potential failure), the study of the consequences if these failures occur, the definition of the risk considered acceptable according to the appetite for risks established by the Organization, and the identification of the action on the risk that can vary in a spectrum that can involve decreasing, transferring, avoiding or accepting the risk. This work has as objective to exploit the aspects for the implementation of Risk Management in the Oil and Gas segment. The study also seeks to explicit, based on the systematic registry of the measured items, how it is possible to evaluate the financial exposure of the risk to which a project

  1. Universality of Generalized Bunching and Efficient Assessment of Boson Sampling

    Science.gov (United States)

    Shchesnovich, V. S.

    2016-03-01

    It is found that identical bosons (fermions) show a generalized bunching (antibunching) property in linear networks: the absolute maximum (minimum) of the probability that all N input particles are detected in a subset of K output modes of any nontrivial linear M -mode network is attained only by completely indistinguishable bosons (fermions). For fermions K is arbitrary; for bosons it is either (i) arbitrary for only classically correlated bosons or (ii) satisfies K ≥N (or K =1 ) for arbitrary input states of N particles. The generalized bunching allows us to certify in a polynomial in N number of runs that a physical device realizing boson sampling with an arbitrary network operates in the regime of full quantum coherence compatible only with completely indistinguishable bosons. The protocol needs only polynomial classical computations for the standard boson sampling, whereas an analytic formula is available for the scattershot version.

  2. Development of Genesis Solar Wind Sample Cleanliness Assessment: Initial Report on Sample 60341 Optical Imagery and Elemental Mapping

    Science.gov (United States)

    Gonzalez, C. P.; Goreva, Y. S.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, P. J.; Burkett, P. J.

    2014-01-01

    Since 2005 the Genesis science team has experimented with techniques for removing the contaminant particles and films from the collection surface of the Genesis fragments. A subset of 40 samples have been designated as "cleaning matrix" samples. These are small samples to which various cleaning approaches are applied and then cleanliness is assessed optically, by TRXRF, SEM, ToF-SIMS, XPS, ellipsometry or other means [1-9]. Most of these sam-ples remain available for allocation, with cleanliness assessment data. This assessment allows evaluation of various cleaning techniques and handling or analytical effects. Cleaning techniques investigated by the Genesis community include acid/base etching, acetate replica peels, ion beam, and CO2 snow jet cleaning [10-16]. JSC provides surface cleaning using UV ozone exposure and ultra-pure water (UPW) [17-20]. The UPW rinse is commonly used to clean samples for handling debris between processing by different researchers. Optical microscopic images of the sample taken before and after UPW cleaning show what has been added or removed during the cleaning process.

  3. A new approach to automated assessment of fractionation of endocardial electrograms during atrial fibrillation

    International Nuclear Information System (INIS)

    Complex fractionated atrial electrograms (CFAEs) may represent the electrophysiological substrate for atrial fibrillation (AF). Progress in signal processing algorithms to identify sites of CFAEs is crucial for the development of AF ablation strategies. A novel algorithm for automated description of fractionation of atrial electrograms (A-EGMs) based on the wavelet transform has been proposed. The algorithm was developed and validated using a representative set of 1.5 s A-EGM (n = 113) ranked by three experts into four categories: 1—organized atrial activity; 2—mild; 3—intermediate; 4—high degree of fractionation. A tight relationship between a fractionation index and expert classification of A-EGMs (Spearman correlation ρ = 0.87) was documented with a sensitivity of 82% and specificity of 90% for the identification of highly fractionated A-EGMs. This operator-independent description of A-EGM complexity may be easily incorporated into mapping systems to facilitate CFAE identification and to guide AF substrate ablation

  4. An automated method for assessing routine radiographs of patients with total hip replacements.

    Science.gov (United States)

    Redhead, A L; Kotcheff, A C; Taylor, C J; Porter, M L; Hukins, D W

    1997-01-01

    This paper describes a new, fully automated method of locating objects on radiographs of patients with total joint replacements (TJRs). A statistical computer model, known as an active shape model, was trained to identify the position of the femur, pelvis, stem and cup marker wire on radiographs of patients with Charnley total hip prostheses. Once trained, the model was able to locate these objects through a process of automatic image searching, despite their appearance depending on the orientation and anatomy of the patient. Experiments were carried out to test the accuracy with which the model was able to fit to previously unseen data and with which reference points could be calculated from the model points. The model was able to locate the femur and stem with a mean error of approximately 0.8 mm and a 95 per cent confidence limit of 1.7 mm. Once the model had successfully located these objects, the midpoint of the stem head could be calculated with a mean error of approximately 0.2 mm. Although the model has been trained on Charnley total hip replacements, the method is generic and so can be applied to radiographs of patients with any TJR. This paper shows that computer models can form the basis of a quick, automatic method of taking measurements from standard clinical radiographs.

  5. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    Directory of Open Access Journals (Sweden)

    Colin J Torney

    Full Text Available Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future.

  6. A neural-symbolic system for automated assessment in training simulators - A position paper

    NARCIS (Netherlands)

    Penning, H.L.H. de; Kappé, B.; Bosch, K. van den

    2009-01-01

    Performance assessment in training simulators is a complex task. It requires monitoring and interpreting the student’s behaviour in the simulator using knowledge of the training task, the environment and a lot of experience. Assessment in simulators is therefore generally done by human observers. To

  7. Locoregional control of non-small cell lung cancer in relation to automated early assessment of tumor regression on cone beam computed tomography

    DEFF Research Database (Denmark)

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders;

    2014-01-01

    PURPOSE: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its...

  8. Automated image segmentation and registration of vessel wall MRI for quantitative assessment of carotid artery vessel wall dimensions and plaque composition

    NARCIS (Netherlands)

    Klooster, Ronald van 't

    2014-01-01

    The main goal of this thesis was to develop methods for automated segmentation, registration and classification of the carotid artery vessel wall and plaque components using multi-sequence MR vessel wall images to assess atherosclerosis. First, a general introduction into atherosclerosis and differe

  9. A filter paper-based microdevice for low-cost, rapid, and automated DNA extraction and amplification from diverse sample types.

    Science.gov (United States)

    Gan, Wupeng; Zhuang, Bin; Zhang, Pengfei; Han, Junping; Li, Cai-Xia; Liu, Peng

    2014-10-01

    A plastic microfluidic device that integrates a filter disc as a DNA capture phase was successfully developed for low-cost, rapid and automated DNA extraction and PCR amplification from various raw samples. The microdevice was constructed by sandwiching a piece of Fusion 5 filter, as well as a PDMS (polydimethylsiloxane) membrane, between two PMMA (poly(methyl methacrylate)) layers. An automated DNA extraction from 1 μL of human whole blood can be finished on the chip in 7 minutes by sequentially aspirating NaOH, HCl, and water through the filter. The filter disc containing extracted DNA was then taken out directly for PCR. On-chip DNA purification from 0.25-1 μL of human whole blood yielded 8.1-21.8 ng of DNA, higher than those obtained using QIAamp® DNA Micro kits. To realize DNA extraction from raw samples, an additional sample loading chamber containing a filter net with an 80 μm mesh size was designed in front of the extraction chamber to accommodate sample materials. Real-world samples, including whole blood, dried blood stains on Whatman® 903 paper, dried blood stains on FTA™ cards, buccal swabs, saliva, and cigarette butts, can all be processed in the system in 8 minutes. In addition, multiplex amplification of 15 STR (short tandem repeat) loci and Sanger-based DNA sequencing of the 520 bp GJB2 gene were accomplished from the filters that contained extracted DNA from blood. To further prove the feasibility of integrating this extraction method with downstream analyses, "in situ" PCR amplifications were successfully performed in the DNA extraction chamber following DNA purification from blood and blood stains without DNA elution. Using a modified protocol to bond the PDMS and PMMA, our plastic PDMS devices withstood the PCR process without any leakage. This study represents a significant step towards the practical application of on-chip DNA extraction methods, as well as the development of fully integrated genetic analytical systems.

  10. Automated large scale parameter extraction of road-side trees sampled by a laser mobile mapping system

    NARCIS (Netherlands)

    Lindenbergh, R.C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-01-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadsid

  11. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    Science.gov (United States)

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  12. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    Energy Technology Data Exchange (ETDEWEB)

    Rosato, Antonio [University of Florence, Department of Chemistry and Magnetic Resonance Center (Italy); Vranken, Wim [Vrije Universiteit Brussel, Structural Biology Brussels (Belgium); Fogh, Rasmus H.; Ragan, Timothy J. [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom); Tejero, Roberto [Universidad de Valencia, Departamento de Química Física (Spain); Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H. [University of Georgia, Complex Carbohydrate Research Center and Northeast Structural Genomics Consortium (United States); Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H. [University of Toronto, Department of Medical Biophysics, Cancer Genomics and Proteomics, Ontario Cancer Institute, Northeast Structural Genomics Consortium (Canada); Kennedy, Michael [Miami University, Department of Chemistry and Biochemistry, Northeast Structural Genomics Consortium (United States); Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, Northeast Structural Genomics Consortium, Rutgers (United States); Vuister, Geerten W., E-mail: gv29@le.ac.uk [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom)

    2015-08-15

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

  13. Shorter sampling periods and accurate estimates of milk volume and components are possible for pasture based dairy herds milked with automated milking systems.

    Science.gov (United States)

    Kamphuis, Claudia; Burke, Jennie K; Taukiri, Sarah; Petch, Susan-Fay; Turner, Sally-Anne

    2016-08-01

    Dairy cows grazing pasture and milked using automated milking systems (AMS) have lower milking frequencies than indoor fed cows milked using AMS. Therefore, milk recording intervals used for herd testing indoor fed cows may not be suitable for cows on pasture based farms. We hypothesised that accurate standardised 24 h estimates could be determined for AMS herds with milk recording intervals of less than the Gold Standard (48 hs), but that the optimum milk recording interval would depend on the herd average for milking frequency. The Gold Standard protocol was applied on five commercial dairy farms with AMS, between December 2011 and February 2013. From 12 milk recording test periods, involving 2211 cow-test days and 8049 cow milkings, standardised 24 h estimates for milk volume and milk composition were calculated for the Gold Standard protocol and compared with those collected during nine alternative sampling scenarios, including six shorter sampling periods and three in which a fixed number of milk samples per cow were collected. Results infer a 48 h milk recording protocol is unnecessarily long for collecting accurate estimates during milk recording on pasture based AMS farms. Collection of two milk samples only per cow was optimal in terms of high concordance correlation coefficients for milk volume and components and a low proportion of missed cow-test days. Further research is required to determine the effects of diurnal variations in milk composition on standardised 24 h estimates for milk volume and components, before a protocol based on a fixed number of samples could be considered. Based on the results of this study New Zealand have adopted a split protocol for herd testing based on the average milking frequency for the herd (NZ Herd Test Standard 8100:2015). PMID:27600967

  14. An Automated Grass-Based Procedure to Assess the Geometrical Accuracy of the Openstreetmap Paris Road Network

    Science.gov (United States)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.

    2016-06-01

    OpenStreetMap (OSM) is the largest spatial database of the world. One of the most frequently occurring geospatial elements within this database is the road network, whose quality is crucial for applications such as routing and navigation. Several methods have been proposed for the assessment of OSM road network quality, however they are often tightly coupled to the characteristics of the authoritative dataset involved in the comparison. This makes it hard to replicate and extend these methods. This study relies on an automated procedure which was recently developed for comparing OSM with any road network dataset. It is based on three Python modules for the open source GRASS GIS software and provides measures of OSM road network spatial accuracy and completeness. Provided that the user is familiar with the authoritative dataset used, he can adjust the values of the parameters involved thanks to the flexibility of the procedure. The method is applied to assess the quality of the Paris OSM road network dataset through a comparison against the French official dataset provided by the French National Institute of Geographic and Forest Information (IGN). The results show that the Paris OSM road network has both a high completeness and spatial accuracy. It has a greater length than the IGN road network, and is found to be suitable for applications requiring spatial accuracies up to 5-6 m. Also, the results confirm the flexibility of the procedure for supporting users in carrying out their own comparisons between OSM and reference road datasets.

  15. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    Science.gov (United States)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of

  16. Automating the single crystal x-ray diffraction experiment

    OpenAIRE

    Light, Mark

    2004-01-01

    Ever decreasing data collection times and an explosion in demand present us with the situation where an automated single crystal instrument is not only advantageous but essential. With recent developments in software, instrumentation and robotics it has been possible to fully automate structure determination from mounted crystal to completed crystal structure. In Southampton we have developed a system that takes pre-mounted samples, loads them onto the diffractometer, assesses their diff...

  17. Automated extraction of BI-RADS final assessment categories from radiology reports with natural language processing.

    Science.gov (United States)

    Sippo, Dorothy A; Warden, Graham I; Andriole, Katherine P; Lacson, Ronilda; Ikuta, Ichiro; Birdwell, Robyn L; Khorasani, Ramin

    2013-10-01

    The objective of this study is to evaluate a natural language processing (NLP) algorithm that determines American College of Radiology Breast Imaging Reporting and Data System (BI-RADS) final assessment categories from radiology reports. This HIPAA-compliant study was granted institutional review board approval with waiver of informed consent. This cross-sectional study involved 1,165 breast imaging reports in the electronic medical record (EMR) from a tertiary care academic breast imaging center from 2009. Reports included screening mammography, diagnostic mammography, breast ultrasound, combined diagnostic mammography and breast ultrasound, and breast magnetic resonance imaging studies. Over 220 reports were included from each study type. The recall (sensitivity) and precision (positive predictive value) of a NLP algorithm to collect BI-RADS final assessment categories stated in the report final text was evaluated against a manual human review standard reference. For all breast imaging reports, the NLP algorithm demonstrated a recall of 100.0 % (95 % confidence interval (CI), 99.7, 100.0 %) and a precision of 96.6 % (95 % CI, 95.4, 97.5 %) for correct identification of BI-RADS final assessment categories. The NLP algorithm demonstrated high recall and precision for extraction of BI-RADS final assessment categories from the free text of breast imaging reports. NLP may provide an accurate, scalable data extraction mechanism from reports within EMRs to create databases to track breast imaging performance measures and facilitate optimal breast cancer population management strategies. PMID:23868515

  18. Test-retest reliability analysis of the Cambridge Neuropsychological Automated Tests for the assessment of dementia in older people living in retirement homes.

    Science.gov (United States)

    Gonçalves, Marta Matos; Pinho, Maria Salomé; Simões, Mário R

    2016-01-01

    The validity of the Cambridge Neuropsychological Automated Tests has been widely studied, but their reliability has not. This study aimed to estimate the test-retest reliability of these tests in a sample of 34 older adults, aged 69 to 90 years old, without neuropsychiatric diagnoses and living in retirement homes in the district of Lisbon, Portugal. The battery was administered twice, with a 4-week interval between sessions. The Paired Associates Learning (PAL), Spatial Working Memory (SWM), Rapid Visual Information Processing, and Reaction Time tests revealed measures with high-to-adequate test-retest correlations (.71-.89), although several PAL and SWM measures showed susceptibility to practice effects. Two estimated standardized regression-based methods were found to be more efficient at correcting for practice effects than a method of fixed correction. We also found weak test-retest correlations (.56-.68) for several measures. These results suggest that some, but not all, measures are suitable for cognitive assessment and monitoring in this population.

  19. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  20. Automated detection of ambiguity in BI-RADS assessment categories in mammography reports.

    Science.gov (United States)

    Bozkurt, Selen; Rubin, Daniel

    2014-01-01

    An unsolved challenge in biomedical natural language processing (NLP) is detecting ambiguities in the reports that can help physicians to improve report clarity. Our goal was to develop NLP methods to tackle the challenges of identifying ambiguous descriptions of the laterality of BI-RADS Final Assessment Categories in mammography radiology reports. We developed a text processing system that uses a BI-RADS ontology we built as a knowledge source for automatic annotation of the entities in mammography reports relevant to this problem. We used the GATE NLP toolkit and developed customized processing resources for report segmentation, named entity recognition, and detection of mismatches between BI-RADS Final Assessment Categories and mammogram laterality. Our system detected 55 mismatched cases in 190 reports and the accuracy rate was 81%. We conclude that such NLP techniques can detect ambiguities in mammography reports and may reduce discrepancy and variability in reporting. PMID:24743074

  1. Computer Man Simulation of Incapacitation: An Automated Approach to Wound Ballistics and Associated Medical Care Assessments

    OpenAIRE

    Clare, V.; Ashman, W.; Broome, P; Jameson, J.; Lewis, J.; Merkler, J.; Mickiewicz, A.; Sacco, W.; Sturdivan, L.

    1981-01-01

    Wound ballistics assessments traditionally have been based on correlations between some quantification of “ballistic dose” and an empirical/subjective medical quantification of human functional degradation. Although complicated by the highly inhomogeneous nature of the human body and by the voluminous data handling requirements these correlation values were obtained by manual methods. The procedure required a substantial commitment of time and resources, thereby restricting the data base from...

  2. Automated breast tissue density assessment using high order regional texture descriptors in mammography

    Science.gov (United States)

    Law, Yan Nei; Lieng, Monica Keiko; Li, Jingmei; Khoo, David Aik-Aun

    2014-03-01

    Breast cancer is the most common cancer and second leading cause of cancer death among women in the US. The relative survival rate is lower among women with a more advanced stage at diagnosis. Early detection through screening is vital. Mammography is the most widely used and only proven screening method for reliably and effectively detecting abnormal breast tissues. In particular, mammographic density is one of the strongest breast cancer risk factors, after age and gender, and can be used to assess the future risk of disease before individuals become symptomatic. A reliable method for automatic density assessment would be beneficial and could assist radiologists in the evaluation of mammograms. To address this problem, we propose a density classification method which uses statistical features from different parts of the breast. Our method is composed of three parts: breast region identification, feature extraction and building ensemble classifiers for density assessment. It explores the potential of the features extracted from second and higher order statistical information for mammographic density classification. We further investigate the registration of bilateral pairs and time-series of mammograms. The experimental results on 322 mammograms demonstrate that (1) a classifier using features from dense regions has higher discriminative power than a classifier using only features from the whole breast region; (2) these high-order features can be effectively combined to boost the classification accuracy; (3) a classifier using these statistical features from dense regions achieves 75% accuracy, which is a significant improvement from 70% accuracy obtained by the existing approaches.

  3. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR LANDSCAPE ASSESSMENT AND WATERSHED MANAGEMENT

    Science.gov (United States)

    The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...

  4. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  5. An automated procedure for the assessment of white matter hyperintensities by multispectral (T1, T2, PD) MRI and an evaluation of its between-centre reproducibility based on two large community databases

    International Nuclear Information System (INIS)

    An automated procedure for the detection, quantification, localization and statistical mapping of white matter hyperintensities (WMH) on T2-weighted magnetic resonance (MR) images is presented and validated based on the results of a between-centre reproducibility study. The first step is the identification of white matter (WM) tissue using a multispectral (T1, T2, PD) segmentation. In a second step, WMH are identified within the WM tissue by segmenting T2 images, isolating two different classes of WMH voxels - low- and high-contrast WMH voxels, respectively. The reliability of the whole procedure was assessed by applying it to the analysis of two large MR imaging databases (n = 650 and n710, respectively) of healthy elderly subjects matched for demographic characteristics. Average overall WMH load and spatial distribution were found to be similar in the two samples, (1.81 and 1.79% of the WM volume, respectively). White matter hyperintensity load was found to be significantly associated with both age and high blood pressure, with similar effects in both samples. With specific reference to the 650 subject cohort, we also found that WMH load provided by this automated procedure was significantly associated with visual grading of the severity of WMH, as assessed by a trained neurologist. The results show that this method is sensitive, well correlated with semi-quantitative visual rating and highly reproducible. (orig.)

  6. Automated solvent concentrator

    Science.gov (United States)

    Griffith, J. S.; Stuart, J. L.

    1976-01-01

    Designed for automated drug identification system (AUDRI), device increases concentration by 100. Sample is first filtered, removing particulate contaminants and reducing water content of sample. Sample is extracted from filtered residue by specific solvent. Concentrator provides input material to analysis subsystem.

  7. Calibration of a liquid scintillation counter to assess tritium levels in various samples

    CERN Document Server

    Al-Haddad, M N; Abu-Jarad, F A

    1999-01-01

    An LKB-Wallac 1217 Liquid Scintillation Counter (LSC) was calibrated with a newly adopted cocktail. The LSC was then used to measure tritium levels in various samples to assess the compliance of tritium levels with the recommended international levels. The counter was calibrated to measure both biological and operational samples for personnel and for an accelerator facility at KFUPM. The biological samples include the bioassay (urine), saliva, and nasal tests. The operational samples of the light ion linear accelerator include target cooling water, organic oil, fomblin oil, and smear samples. Sets of standards, which simulate various samples, were fabricated using traceable certified tritium standards. The efficiency of the counter was obtained for each sample. The typical range of the efficiencies varied from 33% for smear samples down to 1.5% for organic oil samples. A quenching curve for each sample is presented. The minimum detectable activity for each sample was established. Typical tritium levels in bio...

  8. Automated modal tracking and fatigue assessment of a wind turbine based on continuous dynamic monitoring

    Directory of Open Access Journals (Sweden)

    Oliveira Gustavo

    2015-01-01

    Full Text Available The paper describes the implementation of a dynamic monitoring system at a 2.0 MW onshore wind turbine. The system is composed by two components aiming at the structural integrity and fatigue assessment. The first component enables the continuous tracking of modal characteristics of the wind turbine (natural frequency values, modal damping ratios and mode shapes in order to detect abnormal deviations of these properties, which may be caused by the occurrence of structural damage. On the other hand, the second component allows the estimation of the remaining fatigue lifetime of the structure based on the analysis of the measured cycles of structural vibration.

  9. Enabling automated magnetic resonance imaging-based targeting assessment during dipole field navigation

    Science.gov (United States)

    Latulippe, Maxime; Felfoul, Ouajdi; Dupont, Pierre E.; Martel, Sylvain

    2016-02-01

    The magnetic navigation of drugs in the vascular network promises to increase the efficacy and reduce the secondary toxicity of cancer treatments by targeting tumors directly. Recently, dipole field navigation (DFN) was proposed as the first method achieving both high field and high navigation gradient strengths for whole-body interventions in deep tissues. This is achieved by introducing large ferromagnetic cores around the patient inside a magnetic resonance imaging (MRI) scanner. However, doing so distorts the static field inside the scanner, which prevents imaging during the intervention. This limitation constrains DFN to open-loop navigation, thus exposing the risk of a harmful toxicity in case of a navigation failure. Here, we are interested in periodically assessing drug targeting efficiency using MRI even in the presence of a core. We demonstrate, using a clinical scanner, that it is in fact possible to acquire, in specific regions around a core, images of sufficient quality to perform this task. We show that the core can be moved inside the scanner to a position minimizing the distortion effect in the region of interest for imaging. Moving the core can be done automatically using the gradient coils of the scanner, which then also enables the core to be repositioned to perform navigation to additional targets. The feasibility and potential of the approach are validated in an in vitro experiment demonstrating navigation and assessment at two targets.

  10. Assessing breast cancer masking risk with automated texture analysis in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lilholm, Martin; Diao, Pengfei;

    2015-01-01

    PURPOSE The goal of this work is to develop a method to assess the risk of breast cancer masking, based on image characteristics beyond breast density. METHOD AND MATERIALS From the Dutch breast cancer screening program we collected 285 screen detected cancers, and 109 cancers that were screen...... (Quartile 1/2) versus high (Quartile 3/4) texture risk score. We computed odds ratios (OR) for breast cancer masking risk (i.e. interval versus screen detected cancer) for each of the subgroups. The OR was 1.63 (1.04-2.53 95%CI) for the high dense group (as compared to the low dense group), whereas...... that a breast cancer is masked in regular mammography, independently of breast density. As such it offers opportunities to further enhance personalized breast cancer screening, beyond breast density....

  11. Measurement of acceleration while walking as an automated method for gait assessment in dairy cattle

    DEFF Research Database (Denmark)

    Chapinal, N.; de Passillé, A.M.; Pastell, M.;

    2011-01-01

    -dimensional accelerometers, 1 attached to each leg and 1 to the back, and acceleration data were collected while cows walked in a straight line on concrete (experiment 1) or on both concrete and rubber (experiment 2). Cows were video-recorded while walking to assess overall gait, asymmetry of the steps......, and walking speed. In experiment 1, cows were selected to maximize the range of gait scores, whereas no clinically lame cows were enrolled in experiment 2. For each accelerometer location, overall acceleration was calculated as the magnitude of the 3-dimensional acceleration vector and the variance of overall...... to be a promising tool for lameness detection on farm and to study walking surfaces, especially when attached to a leg....

  12. Automated texture scoring for assessing breast cancer masking risk in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Petersen, Kersten; Lilholm, Martin;

    validation. To assess the independency of the texture scores of breast density, density was determined for each image using Volpara. RESULTS: The odds ratios for interval cancer were 1.59 (95%CI: 0.76-3.32), 2.07 (1.02-4.20), and 3.14 (1.60-6.17) for quartile 2, 3 and 4 respectively, relative to quartile 1....... Correlation between the texture scores and breast density was 0.59 (0.52-0.64). Breast density adjusted odds ratios, as determined with logistic regression, were 1.49 (0.71-3.13), 1.58 (0.75-3.33), and 1.97 (0.91-4.27). CONCLUSIONS: The CSAE texture score is independently associated with the risk of having...

  13. A computer-based automated algorithm for assessing acinar cell loss after experimental pancreatitis.

    Directory of Open Access Journals (Sweden)

    John F Eisses

    Full Text Available The change in exocrine mass is an important parameter to follow in experimental models of pancreatic injury and regeneration. However, at present, the quantitative assessment of exocrine content by histology is tedious and operator-dependent, requiring manual assessment of acinar area on serial pancreatic sections. In this study, we utilized a novel computer-generated learning algorithm to construct an accurate and rapid method of quantifying acinar content. The algorithm works by learning differences in pixel characteristics from input examples provided by human experts. HE-stained pancreatic sections were obtained in mice recovering from a 2-day, hourly caerulein hyperstimulation model of experimental pancreatitis. For training data, a pathologist carefully outlined discrete regions of acinar and non-acinar tissue in 21 sections at various stages of pancreatic injury and recovery (termed the "ground truth". After the expert defined the ground truth, the computer was able to develop a prediction rule that was then applied to a unique set of high-resolution images in order to validate the process. For baseline, non-injured pancreatic sections, the software demonstrated close agreement with the ground truth in identifying baseline acinar tissue area with only a difference of 1% ± 0.05% (p = 0.21. Within regions of injured tissue, the software reported a difference of 2.5% ± 0.04% in acinar area compared with the pathologist (p = 0.47. Surprisingly, on detailed morphological examination, the discrepancy was primarily because the software outlined acini and excluded inter-acinar and luminal white space with greater precision. The findings suggest that the software will be of great potential benefit to both clinicians and researchers in quantifying pancreatic acinar cell flux in the injured and recovering pancreas.

  14. Lung ventilation-perfusion imbalance in pulmonary emphysema. Assessment with automated V/Q quotient SPECT

    International Nuclear Information System (INIS)

    Tc-99m-Technegas-macro-aggregated albumin (MAA) single photon emission computed tomography (SPECT)-derived ventilation (V)/perfusion (Q) quotient SPECT was used to assess lung V-Q imbalance in patients with pulmonary emphysema. V/Q quotient SPECT and V/Q profile were automatically built in 38 patients with pulmonary emphysema and 12 controls, and V/Q distribution and V/Q profile parameters were compared. V/Q distribution on V/Q quotient SPECT was correlated with low attenuation areas (LAA) on density-mask computed tomography (CT). Parameters of V/Q profile such as the median, standard deviation (SD), kurtosis and skewness were proposed to objectively evaluate the severity of lung V-Q imbalance. In contrast to uniform V/Q distribution on V/Q quotient SPECT and a sharp peak with symmetrical V/Q distribution on V/Q profile in controls, lung areas showing heterogeneously high or low V/Q and flattened peaks with broadened V/Q distribution were frequently seen in patients with emphysema, including lung areas with only slight LAA. V/Q distribution was also often asymmetric regardless of symmetric LAA. All the proposed parameters of V/Q profile in entire lungs of patients with emphysema showed large variations compared with controls; SD and kurtosis were significantly different from controls (P<0.0001 and P<0.001, respectively), and a significant correlation was found between SD and A-aDO2 (P<0.0001). V/Q quotient SPECT appears to be more sensitive to detect emphysematous lungs compared with morphologic CT in patients with emphysema. SD and kurtosis of V/Q profile can be adequate parameters to assess the severity of lung V-Q imbalance causing gas-exchange impairment in patients with emphysema. (author)

  15. Strong Prognostic Value of Tumor-infiltrating Neutrophils and Lymphocytes Assessed by Automated Digital Image Analysis in Early Stage Cervical Cancer

    DEFF Research Database (Denmark)

    Carus, Andreas; Donskov, Frede; Switten Nielsen, Patricia;

    2014-01-01

    INTRODUCTION Manual observer-assisted stereological (OAS) assessments of tumor-infiltrating neutrophils and lymphocytes are prognostic, accurate, but cumbersome. We assessed the applicability of automated digital image analysis (DIA). METHODS Visiomorph software was used to obtain DIA densities...... to lymphocyte (TA–NL) index accurately predicted the risk of relapse, ranging from 8% to 52% (P = 0.001). CONCLUSIONS DIA is a potential assessment technique. The TA–NL index obtained by DIA is a strong prognostic variable with possible routine clinical application....

  16. Automating Risk Assessments of Hazardous Material Shipments for Transportation Routes and Mode Selection

    Energy Technology Data Exchange (ETDEWEB)

    Barbara H. Dolphin; William D. RIchins; Stephen R. Novascone

    2010-10-01

    The METEOR project at Idaho National Laboratory (INL) successfully addresses the difficult problem in risk assessment analyses of combining the results from bounding deterministic simulation results with probabilistic (Monte Carlo) risk assessment techniques. This paper describes a software suite designed to perform sensitivity and cost/benefit analyses on selected transportation routes and vehicles to minimize risk associated with the shipment of hazardous materials. METEOR uses Monte Carlo techniques to estimate the probability of an accidental release of a hazardous substance along a proposed transportation route. A METEOR user selects the mode of transportation, origin and destination points, and charts the route using interactive graphics. Inputs to METEOR (many selections built in) include crash rates for the specific aircraft, soil/rock type and population densities over the proposed route, and bounding limits for potential accident types (velocity, temperature, etc.). New vehicle, materials, and location data are added when available. If the risk estimates are unacceptable, the risks associated with alternate transportation modes or routes can be quickly evaluated and compared. Systematic optimizing methods will provide the user with the route and vehicle selection identified with the lowest risk of hazardous material release. The effects of a selected range of potential accidents such as vehicle impact, fire, fuel explosions, excessive containment pressure, flooding, etc. are evaluated primarily using hydrocodes capable of accurately simulating the material response of critical containment components. Bounding conditions that represent credible accidents (i.e; for an impact event, velocity, orientations, and soil conditions) are used as input parameters to the hydrocode models yielding correlation functions relating accident parameters to component damage. The Monte Carlo algorithms use random number generators to make selections at the various decision

  17. Evaluation of a Portable Automated Serum Chemistry Analyzer for Field Assessment of Harlequin Ducks, Histrionicus histrionicus.

    Science.gov (United States)

    Stoskopf, Michael K; Mulcahy, Daniel M; Esler, Daniel

    2010-01-01

    A portable analytical chemistry analyzer was used to make field assessments of wild harlequin ducks (Histrionicus histrionicus) in association with telemetry studies of winter survival in Prince William Sound, Alaska. We compared serum chemistry results obtained on-site with results from a traditional laboratory. Particular attention was paid to serum glucose and potassium concentrations as potential indicators of high-risk surgical candidates based on evaluation of the field data. The median differential for glucose values (N = 82) between methods was 0.6 mmol/L (quartiles 0.3 and 0.9 mmol/L) with the median value higher when assayed on site. Analysis of potassium on site returned a median of 2.7 mmol/L (N = 88; quartiles 2.4 and 3.0 mmol/L). Serum potassium values were too low for quantitation by the traditional laboratory. Changes in several serum chemistry values following a three-day storm during the study support the value of on site evaluation of serum potassium to identify presurgical patients with increased anesthetic risk.

  18. An automated model for rooftop PV systems assessment in ArcGIS using LIDAR

    Directory of Open Access Journals (Sweden)

    Mesude Bayrakci Boz

    2015-08-01

    Full Text Available As photovoltaic (PV systems have become less expensive, building rooftops have come to be attractive for local power production. Identifying rooftops suitable for solar energy systems over large geographic areas is needed for cities to obtain more accurate assessments of production potential and likely patterns of development. This paper presents a new method for extracting roof segments and locating suitable areas for PV systems using Light Detection and Ranging (LIDAR data and building footprints. Rooftop segments are created using seven slope (tilt, ve aspect (azimuth classes and 6 different building types. Moreover, direct beam shading caused by nearby objects and the surrounding terrain is taken into account on a monthly basis. Finally, the method is implemented as an ArcGIS model in ModelBuilder and a tool is created. In order to show its validity, the method is applied to city of Philadelphia, PA, USA with the criteria of slope, aspect, shading and area used to locate suitable areas for PV system installation. The results show that 33.7% of the buildings footprints areas and 48.6% of the rooftop segments identi ed is suitable for PV systems. Overall, this study provides a replicable model using commercial software that is capable of extracting individual roof segments with more detailed criteria across an urban area.

  19. Accurate and Precise in Situ Zircon U-Pb age Dating With High Sample Throughput by Automated LA-SF-ICP-MS

    Science.gov (United States)

    Frei, D.; Gerdes, A.; Schersten, A.; Hollis, J. A.; Martina, F.; Knudsen, C.

    2006-12-01

    Zircon is an ubiquitous mineral in most crystalline rocks as well as clastic sediments. The high resistance to thermal resetting and physical erosion makes zircon an exceptionally useful mineral for precise and accurate dating of thermal geological events. For example, the analysis of the U-Pb ages of detrital zircon grains in clastic sediments is a powerful tool in sedimentary provenance studies. Accurate and precise U-Pb ages of > 100 zircon grains in a sample usually allow to detect all major sedimentary source age components with statistical confidence. U-Pb age dating of detrital zircons is generally the domain of high resolution ion microprobe techniques (high resolution SIMS), where relatively rapid in situ analysis can be achieved. The major limitations of these techniques are sample throughput (about 75 zircon age dates per 24 hours), the very high purchasing and operating costs of the equipment and the need for highly specialised personnel, resulting in high cost. These high costs usually impose uncomfortable restrictions on the number of samples that can be analysed in a provenance study. Here, we present a high sample throughput technique for highly accurate and precise U-Pb dating of zircons by laser ablation magnetic sectorfield inductively coupled plasma mass spectrometry (LA-SF-ICP-MS). This technique takes advantage of recent progress in laser technology and the introduction of magnetic sectorfield ICP-MS instruments. Based on a ThermoFinnigan Element2 magnetic sctorfield ICP-MS and a New Wave UP 213 laser ablation system, this techniques allows U-Pb dating of zircon grains with precision, accuray and spatial resolution comparable to high resolution SIMS. Because an individual analysis is carried out in less than two minutes and all data is acquired automated in pre-set mode with only minimal operator presence, the sample throughput is an order of magnitude higher compared to high resolution SIMS. Furthermore, the purchasing and operating costs of

  20. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  1. Assessment of the Current Level of Automation in the Manufacture of Fuel Cell Systems for Combined Heat and Power Applications

    Energy Technology Data Exchange (ETDEWEB)

    Ulsh, M.; Wheeler, D.; Protopappas, P.

    2011-08-01

    The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study using a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.

  2. Testing of an automated online EA-IRMS method for fast and simultaneous carbon content and stable isotope measurement of aerosol samples

    Science.gov (United States)

    Major, István; Gyökös, Brigitta; Túri, Marianna; Futó, István; Filep, Ágnes; Hoffer, András; Molnár, Mihály

    2016-04-01

    Comprehensive atmospheric studies have demonstrated that carbonaceous aerosol is one of the main components of atmospheric particulate matter over Europe. Various methods, considering optical or thermal properties, have been developed for quantification of the accurate amount of both organic and elemental carbon constituents of atmospheric aerosol. The aim of our work was to develop an alternative fast and easy method for determination of the total carbon content of individual aerosol samples collected on prebaked quartz filters whereby the mass and surface concentration becomes simply computable. We applied the conventional "elemental analyzer (EA) coupled online with an isotope ratio mass spectrometer (IRMS)" technique which is ubiquitously used in mass spectrometry. Using this technique we are able to measure simultaneously the carbon stable isotope ratio of the samples, as well. During the developing process, we compared the EA-IRMS technique with an off-line catalytic combustion method worked out previously at Hertelendi Laboratory of Environmental Studies (HEKAL). We tested the combined online total carbon content and stable isotope ratio measurement both on standard materials and real aerosol samples. Regarding the test results the novel method assures, on the one hand, at least 95% of carbon recovery yield in a broad total carbon mass range (between 100 and 3000 ug) and, on the other hand, a good reproducibility of stable isotope measurements with an uncertainty of ± 0.2 per mill. Comparing the total carbon results obtained by the EA-IRMS and the off-line catalytic combustion method we found a very good correlation (R2=0.94) that proves the applicability of both preparation method. Advantages of the novel method are the fast and simplified sample preparation steps and the fully automated, simultaneous carbon stable isotope ratio measurement processes. Furthermore stable isotope ratio results can effectively be applied in the source apportionment

  3. Radiostrontium and radium analysis in low-level environmental samples following a multi-stage semi-automated chromatographic sequential separation

    International Nuclear Information System (INIS)

    Strontium isotopes, 89Sr and 90Sr, and 226Ra being radiotoxic when ingested, are routinely monitored in milk and drinking water samples collected from different regions in Canada. In order to monitor environmental levels of activity, a novel semi-automated sensitive method has been developed at the Radiation Protection Bureau of Health Canada (Ottawa, Canada). This method allows the separation and quantification of both 89Sr and 90Sr and has also been adapted to quantify 226Ra during the same sample preparation procedure. The method uses a 2-stage purification process during which matrix constituents, such as magnesium and calcium that are rich in milk, are removed as well as the main beta-interferences (e.g., 40K, 87Rb, 134Cs, 137Cs, and 140Ba). The first purification step uses strong cation exchange (SCX) chromatography with commercially available resins. In a second step, fractions containing the radiostrontium analytes are further purified using high-performance ion chromatography (HPIC). While 89Sr is quantified by Cerenkov counting immediately after the second purification stage, the same vial is counted again after a latent period of 10-14 days to quantify the 90Sr activity based on 90Y ingrowth. Similarly, the activity of 226Ra, which is separated by SCX only, is determined via the emanation of 222Rn in a 2-phase aqueous/cocktail system using liquid scintillation counting. The minimum detectable concentration (MDC) for 89Sr and 90Sr for a 200 min count time at 95% confidence interval is 0.03 and 0.02 Bq/L, respectively. The MDC for 226Ra for a 100 min count time is 0.002 Bq/L. Semi-annual intercomparison samples from the USA Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) were used to validate the method for 89Sr and 90Sr. Spiked water samples prepared in-house and from International Atomic Energy Agency (IAEA) were used to validate the 226Ra assay.

  4. Fully automated ionic liquid-based headspace single drop microextraction coupled to GC-MS/MS to determine musk fragrances in environmental water samples.

    Science.gov (United States)

    Vallecillos, Laura; Pocurull, Eva; Borrull, Francesc

    2012-09-15

    A fully automated ionic liquid-based headspace single drop microextraction (IL-HS-SDME) procedure has been developed for the first time to preconcentrate trace amounts of ten musk fragrances extensively used in personal care products (six polycyclic musks, three nitro musks and one polycyclic musk degradation product) from wastewater samples prior to analysis by gas chromatography and ion trap tandem mass spectrometry (GC-IT-MS/MS). Due to the low volatility of the ILs, a large internal diameter liner (3.4 mm i.d.) was used to improve the ILs evaporation. Furthermore, a piece of glass wool was introduced into the liner to avoid the entrance of the ILs in the GC column and a guard column was used to prevent analytical column damages. The main factors influencing the IL-HS-SDME were optimized. For all species, the highest enrichments factors were achieved using 1 μL of 1-octyl-3-methylimidazolium hexafluorophosphate ([OMIM][PF(6)]) ionic liquid exposed in the headspace of 10 mL water samples containing 300 g L(-1) of NaCl and stirred at 750 rpm and 60 °C for 45 min. All compounds were determined by direct injection GC-IT-MS/MS with a chromatographic time of 19 min. Method detection limits were found in the low ng mL(-1) range between 0.010 ng mL(-1) and 0.030 ng mL(-1) depending on the target analytes. Also, under optimized conditions, the method gave good levels of intra-day and inter-day repeatabilities in wastewater samples with relative standard deviations varying between 3% and 6% and 5% and 11%, respectively (n=3, 1 ng mL(-1)). The applicability of the method was tested with different wastewater samples from influent and effluent urban wastewater treatment plants (WWTPs) and one potable treatment plant (PTP). The analysis of influent urban wastewater revealed the presence of galaxolide and tonalide at concentrations of between 2.10 ng mL(-1) and 0.29 ng mL(-1) and 0.32 ng mL(-1) and waters from PTP only galaxolide was found at a concentration higher than MQL.

  5. Development and Validation of an Admission Test Designed to Assess Samples of Performance on Academic Tasks

    Science.gov (United States)

    Tanilon, Jenny; Segers, Mien; Vedder, Paul; Tillema, Harm

    2009-01-01

    This study illustrates the development and validation of an admission test, labeled as Performance Samples on Academic Tasks in Educational Sciences (PSAT-Ed), designed to assess samples of performance on academic tasks characteristic of those that would eventually be encountered by examinees in an Educational Sciences program. The test was based…

  6. Contamination Rates of Three Urine-Sampling Methods to Assess Bacteriuria in Pregnant Women

    NARCIS (Netherlands)

    Schneeberger, Caroline; van den Heuvel, Edwin R.; Erwich, Jan Jaap H. M.; Stolk, Ronald P.; Visser, Caroline E.; Geerlings, Suzanne E.

    2013-01-01

    OBJECTIVE: To estimate and compare contamination rates of three different urine-sampling methods in pregnant women to assess bacteriuria. METHODS: In this cross-sectional study, 113 pregnant women collected three different midstream urine samples consecutively: morning (first void); midstream (void

  7. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    Science.gov (United States)

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  8. Particle analysis for uranium isotopes on swipe samples using new generation Cameca IMS 7f SIMS supported by SEM automated uranium detection

    International Nuclear Information System (INIS)

    triangle pieces. This internal reference enables the determination of parameters in the transformation of coordinates relative to the SEM, to coordinates relative to the SIMS sample stages with a precision better than 50 μm. Uranium-bearing particle detection - The main difficulty in particle detection arises because the programs which are commonly used for SIMS automated uranium-bearing particle search (e.g. P-search by Evans Analytical) still have to be updated to a version compatible with the IMS 7f software. In this study, the automated detection of uranium-bearing particles has been performed using a FEI XL 30 environmental SEM fitted with an EDAX system. An adaptation of the Gun Shot Residue forensic software allows the automatic search for uranium-containing particles using back-scattered electron image analysis and qualitative micro-analysis of major elemental composition by energy dispersed X-ray spectrometry. In addition, secondary electron images of uranium-containing particles can be acquired in order to characterize their morphology. An overnight GSR run may investigate a ∼ 1 cm2 deposition area, detecting with a high probability all uranium-bearing particles with diameter > 1 μm. The GSR program provides a listing of uranium-bearing particle coordinates relative to the MEB sample stage. Compared to the SIMS detection, this lower cost method presents some advantages: it is non-destructive, non-susceptible to isobaric interferences, and provides some additional relevant information on individual particles (e.g. volume, morphology, and major elemental composition). Compared to SIMS particle detection, the main drawback of this technique is that it is not sensitive to 235U-enrichment of the detected particles. As a consequence, no priority can be drawn among the particles to be analyzed for isotopic ratios. SIMS analysis of uranium isotopic ratios -- About 40 particles selected among the uranium-bearing particles previously detected by SEM could be analyzed

  9. Determination of aflatoxins in food samples by automated on-line in-tube solid-phase microextraction coupled with liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Nonaka, Y; Saito, K; Hanioka, N; Narimatsu, S; Kataoka, H

    2009-05-15

    A simple and sensitive automated method for determination of aflatoxins (B1, B2, G1, and G2) in nuts, cereals, dried fruits, and spices was developed consisting of in-tube solid-phase microextraction (SPME) coupled with liquid chromatography-mass spectrometry (LC-MS). Aflatoxins were separated within 8 min by high-performance liquid chromatography using a Zorbax Eclipse XDB-C8 column with methanol/acetonitrile (60/40, v/v): 5mM ammonium formate (45:55) as the mobile phase. Electrospray ionization conditions in the positive ion mode were optimized for MS detection of aflatoxins. The pseudo-molecular ions [M+H](+) were used to detect aflatoxins in selected ion monitoring (SIM) mode. The optimum in-tube SPME conditions were 25draw/eject cycles of 40 microL of sample using a Supel-Q PLOT capillary column as an extraction device. The extracted aflatoxins were readily desorbed from the capillary by passage of the mobile phase, and no carryover was observed. Using the in-tube SPME LC-MS with SIM method, good linearity of the calibration curve (r>0.9994) was obtained in the concentration range of 0.05-2.0 ng/mL using aflatoxin M1 as an internal standard, and the detection limits (S/N=3) of aflatoxins were 2.1-2.8 pg/mL. The in-tube SPME method showed >23-fold higher sensitivity than the direct injection method (10 microL injection volume). The within-day and between-day precision (relative standard deviations) at the concentration of 1 ng/mL aflatoxin mixture were below 3.3% and 7.7% (n=5), respectively. This method was applied successfully to analysis of food samples without interference peaks. The recoveries of aflatoxins spiked into nuts and cereals were >80%, and the relative standard deviations were Aflatoxins were detected at <10 ng/g in several commercial food samples. PMID:19328492

  10. An automated system for access to derived climate indices in support of ecological impacts assessments and resource management

    Science.gov (United States)

    Walker, J.; Morisette, J. T.; Talbert, C.; Blodgett, D. L.; Kunicki, T.

    2012-12-01

    A U.S. Geological Survey team is working with several providers to establish standard data services for the climate projection data they host. To meet the needs of climate adaptation science and landscape management communities, the team is establishing a set of climate index calculation algorithms that will consume data from various providers and provide directly useful data derivatives. Climate projections coming from various scenarios, modeling centers, and downscaling methods are increasing in number and size. Global change impact modeling and assessment, generally, requires inputs in the form of climate indices or values derived from raw climate projections. This requirement puts a large burden on a community not familiar with climate data formats, semantics, and processing techniques and requires storage capacity and computing resources out of the reach of most. In order to fully understand the implications of our best available climate projections, assessments must take into account an ensemble of climate projections and potentially a range of parameters for calculation of climate indices. These requirements around data access and processing are not unique from project to project, or even among projected climate data sets, pointing to the need for a reusable tool to generate climate indices. The U.S. Geological Survey has developed a pilot application and supporting web service framework that automates the generation of climate indices. The web service framework consists of standards-based data servers and a data integration broker. The resulting system allows data producers to publish and maintain ownership of their data and data consumers to access climate derivatives via a simple to use "data product ordering" workflow. Data access and processing is completed on enterprise "cloud" computing resources and only the relatively small, derived climate indices are delivered to the scientist or land manager. These services will assist the scientific and land

  11. Differential proteomic analysis of mouse macrophages exposed to adsorbate-loaded heavy fuel oil derived combustion particles using an automated sample-preparation workflow.

    Science.gov (United States)

    Kanashova, Tamara; Popp, Oliver; Orasche, Jürgen; Karg, Erwin; Harndorf, Horst; Stengel, Benjamin; Sklorz, Martin; Streibel, Thorsten; Zimmermann, Ralf; Dittmar, Gunnar

    2015-08-01

    Ship diesel combustion particles are known to cause broad cytotoxic effects and thereby strongly impact human health. Particles from heavy fuel oil (HFO) operated ships are considered as particularly dangerous. However, little is known about the relevant components of the ship emission particles. In particular, it is interesting to know if the particle cores, consisting of soot and metal oxides, or the adsorbate layers, consisting of semi- and low-volatile organic compounds and salts, are more relevant. We therefore sought to relate the adsorbates and the core composition of HFO combustion particles to the early cellular responses, allowing for the development of measures that counteract their detrimental effects. Hence, the semi-volatile coating of HFO-operated ship diesel engine particles was removed by stepwise thermal stripping using different temperatures. RAW 264.7 macrophages were exposed to native and thermally stripped particles in submersed culture. Proteomic changes were monitored by two different quantitative mass spectrometry approaches, stable isotope labeling by amino acids in cell culture (SILAC) and dimethyl labeling. Our data revealed that cells reacted differently to native or stripped HFO combustion particles. Cells exposed to thermally stripped particles showed a very differential reaction with respect to the composition of the individual chemical load of the particle. The cellular reactions of the HFO particles included reaction to oxidative stress, reorganization of the cytoskeleton and changes in endocytosis. Cells exposed to the 280 °C treated particles showed an induction of RNA-related processes, a number of mitochondria-associated processes as well as DNA damage response, while the exposure to 580 °C treated HFO particles mainly induced the regulation of intracellular transport. In summary, our analysis based on a highly reproducible automated proteomic sample-preparation procedure shows a diverse cellular response, depending on the

  12. The use of importance sampling in a trial assessment to obtain converged estimates of radiological risk

    International Nuclear Information System (INIS)

    In developing a methodology for assessing potential sites for the disposal of radioactive wastes, the Department of the Environment has conducted a series of trial assessment exercises. In order to produce converged estimates of radiological risk using the SYVAC A/C simulation system an efficient sampling procedure is required. Previous work has demonstrated that importance sampling can substantially increase sampling efficiency. This study used importance sampling to produce converged estimates of risk for the first DoE trial assessment. Four major nuclide chains were analysed. In each case importance sampling produced converged risk estimates with between 10 and 170 times fewer runs of the SYVAC A/C model. This increase in sampling efficiency can reduce the total elapsed time required to obtain a converged estimate of risk from one nuclide chain by a factor of 20. The results of this study suggests that the use of importance sampling could reduce the elapsed time required to perform a risk assessment of a potential site by a factor of ten. (author)

  13. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  14. An Assessment of the State of the Art of Curriculum Materials and a Status Assessment of Training Programs for Robotics/Automated Systems Technicians. Task Analysis and Descriptions of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This report presents the results of research conducted to determine the current state of the art of robotics/automated systems technician (RAST) training offered in the United States. Section I discusses the RAST curriculum project, of which this state-of-the-art review is a part, and offers a RAST job description. Section II describes the…

  15. An integrative pharmacological approach to radio telemetry and blood sampling in pharmaceutical drug discovery and safety assessment

    Directory of Open Access Journals (Sweden)

    Kamendi Harriet W

    2011-01-01

    Full Text Available Abstract Background A successful integration of the automated blood sampling (ABS and telemetry (ABST system is described. The new ABST system facilitates concomitant collection of physiological variables with blood and urine samples for determination of drug concentrations and other biochemical measures in the same rat without handling artifact. Method Integration was achieved by designing a 13 inch circular receiving antenna that operates as a plug-in replacement for the existing pair of DSI's orthogonal antennas which is compatible with the rotating cage and open floor design of the BASi Culex® ABS system. The circular receiving antenna's electrical configuration consists of a pair of electrically orthogonal half-toroids that reinforce reception of a dipole transmitter operating within the coil's interior while reducing both external noise pickup and interference from other adjacent dipole transmitters. Results For validation, measured baclofen concentration (ABST vs. satellite (μM: 69.6 ± 23.8 vs. 76.6 ± 19.5, p = NS and mean arterial pressure (ABST vs. traditional DSI telemetry (mm Hg: 150 ± 5 vs.147 ± 4, p = NS variables were quantitatively and qualitatively similar between rats housed in the ABST system and traditional home cage approaches. Conclusion The ABST system offers unique advantages over traditional between-group study paradigms that include improved data quality and significantly reduced animal use. The superior within-group model facilitates assessment of multiple physiological and biochemical responses to test compounds in the same animal. The ABST also provides opportunities to evaluate temporal relations between parameters and to investigate anomalous outlier events because drug concentrations, physiological and biochemical measures for each animal are available for comparisons.

  16. Validation of an automated ELISA system for detection of antibodies to Aleutian mink disease virus using blood samples collected in filter paper strips

    OpenAIRE

    Knuuttila, Anna; Aronen, Pirjo; Eerola, Majvor; Gardner, Ian A; Virtala, Anna-Maija K; Vapalahti, Olli

    2014-01-01

    Background Aleutian mink disease virus (AMDV) is the cause of a chronic immune complex disease, Aleutian disease (AD), which is common in mink-producing countries. In 2005, implementation of an AMDV eradication programme in Finland created a need for an automated high-throughput assay. The aim of this study was to validate an AMDV-VP2 -recombinant antigen ELISA, which we developed earlier, in an automated assay format for the detection of anti-AMDV antibodies in mink blood and to determine th...

  17. A content validated questionnaire for assessment of self reported venous blood sampling practices

    Directory of Open Access Journals (Sweden)

    Bölenius Karin

    2012-01-01

    Full Text Available Abstract Background Venous blood sampling is a common procedure in health care. It is strictly regulated by national and international guidelines. Deviations from guidelines due to human mistakes can cause patient harm. Validated questionnaires for health care personnel can be used to assess preventable "near misses"--i.e. potential errors and nonconformities during venous blood sampling practices that could transform into adverse events. However, no validated questionnaire that assesses nonconformities in venous blood sampling has previously been presented. The aim was to test a recently developed questionnaire in self reported venous blood sampling practices for validity and reliability. Findings We developed a questionnaire to assess deviations from best practices during venous blood sampling. The questionnaire contained questions about patient identification, test request management, test tube labeling, test tube handling, information search procedures and frequencies of error reporting. For content validity, the questionnaire was confirmed by experts on questionnaires and venous blood sampling. For reliability, test-retest statistics were used on the questionnaire answered twice. The final venous blood sampling questionnaire included 19 questions out of which 9 had in total 34 underlying items. It was found to have content validity. The test-retest analysis demonstrated that the items were generally stable. In total, 82% of the items fulfilled the reliability acceptance criteria. Conclusions The questionnaire could be used for assessment of "near miss" practices that could jeopardize patient safety and gives several benefits instead of assessing rare adverse events only. The higher frequencies of "near miss" practices allows for quantitative analysis of the effect of corrective interventions and to benchmark preanalytical quality not only at the laboratory/hospital level but also at the health care unit/hospital ward.

  18. Gluten-containing grains skew gluten assessment in oats due to sample grind non-homogeneity.

    Science.gov (United States)

    Fritz, Ronald D; Chen, Yumin; Contreras, Veronica

    2017-02-01

    Oats are easily contaminated with gluten-rich kernels of wheat, rye and barley. These contaminants are like gluten 'pills', shown here to skew gluten analysis results. Using R-Biopharm R5 ELISA, we quantified gluten in gluten-free oatmeal servings from an in-market survey. For samples with a 5-20ppm reading on a first test, replicate analyses provided results ranging 160ppm. This suggests sample grinding may inadequately disperse gluten to allow a single accurate gluten assessment. To ascertain this, and characterize the distribution of 0.25-g gluten test results for kernel contaminated oats, twelve 50g samples of pure oats, each spiked with a wheat kernel, showed that 0.25g test results followed log-normal-like distributions. With this, we estimate probabilities of mis-assessment for a 'single measure/sample' relative to the gluten content. PMID:27596406

  19. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  20. Assessment of Residual Stresses in 3013 Inner and Outer Containers and Teardrop Samples

    International Nuclear Information System (INIS)

    This report is an assessment performed by LANL that examines packaging for plutonium-bearing materials and the resilience of its design. This report discusses residual stresses in the 3013 outer, the SRS/Hanford and RFETS/LLNL inner containers, and teardrop samples used in studies to assess the potential for SCC in 3013 containers. Residual tensile stresses in the heat affected zones of the closure welds are of particular concern.

  1. Assessment of Residual Stresses in 3013 Inner and Outer Containers and Teardrop Samples

    Energy Technology Data Exchange (ETDEWEB)

    Stroud, Mary Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Prime, Michael Bruce [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Veirs, Douglas Kirk [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Berg, John M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Clausen, Bjorn [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Worl, Laura Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); DeWald, Adrian T. [Hill Engineering, LLC, Rancho Cordova, CA (United States)

    2015-12-08

    This report is an assessment performed by LANL that examines packaging for plutonium-bearing materials and the resilience of its design. This report discusses residual stresses in the 3013 outer, the SRS/Hanford and RFETS/LLNL inner containers, and teardrop samples used in studies to assess the potential for SCC in 3013 containers. Residual tensile stresses in the heat affected zones of the closure welds are of particular concern.

  2. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    International Nuclear Information System (INIS)

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  3. Hydrochemical assessment of Semarang area using multivariate statistics: A sample based dataset

    OpenAIRE

    Irawan, Dasapta Erwin; Putranto, Thomas Triadi

    2016-01-01

    The following paper describes in brief the data set related to our project "Hydrochemical assessment of Semarang Groundwater Quality". All of 58 samples were taken in 1992, 1993, 2003, 2006, and 2007 using well point data from several reports from Ministry of Energy and Min- eral Resources and independent consultants. We provided 20 parameters in each samples (sample id, coord X, coord Y, well depth, water level, water elevation, TDS, pH, EC, K, Ca, Na, Mg, Cl, SO4, HCO3, ye...

  4. Automated headspace-solid-phase micro extraction-retention time locked-isotope dilution gas chromatography-mass spectrometry for the analysis of organotin compounds in water and sediment samples.

    Science.gov (United States)

    Devosa, Christophe; Vliegen, Maarten; Willaert, Bart; David, Frank; Moens, Luc; Sandra, Pat

    2005-06-24

    An automated method for the simultaneous determination of six important organotin compounds namely monobutyltin (MBT), dibutyltin (DBT), tributyltin (TBT), monophenyltin (MPhT), diphenyltin (DPhT) and triphenyltin (TPhT) in water and sediment samples is described. The method is based on derivatization with sodium tetraethylborate followed by automated headspace-solid-phase micro extraction (SPME) combined with GC-MS under retention time locked (RTL) conditions. Home-synthesized deuterated organotin analogues were used as internal standards. Two high abundant fragment ions corresponding to the main tin isotopes Sn118 and Sn120 were chosen; one for quantification and one as qualifier ion. The method was validated and excellent figures of merit were obtained. Limits of quantification (LOQs) are from 1.3 to 15 ng l(-1) (ppt) for water samples and from 1.0 to 6.3 microg kg(-1) (ppb) for sediment samples. Accuracy for sediment samples was tested on spiked real-life sediment samples and on a reference PACS-2 marine harbor sediment. The developed method was used in a case-study at the harbor of Antwerp where sediment samples in different areas were taken and subsequently screened for TBT contamination. Concentrations ranged from 15 microg kg(-1) in the port of Antwerp up to 43 mg kg(-1) near a ship repair unit. PMID:16038329

  5. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  6. Preparation and validation of gross alpha/beta samples used in EML's quality assessment program

    International Nuclear Information System (INIS)

    A set of water and filter samples have been incorporated into the existing Environmental Measurements Laboratory's (EML) Quality Assessment Program (QAP) for gross alpha/beta determinations by participating DOE laboratories. The participating laboratories are evaluated by comparing their results with the EML value. The preferred EML method for measuring water and filter samples, described in this report, uses gas flow proportional counters with 2 in. detectors. Procedures for sample preparation, quality control and instrument calibration are presented. Liquid scintillation (LS) counting is an alternative technique that is suitable for quantifying both the alpha (241Am, 230Th and 238Pu) and beta (90Sr/90Y) activity concentrations in the solutions used to prepare the QAP water and air filter samples. Three LS counting techniques (Cerenkov, dual dpm and full spectrum analysis) are compared. These techniques may be used to validate the activity concentrations of each component in the alpha/beta solution before the QAP samples are actually prepared

  7. Assessment of Emotional Intelligence in a Sample of Prospective Secondary Education Teachers

    Science.gov (United States)

    Gutiérrez-Moret, Margarita; Ibáñez-Martinez, Raquel; Aguilar-Moya, Remedios; Vidal-Infer, Antonio

    2016-01-01

    In the past few years, skills related to emotional intelligence (EI) have acquired special relevance in the educational domain. This study assesses EI in a sample of 155 students of 5 different specialities of a Master's degree in Teacher Training for Secondary Education. Data collection was conducted through the administration of the Trait Meta…

  8. Mood disorders in everyday life : A systematic review of experience sampling and ecological momentary assessment studies

    NARCIS (Netherlands)

    Aan het Rot, M.; Hogenelst, Koen; Schoevers, R.A.

    2012-01-01

    In the past two decades, the study of mood disorder patients using experience sampling methods (ESM) and ecological momentary assessment (EMA) has yielded important findings. In patients with major depressive disorder (MDD), the dynamics of their everyday mood have been associated with various aspec

  9. Sample Size Requirements for Assessing Statistical Moments of Simulated Crop Yield Distributions

    NARCIS (Netherlands)

    Lehmann, N.; Finger, R.; Klein, T.; Calanca, P.

    2013-01-01

    Mechanistic crop growth models are becoming increasingly important in agricultural research and are extensively used in climate change impact assessments. In such studies, statistics of crop yields are usually evaluated without the explicit consideration of sample size requirements. The purpose of t

  10. Gluten-containing grains skew gluten assessment in oats due to sample grind non-homogeneity.

    Science.gov (United States)

    Fritz, Ronald D; Chen, Yumin; Contreras, Veronica

    2017-02-01

    Oats are easily contaminated with gluten-rich kernels of wheat, rye and barley. These contaminants are like gluten 'pills', shown here to skew gluten analysis results. Using R-Biopharm R5 ELISA, we quantified gluten in gluten-free oatmeal servings from an in-market survey. For samples with a 5-20ppm reading on a first test, replicate analyses provided results ranging 160ppm. This suggests sample grinding may inadequately disperse gluten to allow a single accurate gluten assessment. To ascertain this, and characterize the distribution of 0.25-g gluten test results for kernel contaminated oats, twelve 50g samples of pure oats, each spiked with a wheat kernel, showed that 0.25g test results followed log-normal-like distributions. With this, we estimate probabilities of mis-assessment for a 'single measure/sample' relative to the <20ppm regulatory threshold, and derive an equation relating the probability of mis-assessment to sample average gluten content.

  11. Automated assessment of Pavlovian conditioned freezing and shock reactivity in mice using the VideoFreeze system

    Directory of Open Access Journals (Sweden)

    Stephan G Anagnostaras

    2010-09-01

    Full Text Available The Pavlovian conditioned freezing paradigm has become a prominent mouse and rat model of learning and memory, as well as of pathological fear. Due to its efficiency, reproducibility, and well-defined neurobiology, the paradigm has become widely adopted in large-scale genetic and pharmacological screens. However, one major shortcoming of the use of freezing behavior has been that it has required the use of tedious hand scoring, or a variety of proprietary automated methods that are often poorly validated or difficult to obtain and implement. Here we report an extensive validation of the Video Freeze system in mice, a turn-key all-inclusive system for fear conditioning in small animals. Using digital video and near-infrared lighting, the system achieved outstanding performance in scoring both freezing and movement. Given the large-scale adoption of the conditioned freezing paradigm, we encourage similar validation of other automated systems for scoring freezing, or other behaviors.

  12. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population.

    Science.gov (United States)

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel

    2016-01-01

    Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods) were analysed and compared using the city of Talca (Chile) as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population. PMID:27187429

  13. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population

    Directory of Open Access Journals (Sweden)

    Guillermo Rey Gozalo

    2016-05-01

    Full Text Available Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods were analysed and compared using the city of Talca (Chile as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population.

  14. Threshold-dependent sample sizes for selenium assessment with stream fish tissue

    Science.gov (United States)

    Hitt, Nathaniel P.; Smith, David

    2013-01-01

    Natural resource managers are developing assessments of selenium (Se) contamination in freshwater ecosystems based on fish tissue concentrations. We evaluated the effects of sample size (i.e., number of fish per site) on the probability of correctly detecting mean whole-body Se values above a range of potential management thresholds. We modeled Se concentrations as gamma distributions with shape and scale parameters fitting an empirical mean-to-variance relationship in data from southwestern West Virginia, USA (63 collections, 382 individuals). We used parametric bootstrapping techniques to calculate statistical power as the probability of detecting true mean concentrations up to 3 mg Se/kg above management thresholds ranging from 4-8 mg Se/kg. Sample sizes required to achieve 80% power varied as a function of management thresholds and type-I error tolerance (α). Higher thresholds required more samples than lower thresholds because populations were more heterogeneous at higher mean Se levels. For instance, to assess a management threshold of 4 mg Se/kg, a sample of 8 fish could detect an increase of ∼ 1 mg Se/kg with 80% power (given α = 0.05), but this sample size would be unable to detect such an increase from a management threshold of 8 mg Se/kg with more than a coin-flip probability. Increasing α decreased sample size requirements to detect above-threshold mean Se concentrations with 80% power. For instance, at an α-level of 0.05, an 8-fish sample could detect an increase of ∼ 2 units above a threshold of 8 mg Se/kg with 80% power, but when α was relaxed to 0.2 this sample size was more sensitive to increasing mean Se concentrations, allowing detection of an increase of ∼ 1.2 units with equivalent power. Combining individuals into 2- and 4-fish composite samples for laboratory analysis did not decrease power because the reduced number of laboratory samples was compensated by increased precision of composites for estimating mean

  15. Automated assessment of β-cell area and density per islet and patient using TMEM27 and BACE2 immunofluorescence staining in human pancreatic β-cells.

    Directory of Open Access Journals (Sweden)

    Markus P Rechsteiner

    Full Text Available In this study we aimed to establish an unbiased automatic quantification pipeline to assess islet specific features such as β-cell area and density per islet based on immunofluorescence stainings. To determine these parameters, the in vivo protein expression levels of TMEM27 and BACE2 in pancreatic islets of 32 patients with type 2 diabetes (T2D and in 28 non-diabetic individuals (ND were used as input for the automated pipeline. The output of the automated pipeline was first compared to a previously developed manual area scoring system which takes into account the intensity of the staining as well as the percentage of cells which are stained within an islet. The median TMEM27 and BACE2 area scores of all islets investigated per patient correlated significantly with the manual scoring and with the median area score of insulin. Furthermore, the median area scores of TMEM27, BACE2 and insulin calculated from all T2D were significantly lower compared to the one of all ND. TMEM27, BACE2, and insulin area scores correlated as well in each individual tissue specimen. Moreover, islet size determined by costaining of glucagon and either TMEM27 or BACE2 and β-cell density based either on TMEM27 or BACE2 positive cells correlated significantly. Finally, the TMEM27 area score showed a positive correlation with BMI in ND and an inverse pattern in T2D. In summary, automated quantification outperforms manual scoring by reducing time and individual bias. The simultaneous changes of TMEM27, BACE2, and insulin in the majority of the β-cells suggest that these proteins reflect the total number of functional insulin producing β-cells. Additionally, β-cell subpopulations may be identified which are positive for TMEM27, BACE2 or insulin only. Thus, the cumulative assessment of all three markers may provide further information about the real β-cell number per islet.

  16. The Automated Geospatial Watershed Assessment Tool (AGWA): Developing Post-Fire Model Parameters Using Precipitation and Runoff Records from Gauged Watersheds

    Science.gov (United States)

    Sheppard, B. S.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.; Canfield, E.; Sidman, G.

    2014-12-01

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impacts of wildfire on runoff and erosion. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of a suite of hydrologic and erosion models (RHEM, WEPP, KINEROS2 and SWAT). Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM). The watershed model elements are then intersected with terrain, soils, and land cover data layers to derive the requisite model input parameters. With the addition of a burn severity map AGWA can be used to model post wildfire changes to a catchment. By applying the same design storm to burned and unburned conditions a rapid assessment of the watershed can be made and areas that are the most prone to flooding can be identified. Post-fire precipitation and runoff records from gauged forested watersheds are now being used to make improvements to post fire model input parameters. Rainfall and runoff pairs have been selected from these records in order to calibrate parameter values for surface roughness and saturated hydraulic conductivity used in the KINEROS2 model. Several objective functions will be tried in the calibration process. Results will be validated. Currently Department of Interior Burn Area Emergency Response (DOI BAER) teams are using the AGWA-KINEROS2 modeling interface to assess hydrologically imposed risk immediately following wild fire. These parameter refinements are being made to further improve the quality of these assessments.

  17. A hybrid DNA extraction method for the qualitative and quantitative assessment of bacterial communities from poultry production samples.

    Science.gov (United States)

    Rothrock, Michael J; Hiett, Kelli L; Gamble, John; Caudill, Andrew C; Cicconi-Hogan, Kellie M; Caporaso, J Gregory

    2014-01-01

    The efficacy of DNA extraction protocols can be highly dependent upon both the type of sample being investigated and the types of downstream analyses performed. Considering that the use of new bacterial community analysis techniques (e.g., microbiomics, metagenomics) is becoming more prevalent in the agricultural and environmental sciences and many environmental samples within these disciplines can be physiochemically and microbiologically unique (e.g., fecal and litter/bedding samples from the poultry production spectrum), appropriate and effective DNA extraction methods need to be carefully chosen. Therefore, a novel semi-automated hybrid DNA extraction method was developed specifically for use with environmental poultry production samples. This method is a combination of the two major types of DNA extraction: mechanical and enzymatic. A two-step intense mechanical homogenization step (using bead-beating specifically formulated for environmental samples) was added to the beginning of the "gold standard" enzymatic DNA extraction method for fecal samples to enhance the removal of bacteria and DNA from the sample matrix and improve the recovery of Gram-positive bacterial community members. Once the enzymatic extraction portion of the hybrid method was initiated, the remaining purification process was automated using a robotic workstation to increase sample throughput and decrease sample processing error. In comparison to the strict mechanical and enzymatic DNA extraction methods, this novel hybrid method provided the best overall combined performance when considering quantitative (using 16S rRNA qPCR) and qualitative (using microbiomics) estimates of the total bacterial communities when processing poultry feces and litter samples. PMID:25548939

  18. Meconium samples used to assess infant exposure to the components of ETS during pregnancy

    Directory of Open Access Journals (Sweden)

    Sylwia Narkowicz

    2015-12-01

    Full Text Available Objectives: The aim of the study was to use meconium samples to assess fetal exposure to compounds present in environmental tobacco smoke (ETS. Material and Methods: In order to assess fetal exposure to toxic tobacco smoke compounds, samples of meconium from the offspring of women with different levels of tobacco smoke exposure, and the samples of saliva from the mothers were analyzed. Thiocyanate ion as a biomarker of tobacco smoke exposure, and other ions that are indices of such exposure were determined by means of ion chromatography. Results: The results of ion chromatography analysis of the meconium and maternal saliva samples for the presence of cations and anions (including thiocyanate ion indicate that the concentration level of specific ions depends on the intensity of environmental tobacco smoke exposure of pregnant women. Conclusions: Based on the results, it can be concluded that meconium samples can be used to determine the substances from tobacco smoke. The results confirm the effect of smoking during pregnancy on the presence and content of substances from tobacco smoke.

  19. Assessing genetic polymorphisms using DNA extracted from cells present in saliva samples

    Directory of Open Access Journals (Sweden)

    Nemoda Zsofia

    2011-12-01

    Full Text Available Abstract Background Technical advances following the Human Genome Project revealed that high-quality and -quantity DNA may be obtained from whole saliva samples. However, usability of previously collected samples and the effects of environmental conditions on the samples during collection have not been assessed in detail. In five studies we document the effects of sample volume, handling and storage conditions, type of collection device, and oral sampling location, on quantity, quality, and genetic assessment of DNA extracted from cells present in saliva. Methods Saliva samples were collected from ten adults in each study. Saliva volumes from .10-1.0 ml, different saliva collection devices, sampling locations in the mouth, room temperature storage, and multiple freeze-thaw cycles were tested. One representative single nucleotide polymorphism (SNP in the catechol-0-methyltransferase gene (COMT rs4680 and one representative variable number of tandem repeats (VNTR in the serotonin transporter gene (5-HTTLPR: serotonin transporter linked polymorphic region were selected for genetic analyses. Results The smallest tested whole saliva volume of .10 ml yielded, on average, 1.43 ± .77 μg DNA and gave accurate genotype calls in both genetic analyses. The usage of collection devices reduced the amount of DNA extracted from the saliva filtrates compared to the whole saliva sample, as 54-92% of the DNA was retained on the device. An "adhered cell" extraction enabled recovery of this DNA and provided good quality and quantity DNA. The DNA from both the saliva filtrates and the adhered cell recovery provided accurate genotype calls. The effects of storage at room temperature (up to 5 days, repeated freeze-thaw cycles (up to 6 cycles, and oral sampling location on DNA extraction and on genetic analysis from saliva were negligible. Conclusions Whole saliva samples with volumes of at least .10 ml were sufficient to extract good quality and quantity DNA. Using

  20. Methods for collecting benthic invertebrate samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.

  1. Molecular Method To Assess the Diversity of Burkholderia Species in Environmental Samples

    OpenAIRE

    Salles, J; Souza, de, H.R.; Elsas, van, J.D.

    2002-01-01

    In spite of the importance of many members of the genus Burkholderia in the soil microbial community, no direct method to assess the diversity of this genus has been developed so far. The aim of this work was the development of soil DNA-based PCR-denaturing gradient gel electrophoresis (DGGE), a powerful tool for studying the diversity of microbial communities, for detection and analysis of the Burkholderia diversity in soil samples. Primers specific for the genus Burkholderia were developed ...

  2. PREVALENCE AND ANTIMICROBIAL RESISTANCE ASSESSMENT OF SUBCLINICAL MASTITIS IN MILK SAMPLES FROM SELECTED DAIRY FARMS

    OpenAIRE

    Murugaiyah Marimuthu; Faez Firdaus Jesse Abdullah; Konto Mohammed; Sangeetha D/O Sarvananthan Poshpum; Lawan Adamu; Abdinasir Yusuf Osman; Yusuf Abba; Abdulnasir Tijjani

    2014-01-01

    This study was conducted in order to determine the prevalence and bacteriological assessment of subclinical mastitis and antimicrobial resistance of bacterial isolates from dairy cows in different farms around Selangor, Malaysia. A total of 120 milk samples from 3 different farms were randomly collected and tested for subclinical mastitis using California Mastitis Test (CMT), as well as for bacterial culture for isolation, identification and antimicrobial resistance. The most prevalent bacter...

  3. HydroCrowd: Citizen-empowered snapshot sampling to assess the spatial distribution of stream

    Science.gov (United States)

    Kraft, Philipp; Breuer, Lutz; Bach, Martin; Aubert, Alice H.; Frede, Hans-Georg

    2016-04-01

    Large parts of groundwater bodies in Central Europe shows elevated nitrate concentrations. While groundwater samplings characterize the water quality for a longer period, surface water resources, in particular streams, may be subject of fast concentration fluctuations and measurements distributed in time cannot by compared. Thus, sampling should be done in a short time frame (snapshot sampling). To describe the nitrogen status of streams in Germany, we organized a crowdsourcing experiment in the form of a snapshot sampling at a distinct day. We selected a national holiday in fall 2013 (Oct, 3rd) to ensure that a) volunteers have time to take a sample, b) stream water is unlikely to be influenced by recent agricultural fertilizer application, and c) low flow conditions are likely. We distributed 570 cleaned sample flasks to volunteers and got 280 filled flasks back with coordinates and other meta data about the sampled stream. The volunteers were asked to visit any stream outside of settlements and fill the flask with water from that stream. The samples were analyzed in our lab for concentration of nitrate, ammonium and dissolved organic nitrogen (DON), results are presented as a map on the web site http://www.uni-giessen.de/hydrocrowd. The measured results are related to catchment features such as population density, soil properties, and land use derived from national geodata sources. The statistical analyses revealed a significant correlation between nitrate and fraction of arable land (0.46), as well as soil humus content (0.37), but a weak correlation with population density (0.12). DON correlations were weak but significant with humus content (0.14) and arable land (0.13). The mean contribution of DON to total dissolved nitrogen was 22%. Crowdsourcing turned out to be a useful method to assess the spatial distribution of stream solutes, as considerable amounts of samples were collected with comparatively little effort at a single day.

  4. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff....... Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  5. Central Colorado Assessment Project (CCAP)-Geochemical data for rock, sediment, soil, and concentrate sample media

    Science.gov (United States)

    Granitto, Matthew; DeWitt, Ed H.; Klein, Terry L.

    2010-01-01

    This database was initiated, designed, and populated to collect and integrate geochemical data from central Colorado in order to facilitate geologic mapping, petrologic studies, mineral resource assessment, definition of geochemical baseline values and statistics, environmental impact assessment, and medical geology. The Microsoft Access database serves as a geochemical data warehouse in support of the Central Colorado Assessment Project (CCAP) and contains data tables describing historical and new quantitative and qualitative geochemical analyses determined by 70 analytical laboratory and field methods for 47,478 rock, sediment, soil, and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed either in the analytical laboratories of the USGS or by contract with commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects. In addition, geochemical data from 7,470 sediment and soil samples collected and analyzed under the Atomic Energy Commission National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program (henceforth called NURE) have been included in this database. In addition to data from 2,377 samples collected and analyzed under CCAP, this dataset includes archived geochemical data originally entered into the in-house Rock Analysis Storage System (RASS) database (used by the USGS from the mid-1960s through the late 1980s) and the in-house PLUTO database (used by the USGS from the mid-1970s through the mid-1990s). All of these data are maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB and from the NURE database were used to generate most of this dataset. In addition, USGS data that have been excluded previously from the NGDB because the data predate earliest USGS geochemical databases, or were once excluded for programmatic reasons

  6. Assessment of metal concentrations in sediment samples from Billings reservoir, Rio Grande tributary, Sao Paulo, Brazil

    International Nuclear Information System (INIS)

    The present study chemically characterized sediment samples from the Billings reservoir, Rio Grande tributary, in the Metropolitan region of Sao Paulo, by determining metal concentration and other elements of interest. The chosen chemical parameters for this characterization were Aluminum, Arsenic, Barium, Cadmium, Copper, Chromium, Iron, Lead, Manganese, Mercury, Nickel, Selenium and Zinc. These parameters are also used in the water quality index, with the exception of Selenium. The concentrations were determined through different analytical techniques such as atomic absorption spectrometry (FAAS, GFAAS and CVAAS), optical emission spectrometry (ICP OES) and neutron activation analysis. These analytical methodologies were assessed for precision, accuracy and detection and/or quantification limits for the sediment elements in question. Advantages and disadvantages of each technique for each element and its concentration were also discussed. From these assessment the most adequate technique was selected for the routine analysis of sediment samples for each element concentration determination. This assessment verified also that digestion in a closed microwave system with nitric acid is efficient for the evaluation of extracted metals of environmental interest. The analytical techniques chosen were equally efficient for metals determination. In the case of Cd and Pb, the FAAS technique was selected due to better results than ICP OES, as it does not present matrix interference. The concentration values obtained for metals As, Cd, Cu, Cr, Hg, Ni, Pb and Zn in the sediment samples were compared to Canadian Council of Minister of the Environment (CCME) TEL and PEL values. (author)

  7. Assessment of metal concentrations in sediment samples from Billings Reservoir, Rio Grande tributary, Sao Paulo, Brazil

    International Nuclear Information System (INIS)

    The present study chemically characterized sediment samples from the Billings reservoir, Rio Grande tributary, in the Metropolitan region of Sao Paulo, by determining metal concentration and other elements of interest. The chosen chemical parameters for this characterization were Aluminum, Arsenic, Barium, Cadmium, Copper, Chromium, Iron, Lead, Manganese, Mercury, Nickel, Selenium and Zinco. These parameters are also used in the water quality index, with the exception of Selenium. The concentrations were determined through different analytical techniques such as atomic absorption spectrometry (FAAS, GFAAS and CVAAS), optical emission spectrometry (ICP OES) and neutron activation analysis. These analytical methodologies were assessed for precision, accuracy and detection and/or quantification limits for the sediment elements in question. Advantages and disadvantages of each technique for each element and its concentration were also discussed. From these assessments the most adequate technique was selected for the routine analysis of sediment samples for each element concentration determination. This assessment verified also that digestion in a closed microwave system with nitric acid is efficient for the evaluation of extracted metals of environmental interest. The analytical techniques chosen were equally efficient for metals determination. In the case of Cd and Pb, the FAAS technique was selected due to better results than ICP OES, as it does not present matrix interference. The concentration values obtained for metals As, Cd, Cu, Cr, Hg, Ni, Pb and Zn in the sediment samples were compared to Canadian Council of Minister of the Environment (CCME) TEL and PEL values. (author)

  8. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  9. Contamination assessment in microbiological sampling of the Eyreville core, Chesapeake Bay impact structure

    Science.gov (United States)

    Gronstal, A.L.; Voytek, M.A.; Kirshtein, J.D.; Von der, Heyde; Lowit, M.D.; Cockell, C.S.

    2009-01-01

    Knowledge of the deep subsurface biosphere is limited due to difficulties in recovering materials. Deep drilling projects provide access to the subsurface; however, contamination introduced during drilling poses a major obstacle in obtaining clean samples. To monitor contamination during the 2005 International Continental Scientific Drilling Program (ICDP)-U.S. Geological Survey (USGS) deep drilling of the Chesapeake Bay impact structure, four methods were utilized. Fluorescent microspheres were used to mimic the ability of contaminant cells to enter samples through fractures in the core material during retrieval. Drilling mud was infused with a chemical tracer (Halon 1211) in order to monitor penetration of mud into cores. Pore water from samples was examined using excitation-emission matrix (EEM) fl uorescence spectroscopy to characterize dissolved organic carbon (DOC) present at various depths. DOC signatures at depth were compared to signatures from drilling mud in order to identify potential contamination. Finally, microbial contaminants present in drilling mud were identified through 16S ribosomal deoxyribonucleic acid (rDNA) clone libraries and compared to species cultured from core samples. Together, these methods allowed us to categorize the recovered core samples according to the likelihood of contamination. Twenty-two of the 47 subcores that were retrieved were free of contamination by all the methods used and were subsequently used for microbiological culture and culture-independent analysis. Our approach provides a comprehensive assessment of both particulate and dissolved contaminants that could be applied to any environment with low biomass. ?? 2009 The Geological Society of America.

  10. Validating the use of biopsy sampling in contamination assessment studies of small cetaceans.

    Science.gov (United States)

    Méndez-Fernandez, Paula; Galluzzi Polesi, Paola; Taniguchi, Satie; de O Santos, Marcos C; Montone, Rosalinda C

    2016-06-15

    Remote biopsy sampling is the most common technique for acquiring samples from free-ranging marine mammals. However, such techniques may result in variable sampling being sometimes superficial skin and blubber biopsies. For decades, blubber has been used to monitor the exposure of marine mammals to persistent organic pollutants (POPs), but little is known regarding the variability of POPs as a function of blubber depth in small cetaceans and the available literature offers variable results. Thus, the aim of the present study was to validate biopsy sampling for monitoring contaminant concentrations in small, free-ranging cetaceans. Samples from the dorsal blubber of 10 incidentally captured Atlantic spotted dolphins (Stenella frontalis) were separated into two different layers (outer and inner) to investigate the influence of sampling depth on POP concentrations. POP concentrations were compared to those of the full blubber layer. The results revealed no significant differences in lipid content between males and females or among the inner, outer and full blubber layers (p>0.05). Moreover, the wet and lipid weight concentrations of all POP classes analysed [i.e. polychlorinated biphenyls (PCBs), dichlorodiphenyltrichloroethanes (DDTs), polybrominated diphenyl ethers (PBDEs), hexachlorobenzene (HCB), hexachlorocyclohexanes (HCHs), chlordanes (CHLs) and mirex] did not differ significantly with blubber depth (p>0.05). POP classes followed the same decreasing order of wet weight concentrations in blubber layers and full blubber: PCBs>DDTs>PBDEs>mirex>HCB>HCHs>CHLs. Moreover, there was a low degree of differentiation in the accumulation of POP congeners. The present findings indicated that the distribution of contaminants was homogenous with blubber depth, which validates the use of biopsy sampling for the assessment of contaminants in small cetaceans. PMID:27113024

  11. Assessing the effects of sampling design on water quality status classification

    Science.gov (United States)

    Lloyd, Charlotte; Freer, Jim; Johnes, Penny; Collins, Adrian

    2013-04-01

    The Water Framework Directive (WFD) requires continued reporting of the water quality status of all European waterbodies, with this status partly determined by the time a waterbody exceeds different pollution concentration thresholds. Routine water quality monitoring most commonly takes place at weekly to monthly time steps meaning that potentially important pollution events can be missed. This has the potential to result in the misclassification of water quality status. Against this context, this paper investigates the implications of sampling design on a range of existing water quality status metrics routinely applied to WFD compliance assessments. Previous research has investigated the effect of sampling design on the calculation of annual nutrient and sediment loads using a variety of different interpolation and extrapolation models. This work builds on this foundation, extending the analysis to include the effects of sampling regime on flow- and concentration-duration curves as well as threshold-exceedance statistics, which form an essential part of WFD reporting. The effects of sampling regime on both the magnitude of the summary metrics and their corresponding uncertainties are investigated. This analysis is being undertaken on data collected as part of the Hampshire Avon Demonstration Test Catchment (DTC) project; a DEFRA funded initiative investigating cost-effective solutions for reducing diffuse pollution from agriculture. The DTC monitoring platform is collecting water quality data at a variety of temporal resolutions and using differing collection methods, including weekly grab samples, daily ISCO autosamples and high resolution samples (15-30 min time step) using analysers in situ on the river bank. Datasets collected during 2011-2013 were used to construct flow- and concentration-duration curves. A bootstrapping methodology was employed to resample randomly the individual datasets and produce distributions of the curves in order to quantify the

  12. 全自动定量浓缩-气相色谱法分析地表水中的有机氯农药%Determination of organochlorine Pesticides in Water Samples by Fully Automated Quantitative Concentrator-Gas Chromatography

    Institute of Scientific and Technical Information of China (English)

    曹旭静

    2016-01-01

    地表水中的有机氯农药用正己烷萃取后,用全自动定量蒸发浓缩仪在水浴温度35℃,真空度为300mbar时浓缩定容到1mL,一个样品只需要25min。用液液萃取-全自动定量浓缩仪-气相色谱法分析地表水水中的有机氯农药,该方法的检出限为为0.001~0.008μg/L,方法的平均回收率在78.6%~104%之间。该方法检出限低,精密度好,省时省力,自动化程度高,适合于大批量样品的监测。%Organochlorine pesticides in water were extracted by n-hexan,the extracted liquid was concentrated to 1mL with fully automated quantitative concentrator in the water bath temperature 35℃and the vacuum 300mbar.Which only need 25min. Organochlorine pesticides were determined by gas chromatograph after samples pre-treatment by liquid-liquid ex⁃traction with n-hexane and concentration with fully automated quantitative concentrator.The detection limits of method for organochlorine pesticides were in the range of 0.001~0.008μg/L.The average recoveries were 78.6%~104%. This method had advantages of good accuracy and precision,rapid,high degree of automation and was suitable for batch samples.

  13. Bayesian Reliability Modeling and Assessment Solution for NC Machine Tools under Small-sample Data

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaojun; KAN Yingnan; CHEN Fei; XU Binbin; CHEN Chuanhai; YANG Chuangui

    2015-01-01

    Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters’ prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters’ posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.

  14. Efficient assessment of modified nucleoside stability under conditions of automated oligonucleotide synthesis: characterization of the oxidation and oxidative desulfurization of 2-thiouridine.

    Science.gov (United States)

    Sochacka, E

    2001-01-01

    In order to efficiently assess the chemical stability of modified nucleosides to the reagents and conditions of automated oligonucleotide synthesis, we designed, developed and tested a scheme in which the modified nucleoside, directly attached to a solid support, is exposed to the cyclic chemistry of the instrument. Stability of 2-thiouridine against different oxidizers was investigated. Tertbutyl hydroperoxide (1 M) in anhydrous acetonitrile was a more effective oxidizer for the incorporation of 2-thiouridine into oligonucleotide chains than the same oxidizer in methylene chloride. Carbon tetrachloride/water in the presence of a basic catalyst was superior in maintaining the thiocarbonyl function, but its utility for RNA synthesis has yet to be fully tested, whereas 2-phenylsulfonyloxaziridine was a very efficient reagent for oxidative desulfurization of 2-thiouridine. PMID:11720000

  15. The potential for automated question answering in the context of genomic medicine: an assessment of existing resources and properties of answers.

    Science.gov (United States)

    Overby, Casey Lynnette; Tarczy-Hornoch, Peter; Demner-Fushman, Dina

    2009-01-01

    Knowledge gained in studies of genetic disorders is reported in a growing body of biomedical literature containing reports of genetic variation in individuals that map to medical conditions and/or response to therapy. These scientific discoveries need to be translated into practical applications to optimize patient care. Translating research into practice can be facilitated by supplying clinicians with research evidence. We assessed the role of existing tools in extracting answers to translational research questions in the area of genomic medicine. We: evaluate the coverage of translational research terms in the Unified Medical Language Systems (UMLS) Metathesaurus; determine where answers are most often found in full-text articles; and determine common answer patterns. Findings suggest that we will be able to leverage the UMLS in development of natural language processing algorithms for automated extraction of answers to translational research questions from biomedical text in the area of genomic medicine. PMID:19761578

  16. A machine vision system for automated non-invasive assessment of cell viability via dark field microscopy, wavelet feature selection and classification

    Directory of Open Access Journals (Sweden)

    Friehs Karl

    2008-10-01

    Full Text Available Abstract Background Cell viability is one of the basic properties indicating the physiological state of the cell, thus, it has long been one of the major considerations in biotechnological applications. Conventional methods for extracting information about cell viability usually need reagents to be applied on the targeted cells. These reagent-based techniques are reliable and versatile, however, some of them might be invasive and even toxic to the target cells. In support of automated noninvasive assessment of cell viability, a machine vision system has been developed. Results This system is based on supervised learning technique. It learns from images of certain kinds of cell populations and trains some classifiers. These trained classifiers are then employed to evaluate the images of given cell populations obtained via dark field microscopy. Wavelet decomposition is performed on the cell images. Energy and entropy are computed for each wavelet subimage as features. A feature selection algorithm is implemented to achieve better performance. Correlation between the results from the machine vision system and commonly accepted gold standards becomes stronger if wavelet features are utilized. The best performance is achieved with a selected subset of wavelet features. Conclusion The machine vision system based on dark field microscopy in conjugation with supervised machine learning and wavelet feature selection automates the cell viability assessment, and yields comparable results to commonly accepted methods. Wavelet features are found to be suitable to describe the discriminative properties of the live and dead cells in viability classification. According to the analysis, live cells exhibit morphologically more details and are intracellularly more organized than dead ones, which display more homogeneous and diffuse gray values throughout the cells. Feature selection increases the system's performance. The reason lies in the fact that feature

  17. Quality Assessment of Attribute Data in GIS Based on Simple Random Sampling

    Institute of Scientific and Technical Information of China (English)

    LIU Chun; SHI Wenzhong; LIU Dajie

    2003-01-01

    On the basis of the principles of simple random sampling, the statistical model of rate of disfigurement (RD) is put forward and described in detail. According to the definition of simple random sampling for the attribute data in GIS, the mean and variance of the RD are deduced as the characteristic value of the statistical model in order to explain the feasibility of the accuracy measurement of the attribute data in GIS by using the RD. Moreover, on the basis of the mean and variance of the RD, the quality assessment method for attribute data of vector maps during the data collecting is discussed. The RD spread graph is also drawn to see whether the quality of the attribute data is under control. The RD model can synthetically judge the quality of attribute data, which is different from other measurement coefficients that only discuss accuracy of classification.

  18. Review on the fire risk evaluation items and sample fire models for its assessment

    International Nuclear Information System (INIS)

    NFPA-803, the prescriptive regulation for Fire Protection Standard for Nuclear Power Plant (NPP), has to be replaced with NFPA-805, whose main tenet is based on probabilistic analysis or quantitative approach. With this insight, this paper introduces the evaluation items that must be reviewed and selected for the fire risk evaluation and the sample Fire Model for their assessment when the new Standard is applied in NPP. In addition, it is suggested that there has to be some modification as well as complementary renewal in some parts of the fire modeling programs if these kind of tools are comprehensibly used with validity

  19. The use of ESR technique for assessment of heating temperatures of archaeological lentil samples

    Science.gov (United States)

    Aydaş, Canan; Engin, Birol; Dönmez, Emel Oybak; Belli, Oktay

    2010-01-01

    Heat-induced paramagnetic centers in modern and archaeological lentils ( Lens culinaris, Medik.) were studied by X-band (9.3 GHz) electron spin resonance (ESR) technique. The modern red lentil samples were heated in an electrical furnace at increasing temperatures in the range 70-500 °C. The ESR spectral parameters (the intensity, g-value and peak-to-peak line width) of the heat-induced organic radicals were investigated for modern red lentil ( Lens culinaris, Medik.) samples. The obtained ESR spectra indicate that the relative number of heat-induced paramagnetic species and peak-to-peak line widths depends on the temperature and heating time of the modern lentil. The g-values also depend on the heating temperature but not heating time. Heated modern red lentils produced a range of organic radicals with g-values from g = 2.0062 to 2.0035. ESR signals of carbonised archaeological lentil samples from two archaeological deposits of the Van province in Turkey were studied and g-values, peak-to-peak line widths, intensities and elemental compositions were compared with those obtained for modern samples in order to assess at which temperature these archaeological lentils were heated in prehistoric sites. The maximum temperatures of the previous heating of carbonised UA5 and Y11 lentil seeds are as follows about 500 °C and above 500 °C, respectively.

  20. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  1. Genotoxicity assessment of water sampled from R-11 reservoir by means of allium test

    Energy Technology Data Exchange (ETDEWEB)

    Bukatich, E.; Pryakhin, E. [Urals Research Center for Radiation Medicine (Russian Federation); Geraskin, S. [Russian Institute of Agricultural Radiology and Agroecology (Russian Federation)

    2014-07-01

    slides of root tips meristem were dyed with aceto-orcein. Approximately 150 ana-telophases were scored for each root. 20-40 roots were analyzed for each water sample. In total 3000 - 6000 ana-telophases for each water sample were analyzed. Chromosome aberrations in ana-telophases (chromatid and chromosomal bridges and fragments), mitotic abnormalities (multipolar mitosis and laggards) were scored. The data analysis was arranged using R statistics. Aberration frequency in water samples from the natural control reservoir (0.46 ± 0.12%) exceeded insignificantly the frequency of aberrations in distilled (0.15 ± 0.08%) and bottled waters (0.33 ± 0.08%). Average frequency of aberrant cells in root meristem of onion germinated in water samples from R-11 reservoir (1.36 ± 0.24%) was about 3 times higher compared to control ones. Mitotic activity in root meristem was slightly inhibited in bulbs germinated in R-11 sample, but this effect was statistically insignificant. There was no difference in types of aberrations among all water samples but only in the frequency of abnormalities. So genotoxicity assessment of water sampled from R-11 reservoir by means of allium test shows the presence of genotoxic factor in water from the reservoir. Document available in abstract form only. (authors)

  2. A preliminary study to assess the construct validity of a cultural intelligence measure on a South African sample

    OpenAIRE

    Bright Mahembe; Amos S. Engelbrecht

    2014-01-01

    Orientation: Cultural intelligence is an essential social competence for effective individual interaction in a cross-cultural context. The cultural intelligence scale (CQS) is used extensively for assessing cultural intelligence; nevertheless, its reliability and validity on a South African sample are yet to be ascertained.Research purpose: The purpose of the current study was to assess the construct validity of the CQS on a South African sample. The results of the psychometric assessment off...

  3. Automated extraction and assessment of functional features of areal measured microstructures using a segmentation-based evaluation method

    Science.gov (United States)

    Hartmann, Wito; Loderer, Andreas

    2014-10-01

    In addition to currently available surface parameters, according to ISO 4287:2010 and ISO 25178-2:2012—which are defined particularly for stochastic surfaces—a universal evaluation procedure is provided for geometrical, well-defined, microstructured surfaces. Since several million of features (like diameters, depths, etc) are present on microstructured surfaces, segmentation techniques are used for the automation of the feature-based dimensional evaluation. By applying an additional extended 3D evaluation after the segmentation and classification procedure, the accuracy of the evaluation is improved compared to the direct evaluation of segments, and additional functional parameters can be derived. Advantages of the extended segmentation-based evaluation method include not only the ability to evaluate the manufacturing process statistically (e.g. by capability indices, according to ISO 21747:2007 and ISO 3534-2:2013) and to derive statistical reliable values for the correction of microstructuring processes but also the direct re-use of the evaluated parameter (including its statistical distribution) in simulations for the calculation of probabilities with respect to the functionality of the microstructured surface. The practical suitability of this method is demonstrated using examples of microstructures for the improvement of sliding and ink transfers for printing machines.

  4. Assessing the Role of Automation in Managing of Iranian E-banking and its Impact on Social Benefit

    Directory of Open Access Journals (Sweden)

    Hamidreza Salmani MOJAVERI

    2011-06-01

    Full Text Available Banks in the field of commercial developments have attention to create structural changes in the receiving and payment systems and also have facilities in services process to customers. In fact we can claim one of the reasons of general tendency to electronic business is the banks managers’ attention to the importance and necessity of this phenomenon, thus have led to their trend and serious attention for providing banking structure, based on electronic method. What banking services makes it different in comparing with other conventional methods for using E-Banking systems, is, quantitative and qualitative expansion in customer service. In other words, E-Banking, prepares the situation to customer till have wider and more diverse services. Furthermore, time and spatial dimension will not have effect in reducing or increasing services to customers. Also the customer can control his/her financial activities in every time and everywhere without attending in bank’s branches. The aim of this paper is to illustrate the status of banking automation, its social and organizational consequences in Iranian E-banking system, and providing appropriate recommendations.

  5. Automated determination of the stable carbon isotopic composition (δ13C) of total dissolved inorganic carbon (DIC) and total nonpurgeable dissolved organic carbon (DOC) in aqueous samples: RSIL lab codes 1851 and 1852

    Science.gov (United States)

    Révész, Kinga M.; Doctor, Daniel H.

    2014-01-01

    The purposes of the Reston Stable Isotope Laboratory (RSIL) lab codes 1851 and 1852 are to determine the total carbon mass and the ratio of the stable isotopes of carbon (δ13C) for total dissolved inorganic carbon (DIC, lab code 1851) and total nonpurgeable dissolved organic carbon (DOC, lab code 1852) in aqueous samples. The analysis procedure is automated according to a method that utilizes a total carbon analyzer as a peripheral sample preparation device for analysis of carbon dioxide (CO2) gas by a continuous-flow isotope ratio mass spectrometer (CF-IRMS). The carbon analyzer produces CO2 and determines the carbon mass in parts per million (ppm) of DIC and DOC in each sample separately, and the CF-IRMS determines the carbon isotope ratio of the produced CO2. This configuration provides a fully automated analysis of total carbon mass and δ13C with no operator intervention, additional sample preparation, or other manual analysis. To determine the DIC, the carbon analyzer transfers a specified sample volume to a heated (70 °C) reaction vessel with a preprogrammed volume of 10% phosphoric acid (H3PO4), which allows the carbonate and bicarbonate species in the sample to dissociate to CO2. The CO2 from the reacted sample is subsequently purged with a flow of helium gas that sweeps the CO2 through an infrared CO2 detector and quantifies the CO2. The CO2 is then carried through a high-temperature (650 °C) scrubber reactor, a series of water traps, and ultimately to the inlet of the mass spectrometer. For the analysis of total dissolved organic carbon, the carbon analyzer performs a second step on the sample in the heated reaction vessel during which a preprogrammed volume of sodium persulfate (Na2S2O8) is added, and the hydroxyl radicals oxidize the organics to CO2. Samples containing 2 ppm to 30,000 ppm of carbon are analyzed. The precision of the carbon isotope analysis is within 0.3 per mill for DIC, and within 0.5 per mill for DOC.

  6. Goblet cells of the normal human bulbar conjunctiva and their assessment by impression cytology sampling.

    Science.gov (United States)

    Doughty, Michael J

    2012-07-01

    Goblet cells of the conjunctiva are the main source of mucus for the ocular surface. The objectives of this review are to consider the goblet cells as assessed by various histological, cytological and electron microscopy methods, and to assess the consistency of published reports (over more than 25 years) of goblet cell density (GCD) from impression cytology specimens from nominally healthy human subjects. Reported GCD values have been notably variable, with a range from 24 to 2226 cells/mm² for average values. Data analysis suggests that a high density of goblet cells should be expected for the healthy human conjunctiva, with a tendency toward higher values in samples taken from normally covered locations (inferior and superior bulbar conjunctiva) of the open eye (at 973 +/- 789 cells/ mm²) than in samples taken from exposed (interpalpebral) locations (at 427 +/- 376 cells/mm²). No obvious change in GCD was found with respect to age, perhaps because the variability of the data did not allow detection of any age-related decline in GCD. Analyses of published data from 33 other sources indicated a trend for GCD to be lower than normal across a spectrum of ocular surface diseases.

  7. GY SAMPLING THEORY IN ENVIRONMENTAL STUDIES 1: ASSESSING SOIL SPLITTING PROTOCOLS

    Science.gov (United States)

    Five soil sample splitting methods (riffle splitting, paper cone riffle splitting, fractional shoveling, coning and quartering, and grab sampling) were evaluated with synthetic samples to verify Pierre Gy sampling theory expectations. Individually prepared samples consisting of l...

  8. The impact of genetic heterogeneity on biomarker development in kidney cancer assessed by multiregional sampling

    International Nuclear Information System (INIS)

    Primary clear cell renal cell carcinoma (ccRCC) genetic heterogeneity may lead to an underestimation of the mutational burden detected from a single site evaluation. We sought to characterize the extent of clonal branching involving key tumor suppressor mutations in primary ccRCC and determine if genetic heterogeneity could limit the mutation profiling from a single region assessment. Ex vivo core needle biopsies were obtained from three to five different regions of resected renal tumors at a single institution from 2012 to 2013. DNA was extracted and targeted sequencing was performed on five genes associated with ccRCC (von-Hippel Lindau [VHL], PBRM1, SETD2, BAP1, and KDM5C). We constructed phylogenetic trees by inferring clonal evolution based on the mutations present within each core and estimated the predictive power of detecting a mutation for each successive tumor region sampled. We obtained 47 ex vivo biopsy cores from 14 primary ccRCC's (median tumor size 4.5 cm, IQR 4.0–5.9 cm). Branching patterns of various complexities were observed in tumors with three or more mutations. A VHL mutation was detected in nine tumors (64%), each time being present ubiquitously throughout the tumor. Other genes had various degrees of regional mutational variation. Based on the mutations' prevalence we estimated that three different tumor regions should be sampled to detect mutations in PBRM1, SETD2, BAP1, and/or KDM5C with 90% certainty. The mutational burden of renal tumors varies by region sampled. Single site assessment of key tumor suppressor mutations in primary ccRCC may not adequately capture the genetic predictors of tumor behavior

  9. The impact of genetic heterogeneity on biomarker development in kidney cancer assessed by multiregional sampling.

    Science.gov (United States)

    Sankin, Alexander; Hakimi, Abraham A; Mikkilineni, Nina; Ostrovnaya, Irina; Silk, Mikhail T; Liang, Yupu; Mano, Roy; Chevinsky, Michael; Motzer, Robert J; Solomon, Stephen B; Cheng, Emily H; Durack, Jeremy C; Coleman, Jonathan A; Russo, Paul; Hsieh, James J

    2014-12-01

    Primary clear cell renal cell carcinoma (ccRCC) genetic heterogeneity may lead to an underestimation of the mutational burden detected from a single site evaluation. We sought to characterize the extent of clonal branching involving key tumor suppressor mutations in primary ccRCC and determine if genetic heterogeneity could limit the mutation profiling from a single region assessment. Ex vivo core needle biopsies were obtained from three to five different regions of resected renal tumors at a single institution from 2012 to 2013. DNA was extracted and targeted sequencing was performed on five genes associated with ccRCC (von-Hippel Lindau [VHL], PBRM1, SETD2, BAP1, and KDM5C). We constructed phylogenetic trees by inferring clonal evolution based on the mutations present within each core and estimated the predictive power of detecting a mutation for each successive tumor region sampled. We obtained 47 ex vivo biopsy cores from 14 primary ccRCC's (median tumor size 4.5 cm, IQR 4.0-5.9 cm). Branching patterns of various complexities were observed in tumors with three or more mutations. A VHL mutation was detected in nine tumors (64%), each time being present ubiquitously throughout the tumor. Other genes had various degrees of regional mutational variation. Based on the mutations' prevalence we estimated that three different tumor regions should be sampled to detect mutations in PBRM1, SETD2, BAP1, and/or KDM5C with 90% certainty. The mutational burden of renal tumors varies by region sampled. Single site assessment of key tumor suppressor mutations in primary ccRCC may not adequately capture the genetic predictors of tumor behavior. PMID:25124064

  10. Determination of furan levels in commercial samples of baby food from Brazil and preliminary risk assessment.

    Science.gov (United States)

    Pavesi Arisseto, A; Vicente, E; De Figueiredo Toledo, M C

    2010-08-01

    Commercial baby food samples available on the Brazilian market (n = 31) were analysed for furan content using a gas chromatography-mass spectrometry method preceded by solid-phase microextraction. A limit of detection of 0.7 microg kg(-1), a limit of quantitation of 2.4 microg kg(-1), mean recoveries varying from 80% to 107%, and coefficients of variation ranging from 5.6% to 9.4% for repeatability and from 7.4% to 12.4% for within-laboratory reproducibility were obtained during an in-house validation. The levels of furan found in the samples were from not detected to 95.5 microg kg(-1). Samples containing vegetables and meat showed higher furan levels as compared with those containing only fruits. An exposure assessment showed furan intakes up to 2.4 microg kg(-1) body weight day(-1) (99th percentile) for babies fed exclusively with commercial baby foods. Margins of exposure obtained from intakes estimated in this work indicated a potential public health concern.

  11. Assessments of Cancer Risk from Soil Samples in Gebeng Industrial Estate, Pahang and Amang Samples in Perak

    International Nuclear Information System (INIS)

    Industrial activities such as the tin tailings and rare earth processing contribute to radiological risk to human health and environment. Those activities can accumulate the naturally occurring radioactive materials (NORM) with significant concentration in the environment. The aims of this study was to determine the activities concentration of Thorium-232 (232Th), Uranium-238 (238U) and Potassium-40 (40K) in soil samples around the Gebeng Industrial Estate, Pahang and in samples of ilmenite and monazite from three tin tailings processing plants in Perak using gamma ray spectrometry. The terrestrial gamma dose rate, the annual dose and cancer risk were also determined. The activities concentration of 232Th, 238U and 40K in the Gebeng soil samples were found in the range of 14.3 - 102.4, 23.8 - 81.3 and 73.3 - 451 Bq kg-1, respectively. While the activities concentration of 232Th, 238U and 40K for ilmenite and monazite samples were in the range of 259 - 166500, 194 - 28750 and 26.4 - 11991 Bq kg-1, respectively. The range terrestrial gamma dose rate at the Gebeng Industrial Estate was 22 - 108 nGy h-1 and the tin tailings processing plants was 390 - 6650 nGy h-1. Whereas the annual dose at the Gebeng Industrial Estate and tin tailings processing plants were 0.02 - 0.15 and 0.47 - 68 mSv y-1, respectively. The study showed that the cancer risk in the Gebeng industrial area were 4 peoples per million and 3702 peoples per million in the tin tailings processing plants. The activity concentration of soil from industrial area reported by UNSCEAR 2000 was in range of the Malaysia soil background. The activity concentration, the terrestrial gamma dose rate, the annual dose and the cancer risk were lower in the industrial area compared to tin tailings processing plants due to the high activity among the tin tailings processing area due to the high content of thorium in monazite. This study is recommended to monitor the environmental dose continuously in order to ensure the

  12. use of nuclear spectroscopic techniques for assessment of polluting elements in environmental samples

    International Nuclear Information System (INIS)

    The concentrations of elements and radioisotopes in sediment, soil, water and wild plant samples collected from Burullus Lake, Egypt, has been studied in order to understand current contamination due to agricultural and industrial wastewaters. A multiple approaches were applied to assess properly sediment contamination in the Burullus Lake. The distributions of the Al, Fe and Mn in the lake's sediments are relatively homogenous with the exception of three locations with significantly high levels of Al and Fe in close approximation in the southeastern part. Sediments collected from the lake can be categorized as unpolluted with the exception of three locations which were very low polluted with Sr based on the geo-accumulation indices. High enrichment factors were obtained for Mn, Co, Cr, Cu and Zn. The MPIs indicate that one of the drain may have a major role in mobilizing major and trace metals in the lake environment while cluster analysis indicates possible pollution from only three of the drainage channels. Comparisons with consensus-based sediment quality guidelines revealed that 100%, ∼69%, ∼92% and ∼∼15% of the samples exceeded the threshold effect concentration for Cr, Cu, Ni and Zn, respectively, with over 15% for Cr and Ni of the sample concentrations falling above the probable effect concentration. On the other hand, no samples exceed both levels for Pb. The concentration of 40K is uniform and that of 137Cs is generally higher in eastern part of the lake. The result indicate that 226 Ra is less soluble in the lake environment than 232Th. Elemental concentrations in water have uniform distributions and the Fe, Mn, Co, Cr, Cu and Ni are more likely to exist in soluble phase in the lake environment.

  13. Comparison of soil solution sampling techniques to assess metal fluxes from contaminated soil to groundwater.

    Science.gov (United States)

    Coutelot, F; Sappin-Didier, V; Keller, C; Atteia, O

    2014-12-01

    The unsaturated zone plays a major role in elemental fluxes in terrestrial ecosystems. A representative chemical analysis of soil pore water is required for the interpretation of soil chemical phenomena and particularly to assess Trace Elements (TEs) mobility. This requires an optimal sampling system to avoid modification of the extracted soil water chemistry and allow for an accurate estimation of solute fluxes. In this paper, the chemical composition of soil solutions sampled by Rhizon® samplers connected to a standard syringe was compared to two other types of suction probes (Rhizon® + vacuum tube and Rhizon® + diverted flow system). We investigated the effects of different vacuum application procedures on concentrations of spiked elements (Cr, As, Zn) mixed as powder into the first 20 cm of 100-cm columns and non-spiked elements (Ca, Na, Mg) concentrations in two types of columns (SiO2 sand and a mixture of kaolinite + SiO2 sand substrates). Rhizon® was installed at different depths. The metals concentrations showed that (i) in sand, peak concentrations cannot be correctly sampled, thus the flux cannot be estimated, and the errors can easily reach a factor 2; (ii) in sand + clay columns, peak concentrations were larger, indicating that they could be sampled but, due to sorption on clay, it was not possible to compare fluxes at different depths. The different samplers tested were not able to reflect the elemental flux to groundwater and, although the Rhizon® + syringe device was more accurate, the best solution remains to be the use of a lysimeter, whose bottom is kept continuously at a suction close to the one existing in the soil. PMID:25277861

  14. Automated activation-analysis system

    International Nuclear Information System (INIS)

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  15. SU-E-I-89: Assessment of CT Radiation Dose and Image Quality for An Automated Tube Potential Selection Algorithm Using Pediatric Anthropomorphic and ACR Phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Mahmood, U; Erdi, Y; Wang, W [Memorial Sloan Kettering Cancer Center, NY, NY (United States)

    2014-06-01

    Purpose: To assess the impact of General Electrics automated tube potential algorithm, kV assist (kVa) on radiation dose and image quality, with an emphasis on optimizing protocols based on noise texture. Methods: Radiation dose was assessed by inserting optically stimulated luminescence dosimeters (OSLs) throughout the body of a pediatric anthropomorphic phantom (CIRS). The baseline protocol was: 120 kVp, 80 mA, 0.7s rotation time. Image quality was assessed by calculating the contrast to noise ratio (CNR) and noise power spectrum (NPS) from the ACR CT accreditation phantom. CNRs were calculated according to the steps described in ACR CT phantom testing document. NPS was determined by taking the 3D FFT of the uniformity section of the ACR phantom. NPS and CNR were evaluated with and without kVa and for all available adaptive iterative statistical reconstruction (ASiR) settings, ranging from 0 to 100%. Each NPS was also evaluated for its peak frequency difference (PFD) with respect to the baseline protocol. Results: For the baseline protocol, CNR was found to decrease from 0.460 ± 0.182 to 0.420 ± 0.057 when kVa was activated. When compared against the baseline protocol, the PFD at ASiR of 40% yielded a decrease in noise magnitude as realized by the increase in CNR = 0.620 ± 0.040. The liver dose decreased by 30% with kVa activation. Conclusion: Application of kVa reduces the liver dose up to 30%. However, reduction in image quality for abdominal scans occurs when using the automated tube voltage selection feature at the baseline protocol. As demonstrated by the CNR and NPS analysis, the texture and magnitude of the noise in reconstructed images at ASiR 40% was found to be the same as our baseline images. We have demonstrated that 30% dose reduction is possible when using 40% ASiR with kVa in pediatric patients.

  16. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864....5680 Automated heparin analyzer. (a) Identification. An automated heparin analyzer is a device used to determine the heparin level in a blood sample by mixing the sample with protamine (a...

  17. A simulation approach to assessing sampling strategies for insect pests: an example with the balsam gall midge.

    Directory of Open Access Journals (Sweden)

    R Drew Carleton

    Full Text Available Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with "pre-sampling" data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n ∼ 100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand was the most efficient, with sample means converging on true mean density for sample sizes of n ∼ 25-40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods.

  18. Systematic assessment of reduced representation bisulfite sequencing to human blood samples

    DEFF Research Database (Denmark)

    Wang, Li; Sun, Jihua; Wu, Honglong;

    2012-01-01

    Complementary to the time- and cost-intensive direct bisulfite sequencing, we applied reduced representation bisulfite sequencing (RRBS) to the human peripheral blood mononuclear cells (PBMC) from YH, the Asian individual whose genome and epigenome has been deciphered in the YH project...... and systematically assessed the genomic coverage, coverage depth and reproducibility of this technology as well as the concordance of DNA methylation levels measured by RRBS and direct bisulfite sequencing for the detected CpG sites. Our result suggests that RRBS can cover more than half of CpG islands and promoter...... between the two methods is high. It can be concluded that RRBS is a time and cost-effective sequencing method for unbiased DNA methylation profiling of CpG islands and promoter regions in a genome-wide scale and it is the method of choice to assay certain genomic regions for multiple samples in a rapid...

  19. Assessing the diagnostic validity of a structured psychiatric interview in a first-admission hospital sample

    DEFF Research Database (Denmark)

    Frederiksen, Julie Elisabeth Nordgaard; Revsbech, Rasmus; Sæbye, Ditte;

    2012-01-01

    on multiple sources of information, which included videotaped comprehensive semi-structured narrative interviews. The overall kappa agreement was 0.18. The sensitivity and specificity for the diagnosis of schizophrenia by SCID were 19% and 100%, respectively. It is concluded that structured interviews......The use of structured psychiatric interviews performed by non-clinicians is frequent for research purposes and is becoming increasingly common in clini-cal practice. The validity of such interviews has rarely been evaluated empirically. In this study of a sample of 100 diagnostically heterogeneous......, first-admitted inpatients, the results of an assessment with the Structured Clinical Interview for DSM-IV (SCID), yielding a DSM-IV diagnosis and performed by a trained non-clinician, were compared with a consensus lifetime best diagnostic estimate (DSM-IV) by two experienced research clinicians, based...

  20. Macrothrombocytopenia in North India: Role of Automated Platelet Data in the Detection of an Under Diagnosed Entity

    OpenAIRE

    Kakkar, Naveen; John, M. Joseph; Mathew, Amrith

    2014-01-01

    Congenital macrothrombocytopenia is being increasingly recognised because of the increasing availability of automated platelet counts during routine complete blood count. If not recognised, these patients may be unnecessarily investigated or treated. The study was done to assess the occurrence of macrothrombocytopenia in the North Indian population and the role of automated platelet parameters in its detection. This prospective study was done on patients whose blood samples were sent for CBC ...

  1. Psychometric Properties of the Theory of Mind Assessment Scale in a Sample of Adolescents and Adults.

    Science.gov (United States)

    Bosco, Francesca M; Gabbatore, Ilaria; Tirassa, Maurizio; Testa, Silvia

    2016-01-01

    This research aimed at the evaluation of the psychometric properties of the Theory of Mind Assessment Scale (Th.o.m.a.s.). Th.o.m.a.s. is a semi-structured interview meant to evaluate a person's Theory of Mind (ToM). It is composed of several questions organized in four scales, each focusing on one of the areas of knowledge in which such faculty may manifest itself: Scale A (I-Me) investigates first-order first-person ToM; Scale B (Other-Self) investigates third-person ToM from an allocentric perspective; Scale C (I-Other) again investigates third-person ToM, but from an egocentric perspective; and Scale D (Other-Me) investigates second-order ToM. The psychometric proprieties of Th.o.m.a.s. were evaluated in a sample of 156 healthy persons: 80 preadolescent and adolescent (aged 11-17 years, 42 females) and 76 adults (aged from 20 to 67 years, 35 females). Th.o.m.a.s. scores show good inter-rater agreement and internal consistency; the scores increase with age. Evidence of criterion validity was found as Scale B scores were correlated with those of an independent instrument for the evaluation of ToM, the Strange Stories task. Confirmatory factor analysis (CFA) showed good fit of the four-factors theoretical model to the data, although the four factors were highly correlated. For each of the four scales, Rasch analyses showed that, with few exceptions, items fitted the Partial credit model and their functioning was invariant for gender and age. The results of this study, along with those of previous researches with clinical samples, show that Th.o.m.a.s. is a promising instrument to assess ToM in different populations. PMID:27242563

  2. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  3. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Ruiz, Tomas [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain)]. E-mail: tpr@um.es; Martinez-Lozano, Carmen [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain); Garcia, Maria Dolores [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain)

    2007-02-19

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 {mu}g mL{sup -1} of propoxur, with a detection limit of 5 ng mL{sup -1}. The repeatability was 0.82% expressed as relative standard deviation (n = 10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL{sup -1} levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L{sup -1} using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 {mu}g kg{sup -1}.

  4. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection.

    Science.gov (United States)

    Pérez-Ruiz, Tomás; Martínez-Lozano, Carmen; García, María Dolores

    2007-02-19

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 microg mL(-1) of propoxur, with a detection limit of 5 ng mL(-1). The repeatability was 0.82% expressed as relative standard deviation (n=10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL(-1) levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L(-1) using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 microg kg(-1).

  5. Osteoporosis Self-Assessment Tool Performance in a Large Sample of Postmenopausal Women of Mendoza, Argentina

    Directory of Open Access Journals (Sweden)

    Fernando D. Saraví

    2013-01-01

    Full Text Available The Osteoporosis Self-assessment Tool (OST is a clinical instrument designed to select patients at risk of osteoporosis, who would benefit from a bone mineral density measurement. The OST only takes into account the age and weight of the subject. It was developed for Asian women and later validated for European and North American white women. The performance of the OST in a sample of 4343 women from Greater Mendoza, a large metropolitan area of Argentina, was assessed. Dual X-ray absorptiometry (DXA scans of lumbar spine and hip were obtained. Patients were classified as either osteoporotic (N=1830 or nonosteoporotic (n=2513 according to their lowest T-score at any site. Osteoporotic patients had lower OST scores (P<0.0001. A receiver operating characteristic (ROC curve showed an area under the curve of 71% (P<0.0001, with a sensitivity of 83.7% and a specificity of 44% for a cut-off value of 2. Positive predictive value was 52% and negative predictive value was 79%. The odds ratio for the diagnosis of osteoporosis was 4.06 (CI95 3.51 to 4.71; P<0.0001. It is concluded that the OST is useful for selecting postmenopausal women for DXA testing in the studied population.

  6. Violence risk assessment and women: predictive accuracy of the HCR-20 in a civil psychiatric sample.

    Science.gov (United States)

    Garcia-Mansilla, Alexandra; Rosenfeld, Barry; Cruise, Keith R

    2011-01-01

    Research to date has not adequately demonstrated whether the HCR-20 Violence Risk Assessment Scheme (HCR-20; Webster, Douglas, Eaves, & Hart, 1997), a structured violence risk assessment measure with a robust literature supporting its validity in male samples, is a valid indicator of violence risk in women. This study utilized data from the MacArthur Study of Mental Disorder and Violence to retrospectively score an abbreviated version of HCR-20 in 827 civil psychiatric patients. HCR-20 scores and predictive accuracy of community violence were compared for men and women. Results suggested that the HCR-20 is slightly, but not significantly, better for evaluating future risk for violence in men than in women, although the magnitude of the gender differences was small and was largely limited to historical factors. The results do not indicate that the HCR-20 needs to be tailored for use in women or that it should not be used in women, but they do highlight that the HCR-20 should be used cautiously and with full awareness of its potential limitations in women.

  7. A cost-effective technique for integrating personal radiation dose assessment with personal gravimetric sampling

    International Nuclear Information System (INIS)

    During recent years there has been an increasing awareness internationally of radiation levels in the mining and milling of radioactive ores, including those from non-uranium mines. A major aspect of radiation control is concerned with the measurement of radiation levels and the assessment of radiation doses incurred by individual workers. Current techniques available internationally for personnel monitoring of radiation exposures are expensive and there is a particular need to reduce the cost of personal radiation monitoring in South African gold mines because of the large labour force employed. In this regard the obvious benefits of integrating personal radiation monitoring with existing personal monitoring systems already in place in South African gold mines should be exploited. A system which can be utilized for this purpose is personal gravimetric sampling. A new cost-effective technique for personal radiation monitoring, which can be fully integrated with the personal gravimetric sampling strategy being implemented on mines, has been developed in South Africa. The basic principles of this technique and its potential in South African mines are described. 9 refs., 7 figs

  8. An automated image analysis framework for segmentation and division plane detection of single live Staphylococcus aureus cells which can operate at millisecond sampling time scales using bespoke Slimfield microscopy

    CERN Document Server

    Wollman, Adam J M; Foster, Simon; Leake, Mark C

    2016-01-01

    Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphological...

  9. Assessment of natural radioactivity levels and associated dose rates in soil samples from Northern Rajasthan, India.

    Science.gov (United States)

    Duggal, Vikas; Rani, Asha; Mehra, Rohit; Ramola, R C

    2014-01-01

    The analysis of naturally occurring radionuclides ((226)Ra, (232)Th and (40)K) has been carried out in 40 soil samples collected from four districts of the Northern Rajasthan, India using gamma-ray spectrometry with an NaI(Tl) detector. The activity concentrations of the samples range from 38±9 to 65±11 Bq kg(-1) with a mean value of 52 Bq kg(-1) for (226)Ra, from 8±8 to 32±9 Bq kg(-1) with a mean value of 19 Bq kg(-1) for (232)Th and from 929±185 to 1894±249 Bq kg(-1) with a mean value of 1627 Bq kg(-1) for (40)K. The measured activity concentration of (226)Ra and (40)K in soil was higher and for (232)Th was lower than the worldwide range. Radium equivalent activities were calculated for the soil samples to assess the radiation hazards arising due to the use of these soils in the construction of buildings. The calculated average radium equivalent activity was 205±20 Bq kg(-1), which is less than the recommended limit of 370 Bq kg(-1) by the Organization for Economic Cooperation and Development. The total absorbed dose rate calculated from the activity concentration of (226)Ra, (232)Th and (40)K ranges from 77 to 123 nGy h(-1) with an average value of 103 nGy h(-1). The mean external (Hex) and internal hazard indices (Hin) for the area under study were determined to be 0.55 and 0.69, respectively. The corresponding average annual effective dose was found to be 0.63 mSv. PMID:23943368

  10. Visual and automated assessment of matrix metalloproteinase-14 tissue expression for the evaluation of ovarian cancer prognosis.

    Science.gov (United States)

    Trudel, Dominique; Desmeules, Patrice; Turcotte, Stéphane; Plante, Marie; Grégoire, Jean; Renaud, Marie-Claude; Orain, Michèle; Bairati, Isabelle; Têtu, Bernard

    2014-10-01

    The purpose of this study was to evaluate whether the membrane type 1 matrix metalloproteinase-14 (or MT1-MMP) tissue expression, as assessed visually on digital slides and by digital image analysis, could predict outcomes in women with ovarian carcinoma. Tissue microarrays from a cohort of 211 ovarian carcinoma women who underwent a debulking surgery between 1993 and 2006 at the CHU de Québec (Canada) were immunostained for matrix metalloproteinase-14. The percentage of MMP-14 staining was assessed visually and with the Calopix software. Progression was evaluated using the CA-125 and/or the RECIST criteria according to the GCIG criteria. Dates of death were obtained by record linkage with the Québec mortality files. Adjusted hazard ratios of death and progression with their 95% confidence intervals were estimated using the Cox model. Comparisons between the two modalities of MMP-14 assessment were done using the box plots and the Kruskal-Wallis test. The highest levels of MMP-14 immunostaining were associated with nonserous histology, early FIGO stage, and low preoperative CA-125 levels (P40% of MMP-14-positive cells) was inversely associated with progression using visual assessment (hazard ratio=0.39; 95% confidence interval: 0.18-0.82). A similar association was observed with the highest quartile of MMP-14-positive area assessed by digital image analysis (hazard ratio=0.48; 95% confidence interval: 0.28-0.82). After adjustment for standard prognostic factors, these associations were no longer significant in the ovarian carcinoma cohort. However, in women with serous carcinoma, the highest quartile of MMP-14-positive area was associated with progression (adjusted hazard ratio=0.48; 95% confidence interval: 0.24-0.99). There was no association with overall survival. The digital image analysis of MMP-14-positive area matched the visual assessment using three categories (>40% vs 21-40 vs <20%). Higher levels of MMP-14 immunostaining were associated with standard

  11. Comparison of Individual and Pooled Stool Samples for the Assessment of Soil-Transmitted Helminth Infection Intensity and Drug Efficacy

    OpenAIRE

    Zeleke Mekonnen; Selima Meka; Mio Ayana; Johannes Bogers; Jozef Vercruysse; Bruno Levecke

    2013-01-01

    BACKGROUND: In veterinary parasitology samples are often pooled for a rapid assessment of infection intensity and drug efficacy. Currently, studies evaluating this strategy in large-scale drug administration programs to control human soil-transmitted helminths (STHs; Ascaris lumbricoides, Trichuris trichiura, and hookworm), are absent. Therefore, we developed and evaluated a pooling strategy to assess intensity of STH infections and drug efficacy. METHODS/PRINCIPAL FINDINGS: Stool samples fro...

  12. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  13. Detection of Giardia lamblia, Cryptosporidium spp. and Entamoeba histolytica in clinical stool samples by using multiplex real-time PCR after automated DNA isolation

    NARCIS (Netherlands)

    Van Lint, P; Rossen, J W; Vermeiren, S; Ver Elst, K; Weekx, S; Van Schaeren, J; Jeurissen, A

    2013-01-01

    Diagnosis of intestinal parasites in stool samples is generally still carried out by microscopy; however, this technique is known to suffer from a low sensitivity and is unable to discriminate between certain protozoa. In order to overcome these limitations, a real-time multiplex PCR was evaluated a

  14. Use of pooled urine samples and automated DNA isolation to achieve improved sensitivity and cost-effectiveness of large-scale testing for Chlamydia trachomatis in pregnant women.

    NARCIS (Netherlands)

    Rours, G.I.J.G.; Verkooyen, R.P.; Willemse, H.F.M.; Zwaan, E.A. van der; Belkum, A. van; Groot, R. de; Verbrugh, H.A.; Ossewaarde, J.M.

    2005-01-01

    The success of large-scale screening for Chlamydia trachomatis depends on the availability of noninvasive samples, low costs, and high-quality testing. To evaluate C. trachomatis testing with pregnant women, first-void urine specimens from 750 consecutive asymptomatic pregnant women from the Rotterd

  15. Use of pooled urine samples and automated DNA isolation to achieve improved sensitivity and cost-effectiveness of large-scale testing for Chlamydia trachomatis in pregnant women.

    NARCIS (Netherlands)

    G.I.J.G. Rours (Ingrid); R.P.A.J. Verkooyen (Roel); H.F. Willemse; E.A.E. van der Zwaan (Elizabeth); A.F. van Belkum (Alex); R. de Groot (Ronald); H.A. Verbrugh (Henri); J.M. Ossewaarde (Jacobus)

    2005-01-01

    textabstractThe success of large-scale screening for Chlamydia trachomatis depends on the availability of noninvasive samples, low costs, and high-quality testing. To evaluate C. trachomatis testing with pregnant women, first-void urine specimens from 750 consecutive asymptomatic pregnant women from

  16. Determination of benzoylureas in ground water samples by fully automated on-line pre-concentration and liquid chromatography-fluorescence detection.

    Science.gov (United States)

    Gil García, M D; Martínez Galera, M; Barranco Martínez, D; Gisbert Gallego, J

    2006-01-27

    An on-line pre-concentration method for the analysis of five benzoylureas (diflubenzuron, triflumuron, hexaflumuron, lufenuron and flufenoxuron) in ground water samples was evaluated using two C(18) columns, and fluorescence detection after photochemical induced fluorescence (PIF) post-column derivatization. The trace enrichment was carried out with 35 mL of ground water modified with 15 mL of MeOH on a 50 mm x 4.6 mm I.D. first enrichment column (C-1) packed with 5 microm Hypersil Elite C(18). Retention properties of pesticides and humic acids usually contained in ground water were studied on C-1 at concentration levels ranging between 0.04 and 14.00 microg/L in water samples. The results obtained in this study show that the pesticides are pre-concentrated in the first short column while the humic acids contained in the ground water samples are eluted to waste. Pesticides recoveries ranged between 92.3 and 109.5%. The methodology proposed was used to determine benzoylureas in ground water samples at levels lower than 0.1 microg/L (maximum levels established by the European Union). PMID:16337641

  17. The accuracy of platelet counting in thrombocytopenic blood samples distributed by the UK National External Quality Assessment Scheme for General Haematology.

    Science.gov (United States)

    De la Salle, Barbara J; McTaggart, Paul N; Briggs, Carol; Harrison, Paul; Doré, Caroline J; Longair, Ian; Machin, Samuel J; Hyde, Keith

    2012-01-01

    A knowledge of the limitations of automated platelet counting is essential for the effective care of thrombocytopenic patients and management of platelet stocks for transfusion. For this study, 29 external quality assessment specimen pools with platelet counts between 5 and 64 × 10(9)/L were distributed to more than 1,100 users of 23 different hematology analyzer models. The same specimen pools were analyzed by the international reference method (IRM) for platelet counting at 3 reference centers. The IRM values were on average lower than the all-methods median values returned by the automated analyzers. The majority (~67%) of the automated analyzer results overestimated the platelet count compared with the IRM, with significant differences in 16.5% of cases. Performance differed between analyzer models. The observed differences may depend in part on the nature of the survey material and analyzer technology, but the findings have implications for the interpretation of platelet counts at levels of clinical decision making.

  18. A comparison of three macroinvertebrate sampling devices for use in conducting rapid-assessment procedures of Delmarva Peninsula wetlands

    Science.gov (United States)

    Lowe, Terrence (Peter); Tebbs, Kerry; Sparling, Donald W.

    2016-01-01

    Three types of macroinvertebrate collecting devices, Gerking box traps, D-shaped sweep nets, and activity traps, have commonly been used to sample macroinvertebrates when conducting rapid biological assessments of North American wetlands. We compared collections of macroinvertebrates identified to the family level made with these devices in 6 constructed and 2 natural wetlands on the Delmarva Peninsula of Maryland. We also assessed their potential efficacy in comparisons among wetlands using several proportional and richness attributes. Differences in median diversity among samples from the 3 devices were significant; the sweep-net samples had the greatest diversity and the activity-trap samples had the least diversity. Differences in median abundance were not significant between the Gerking box-trap samples and sweep-net samples, but median abundance among activity-trap samples was significantly lower than among samples of the other 2 devices. Within samples, the proportions of median diversity composed of major class and order groupings were similar among the 3 devices. However the proportions of median abundance composed of the major class and order groupings within activity-trap samples were not similar to those of the other 2 devices. There was a slight but significant increase in the total number of families captured when we combined activity-trap samples with Gerking box-trap samples or with sweep-net samples, and the per-sample median numbers of families of the combined activity-trap and sweep-net samples was significantly higher than that of the combined activity-trap and Gerking box-trap samples. We detected significant differences among wetlands for 4 macroinvertebrate attributes with the Gerking box-trap data, 6 attributes with sweep-net data, and 5 attributes with the activity-trap data. A small, but significant increase in the number of attributes showing differences among wetlands occurred when we combined activity-trap samples with those of the

  19. Dynamic three-dimensional echocardiography combined with semi-automated border detection offers advantages for assessment of resynchronization therapy

    Directory of Open Access Journals (Sweden)

    Voormolen Marco M

    2003-10-01

    Full Text Available Abstract Simultaneous electrical stimulation of both ventricles in patients with interventricular conduction disturbance and advanced heart failure improves hemodynamics and results in increased exercise tolerance, quality of life. We have developed a novel technique for the assessment and optimization of resynchronization therapy. Our approach is based on transthoracic dynamic three-dimensional (3D echocardiography and allows determination of the most delayed contraction site of the left ventricle (LV together with global LV function data. Our initial results suggest that fast reconstruction of the LV is feasible for the selection of the optimal pacing site and allows identifying LV segments with dyssynchrony.

  20. US Environmental Protection Agency Method 314.1, an automated sample preconcentration/matrix elimination suppressed conductivity method for the analysis of trace levels (0.50 microg/L) of perchlorate in drinking water.

    Science.gov (United States)

    Wagner, Herbert P; Pepich, B V; Pohl, C; Later, D; Joyce, R; Srinivasan, K; Thomas, D; Woodruff, A; Deborba, B; Munch, D J

    2006-06-16

    Since 1997 there has been increasing interest in the development of analytical methods for the analysis of perchlorate. The US Environmental Protection Agency (EPA) Method 314.0, which was used during the first Unregulated Contaminant Monitoring Regulation (UCMR) cycle, supports a method reporting limit (MRL) of 4.0 microg/L. The non-selective nature of conductivity detection, combined with very high ionic strength matrices, can create conditions that make the determination of perchlorate difficult. The objective of this work was to develop an automated, suppressed conductivity method with improved sensitivity for use in the second UCMR cycle. The new method, EPA Method 314.1, uses a 35 mm x 4 mm cryptand concentrator column in the sample loop position to concentrate perchlorate from a 2 mL sample volume, which is subsequently rinsed with 10 mM NaOH to remove interfering anions. The cryptand concentrator column is combined with a primary AS16 analytical column and a confirmation AS20 analytical column. Unique characteristics of the cryptand column allow perchlorate to be desorbed from the cryptand trap and refocused on the head of the guard column for subsequent separation and analysis. EPA Method 314.1 has a perchlorate lowest concentration minimum reporting level (LCMRL) of 0.13 microg/L in both drinking water and laboratory synthetic sample matrices (LSSM) containing up to 1,000 microg/L each of chloride, bicarbonate and sulfate.

  1. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  2. Automation of a high-speed imaging setup for differential viscosity measurements

    International Nuclear Information System (INIS)

    We present the automation of a setup previously used to assess the viscosity of pleural effusion samples and discriminate between transudates and exudates, an important first step in clinical diagnostics. The presented automation includes the design, testing, and characterization of a vacuum-actuated loading station that handles the 2 mm glass spheres used as sensors, as well as the engineering of electronic Printed Circuit Board (PCB) incorporating a microcontroller and their synchronization with a commercial high-speed camera operating at 10 000 fps. The hereby work therefore focuses on the instrumentation-related automation efforts as the general method and clinical application have been reported earlier [Hurth et al., J. Appl. Phys. 110, 034701 (2011)]. In addition, we validate the performance of the automated setup with the calibration for viscosity measurements using water/glycerol standard solutions and the determination of the viscosity of an “unknown” solution of hydroxyethyl cellulose

  3. Automated extraction of DNA from reference samples from various types of biological materials on the Qiagen BioRobot EZ1 Workstation

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Jørgensen, Mads; Hansen, Anders Johannes;

    2009-01-01

    We have validated and implemented a protocol for DNA extraction from various types of biological materials using a Qiagen BioRobot EZ1 Workstation. The sample materials included whole blood, blood from deceased, buccal cells on Omni swabs and FTA Cards, blood on FTA Cards and cotton swabs......, and muscle biopsies. The DNA extraction was validated according to EN/ISO 17025 for the STR kits AmpFlSTR« Identifiler« and AmpFlSTR« Yfiler« (Applied Biosystems). Of 298 samples extracted, 11 (4%) did not yield acceptable results. In conclusion, we have demonstrated that extraction of DNA from various types...... of biological material can be performed quickly and without the use of hazardous chemicals, and that the DNA may be successfully STR typed according to the requirements of forensic genetic investigations accredited according to EN/ISO 17025...

  4. Full second order chromatographic/spectrometric data matrices for automated sample identification and component analysis by non-data-reducing image analysis

    DEFF Research Database (Denmark)

    Nielsen, Niles-Peter Vest; Smedsgaard, Jørn; Frisvad, Jens Christian

    1999-01-01

    A data analysis method is proposed for identification and for confirmation of classification schemes, based on single- or multiple-wavelength chromatographic profiles. The proposed method works directly on the chromatographic data without data reduction procedures such as peak area or retention...... index calculation, Chromatographic matrices from analysis of previously identified samples are used for generating a reference chromatogram for each class, and unidentified samples are compared with all reference chromatograms by calculating a resemblance measure for each reference. Once the method...... yielded over 90% agreement with accepted classifications. The method is highly accurate and may be used on all sorts of chromatographic profiles. Characteristic component analysis yielded results in good agreement with existing knowledge of characteristic components, but also succeeded in identifying new...

  5. Automated In-Injector Derivatization Combined with High-Performance Liquid Chromatography-Fluorescence Detection for the Determination of Semicarbazide in Fish and Bread Samples.

    Science.gov (United States)

    Wang, Yinan; Chan, Wan

    2016-04-01

    Semicarbazide (1) is a widespread genotoxic food contaminant originating as a metabolic byproduct of the antibiotic nitrofurazone used in fish farming or as a thermal degradation product of the common flour additive azodicarbonamide. The goal of this study is to develop a simple and sensitive high-performance liquid chromatography coupled with fluorescence detection (HPLC-FLD) method for the detection of compound 1 in food products. In comparison to existing methods for the determination of compound 1, the reported method combining online precolumn derivatization and HPLC-FLD is less labor-intensive, produces higher sample throughput, and does not require the use of expensive analytical instruments. After validation of accuracy and precision, this method was applied to determine the amount of compound 1 in fish and bread samples. Comparative studies using an established liquid chromatography coupled with tandem mass spectrometry method did not yield systematically different results, indicating that the developed HPLC-FLD method is accurate and suitable for the determination of compound 1 in fish and bread samples. PMID:26985968

  6. Local Adaptation in European Firs Assessed through Extensive Sampling across Altitudinal Gradients in Southern Europe

    Science.gov (United States)

    Postolache, Dragos; Lascoux, Martin; Drouzas, Andreas D.; Källman, Thomas; Leonarduzzi, Cristina; Liepelt, Sascha; Piotti, Andrea; Popescu, Flaviu; Roschanski, Anna M.; Zhelev, Peter; Fady, Bruno; Vendramin, Giovanni Giuseppe

    2016-01-01

    Background Local adaptation is a key driver of phenotypic and genetic divergence at loci responsible for adaptive traits variations in forest tree populations. Its experimental assessment requires rigorous sampling strategies such as those involving population pairs replicated across broad spatial scales. Methods A hierarchical Bayesian model of selection (HBM) that explicitly considers both the replication of the environmental contrast and the hierarchical genetic structure among replicated study sites is introduced. Its power was assessed through simulations and compared to classical ‘within-site’ approaches (FDIST, BAYESCAN) and a simplified, within-site, version of the model introduced here (SBM). Results HBM demonstrates that hierarchical approaches are very powerful to detect replicated patterns of adaptive divergence with low false-discovery (FDR) and false-non-discovery (FNR) rates compared to the analysis of different sites separately through within-site approaches. The hypothesis of local adaptation to altitude was further addressed by analyzing replicated Abies alba population pairs (low and high elevations) across the species’ southern distribution range, where the effects of climatic selection are expected to be the strongest. For comparison, a single population pair from the closely related species A. cephalonica was also analyzed. The hierarchical model did not detect any pattern of adaptive divergence to altitude replicated in the different study sites. Instead, idiosyncratic patterns of local adaptation among sites were detected by within-site approaches. Conclusion Hierarchical approaches may miss idiosyncratic patterns of adaptation among sites, and we strongly recommend the use of both hierarchical (multi-site) and classical (within-site) approaches when addressing the question of adaptation across broad spatial scales. PMID:27392065

  7. Automated detection of breast tumor in MRI and comparison of kinetic features for assessing tumor response to chemotherapy

    Science.gov (United States)

    Aghaei, Faranak; Tan, Maxine; Zheng, Bin

    2015-03-01

    Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) is used increasingly in diagnosis of breast cancer and assessment of treatment efficacy in current clinical practice. The purpose of this preliminary study is to develop and test a new quantitative kinetic image feature analysis method and biomarker to predict response of breast cancer patients to neoadjuvant chemotherapy using breast MR images acquired before the chemotherapy. For this purpose, we developed a computer-aided detection scheme to automatically segment breast areas and tumors depicting on the sequentially scanned breast MR images. From a contrast-enhancement map generated by subtraction of two image sets scanned pre- and post-injection of contrast agent, our scheme computed 38 morphological and kinetic image features from both tumor and background parenchymal regions. We applied a number of statistical data analysis methods to identify effective image features in predicting response of the patients to the chemotherapy. Based on the performance assessment of individual features and their correlations, we applied a fusion method to generate a final image biomarker. A breast MR image dataset involving 68 patients was used in this study. Among them, 25 had complete response and 43 had partially response to the chemotherapy based on the RECIST guideline. Using this image feature fusion based biomarker, the area under a receiver operating characteristic curve is AUC = 0.850±0.047. This study demonstrated that a biomarker developed from the fusion of kinetic image features computed from breast MR images acquired pre-chemotherapy has potentially higher discriminatory power in predicting response of the patients to the chemotherapy.

  8. Assessment of respiratory effect of air pollution: study design on general population samples.

    Science.gov (United States)

    Baldacci, S; Carrozzi, L; Viegi, G; Giuntini, C

    1997-01-01

    The aim of this paper is to describe an epidemiological model to investigate the relationship between respiratory diseases and environmental air pollution. In the Po Delta prospective study, subjects were investigated before and after a large thermoelectric power plant began operating, in 1980 to 1982 and in 1988 to 1991, respectively. The Pisa prospective study was performed in 1986 to 1988 and in 1991 to 1993, before and after the construction of a new expressway that encircles the city from the North to the Southeast. In each survey, subjects completed the interviewer-administered standardized CNR questionnaire on respiratory symptoms/diseases and risk factors, and performed lung function tests. In the second survey of each study, skin prick tests, total serum IgE determination, methacholine challenge test and biomarkers (such as sister chromatide exchanges, micronuclei, chromosomal abnormalities, DNA and hemoglobin adducts) were also performed. Concentrations of total suspended particulate and SO2 in both surveys were higher in urban than in rural areas, as well as symptom/disease prevalences and bronchial reactivity. Subgroups of subjects from the two samples were enrolled to perform a specific study on the acute respiratory effects of indoor pollution; the daily presence of symptoms and measurements of peak expiratory flow (PEF), daily activity pattern, and assessment of the indoor air quality (particulates particulates, especially asthmatics. In conclusion, these studies represent a basis for further analyses to better define the relationship between respiratory health and indoor/outdoor pollutant levels.

  9. PREVALENCE AND ANTIMICROBIAL RESISTANCE ASSESSMENT OF SUBCLINICAL MASTITIS IN MILK SAMPLES FROM SELECTED DAIRY FARMS

    Directory of Open Access Journals (Sweden)

    Murugaiyah Marimuthu

    2014-01-01

    Full Text Available This study was conducted in order to determine the prevalence and bacteriological assessment of subclinical mastitis and antimicrobial resistance of bacterial isolates from dairy cows in different farms around Selangor, Malaysia. A total of 120 milk samples from 3 different farms were randomly collected and tested for subclinical mastitis using California Mastitis Test (CMT, as well as for bacterial culture for isolation, identification and antimicrobial resistance. The most prevalent bacteria was Staphylococcus sp. (55%, followed by Bacillus sp., (21% and Corynebacterium sp., (7%, Yersinia sp. and Neisseria sp. both showed 5% prevalence, other species with prevalence below 5% are Acinetobacter sp., Actinobacillus sp., Vibrio sp., Pseudomonas sp., E.coli, Klebsiella sp. and Chromobacter sp. Selected Staphylococcus sp. showed a mean antimicrobial resistance of 73.3% to Ampicillin, 26.7% to Penicillin, Methicillin and Compound Sulphonamide each, 20% to Oxacillin, Amoxycillin and Cefuroxime, 13.3% to Polymyxin B, Erythromycin, Ceftriaxone and Azithromycin and 6.7% to Streptomycin, Clindamycin, Lincomycin and Tetracycline each. This study indicates the need for urgent and effective control measures to tackle the increase in prevalence of subclinical mastitis and their antimicrobial resistance in the study area.

  10. Effect of size and heterogeneity of samples on biomarker discovery: synthetic and real data assessment.

    Directory of Open Access Journals (Sweden)

    Barbara Di Camillo

    Full Text Available MOTIVATION: The identification of robust lists of molecular biomarkers related to a disease is a fundamental step for early diagnosis and treatment. However, methodologies for the discovery of biomarkers using microarray data often provide results with limited overlap. These differences are imputable to 1 dataset size (few subjects with respect to the number of features; 2 heterogeneity of the disease; 3 heterogeneity of experimental protocols and computational pipelines employed in the analysis. In this paper, we focus on the first two issues and assess, both on simulated (through an in silico regulation network model and real clinical datasets, the consistency of candidate biomarkers provided by a number of different methods. METHODS: We extensively simulated the effect of heterogeneity characteristic of complex diseases on different sets of microarray data. Heterogeneity was reproduced by simulating both intrinsic variability of the population and the alteration of regulatory mechanisms. Population variability was simulated by modeling evolution of a pool of subjects; then, a subset of them underwent alterations in regulatory mechanisms so as to mimic the disease state. RESULTS: The simulated data allowed us to outline advantages and drawbacks of different methods across multiple studies and varying number of samples and to evaluate precision of feature selection on a benchmark with known biomarkers. Although comparable classification accuracy was reached by different methods, the use of external cross-validation loops is helpful in finding features with a higher degree of precision and stability. Application to real data confirmed these results.

  11. OCT as a convenient tool to assess the quality and application of organotypic retinal samples

    Science.gov (United States)

    Gater, Rachel; Khoshnaw, Nicholas; Nguyen, Dan; El Haj, Alicia J.; Yang, Ying

    2016-03-01

    Eye diseases such as macular degeneration and glaucoma have profound consequences on the quality of human life. Without treatment, these diseases can lead to loss of sight. To develop better treatments for retinal diseases, including cell therapies and drug intervention, establishment of an efficient and reproducible 3D native retinal tissue system, enabled over a prolonged culture duration, will be valuable. The retina is a complex tissue, consisting of ten layers with a different density and cellular composition to each. Uniquely, as a light transmitting tissue, retinal refraction of light differs among the layers, forming a good basis to use optical coherence tomography (OCT) in assessing the layered structure of the retina and its change during the culture and treatments. In this study, we develop a new methodology to generate retinal organotypic tissues and compare two substrates: filter paper and collagen hydrogel, to culture the organotypic tissue. Freshly slaughtered pig eyes have been obtained for use in this study. The layered morphology of intact organotypic retinal tissue cultured on two different substrates has been examined by spectral domain OCT. The viability of the tissues has been examined by live/dead fluorescence dye kit to cross validate the OCT images. For the first time, it is demonstrated that the use of a collagen hydrogel supports the viability of retinal organotypic tissue, capable of prolonged culture up to 2 weeks. OCT is a convenient tool for appraising the quality and application of organotypic retinal samples and is important in the development of current organotypic models.

  12. Radiometric assessment of natural radioactivity levels of agricultural soil samples collected in Dakahlia, Egypt.

    Science.gov (United States)

    Issa, Shams A M

    2013-01-01

    Determination of the natural radioactivity has been carried out, by using a gamma-ray spectrometry [NaI (Tl) 3″ × 3″] system, in surface soil samples collected from various locations in Dakahlia governorate, Egypt. These locations form the agriculturally important regions of Egypt. The study area has many industries such as chemical, paper, organic fertilisers and construction materials, and the soils of the study region are used as a construction material. Therefore, it becomes necessary to study the natural radioactivity levels in soil to assess the dose for the population in order to know the health risks. The activity concentrations of (226)Ra, (232)Th and (40)K in the soil ranged from 5.7 ± 0.3 to 140 ± 7, from 9.0 ± 0.4 to 139 ± 7 and from 22 ± 1 to 319 ± 16 Bq kg(-1), respectively. The absorbed dose rate, annual effective dose rate, radium equivalent (Req), excess lifetime cancer risk, hazard indices (Hex and Hin) and annual gonadal dose equivalent, which resulted from the natural radionuclides in the soil were calculated. PMID:23509393

  13. Radiometric assessment of natural radioactivity levels of agricultural soil samples collected in Dakahlia, Egypt.

    Science.gov (United States)

    Issa, Shams A M

    2013-01-01

    Determination of the natural radioactivity has been carried out, by using a gamma-ray spectrometry [NaI (Tl) 3″ × 3″] system, in surface soil samples collected from various locations in Dakahlia governorate, Egypt. These locations form the agriculturally important regions of Egypt. The study area has many industries such as chemical, paper, organic fertilisers and construction materials, and the soils of the study region are used as a construction material. Therefore, it becomes necessary to study the natural radioactivity levels in soil to assess the dose for the population in order to know the health risks. The activity concentrations of (226)Ra, (232)Th and (40)K in the soil ranged from 5.7 ± 0.3 to 140 ± 7, from 9.0 ± 0.4 to 139 ± 7 and from 22 ± 1 to 319 ± 16 Bq kg(-1), respectively. The absorbed dose rate, annual effective dose rate, radium equivalent (Req), excess lifetime cancer risk, hazard indices (Hex and Hin) and annual gonadal dose equivalent, which resulted from the natural radionuclides in the soil were calculated.

  14. Radioactivity concentrations and dose assessment for soil samples around nuclear power plant IV in Taiwan.

    Science.gov (United States)

    Tsai, Tsuey-Lin; Lin, Chun-Chih; Wang, Tzu-Wen; Chu, Tieh-Chi

    2008-09-01

    Activity concentrations and distributions of natural and man-made radionuclides in soil samples collected around nuclear power plant IV, Taiwan, were investigated for five years to assess the environmental radioactivity and characterisation of radiological hazard prior to commercial operation. The activity concentrations of radionuclides were determined via gamma-ray spectrometry using an HPGe detector. Data obtained show that the average concentrations of the (238)U and (232)Th series, and (40)K, were within world median ranges in the UNSCEAR report. The (137)Cs ranged from 2.46 +/- 0.55 to 12.13 +/- 1.31 Bq kg(-1). The terrestrial absorbed dose rate estimated by soil activity and directly measured with a thermoluminescence dosemeter (excluding cosmic rays), and the annual effective doses, were 45.63, 57.34 nGy h(-1) and 57.19 microSv, respectively. Experimental results were compared with international recommended values. Since the soil in this area is an important building material, the mean radium equivalent activity, external and inhalation hazard indices and the representative level index using various models given in the literature for the study area were 98.18 Bq kg(-1), 0.27, 0.34 and 0.73, respectively, which were below the recommended limits. Analytical results demonstrate that no radiological anomaly exists. The baseline data will prove useful and important in estimating the collective dose near the new nuclear power plant under construction in Taiwan. PMID:18714131

  15. Analysis of terrestrial natural radionuclides in soil samples and assessment of average effective dose

    International Nuclear Information System (INIS)

    Radionuclides that are present in soil significantly affect terrestrial gamma radiation levels which in turn can be used for the assessment of terrestrial gamma dose rates. Natural radioactivity analysis has been done for the soil samples collected from different villages/towns of Hoshiarpur district of Punjab, India. The measurements have been carried out using HPGe detector based on high-resolution gamma spectrometry system. The calculated activity concentration values for terrestrial gamma viz. 238U, 232Th and 40K have been found to vary from 8.89 to 56.71 Bq kg-1, from 137.32 to 334.47 Bq kg-1 and from 823.62 to 1064.97 Bq kg-1, respectively. The total average absorbed dose rate in the study areas is 185.32 nGyh-1. The calculated value of average radium equivalent activity (401.13 Bq kg-1) exceeds the permissible limit (370 Bqkg-1) recommended by Organization for Economic Corporation and Development (OECD). The calculated average value of external hazard index (Hex) is 1.097. The calculated values of Indoor and Outdoor annual effective doses vary from 0.61 to 1.28 mSv and from 0.15 to 0.32 mSv, respectively. A positive correlation (R2 = 0.71) has also been observed for concentration of 232Th and 40K. (author)

  16. Automated dynamic hollow fiber liquid-liquid-liquid microextraction combined with capillary electrophoresis for speciation of mercury in biological and environmental samples.

    Science.gov (United States)

    Li, Pingjing; He, Man; Chen, Beibei; Hu, Bin

    2015-10-01

    A simple home-made automatic dynamic hollow fiber based liquid-liquid-liquid microextraction (AD-HF-LLLME) device was designed and constructed for the simultaneous extraction of organomercury and inorganic mercury species with the assistant of a programmable flow injection analyzer. With 18-crown-6 as the complexing reagent, mercury species including methyl-, ethyl-, phenyl- and inorganic mercury were extracted into the organic phase (chlorobenzene), and then back-extracted into the acceptor phase of 0.1% (m/v) 3-mercapto-1-propanesulfonic acid (MPS) aqueous solution. Compared with automatic static (AS)-HF-LLLME system, the extraction equilibrium of target mercury species was obtained in shorter time with higher extraction efficiency in AD-HF-LLLME system. Based on it, a new method of AD-HF-LLLME coupled with large volume sample stacking (LVSS)-capillary electrophoresis (CE)/UV detection was developed for the simultaneous analysis of methyl-, phenyl- and inorganic mercury species in biological samples and environmental water. Under the optimized conditions, AD-HF-LLLME provided high enrichment factors (EFs) of 149-253-fold within relatively short extraction equilibrium time (25min) and good precision with RSD between 3.8 and 8.1%. By combining AD-HF-LLLME with LVSS-CE/UV, EFs were magnified up to 2195-fold and the limits of detection (at S/N=3) for target mercury species were improved to be sub ppb level.

  17. Fast automated dual-syringe based dispersive liquid-liquid microextraction coupled with gas chromatography-mass spectrometry for the determination of polycyclic aromatic hydrocarbons in environmental water samples.

    Science.gov (United States)

    Guo, Liang; Tan, Shufang; Li, Xiao; Lee, Hian Kee

    2016-03-18

    An automated procedure, combining low density solvent based solvent demulsification dispersive liquid-liquid microextraction (DLLME) with gas chromatography-mass spectrometry analysis, was developed for the determination of polycyclic aromatic hydrocarbons (PAHs) in environmental water samples. Capitalizing on a two-rail commercial autosampler, fast solvent transfer using a large volume syringe dedicated to the DLLME process, and convenient extract collection using a small volume microsyringe for better GC performance were enabled. Extraction parameters including the type and volume of extraction solvent, the type and volume of dispersive solvent and demulsification solvent, extraction and demulsification time, and the speed of solvent injection were investigated and optimized. Under the optimized conditions, the linearity ranged from 0.1 to 50 μg/L, 0.2 to 50 μg/L, and 0.5 to 50 μg/L, depending on the analytes. Limits of detection were determined to be between 0.023 and 0.058 μg/L. The method was applied to determine PAHs in environmental water samples.

  18. Radiological assessment of fish samples due to natural radionuclides in river Yobe, Northern Nigeria

    International Nuclear Information System (INIS)

    Assessment of natural radioactivity of some fish samples in river Yobe was conducted, using gamma spectroscopy method with NaI(TI) detector. Radioactivity is phenomenon that leads to production of radiations, whereas radiation is known to trigger or induce cancer. The fish were analyzed to estimate the radioactivity (activity) concentrations due to natural radionuclides 226Ra, 232Th and 40K. The obtained result show that the activity concentration for (226Ra), in all the fish samples collected ranges from 15.23±2.45BqKg-1 to 67.39±2.13BqKg-1 with an average value of 34.13±1.34BqKg-1. That of 232Th, ranges from 42.66±0.81BqKg-1 to 201.18±3.82BqKg-1, and the average value stands at 96.01±3.82BqKg-1. The activity concentration for 40K, ranges between 243.3±1.56 BqKg-1 to 618.2±2.81 BqKg-1 and the average is 413.92±1.7 BqKg-1. This study indicated that average daily intake due to natural activity from the fish is valued at 0.913 Bq/day, 2.577Bq/day and 11.088 Bq/day for 226Ra, 232Th and 40K respectively. This shows that the activity concentration values for fish, shows a promising result with most of the fish activity concentrations been within the acceptable limits. However locations (F02, F07 and F12), fish became outliers with significant values of 112.53μSvy-1, 121.11μSvy-1 and 114.32μSvy-1 effective dose. This could be attributed to variation in geological formations within the river as well as the feeding habits of these fish. The work shows that consumers of fish from River Yobe have no risk of radioactivity ingestion, even though no amount of radiation is assumed to be totally safe.

  19. A small sample-size automated adiabatic calorimeter from 70 to 580 K——Molar heat capacities of α-Al2O3

    Institute of Scientific and Technical Information of China (English)

    谭志诚; 张际标; 孟霜鹤; 李莉

    1999-01-01

    An automatic adiabatic calorimeter for measuring heat capacities in the temperature range 70—580 K, equipped with a small sample cell of 7.4 cm~3 in the internal volume has been developed. In order to obtain a good adiabatic condition of the calorimeter at high temperature, the calorimeter was surrounded in sequence by two adiabatic shields, three radiation shields and an auxiliary temperature-controlled sheath. The main body of the cell made of copper and the lid made of brass are silver-soldered and the cell is sealed with a copper screw cap. A sealing gasket made of Pb-Sn alloy is put between the cap and the lid to ensure a high vacuum sealing of the cell in the whole experimental temperature range. All the leads are insulated and fixed with W30-11 varnish, thus a good electric insulation is obtained at high temperature. All the experimental data, including those for energy and temperature are collected and processed automatically with a personal computer using a predetermined program. To verify the

  20. Development of an automated sampling-analysis system for simultaneous measurement of reactive oxygen species (ROS) in gas and particle phases: GAC-ROS

    Science.gov (United States)

    Huang, Wei; Zhang, Yuanxun; Zhang, Yang; Zeng, Limin; Dong, Huabin; Huo, Peng; Fang, Dongqing; Schauer, James J.

    2016-06-01

    A novel online system, GAC-ROS, for simultaneous measurement of reactive oxygen species (ROS) in both gas and particle phases was developed based on 2‧,7‧-dichlorofluorescin (DCFH) assay to provide fast sampling and analysis of atmospheric ROS. The GAC-ROS, composed of a Gas and Aerosol Collector (GAC), a series of reaction and transportation systems, and a fluorescence detector, was tested for instrumental performance in laboratory. Results showed good performance with a favorable R2 value for the calibration curve (above 0.998), high penetration efficiencies of ROS (above 99.5%), and low detection limits (gas-phase ROS: 0.16 nmol H2O2 m-3; particle-phase ROS: 0.12 nmol H2O2 m-3). Laboratorial comparison between online and offline methods for particle-bound ROS showed significant loss of ROS due to the relatively long time off-line treatment. Field observations in Beijing found that concentrations of ROS in winter time were significantly higher than those observed in spring. Only a few weak positive correlations were found between ROS and some air pollutants, which reflects the complexities of ROS generation and transformation in atmosphere. This study was the first to simultaneously obtain concentrations of gas and particle-phase ROS using an online method. Consequently, it provides a powerful tool to characterize the oxidizing capacity of the atmosphere and the sources of the oxidizing capacity.

  1. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  2. Assessment of polychlorinated biphenyls and organochlorine pesticides in water samples from the Yamuna River

    Directory of Open Access Journals (Sweden)

    Bhupander Kumar

    2012-07-01

    Full Text Available Polychlorinated biphenyls (PCBs, hexachlorocyclohexane (HCH and dichlorodiphenyltrichloroethane (DDT are toxic, persistent and bioaccumulative long-range atmospheric transport pollutants. These are transported worldwide affecting remote regions far from their original sources, and can transfer into food webs with a wide range of acute and chronic health effects. India ratified the Stockholm Convention with the intention of reducing and eliminating persistent organic pollutants (POPs, and encouraged the support of research on POPs. Despite the ban and restriction on the use of these chemicals in India, their contamination of air, water, sediment, biota and humans has been reported. In this study, surface water samples were collected during January 2012 from the Yamuna River in Delhi, India, and analyzed for PCBs and organochlorine pesticides (OCPs. The concentrations of ΣPCBs and ΣOCPs ranged between 2-779 ng L–1 and from less than 0.1 to 618 ng L–1 (mean 99±38 ng L–1 and 221±50 ng L–1, respectively. The PCB homolog was dominated by 3-4 chlorinated biphenyls. In calculating the toxicity equivalent of dioxin-like PCBs (dl-PCBsusing World Health Organization toxic equivalency factors, dl-PCBs accounted for 10% of a total of 27 PCBs. The concentration of ΣHCH ranged between less than 0.1 and 285 ng L–1 (mean 151±32 ng L–1. However, ΣDDTs concentrations varied between less than 0.1 and 354 ng L–1 (mean 83±26 ng L–1. The concentrations were lower than the US guideline values; however, levels of lindane exceeded those recommended in guidelines. Further in-depth study is proposed to determine the bioaccumulation of these pollutants through aquatic biota to assess the risk of contaminants to human health.

  3. Assessing the efficacy of hair snares as a method for noninvasive sampling of Neotropical felids

    Directory of Open Access Journals (Sweden)

    Tatiana P. Portella

    2013-02-01

    Full Text Available Hair snares have been used in North and Central America for a long time in assessment and monitoring studies of several mammalian species. This method can provide a cheap, suitable, and efficient way to monitor mammals because it combines characteristics that are not present in most alternative techniques. However, despite their usefulness, hair snares are rarely used in other parts of the world. The aim of our study was to evaluate the effectiveness of hair snares and three scent lures (cinnamon, catnip, and vanilla in the detection of felids in one of the largest remnants of the Brazilian Atlantic Forest. We performed tests with six captive felid species - Panthera onca (Linnaeus, 1758, Leopardus pardalis (Linnaeus, 1758, L. tigrinus (Schreber, 1775, L. wiedii (Schinz, 1821, Puma concolor (Linnaeus, 1771, and P. yagouaroundi (É. Geoffroy Saint-Hilaire, 1803 - to examine their responses to the attractants, and to correlate those with lure efficiency in the field. The field tests were conducted at the Parque Estadual Pico do Marumbi, state of Paraná, Brazil. Hair traps were placed on seven transects. There were equal numbers of traps with each scent lure, for a total of 1,551 trap-days. In captivity, vanilla provided the greatest response, yet no felids were detected in the field with any of the tested lures, although other species were recorded. Based on the sampling of non-target species, and the comparison with similar studies elsewhere, this study points to a possible caveat of this method when rare species or small populations are concerned. Meanwhile, we believe that improved hair snares could provide important results with several species in the location tested and others.

  4. Assessment of vadose zone sampling methods for detection of preferential herbicide transport

    Directory of Open Access Journals (Sweden)

    N. P. Peranginangin

    2009-11-01

    Full Text Available Accurate soil water sampling is needed for monitoring of pesticide leaching through the vadose zone, especially in soils with significant preferential flowpaths. We assessed the effectiveness of wick and gravity pan lysimeters as well as ceramic cups (installed 45–60 cm deep in strongly-structured silty clay loam (Hudson series and weakly-structured fine sandy loam (Arkport series soils. Simulated rainfall (10–14 cm in 4 d, approximately equal to a 10-yr, 24 h storm was applied following concurrent application of agronomic rates (0.2 g m−2 of atrazine (6-chloro-N2-ethyl-N4-isopropyl-1,3,5-triazine-2,4-diamine and 2,4-D (2,4-dichloro-phenoxy-acetic acid immediately following application of a chloride tracer (22–44 g m−2. Preferential flow mechanisms were observed in both soils, with herbicide and tracer mobility greater than would be predicted by uniform flow. Preferential flow was more dominant in the Hudson soil, with earlier breakthroughs observed. Mean wick and gravity pan sampler percolate concentrations at 60 cm depth ranged from 96 to 223 μg L−1 for atrazine and 54 to 78 μg L−1 for 2,4-D at the Hudson site, and from 7 to 22 μg L−1 for atrazine and 0.5 to 2.8 μg L−1 for 2,4-D at the Arkport site. Gravity and wick pan lysimeters had comparably good collection efficiencies at elevated soil moisture levels, whereas wick pan samplers performed better at lower moisture contents. Cup samplers performed poorly with wide variations in collections and solute concentrations.

  5. Sampling and Analysis Instruction for Assessing Chemical Vulnerability Potential in REDOX and U Plants

    International Nuclear Information System (INIS)

    The purpose of this sampling and analysis instruction is to provide the sampling and analytical approach to be used to determine if the constituents that are present pose a threat to human health or the environment. A secondary purpose of this sampling effort is to gather analytical data that will be used to characterize the contents of each tank for waste characterization/disposal

  6. Spatial scan statistics to assess sampling strategy of antimicrobial resistance monitoring programme

    DEFF Research Database (Denmark)

    Vieira, Antonio; Houe, Hans; Wegener, Henrik Caspar;

    2009-01-01

    sampled by the Danish Integrated Antimicrobial Resistance Monitoring and Research Programme (DANMAP), by identifying spatial Clusters of samples and detecting areas with significantly high or low sampling rates. These analyses were performed for each year and for the total 5-year study period for all...

  7. Automated Ground-Water Sampling and Analysis of Hexavalent Chromium using a “Universal” Sampling/Analytical System

    Directory of Open Access Journals (Sweden)

    Richard J. Venedam

    2005-02-01

    Full Text Available The capabilities of a “universal platform” for the deployment of analyticalsensors in the field for long-term monitoring of environmental contaminants were expandedin this investigation. The platform was previously used to monitor trichloroethene inmonitoring wells and at groundwater treatment systems (1,2. The platform was interfacedwith chromium (VI and conductivity analytical systems to monitor shallow wells installedadjacent to the Columbia River at the 100-D Area of the Hanford Site, Washington. Agroundwater plume of hexavalent chromium is discharging into the Columbia River throughthe gravels beds used by spawning salmon. The sampling/analytical platform was deployedfor the purpose of collecting data on subsurface hexavalent chromium concentrations atmore frequent intervals than was possible with the previous sampling and analysis methodsemployed a the Site.

  8. What can one sample tell us? Stable isotopes can assess complex processes in national assessments of lakes, rivers and streams.

    Science.gov (United States)

    Stable isotopes can be very useful in large-scale monitoring programs because samples for isotopic analysis are easy to collect, and isotopes integrate information about complex processes such as evaporation from water isotopes and denitrification from nitrogen isotopes. Traditi...

  9. Application of Fourier Transform Infrared Spectroscopy (FTIR) for assessing biogenic silica sample purity in geochemical analyses and palaeoenvironmental research

    OpenAIRE

    G. E. A. Swann; S. V. Patwardhan

    2011-01-01

    The development of a rapid and non-destructive method to assess purity levels in samples of biogenic silica prior to geochemical/isotope analysis remains a key objective in improving both the quality and use of such data in environmental and palaeoclimatic research. Here a Fourier Transform Infrared Spectroscopy (FTIR) mass-balance method is demonstrated for calculating levels of contamination in cleaned sediment core diatom samples from Lake Baikal, Russia. Following the selection of end-mem...

  10. Application of Fourier Transform Infrared Spectroscopy (FTIR) for assessing biogenic silica sample purity in geochemical analyses and palaeoenvironmental research

    OpenAIRE

    G. E. A. Swann; S. V. Patwardhan

    2010-01-01

    The development of a rapid and non-destructive method to assess levels of purity in samples of biogenic silica prior to geochemical/isotope analysis remains a key objective in improving both the quality and use of such data in environmental and palaeoclimatic research. Here a Fourier Transform Infrared Spectroscopy (FTIR) mass-balance method is demonstrated for calculating levels of contamination in cleaned sediment core diatom samples from Lake Baikal Russia. Following the selection of end-m...

  11. Assessing Eating Disorder Risk: The Pivotal Role of Achievement Anxiety, Depression and Female Gender in Non-Clinical Samples

    OpenAIRE

    Christos C. Frangos; Fragkos, Konstantinos C.

    2013-01-01

    The objective of the present study was to assess factors predicting eating disorder risk in a sample of undergraduate students. A structured questionnaire was employed on a random sample (n = 1865) consisting of the following sections: demographics, SCOFF (Sick, Control, One stone, Fat, Food) questionnaire for screening eating disorders and the Achievement Anxiety Test and the Depression, Anxiety and Stress Scale. The students at risk for eating disorders (SCOFF score ≥2) were 39.7%. Eating d...

  12. Murine Automated Urine Sampler (MAUS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal outlines planned development for a low-power, low-mass automated urine sample collection and preservation system for small mammals, capable of...

  13. Microbiological assessment along the fish production chain of the Norwegian pelagic fisheries sector--Results from a spot sampling programme.

    Science.gov (United States)

    Svanevik, Cecilie Smith; Roiha, Irja Sunde; Levsen, Arne; Lunestad, Bjørn Tore

    2015-10-01

    Microbes play an important role in the degradation of fish products, thus better knowledge of the microbiological conditions throughout the fish production chain may help to optimise product quality and resource utilisation. This paper presents the results of a ten-year spot sampling programme (2005-2014) of the commercially most important pelagic fish species harvested in Norway. Fish-, surface-, and storage water samples were collected from fishing vessels and processing factories. Totally 1,181 samples were assessed with respect to microbiological quality, hygiene and food safety. We introduce a quality and safety assessment scheme for fresh pelagic fish recommending limits for heterotrophic plate counts (HPC), thermos tolerant coliforms, enterococci and Listeria monocytogenes. According to the scheme, in 25 of 41 samplings, sub-optimal conditions were found with respect to quality, whereas in 21 and 9 samplings, samples were not in compliance concerning hygiene and food safety, respectively. The present study has revealed that the quality of pelagic fish can be optimised by improving the hygiene conditions at some critical points at an early phase of the production chain. Thus, the proposed assessment scheme may provide a useful tool for the industry to optimise quality and maintain consumer safety of pelagic fishery products. PMID:26187839

  14. Active and passive sampling for the assessment of hydrophilic organic contaminants in a river basin-ecotoxicological risk assessment.

    Science.gov (United States)

    Terzopoulou, Evangelia; Voutsa, Dimitra

    2016-03-01

    This study presents a complementary approach for the evaluation of water quality in a river basin by employing active and passive sampling. Thirty-eight hydrophilic organic compounds (HpOCs) (organohalogen herbicides, organophosphorous pesticides, carbamate, triazine, urea, pharmaceuticals, phenols, and industrial chemicals) were studied in grab water samples and in passive samplers POCIS collected along Strymonas River, Northern Greece, at three sampling campaigns during the year 2013. Almost all the target compounds were detected at the periods of high rainfall intensity and/or low flow rate. The most frequently detected compounds were aminocarb, carbaryl, chlorfenviphos, chloropropham, 2,4-D, diflubenzuron, diuron, isoproturon, metolachlor, and salicylic acid. Bisphenol A and nonylphenol were also occasionally detected. The use of POCIS allowed the detection of more micropollutants than active sampling. Low discrepancy between the concentrations obtained from both samplings was observed, at least for compounds with >50 % detection frequency; thus, POCIS could be a valuable tool for the selection and monitoring of the most relevant HpOCs in the river basin. Results showed relatively low risk from the presence of HpOCs; however, the potential risk associated with micropollutants such as carbaryl, dinoseb, diuron, fenthion, isoproturon, metolachlor, nonylphenol, and salicylic acid should not be neglected. PMID:26573318

  15. Unmet needs in automated cytogenetics

    International Nuclear Information System (INIS)

    Though some, at least, of the goals of automation systems for analysis of clinical cytogenetic material seem either at hand, like automatic metaphase finding, or at least likely to be met in the near future, like operator-assisted semi-automatic analysis of banded metaphase spreads, important areas of cytogenetic analsis, most importantly the determination of chromosomal aberration frequencies in populations of cells or in samples of cells from people exposed to environmental mutagens, await practical methods of automation. Important as are the clinical diagnostic applications, it is apparent that increasing concern over the clastogenic effects of the multitude of potentially clastogenic chemical and physical agents to which human populations are being increasingly exposed, and the resulting emergence of extensive cytogenetic testing protocols, makes the development of automation not only economically feasible but almost mandatory. The nature of the problems involved, and acutal of possible approaches to their solution, are discussed

  16. Quality assurance guidance for laboratory assessment plates in support of EM environmental sampling and analysis activities

    International Nuclear Information System (INIS)

    This document is one of several guidance documents developed to support the EM (DOE Environmental Restoration and Waste Management) Analytical Services program. Its purpose is to introduce assessment plates that can be used to conduct performance assessments of an organization's or project's ability to meet quality goals for analytical laboratory activities. These assessment plates are provided as non-prescriptive guidance to EM-support organizations responsible for collection of environmental data for remediation and waste management programs at DOE facilities. The assessments evaluate objectively all components of the analytical laboratory process to determine their proper selection and use

  17. Assessment of crystalline disorder in cryo-milled samples of indomethacin using atomic pair-wise distribution functions

    DEFF Research Database (Denmark)

    Bøtker, Johan P; Karmwar, Pranav; Strachan, Clare J;

    2011-01-01

    to analyse the cryo-milled samples. The high similarity between the ¿-indomethacin cryogenic ball milled samples and the crude ¿-indomethacin indicated that milled samples retained residual order of the ¿-form. The PDF analysis encompassed the capability of achieving a correlation with the physical...... properties determined from DSC, ss-NMR and stability experiments. Multivariate data analysis (MVDA) was used to visualize the differences in the PDF and XRPD data. The MVDA approach revealed that PDF is more efficient in assessing the introduced degree of disorder in ¿-indomethacin after cryo-milling than...

  18. Manufacturing and automation

    Directory of Open Access Journals (Sweden)

    Ernesto Córdoba Nieto

    2010-04-01

    Full Text Available The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and examined, especially that referring to the problem’s relationship with reconciling the level of automation with the flexibility and productivity demanded by competitive, worldwide manufacturing.

  19. SU-E-I-81: Assessment of CT Radiation Dose and Image Quality for An Automated Tube Potential Selection Algorithm Using Adult Anthropomorphic and ACR Phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Mahmood, U; Erdi, Y; Wang, W [Memorial Sloan Kettering Cancer Center, NY, NY (United States)

    2014-06-01

    Purpose: To assess the impact of General Electrics (GE) automated tube potential algorithm, kV assist (kVa) on radiation dose and image quality, with an emphasis on optimizing protocols based on noise texture. Methods: Radiation dose was assessed by inserting optically stimulated luminescence dosimeters (OSLs) throughout the body of an adult anthropomorphic phantom (CIRS). The baseline protocol was: 120 kVp, Auto mA (180 to 380 mA), noise index (NI) = 14, adaptive iterative statistical reconstruction (ASiR) of 20%, 0.8s rotation time. Image quality was evaluated by calculating the contrast to noise ratio (CNR) and noise power spectrum (NPS) from the ACR CT accreditation phantom. CNRs were calculated according to the steps described in ACR CT phantom testing document. NPS was determined by taking the 3D FFT of the uniformity section of the ACR phantom. NPS and CNR were evaluated with and without kVa and for all available adaptive iterative statistical reconstruction (ASiR) settings, ranging from 0 to 100%. Each NPS was also evaluated for its peak frequency difference (PFD) with respect to the baseline protocol. Results: The CNR for the adult male was found to decrease from CNR = 0.912 ± 0.045 for the baseline protocol without kVa to a CNR = 0.756 ± 0.049 with kVa activated. When compared against the baseline protocol, the PFD at ASiR of 40% yielded a decrease in noise magnitude as realized by the increase in CNR = 0.903 ± 0.023. The difference in the central liver dose with and without kVa was found to be 0.07%. Conclusion: Dose reduction was insignificant in the adult phantom. As determined by NPS analysis, ASiR of 40% produced images with similar noise texture to the baseline protocol. However, the CNR at ASiR of 40% with kVa fails to meet the current ACR CNR passing requirement of 1.0.

  20. Análise de fármacos em material biológico: acoplamento microextração em fase sólida "no tubo" e cromatografia líquida de alta eficiência Analysis of drugs in biological samples: automated "in-tube" solid-phase microextraction and high performance liquid chromatography

    Directory of Open Access Journals (Sweden)

    Maria Eugênia C. Queiroz

    2005-10-01

    Full Text Available A new solid phase microextraction (SPME system, known as in-tube SPME, was recently developed using an open tubular fused-silica capilary column, instead of an SPME fiber, as the SPME device. On-line in-tube SPME is usually used in combination with high performance liquid chromatography. Drugs in biological samples are directly extracted and concentrated in the stationary phase of capillary columns by repeated draw/eject cycles of sample solution, and then directly transferred to the liquid chromatographic column. In-tube SPME is suitable for automation. Automated sample handling procedures not only shorten the total analysis time, but also usually provide better accuracy and precision relative to manual techniques. In-tube SPME has been demonstrated to be a very effective and highly sensitive technique to determine drugs in biological samples for various purposes such as therapeutic drug monitoring, clinical toxicology, bioavailability and pharmacokinetics.

  1. Automated extraction of lysergic acid diethylamide (LSD) and N-demethyl-LSD from blood, serum, plasma, and urine samples using the Zymark RapidTrace with LC/MS/MS confirmation.

    Science.gov (United States)

    de Kanel, J; Vickery, W E; Waldner, B; Monahan, R M; Diamond, F X

    1998-05-01

    A forensic procedure for the quantitative confirmation of lysergic acid diethylamide (LSD) and the qualitative confirmation of its metabolite, N-demethyl-LSD, in blood, serum, plasma, and urine samples is presented. The Zymark RapidTrace was used to perform fully automated solid-phase extractions of all specimen types. After extract evaporation, confirmations were performed using liquid chromatography (LC) followed by positive electrospray ionization (ESI+) mass spectrometry/mass spectrometry (MS/MS) without derivatization. Quantitation of LSD was accomplished using LSD-d3 as an internal standard. The limit of quantitation (LOQ) for LSD was 0.05 ng/mL. The limit of detection (LOD) for both LSD and N-demethyl-LSD was 0.025 ng/mL. The recovery of LSD was greater than 95% at levels of 0.1 ng/mL and 2.0 ng/mL. For LSD at 1.0 ng/mL, the within-run and between-run (different day) relative standard deviation (RSD) was 2.2% and 4.4%, respectively.

  2. Assessment of statistical uncertainty in the quantitative analysis of solid samples in motion using laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Cabalin, L.M.; Gonzalez, A. [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain); Ruiz, J. [Department of Applied Physics I, University of Malaga, E-29071 Malaga (Spain); Laserna, J.J., E-mail: laserna@uma.e [Department of Analytical Chemistry, University of Malaga, E-29071 Malaga (Spain)

    2010-08-15

    Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s{sup -1}. Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.

  3. Use of CFD for static sampling hood design: An example for methane flux assessment on landfill surfaces.

    Science.gov (United States)

    Lucernoni, Federico; Rizzotto, Matteo; Tapparo, Federica; Capelli, Laura; Sironi, Selena; Busini, Valentina

    2016-11-01

    The work focuses on the principles for the design of a specific static hood and on the definition of an optimal sampling procedure for the assessment of landfill gas (LFG) surface emissions. This is carried out by means of computational fluid dynamics (CFD) simulations to investigate the fluid dynamics conditions of the hood. The study proves that understanding the fluid dynamic conditions is fundamental in order to understand the sampling results and correctly interpret the measured concentration values by relating them to a suitable LFG emission model, and therefore to estimate emission rates. For this reason, CFD is a useful tool for the design and evaluation of sampling systems, among others, to verify the fundamental hypotheses on which the mass balance for the sampling hood is defined. The procedure here discussed, which is specific for the case of the investigated landfill, can be generalized to be applied also to different scenarios, where hood sampling is involved. PMID:27540761

  4. Assessment of Borderline Personality Features in Population Samples: Is the Personality Assessment Inventory-Borderline Features Scale Measurement Invariant across Sex and Age?

    Science.gov (United States)

    De Moor, Marleen H. M.; Distel, Marijn A.; Trull, Timothy J.; Boomsma, Dorret I.

    2009-01-01

    Borderline personality disorder (BPD) is more often diagnosed in women than in men, and symptoms tend to decline with age. Using a large community sample, the authors investigated whether sex and age differences in four main features of BPD, measured with the "Personality Assessment Inventory-Borderline Features" scale (PAI-BOR; Morey, 1991), are…

  5. Assessment of borderline personality features in population samples: Is the Personality Assessment Inventory-Borderline Features scale measurement invariant across sex and age?

    NARCIS (Netherlands)

    Moor, de M.H.M.; Distel, M.A.; Trull, T.J.; Boomsma, D.I.

    2009-01-01

    Borderline personality disorder (BPD) is more often diagnosed in women than in men, and symptoms tend to decline with age. Using a large community sample, the authors investigated whether sex and age differences in four main features of BPD, measured with the Personality Assessment Inventory-Borderl

  6. Assessment of electrical charge on airborne microorganisms by a new bioaerosol sampling method.

    Science.gov (United States)

    Lee, Shu-An; Willeke, Klaus; Mainelis, Gediminas; Adhikari, Atin; Wang, Hongxia; Reponen, Tiina; Grinshpun, Sergey A

    2004-03-01

    Bioaerosol sampling is necessary to monitor and control human exposure to harmful airborne microorganisms. An important parameter affecting the collection of airborne microorganisms is the electrical charge on the microorganisms. Using a new design of an electrostatic precipitator (ESP) for bioaerosol sampling, the polarity and relative strength of the electrical charges on airborne microorganisms were determined in several laboratory and field environments by measuring the overall physical collection efficiency and the biological collection efficiency at specific precipitation voltages and polarities. First, bacteria, fungal spores, and dust dispersed from soiled carpets were sampled in a walk-in test chamber. Second, a simulant of anthrax-causing Bacillus anthracis spores was dispersed and sampled in the same chamber. Third, bacteria were sampled in a small office while four adults were engaged in lively discussions. Fourth, bacteria and fungal spores released from hay and horse manure were sampled in a horse barn during cleanup operations. Fifth, bacteria in metalworking fluid droplets were sampled in a metalworking simulator. It was found that the new ESP differentiates between positively and negatively charged microorganisms, and that in most of the tested environments the airborne microorganisms had a net negative charge. This adds a signature to the sampled microorganisms that may assist in their identification or differentiation, for example, in an anti-bioterrorism network.

  7. Frequent sampling by clear venipuncture in unstable angina is a reliable method to assess haemostatic system activity

    NARCIS (Netherlands)

    Biasucci, L.M.; Liuzzo, G.; Caligiuri, G.; Monaco, C.; Quaranta, G.; Sperti, G.; Greef, W. van de; Maseri, A.; Kluft, C.

    1994-01-01

    Sudden limitations in coronary flow account for the majority of cases of UA. Measurement of TAT in peripheral blood represent a reliable marker of an ongoing thrombotic process. The aim of the study was to assess the reliability of frequent blood sampling (Phase A) and to correlate TAT fluctuation t

  8. A Comparison of Momentary Time Sampling and Partial-Interval Recording for Assessment of Effects of Social Skills Training

    Science.gov (United States)

    Radley, Keith C.; O'Handley, Roderick D.; Labrot, Zachary C.

    2015-01-01

    Assessment in social skills training often utilizes procedures such as partial-interval recording (PIR) and momentary time sampling (MTS) to estimate changes in duration in social engagements due to intervention. Although previous research suggests PIR to be more inaccurate than MTS in estimating levels of behavior, treatment analysis decisions…

  9. Assessment of heavy metals in Averrhoa bilimbi and A. carambola fruit samples at two developmental stages.

    Science.gov (United States)

    Soumya, S L; Nair, Bindu R

    2016-05-01

    Though the fruits of Averrhoa bilimbi and A. carambola are economically and medicinally important, they remain underutilized. The present study reports heavy metal quantitation in the fruit samples of A. bilimbi and A. carambola (Oxalidaceae), collected at two stages of maturity. Heavy metals are known to interfere with the functioning of vital cellular components. Although toxic, some elements are considered essential for human health, in trace quantities. Heavy metals such as Cr, Mn, Co, Cu, Zn, As, Se, Pb, and Cd were analyzed by atomic absorption spectroscopy (AAS). The samples under investigation included, A. bilimbi unripe (BU) and ripe (BR), A. carambola sour unripe (CSU) and ripe (CSR), and A. carambola sweet unripe (CTU) and ripe (CTR). Heavy metal analysis showed that relatively higher level of heavy metals was present in BR samples compared to the rest of the samples. The highest amount of As and Se were recorded in BU samples while Mn content was highest in CSU samples and Co in CSR. Least amounts of Cr, Zn, Se, Cd, and Pb were noted in CTU while, Mn, Cu, and As were least in CTR. Thus, the sweet types of A. carambola (CTU, CTR) had comparatively lower heavy metal content. There appears to be no reason for concern since different fruit samples of Averrhoa studied presently showed the presence of various heavy metals in trace quantities.

  10. Assessment of heavy metals in Averrhoa bilimbi and A. carambola fruit samples at two developmental stages.

    Science.gov (United States)

    Soumya, S L; Nair, Bindu R

    2016-05-01

    Though the fruits of Averrhoa bilimbi and A. carambola are economically and medicinally important, they remain underutilized. The present study reports heavy metal quantitation in the fruit samples of A. bilimbi and A. carambola (Oxalidaceae), collected at two stages of maturity. Heavy metals are known to interfere with the functioning of vital cellular components. Although toxic, some elements are considered essential for human health, in trace quantities. Heavy metals such as Cr, Mn, Co, Cu, Zn, As, Se, Pb, and Cd were analyzed by atomic absorption spectroscopy (AAS). The samples under investigation included, A. bilimbi unripe (BU) and ripe (BR), A. carambola sour unripe (CSU) and ripe (CSR), and A. carambola sweet unripe (CTU) and ripe (CTR). Heavy metal analysis showed that relatively higher level of heavy metals was present in BR samples compared to the rest of the samples. The highest amount of As and Se were recorded in BU samples while Mn content was highest in CSU samples and Co in CSR. Least amounts of Cr, Zn, Se, Cd, and Pb were noted in CTU while, Mn, Cu, and As were least in CTR. Thus, the sweet types of A. carambola (CTU, CTR) had comparatively lower heavy metal content. There appears to be no reason for concern since different fruit samples of Averrhoa studied presently showed the presence of various heavy metals in trace quantities. PMID:27080855

  11. Validating the Diagnostic Infant and Preschool Assessment Using a Danish Trauma Sample

    DEFF Research Database (Denmark)

    Schandorph Løkkegaard, Sille; Elklit, Ask

    Background: There is a lack of validated assessment tools for identifying young children with posttraumatic stress disorder (PTSD). One of the few existing tools for children aged 1-6 years is the Diagnostic Infant and Preschool Assessment (DIPA: Sheeringa & Haslett, 2010.) Purpose: To validate...... their mother; one third of the children have been exposed to family violence and taken shelter at a women’s shelter with their mother; and one third of the children have been exposed to maltreatment and are undergoing legal assessments by social services. Expected results: If the concurrent criterion validity...

  12. Configuration Management Automation (CMA)

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  13. Assessment of the differential linear coherent scattering coefficient of biological samples

    Science.gov (United States)

    Conceição, A. L. C.; Antoniassi, M.; Poletti, M. E.

    2010-07-01

    New differential linear coherent scattering coefficient, μ CS, data for four biological tissue types (fat pork, tendon chicken, adipose and fibroglandular human breast tissues) covering a large momentum transfer interval (0.07≤ q≤70.5 nm -1), resulted from combining WAXS and SAXS data, are presented in order to emphasize the need to update the default data-base by including the molecular interference and the large-scale arrangements effect. The results showed that the differential linear coherent scattering coefficient demonstrates influence of the large-scale arrangement, mainly due to collagen fibrils for tendon chicken and fibroglandular breast samples, and triacylglycerides for fat pork and adipose breast samples at low momentum transfer region. While, at high momentum transfer, the μ CS reflects effects of molecular interference related to water for tendon chicken and fibroglandular samples and, fatty acids for fat pork and adipose samples.

  14. Assessment of the differential linear coherent scattering coefficient of biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Conceicao, A.L.C.; Antoniassi, M. [Departamento de Fisica e Matematica, FFCLRP, Universidade de Sao Paulo, Ribeirao Preto, 14040-901 Sao Paulo (Brazil); Poletti, M.E., E-mail: poletti@ffclrp.usp.b [Departamento de Fisica e Matematica, FFCLRP, Universidade de Sao Paulo, Ribeirao Preto, 14040-901 Sao Paulo (Brazil)

    2010-07-21

    New differential linear coherent scattering coefficient, {mu}{sub CS}, data for four biological tissue types (fat pork, tendon chicken, adipose and fibroglandular human breast tissues) covering a large momentum transfer interval (0.07{<=}q{<=}70.5 nm{sup -1}), resulted from combining WAXS and SAXS data, are presented in order to emphasize the need to update the default data-base by including the molecular interference and the large-scale arrangements effect. The results showed that the differential linear coherent scattering coefficient demonstrates influence of the large-scale arrangement, mainly due to collagen fibrils for tendon chicken and fibroglandular breast samples, and triacylglycerides for fat pork and adipose breast samples at low momentum transfer region. While, at high momentum transfer, the {mu}{sub CS} reflects effects of molecular interference related to water for tendon chicken and fibroglandular samples and, fatty acids for fat pork and adipose samples.

  15. MICROBIOLOGICAL FIELD SAMPLING AND INSTRUMENTATION IN THE ASSESSMENT OF SOIL AND GROUND-WATER POLLUTION

    Science.gov (United States)

    This chapter emphasizes the importance of microbiological sampling of soil and ground water with respect to human heath risks, laws and regulations dealing with safe drinking water, and more prevalent subsurface monitoring activities associated with chlorinated organic compounds,...

  16. DNA damage in preserved specimens and tissue samples: a molecular assessment

    Directory of Open Access Journals (Sweden)

    Cantin Elizabeth

    2008-10-01

    Full Text Available Abstract The extraction of genetic information from preserved tissue samples or museum specimens is a fundamental component of many fields of research, including the Barcode of Life initiative, forensic investigations, biological studies using scat sample analysis, and cancer research utilizing formaldehyde-fixed, paraffin-embedded tissue. Efforts to obtain genetic information from these sources are often hampered by an inability to amplify the desired DNA as a consequence of DNA damage. Previous studies have described techniques for improved DNA extraction from such samples or focused on the effect of damaging agents – such as light, oxygen or formaldehyde – on free nucleotides. We present ongoing work to characterize lesions in DNA samples extracted from preserved specimens. The extracted DNA is digested to single nucleosides with a combination of DNase I, Snake Venom Phosphodiesterase, and Antarctic Phosphatase and then analyzed by HPLC-ESI-TOF-MS. We present data for moth specimens that were preserved dried and pinned with no additional preservative and for frog tissue samples that were preserved in either ethanol, or formaldehyde, or fixed in formaldehyde and then preserved in ethanol. These preservation methods represent the most common methods of preserving animal specimens in museum collections. We observe changes in the nucleoside content of these samples over time, especially a loss of deoxyguanosine. We characterize the fragmentation state of the DNA and aim to identify abundant nucleoside lesions. Finally, simple models are introduced to describe the DNA fragmentation based on nicks and double-strand breaks.

  17. When is the best time to sample aquatic macroinvertebrates in ponds for biodiversity assessment?

    Science.gov (United States)

    Hill, M J; Sayer, C D; Wood, P J

    2016-03-01

    Ponds are sites of high biodiversity and conservation value, yet there is little or no statutory monitoring of them across most of Europe. There are clear and standardised protocols for sampling aquatic macroinvertebrate communities in ponds, but the most suitable time(s) to undertake the survey(s) remains poorly specified. This paper examined the aquatic macroinvertebrate communities from 95 ponds within different land use types over three seasons (spring, summer and autumn) to determine the most appropriate time to undertake sampling to characterise biodiversity. The combined samples from all three seasons provided the most comprehensive record of the aquatic macroinvertebrate taxa recorded within ponds (alpha and gamma diversity). Samples collected during the autumn survey yielded significantly greater macroinvertebrate richness (76% of the total diversity) than either spring or summer surveys. Macroinvertebrate diversity was greatest during autumn in meadow and agricultural ponds, but taxon richness among forest and urban ponds did not differ significantly temporally. The autumn survey provided the highest measures of richness for Coleoptera, Hemiptera and Odonata. However, richness of the aquatic insect order Trichoptera was highest in spring and lowest in autumn. The results illustrate that multiple surveys, covering more than one season, provide the most comprehensive representation of macroinvertebrate biodiversity. When sampling can only be undertaken on one occasion, the most appropriate time to undertake surveys to characterise the macroinvertebrate community biodiversity is during autumn, although this may need to be modified if other floral and faunal groups need to be incorporated into the sampling programme. PMID:26920128

  18. Accuracy of the automated cell counters for management of spontaneous bacterial peritonitis

    Institute of Scientific and Technical Information of China (English)

    Oliviero Riggio; Stefania Angeloni; Antonella Parente; Cinzia Leboffe; Giorgio Pinto; Teresa Aronne; Manuela Merli

    2008-01-01

    AIM: To evaluate the accuracy of automated blood cell counters for ascitic polymorphonuclear (PMN)determination for: (1) diagnosis,(2) efficacy of the ongoing antibiotic therapy,and (3) resolution of spontaneous bacterial peritonitis (SBP).METHODS: One hundred and twelve ascitic fluid samples were collected from 52 consecutive cirrhotic patients,16 of them with SBR The agreement between the manual and the automated method for PMN count was assessed.The sensitivity/specificity and the positive/negative predictive value of the automated blood cell counter were also calculated by considering the manual method as the "gold standard"RESULTS: The mean + SD of the difference between manual and automated measurements was 7.8±58cells/mm3,while the limits of agreement were +124 cells/mm3 [95% confidence interval (CI): +145 to +103] and -108 cells/mm3 (95% CI: -87 to -129).The automated cell counter had a sensitivity of 100% and a specificity of 97.7% in diagnosing SBP,and a sensitivity of 91% and a specificity of 100% for the efficacy of the ongoing antibiotic therapy.The two methods showed a complete agreement for the resolution of infection.CONCLUSION: Automated cell counters not only have a good diagnostic accuracy,but are also very effective in monitoring the antibiotic treatment in patients with SBP.Because of their quicker performance,they should replace the manual counting for PMN determination in the ascitic fluid of patients with SBR

  19. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  20. Assessment of Status of rpoB Gene in FNAC Samples of Tuberculous Lymphadenitis by Real-Time PCR

    Directory of Open Access Journals (Sweden)

    Amita Raoot

    2012-01-01

    Full Text Available Introduction. Multidrug resistance tuberculosis (MDR TB, the combined resistance of Mycobacterium tuberculosis to isoniazid (INH and rifampin (RFM is a major public health problem in India as it ranks second among the MDR-TB high burden countries worldwide. WHO recommends RFM resistance as a “surrogate marker” for detecting MDR. FNAC is the most widely used noninvasive investigative technique for TB lymphadenitis. Real-time polymerase chain reaction, an extremely versatile technique can be used for the timely detection and treatment of MDR TB by assessing RFM resistance status in the FNAC samples of TB lymphadenitis. Aim. To assess the status of rpoB gene by real-time PCR in FNAC samples of TB lymphadenitis. Materials and Methods. Thirty FNAC samples from patients with persistent LAP or appearance of new LAP after 5 months or more of Anti Tubercular Treatment were assessed for status of rpoB gene by Real-Time PCR using probe covering the “hot spot resistance” region of the rpoB gene. Result. By using probe covering codons 531 and 526 of rpoB gene, we could detect 17 of 30 (56.7% rifampin resistant isolate. The PCR could detect Mtb DNA in 100% of cases. Conclusion. Use of molecular methods like Real-Time PCR for detection of MDR-TB in FNAC samples is time saving, logical and economical approach over the culture based method.

  1. Quality Assessment of Artemether-Lumefantrine Samples and Artemether Injections Sold in the Cape Coast Metropolis

    Directory of Open Access Journals (Sweden)

    James Prah

    2016-01-01

    Full Text Available Most prescribers and patients in Ghana now opt for the relatively expensive artemether/lumefantrine rather than artesunate-amodiaquine due to undesirable side effects in the treatment of uncomplicated malaria. The study sought to determine the existence of substandard and/or counterfeit artemether-lumefantrine tablets and suspension as well as artemether injection on the market in Cape Coast. Six brands of artemether-lumefantrine tablets, two brands of artemether-lumefantrine suspensions, and two brands of artemether injections were purchased from pharmacies in Cape Coast for the study. The mechanical properties of the tablets were evaluated. The samples were then analyzed for the content of active ingredients using High Performance Liquid Chromatography with a variable wavelength detector. None of the samples was found to be counterfeit. However, the artemether content of the samples was variable (93.22%−104.70% of stated content by manufacturer. The lumefantrine content of the artemether/lumefantrine samples was also variable (98.70%–111.87%. Seven of the artemether-lumefantrine brands passed whilst one failed the International Pharmacopoeia content requirements. All brands of artemether injections sampled met the International Pharmacopoeia content requirement. The presence of a substandard artemether-lumefantrine suspension in the market should alert regulatory bodies to be more vigilant and totally flush out counterfeit and substandard drugs from the Ghanaian market.

  2. Quality Assessment of Artemether-Lumefantrine Samples and Artemether Injections Sold in the Cape Coast Metropolis.

    Science.gov (United States)

    Prah, James; Ameyaw, Elvis Ofori; Afoakwah, Richmond; Fiawoyife, Patrick; Oppong-Danquah, Ernest; Boampong, Johnson Nyarko

    2016-01-01

    Most prescribers and patients in Ghana now opt for the relatively expensive artemether/lumefantrine rather than artesunate-amodiaquine due to undesirable side effects in the treatment of uncomplicated malaria. The study sought to determine the existence of substandard and/or counterfeit artemether-lumefantrine tablets and suspension as well as artemether injection on the market in Cape Coast. Six brands of artemether-lumefantrine tablets, two brands of artemether-lumefantrine suspensions, and two brands of artemether injections were purchased from pharmacies in Cape Coast for the study. The mechanical properties of the tablets were evaluated. The samples were then analyzed for the content of active ingredients using High Performance Liquid Chromatography with a variable wavelength detector. None of the samples was found to be counterfeit. However, the artemether content of the samples was variable (93.22%-104.70% of stated content by manufacturer). The lumefantrine content of the artemether/lumefantrine samples was also variable (98.70%-111.87%). Seven of the artemether-lumefantrine brands passed whilst one failed the International Pharmacopoeia content requirements. All brands of artemether injections sampled met the International Pharmacopoeia content requirement. The presence of a substandard artemether-lumefantrine suspension in the market should alert regulatory bodies to be more vigilant and totally flush out counterfeit and substandard drugs from the Ghanaian market. PMID:27006665

  3. Measurement of naturally occurring radionuclides in geothermal samples and assessment of radiological risks and radiation doses.

    Science.gov (United States)

    Parmaksiz, A

    2013-12-01

    The analysis of (226)Ra, (232)Th and (40)K radionuclides has been carried out in geothermal water and residue samples collected from six wells of geothermal power plant and disposal site, using gamma-spectrometry system equipped with a high-purity germanium detector. The activity concentrations of nine geothermal water samples were found to be lower than minimum detectable activity (MDA) values. The activity concentration of the residue samples ranged from 40 ± 4 to 2694 ± 85 Bq kg(-1) for (226)Ra, 33 ± 4 to 2388 ± 85 Bq kg(-1) for (232)Th, and MDA value to 967 ± 30 Bq kg(-1) for (40)K. In the study, some radiological indexes were examined and found to be higher than the reference values for majority of the residue samples. The annual effective doses arising from some residue samples were calculated to be higher than the permitted dose rate for the public, i.e. 1 mSv y(-1).

  4. Assessment of methods to recover DNA from bacteria, fungi and archaea in complex environmental samples.

    Science.gov (United States)

    Guillén-Navarro, Karina; Herrera-López, David; López-Chávez, Mariana Y; Cancino-Gómez, Máximo; Reyes-Reyes, Ana L

    2015-11-01

    DNA extraction from environmental samples is a critical step for metagenomic analysis to study microbial communities, including those considered uncultivable. Nevertheless, obtaining good quality DNA in sufficient quantities for downstream methodologies is not always possible, and it depends on the complexity and stability of each ecosystem, which could be more problematic for samples from tropical regions because those ecosystems are less stable and more complex. Three laboratory methods for the extraction of nucleic acids from samples representing unstable (decaying coffee pulp and mangrove sediments) and relatively stable (compost and soil) environments were tested. The results were compared with those obtained using two commercial DNA extraction kits. The quality of the extracted DNA was evaluated by PCR amplification to verify the recovery of bacterial, archaeal, and fungal genetic material. The laboratory method that gave the best results used a lysis procedure combining physical, chemical, and enzymatic steps.

  5. Assessment of methods to recover DNA from bacteria, fungi and archaea in complex environmental samples.

    Science.gov (United States)

    Guillén-Navarro, Karina; Herrera-López, David; López-Chávez, Mariana Y; Cancino-Gómez, Máximo; Reyes-Reyes, Ana L

    2015-11-01

    DNA extraction from environmental samples is a critical step for metagenomic analysis to study microbial communities, including those considered uncultivable. Nevertheless, obtaining good quality DNA in sufficient quantities for downstream methodologies is not always possible, and it depends on the complexity and stability of each ecosystem, which could be more problematic for samples from tropical regions because those ecosystems are less stable and more complex. Three laboratory methods for the extraction of nucleic acids from samples representing unstable (decaying coffee pulp and mangrove sediments) and relatively stable (compost and soil) environments were tested. The results were compared with those obtained using two commercial DNA extraction kits. The quality of the extracted DNA was evaluated by PCR amplification to verify the recovery of bacterial, archaeal, and fungal genetic material. The laboratory method that gave the best results used a lysis procedure combining physical, chemical, and enzymatic steps. PMID:26014885

  6. Assessing efficiency of spatial sampling using combined coverage analysis in geographical and feature spaces

    Science.gov (United States)

    Hengl, Tomislav

    2015-04-01

    Efficiency of spatial sampling largely determines success of model building. This is especially important for geostatistical mapping where an initial sampling plan should provide a good representation or coverage of both geographical (defined by the study area mask map) and feature space (defined by the multi-dimensional covariates). Otherwise the model will need to extrapolate and, hence, the overall uncertainty of the predictions will be high. In many cases, geostatisticians use point data sets which are produced using unknown or inconsistent sampling algorithms. Many point data sets in environmental sciences suffer from spatial clustering and systematic omission of feature space. But how to quantify these 'representation' problems and how to incorporate this knowledge into model building? The author has developed a generic function called 'spsample.prob' (Global Soil Information Facilities package for R) and which simultaneously determines (effective) inclusion probabilities as an average between the kernel density estimation (geographical spreading of points; analysed using the spatstat package in R) and MaxEnt analysis (feature space spreading of points; analysed using the MaxEnt software used primarily for species distribution modelling). The output 'iprob' map indicates whether the sampling plan has systematically missed some important locations and/or features, and can also be used as an input for geostatistical modelling e.g. as a weight map for geostatistical model fitting. The spsample.prob function can also be used in combination with the accessibility analysis (cost of field survey are usually function of distance from the road network, slope and land cover) to allow for simultaneous maximization of average inclusion probabilities and minimization of total survey costs. The author postulates that, by estimating effective inclusion probabilities using combined geographical and feature space analysis, and by comparing survey costs to representation

  7. Campylobacter in broiler slaughter samples assessed by direct count on mCCDA and Campy-Cefex agar.

    Science.gov (United States)

    Gonsalves, Camila Cristina; Borsoi, Anderlise; Perdoncini, Gustavo; Rodrigues, Laura Beatriz; do Nascimento, Vladimir Pinheiro

    2016-01-01

    Campylobacter spp. cause foodborne illnesses in humans primarily through the consumption of contaminated chicken. The aim of this study was to evaluate the United States Department of Agriculture's (USDA) recommended methodology, protocol MLG 41.02, for the isolation, identification and direct plate counting of Campylobacter jejuni and C. coli samples from the broiler slaughtering process. A plating method using both mCCDA and Campy-Cefex agars is recommended to recover Campylobacter cells. It is also possible to use this method in different matrices (cloacal swabs and water samples). Cloacal swabs, samples from pre-chiller and post-chiller carcasses and samples of pre-chiller, chiller and direct supply water were collected each week for four weeks from the same flock at a slaughterhouse located in an abattoir in southern Brazil. Samples were analyzed to directly count Campylobacter spp., and the results showed a high frequency of Campylobacter spp. on Campy-Cefex agar. For the isolated species, 72% were identified as Campylobacter jejuni and 38% as Campylobacter coli. It was possible to count Campylobacter jejuni and Campylobacter coli from different samples, including the water supply samples, using the two-agar method. These results suggest that slaughterhouses can use direct counting methods with both agars and different matrices as a monitoring tool to assess the presence of Campylobacter bacteria in their products. PMID:27237112

  8. The Cognition Battery of the NIH Toolbox for Assessment of Neurological and Behavioral Function: Validation in an Adult Sample

    OpenAIRE

    Weintraub, Sandra; Dikmen, Sureyya S.; Heaton, Robert K.; Tulsky, David S.; Zelazo, Philip David; Slotkin, Jerry; Carlozzi, Noelle E.; Bauer, Patricia J.; Wallner-Allen, Kathleen; Fox, Nathan; Havlik, Richard; Beaumont, Jennifer L.; Mungas, Dan; Manly, Jennifer J.; Moy, Claudia

    2014-01-01

    This paper introduces a special series on validity studies of the Cognition Battery (CB) from the U.S. National Institutes of Health Toolbox for the Assessment of Neurological and Behavioral Function (NIHTB) (R. C. Gershon et al., 2013) in an adult sample. This first paper in the series describes the sample, each of the seven instruments in the NIHTB-CB briefly, and the general approach to data analysis. Data are provided on test-retest reliability and practice effects, and raw scores (mean, ...

  9. Shoe-String Automation

    Energy Technology Data Exchange (ETDEWEB)

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  10. Actuarial Risk Assessment and Recidivism in a Sample of UK Intellectually Disabled Sexual Offenders

    Science.gov (United States)

    Wilcox, Dan; Beech, Anthony; Markall, Helena F.; Blacker, Janine

    2009-01-01

    This study examines the effectiveness of three risk assessment instruments: Static-99, Risk Matrix 2000 (RM2000) and the Rapid Risk of Sex Offender Recidivism (RRASOR), in predicting sexual recidivism among 27 intellectually disabled sex offenders. The overall sexual offence reconviction rate was 30%, while non-recidivists remained offence-free…

  11. Sampling effects on the assessment of genetic diversity of rhizobia associated with soybean and common bean

    NARCIS (Netherlands)

    Alberton, O.; Kaschuk, G.; Hungria, M.

    2006-01-01

    Biological nitrogen fixation plays a key role in agriculture sustainability, and assessment of rhizobial diversity contributes to worldwide knowledge of biodiversity of soil microorganisms, to the usefulness of rhizobial collections and to the establishment of long-term strategies aimed at increasin

  12. Molecular method to assess the diversity of Burkholderia species in environmental samples

    NARCIS (Netherlands)

    Salles, J.; Souza, de F.A.; Elsas, van J.D.

    2002-01-01

    In spite of the importance of many members of the genus Burkholderia in the soil microbial community, no direct method to assess the diversity of this genus has been developed so far. The aim of this work was the development of soil DNA-based PCR-denaturing gradient gel electrophoresis (DGGE), a pow

  13. Molecular method to assess the diversity of Burkholderia species in environmental samples

    NARCIS (Netherlands)

    Salles, J.F.; De Souza, F.A.; Van Elsas, J.D.

    2002-01-01

    In spite of the importance of many members of the genus Burkholderia in the soil microbial community, no direct method to assess the diversity of this genus has been developed so far. The aim of this work was the development of soil DNA-based PCR-denaturing gradient get electrophoresis (DGGE), a pow

  14. Assessing the diagnostic validity of a structured psychiatric interview in a first-admission hospital sample

    DEFF Research Database (Denmark)

    Frederiksen, Julie Elisabeth Nordgaard; Revsbech, Rasmus; Sæbye, Ditte;

    2012-01-01

    , first-admitted inpatients, the results of an assessment with the Structured Clinical Interview for DSM-IV (SCID), yielding a DSM-IV diagnosis and performed by a trained non-clinician, were compared with a consensus lifetime best diagnostic estimate (DSM-IV) by two experienced research clinicians, based...

  15. The Management Aspect of the e-Portfolio as an Assessment Tool: Sample of Anadolu University

    Science.gov (United States)

    Ozgur, Aydin Ziya; Kaya, Secil

    2011-01-01

    This article intends to introduce an e-portfolio system to help mentors assess the teacher candidates' performances and products in a large scale open and distance learning teacher training program. The Pre-School Teacher Training Program (PSTTP) of Anadolu University is a completely distance program that helps around 12.000 students get the…

  16. Sample Tasks from the PISA 2000 Assessment: Reading, Mathematical and Scientific Literacy.

    Science.gov (United States)

    Tamassia, Claudia; Schleicher, Andreas

    In response to the need for internationally comparable student achievement data, the Organisation for Economic Cooperation and Development (OECD) launched the Programme for International Student Assessment (PISA). PISA represents a commitment by the members of OECD to monitor the outcomes of education systems in terms of student achievement. This…

  17. Assessment of bioaerosols in urban and rural primary schools using passive and active sampling methodologies

    Directory of Open Access Journals (Sweden)

    Canha Nuno

    2015-12-01

    Full Text Available People spend most of their time in indoor environments and, consequently, these environments are more significant for the contribution of the daily pollutant exposure than outdoors. In case of children, a great part of their time is spent at school. Therefore, evaluations of this microenvironment are important to assess their time-weighted exposure to air pollutants.

  18. UAF RADIORESPIROMETRIC PROTOCOL FOR ASSESSING HYDROCARBON MINERALIZATION POTENTIAL IN ENVIRONMENTAL SAMPLES

    Science.gov (United States)

    Following the EXXON Valdez Oil Spill, a radiorespirometric protocol was developed at the University of Alaska Fairbanks (UAF) to assess the potential for microorganisms in coastal waters and sediments to degrade hydrocarbons. he use of bioremediation to assist in oil spill cleanu...

  19. Assessment of hygienic quality of some types of cheese sampled from retail outlets

    Directory of Open Access Journals (Sweden)

    Elisabetta Di Giannatale

    2010-06-01

    Full Text Available The authors evaluated the prevalence of Listeria monocytogenes, Escherichia coli O157:H7, Salmonella spp. and Staphylococcus enterotoxin, in 2 132 samples selected from six types of cheese on the basis of recorded consumption in Italy in 2004. In L. monocytogenes-positive samples the precise level of contamination was determined. To define the physical-chemical characteristics of the selected natural cheeses, the pH values, water activity and sodium chloride content were determined. The results suggest that blue and soft cheeses (Brie, Camembert, Gorgonzola and Taleggio are more likely to be contaminated with L. monocytogenes. The mean prevalence of L. monocytogenes in the six types of cheese was 2.4% (from 0.2% in Asiago and Crescenza to 6.5% in Taleggio, with contamination levels of up to 460 MPN/g. No presence of Salmonella spp. and E. coli O157 was found in any sample. Staphylococcus enterotoxin was found in 0.6% of the samples examined. Physical and chemical parameter values confirmed that all types of cheese are considered capable of supporting the growth of L. monocytogenes. The study confirmed the need to apply effective control at production and sales levels to reduce the probability of contamination by L. monocytogenes. This micro-organism can attain high levels of contamination in food products, such as cheeses that have a long shelf-life when associated with difficulties of maintaining appropriate storage temperatures in both sales points and in the home.

  20. Self-Esteem: Assessing Measurement Equivalence in a Multiethnic Sample of Youth

    Science.gov (United States)

    Michaels, Marcia L.; Barr, Alicia; Roosa, Mark W.; Knight, George P.

    2007-01-01

    Global self-worth and five domains of self-esteem (scholastic competence, athletic competence, physical appearance, behavioral conduct, social acceptance) were tested for measurement equivalence in a sample of Anglo American, Mexican American, African American, and Native American youth aged 9 through 14 years. The results revealed that global…

  1. Reconstructing temporal trends in heavy metal deposition: Assessing the value of herbarium moss samples

    Energy Technology Data Exchange (ETDEWEB)

    Shotbolt, L. [Geography Department, Queen Mary, University of London, London, E1 4NS (United Kingdom)]. E-mail: l.shotbolt@qmul.ac.uk; Bueker, P. [Stockholm Environment Institute, University of York, Heslington, YO10 5DD (United Kingdom)]. E-mail: pb25@york.ac.uk; Ashmore, M.R. [Environment Department, University of York, Heslington, YO10 5DD (United Kingdom)]. E-mail: ma512@york.ac.uk

    2007-05-15

    The use of the herbarium moss archive for investigating past atmospheric deposition of Ni, Cu, Zn, As, Cd and Pb was evaluated. Moss samples from five UK regions collected over 150 years were analysed for 26 elements using ICP-MS. Principal components analysis identified soil as a significant source of Ni and As and atmospheric deposition as the main source of Pb and Cu. Sources of Zn and Cd concentrations were identified to be at least partly atmospheric, but require further investigation. Temporal and spatial trends in metal concentrations in herbarium mosses showed that the highest Pb and Cu levels are found in Northern England in the late 19th century. Metal concentrations in herbarium moss samples were consistently higher than those in mosses collected from the field in 2000. Herbarium moss samples are concluded to be a useful resource to contribute to reconstructing trends in Pb and Cu deposition, but not, without further analysis, for Cd, Zn, As and Ni. - Herbarium moss samples can contribute to the reconstruction of past heavy metal deposition.

  2. Assessment of a Rotating Time Sampling Procedure: Implications for Interobserver Agreement and Response Measurement

    Science.gov (United States)

    Becraft, Jessica L.; Borrero, John C.; Davis, Barbara J.; Mendres-Smith, Amber E.

    2016-01-01

    The current study was designed to evaluate a rotating momentary time sampling (MTS) data collection system. A rotating MTS system has been used to measure activity preferences of preschoolers but not to collect data on responses that vary in duration and frequency (e.g., talking). We collected data on talking for 10 preschoolers using a 5-s MTS…

  3. Assessment of Psychopathic Traits in an Incarcerated Adolescent Sample: A Methodological Comparison

    Science.gov (United States)

    Fink, Brandi C.; Tant, Adam S.; Tremba, Katherine; Kiehl, Kent A.

    2012-01-01

    Analyses of convergent validity and group assignment using self-report, caregiver-report and interview-based measures of adolescent psychopathy were conducted in a sample of 160 incarcerated adolescents. Results reveal significant convergent validity between caregiver-report measures of adolescent psychopathy, significant convergent validity…

  4. Use of a Radon Stripping Algorithm for Retrospective Assessment of Air Filter Samples

    Energy Technology Data Exchange (ETDEWEB)

    Robert Hayes

    2009-01-23

    An evaluation of a large number of air sample filters was undertaken using a commercial alpha and beta spectroscopy system employing a passive implanted planar silicon (PIPS) detector. Samples were only measured after air flow through the filters had ceased. Use of a commercial radon stripping algorithm was implemented to discriminate anthropogenic alpha and beta activity on the filters from the radon progeny. When uncontaminated air filters were evaluated, the results showed that there was a time-dependent bias in both average estimates and measurement dispersion with the relative bias being small compared to the dispersion. By also measuring environmental air sample filters simultaneously with electroplated alpha and beta sources, use of the radon stripping algorithm demonstrated a number of substantial unexpected deviations. Use of the current algorithm is therefore not recommended for assay applications and so use of the PIPS detector should only be utilized for gross counting without appropriate modifications to the curve fitting algorithm. As a screening method, the radon stripping algorithm might be expected to see elevated alpha and beta activities on air sample filters (not due to radon progeny) around the 200 dpm level.

  5. Problems Found Using a Radon Stripping Algorithm for Retrospective Assessment of Air Filter Samples

    Energy Technology Data Exchange (ETDEWEB)

    Robert Hayes

    2008-04-01

    An evaluation of a large number of air sample filters was undertaken using a commercial alpha and beta spectroscopy system employing a passive implanted planar silicon (PIPS) detector. Samples were only measured after air flow through the filters had ceased. Use of a commercial radon stripping algorithm was implemented to discriminate anthropogenic alpha activity on the filters from the radon progeny. When uncontaminated air filters were evaluated, the results showed that there was a time-dependent bias in both average estimates and measurement dispersion of anthropogenic activity estimates with the relative bias being small compared to the dispersion, indicating that the system would not give false positive indications for an appropriately set decision level. By also measuring environmental air sample filters simultaneously with electroplated alpha filters, use of the radon stripping algorithm demonstrated a number of substantial unexpected deviations from calibrated values indicating that the system would give false negative indications. Use of the current algorithm is, therefore, not recommended for general assay applications. Use of the PIPS detector should only be utilized for gross counting without appropriate modifications to the curve-fitting algorithm. As a screening method, the radon stripping algorithm might be expected to see elevated alpha activities on air sample filters (not due to radon progeny) around the 200 disintegrations per minute level.

  6. Efficiency of Sampling and Analysis of Asbestos Fibers on Filter Media: Implications for Exposure Assessment

    Science.gov (United States)

    To measure airborne asbestos and other fibers, an air sample must represent the actual number and size of fibers. Typically, mixed cellulose ester (MCE, 0.45 or 0.8 µm pore size) and to a much lesser extent, capillary-pore polycarbonate (PC, 0.4 µm pore size) membrane filters are...

  7. Feasibility of Momentary Sampling Assessment of Cannabis Use in Adolescents and Young Adults

    Science.gov (United States)

    Black, Shimrit K.; de Moor, Carl; Kendall, Ashley D.; Shrier, Lydia A.

    2014-01-01

    This study examines the feasibility of recruiting and retaining adolescents and young adults with frequent cannabis use for a 2-week momentary sampling study of cannabis use. Participants responded to random signals on a handheld computer with reports of their use. Participants also initiated reports pre- and post-cannabis use. Participants had…

  8. Assessment of soil sample quality used for density evaluations through computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Pires, Luiz F.; Arthur, Robson C.J.; Bacchi, Osny O.S. [Centro de Energia Nuclear na Agricultura (CENA), Piracicaba, SP (Brazil)]. E-mail: lfpires@cena.usp.br

    2005-07-01

    There are several methods to measure soil bulk density ({rho}{sub s}) like the paraffin sealed clod (PS), the volumetric ring (VR), the computed tomography (CT), and the neutron-gamma surface gauge (SG). In order to evaluate by a non-destructive way the possible modifications in soil structure caused by sampling for the PS and VR methods of {rho}{sub s} evaluation we proposed to use the gamma ray CT method. A first generation tomograph was used having a {sup 241}Am source and a 3 in x 3 in NaI(Tl) scintillation crystal detector coupled to a photomultiplier tube. Results confirm the effect of soil sampler devices on the structure of soil samples, and that the compaction caused during sampling causes significant alterations of soil bulk density. Through the use of CT it was possible to determine the level of compaction and to make a detailed analysis of the soil bulk density distribution within the soil sample. (author)

  9. Geostatistics for Mapping Leaf Area Index over a Cropland Landscape: Efficiency Sampling Assessment

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Haro

    2010-11-01

    Full Text Available This paper evaluates the performance of spatial methods to estimate leaf area index (LAI fields from ground-based measurements at high-spatial resolution over a cropland landscape. Three geostatistical model variants of the kriging technique, the ordinary kriging (OK, the collocated cokriging (CKC and kriging with an external drift (KED are used. The study focused on the influence of the spatial sampling protocol, auxiliary information, and spatial resolution in the estimates. The main advantage of these models lies in the possibility of considering the spatial dependence of the data and, in the case of the KED and CKC, the auxiliary information for each location used for prediction purposes. A high-resolution NDVI image computed from SPOT TOA reflectance data is used as an auxiliary variable in LAI predictions. The CKC and KED predictions have proven the relevance of the auxiliary information to reproduce the spatial pattern at local scales, proving the KED model to be the best estimator when a non-stationary trend is observed. Advantages and limitations of the methods in LAI field predictions for two systematic and two stratified spatial samplings are discussed for high (20 m, medium (300 m and coarse (1 km spatial scales. The KED has exhibited the best observed local accuracy for all the spatial samplings. Meanwhile, the OK model provides comparable results when a well stratified sampling scheme is considered by land cover.

  10. Validation of three viable-cell counting methods: Manual, semi-automated, and automated

    Directory of Open Access Journals (Sweden)

    Daniela Cadena-Herrera

    2015-09-01

    Full Text Available A viable cell count is essential to evaluate the kinetics of cell growth. Since the hemocytometer was first used for counting blood cells, several variants of the methodology have been developed towards reducing the time of analysis and improving accuracy through automation of both sample preparation and counting. The successful implementation of automated techniques relies in the adjustment of cell staining, image display parameters and cell morphology to obtain equivalent precision, accuracy and linearity with respect to the hemocytometer. In this study we conducted the validation of three trypan blue exclusion-based methods: manual, semi-automated, and fully automated; which were used for the estimation of density and viability of cells employed for the biosynthesis and bioassays of recombinant proteins. Our results showed that the evaluated attributes remained within the same range for the automated methods with respect to the manual, providing an efficient alternative for analyzing a huge number of samples.

  11. Automated Dissolution for Enteric-Coated Aspirin Tablets: A Case Study for Method Transfer to a RoboDis II.

    Science.gov (United States)

    Ibrahim, Sarah A; Martini, Luigi

    2014-08-01

    Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer.

  12. Green Plants in the Red: A Baseline Global Assessment for the IUCN Sampled Red List Index for Plants.

    Directory of Open Access Journals (Sweden)

    Neil A Brummitt

    Full Text Available Plants provide fundamental support systems for life on Earth and are the basis for all terrestrial ecosystems; a decline in plant diversity will be detrimental to all other groups of organisms including humans. Decline in plant diversity has been hard to quantify, due to the huge numbers of known and yet to be discovered species and the lack of an adequate baseline assessment of extinction risk against which to track changes. The biodiversity of many remote parts of the world remains poorly known, and the rate of new assessments of extinction risk for individual plant species approximates the rate at which new plant species are described. Thus the question 'How threatened are plants?' is still very difficult to answer accurately. While completing assessments for each species of plant remains a distant prospect, by assessing a randomly selected sample of species the Sampled Red List Index for Plants gives, for the first time, an accurate view of how threatened plants are across the world. It represents the first key phase of ongoing efforts to monitor the status of the world's plants. More than 20% of plant species assessed are threatened with extinction, and the habitat with the most threatened species is overwhelmingly tropical rain forest, where the greatest threat to plants is anthropogenic habitat conversion, for arable and livestock agriculture, and harvesting of natural resources. Gymnosperms (e.g. conifers and cycads are the most threatened group, while a third of plant species included in this study have yet to receive an assessment or are so poorly known that we cannot yet ascertain whether they are threatened or not. This study provides a baseline assessment from which trends in the status of plant biodiversity can be measured and periodically reassessed.

  13. Assessment of the Revised 3410 Building Filtered Exhaust Stack Sampling Probe Location

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Recknagle, Kurtis P.; Glissmeyer, John A.

    2013-12-01

    In order to support the air emissions permit for the 3410 Building, Pacific Northwest National Laboratory performed a series of tests in the exhaust air discharge from the reconfigured 3410 Building Filtered Exhaust Stack. The objective was to determine whether the location of the air sampling probe for emissions monitoring meets the applicable regulatory criteria governing such effluent monitoring systems. In particular, the capability of the air sampling probe location to meet the acceptance criteria of ANSI/HPS N13.1-2011 , Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stack and Ducts of Nuclear Facilities was determined. The qualification criteria for these types of stacks address 1) uniformity of air velocity, 2) sufficiently small flow angle with respect to the axis of the duct, 3) uniformity of tracer gas concentration, and 4) uniformity of tracer particle concentration. Testing was performed to conform to the quality requirements of NQA-1-2000. Fan configurations tested included all fan combinations of any two fans at a time. Most of the tests were conducted at the normal flow rate, while a small subset of tests was performed at a slightly higher flow rate achieved with the laboratory hood sashes fully open. The qualification criteria for an air monitoring probe location are taken from ANSI/HPS N13.1-2011 and are paraphrased as follows with key results summarized: 1. Angular Flow—The average air velocity angle must not deviate from the axis of the stack or duct by more than 20°. Our test results show that the mean angular flow angles at the center two-thirds of the ducts are smaller than 4.5% for all testing conditions. 2. Uniform Air Velocity—The acceptance criterion is that the COV of the air velocity must be ≤ 20% across the center two thirds of the area of the stack. Our results show that the COVs of the air velocity across the center two-thirds of the stack are smaller than 2.9% for all testing conditions. 3

  14. Assessment of selected contaminants in streambed- and suspended-sediment samples collected in Bexar County, Texas, 2007-09

    Science.gov (United States)

    Wilson, Jennifer T.

    2011-01-01

    Elevated concentrations of sediment-associated contaminants are typically associated with urban areas such as San Antonio, Texas, in Bexar County, the seventh most populous city in the United States. This report describes an assessment of selected sediment-associated contaminants in samples collected in Bexar County from sites on the following streams: Medio Creek, Medina River, Elm Creek, Martinez Creek, Chupaderas Creek, Leon Creek, Salado Creek, and San Antonio River. During 2007-09, the U.S. Geological Survey periodically collected surficial streambed-sediment samples during base flow and suspended-sediment (large-volume suspended-sediment) samples from selected streams during stormwater runoff. All sediment samples were analyzed for major and trace elements and for organic compounds including halogenated organic compounds and polycyclic aromatic hydrocarbons (PAHs). Selected contaminants in streambed and suspended sediments in watersheds of the eight major streams in Bexar County were assessed by using a variety of methods—observations of occurrence and distribution, comparison to sediment-quality guidelines and data from previous studies, statistical analyses, and source indicators. Trace elements concentrations were low compared to the consensus-based sediment-quality guidelines threshold effect concentration (TEC) and probable effect concentration (PEC). Trace element concentrations were greater than the TEC in 28 percent of the samples and greater than the PEC in 1.5 percent of the samples. Chromium concentrations exceeded sediment-quality guidelines more frequently than concentrations of any other constituents analyzed in this study (greater than the TEC in 69 percent of samples and greater than the PEC in 8 percent of samples). Mean trace element concentrations generally are lower in Bexar County samples compared to concentrations in samples collected during previous studies in the Austin and Fort Worth, Texas, areas, but considering the relatively

  15. A sampling scheme to assess persistence and transport characteristics of xenobiotics within an urban river section

    Science.gov (United States)

    Schwientek, Marc; Guillet, Gaelle; Kuch, Bertram; Rügner, Hermann; Grathwohl, Peter

    2014-05-01

    Xenobiotic contaminants such as pharmaceuticals or personal care products typically are continuously introduced into the receiving water bodies via wastewater treatment plant (WWTP) outfalls and, episodically, via combined sewer overflows in the case of precipitation events. Little is known about how these chemicals behave in the environment and how they affect ecosystems and human health. Examples of traditional persistent organic pollutants reveal, that they may still be present in the environment even decades after they have been released. In this study a sampling strategy was developed which gives valuable insights into the environmental behaviour of xenobiotic chemicals. The method is based on the Lagrangian sampling scheme by which a parcel of water is sampled repeatedly as it moves downstream while chemical, physical, and hydrologic processes altering the characteristics of the water mass can be investigated. The Steinlach is a tributary of the River Neckar in Southwest Germany with a catchment area of 140 km². It receives the effluents of a WWTP with 99,000 inhabitant equivalents 4 km upstream of its mouth. The varying flow rate of effluents induces temporal patterns of electrical conductivity in the river water which enable to track parcels of water along the subsequent urban river section. These parcels of water were sampled a) close to the outlet of the WWTP and b) 4 km downstream at the confluence with the Neckar. Sampling was repeated at a 15 min interval over a complete diurnal cycle and 2 h composite samples were prepared. A model-based analysis demonstrated, on the one hand, that substances behaved reactively to a varying extend along the studied river section. On the other hand, it revealed that the observed degradation rates are likely dependent on the time of day. Some chemicals were degraded mainly during daytime (e.g. the disinfectant Triclosan or the phosphorous flame retardant TDCP), others as well during nighttime (e.g. the musk fragrance

  16. Psychometric Properties of the Procrastination Assessment Scale-Student (PASS) in a Student Sample of Sabzevar University of Medical Sciences

    OpenAIRE

    Mortazavi, Forough; Mortazavi, Saideh S.; Khosrorad, Razieh

    2015-01-01

    Background: Procrastination is a common behavior which affects different aspects of life. The procrastination assessment scale-student (PASS) evaluates academic procrastination apropos its frequency and reasons. Objectives: The aims of the present study were to translate, culturally adapt, and validate the Farsi version of the PASS in a sample of Iranian medical students. Patients and Methods: In this cross-sectional study, the PASS was translated into Farsi through the forward-backward metho...

  17. RISK-ASSESSMENT PROCEDURES AND ESTABLISHING THE SIZE OF SAMPLES FOR AUDITING FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Daniel Botez

    2014-12-01

    Full Text Available In auditing financial statements, the procedures for the assessment of the risks and the calculation of the materiality differ from an auditor to another, by audit cabinet policy or advice professional bodies. All, however, have the reference International Audit Standards ISA 315 “Identifying and assessing the risks of material misstatement through understanding the entity and its environment” and ISA 320 “Materiality in planning and performing an audit”. On the basis of specific practices auditors in Romania, the article shows some laborious and examples of these aspects. Such considerations are presented evaluation of the general inherent risk, a specific inherent risk, the risk of control and the calculation of the materiality.

  18. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    In recent years, there has been an increase in the application of distributed, physically-based and integrated hydrological models. Many questions regarding how to properly calibrate and validate distributed models and assess the uncertainty of the estimated parameters and the spatially......-site validation must complement the usual time validation. In this study, we develop, through an application, a comprehensive framework for multi-criteria calibration and uncertainty assessment of distributed physically-based, integrated hydrological models. A revised version of the generalized likelihood......-distributed responses are, however, still quite unexplored. Especially for complex models, rigorous parameterization, reduction of the parameter space and use of efficient and effective algorithms are essential to facilitate the calibration process and make it more robust. Moreover, for these models multi...

  19. Assessing the efficacy of hair snares as a method for noninvasive sampling of Neotropical felids

    OpenAIRE

    Tatiana P. Portella; Diego R. Bilski; Fernando C. Passos; Pie, Marcio R.

    2013-01-01

    Hair snares have been used in North and Central America for a long time in assessment and monitoring studies of several mammalian species. This method can provide a cheap, suitable, and efficient way to monitor mammals because it combines characteristics that are not present in most alternative techniques. However, despite their usefulness, hair snares are rarely used in other parts of the world. The aim of our study was to evaluate the effectiveness of hair snares and three scent lures (cinn...

  20. The impact of genetic heterogeneity on biomarker development in kidney cancer assessed by multiregional sampling

    OpenAIRE

    Sankin, Alexander; Hakimi, Abraham A; Mikkilineni, Nina; Ostrovnaya, Irina; Silk, Mikhail T; Liang, Yupu; Mano, Roy; Chevinsky, Michael; Motzer, Robert J.; Solomon, Stephen B.; Cheng, Emily H.; Durack, Jeremy C.; Coleman, Jonathan A.; Russo, Paul; Hsieh, James J

    2014-01-01

    Primary clear cell renal cell carcinoma (ccRCC) genetic heterogeneity may lead to an underestimation of the mutational burden detected from a single site evaluation. We sought to characterize the extent of clonal branching involving key tumor suppressor mutations in primary ccRCC and determine if genetic heterogeneity could limit the mutation profiling from a single region assessment. Ex vivo core needle biopsies were obtained from three to five different regions of resected renal tumors at a...

  1. Bioluminescent bioreporter for assessment of arsenic contamination in water samples of India

    Indian Academy of Sciences (India)

    Pratima Sharma; Shahzada Asad; Arif Ali

    2013-06-01

    In the present study the most efficient -factor controlling the ars operon was selected after screening of 39 Escherichia coli isolates by minimum inhibitory concentration test (MIC) studies from water samples of different geographical locations of India. Among all, strain isolated from Hooghly River (West Bengal) was found to have maximum tolerance towards arsenic and was further used for the development of bioreporter bacteria. Cloning of the ars regulatory element along with operator-promotor and luxCDABE from Photobacteria into expression vector has been accomplished by following recombinant DNA protocols. The bioreporter sensor system developed in this study can measure the estimated range of 0.74–60 g of As/L and is both specific and selective for sensing bioavailable As. The constructed bacterial biosensor was further used for the determination of arsenic ion concentration in different environmental samples of India.

  2. Assessing phylogenetic relationships of Lycium samples using RAPD and entropy theory

    Institute of Scientific and Technical Information of China (English)

    Xiao-lin YIN; Kai-tai FANG; Yi-zeng LIANG; Ricky NS WONG; Amber WY HA

    2005-01-01

    Aim: To evaluate the phylogenetic relationships among related species of Lycium samples. Methods: Random amplified polymorphic DNA (RAPD) fingerprinting and lab-on-a-chip electrophoresis techniques were used to analyze the characteristics of Lycium species. Seven species and 3 varieties of Lycium were studied.Based on RAPD fingerprint data obtained from 11 primers, we proposed a new index, called dispersivity, using entropy theory and projection methods to depict the diversity of the DNA fingerprints. Results: Using the proposed dispersivity,primers were sorted and the dendrograms of the 7 species and 3 varieties of Lycium were constructed synthetically by merging primer information. Conclusion:Phylogenetic relationships among Lycium samples were constructed synthetically based on RAPD fingerprint data generated from 11 primers.

  3. A METHOD OF IMAGE QUALITY ASSESSMENT FOR COMPRESSIVE SAMPLING VIDEO TRANSMISSION

    Institute of Scientific and Technical Information of China (English)

    Chen Shouning; Zheng Baoyu; Li Jing

    2012-01-01

    Based on compressive sampling transmission model,we demonstrate here a method of quality evaluation for the reconstruction images,which is promising for the transmission of unstructured signal with reduced dimension.By this method,the auxiliary information of the recovery image quality is obtained as a feedback to control number of measurements from compressive sampling video stream.Therefore,the number of measurements can be easily derived at the condition of the absence of information sparsity,and the recovery image quality is effectively improved.Theoretical and experimental results show that this algorithm can estimate the quality of images effectively and is in well consistency with the traditional objective evaluation algorithm.

  4. Assessment of natural radioactivity in soil samples of Dez river sides – Khouzestan province

    OpenAIRE

    Nasri Nasrabadi; Hajialiani; Mostajaboddavati

    2015-01-01

    Geographical features of each region of the world affect the activity concentration of natural radionuclides such as uranium, thorium and potassium. In this study, 26 soil samples were randomly collected from sides of the Dez River and transferred to the Laboratory for preparation. Activity concentration of natural radioactive materials was measured using p-type HPGe detector with a 38% relative efficiency. The results indicated that radioactivity concentration of 226Ra, 232Th and 40K in soi...

  5. Assessment of natural radioactivity levels in soil samples from some areas in Assiut, Egypt.

    Science.gov (United States)

    El-Gamal, Hany; Farid, M El-Azab; Abdel Mageed, A I; Hasabelnaby, M; Hassanien, Hassanien M

    2013-12-01

    The natural radioactivity of soil samples from Assiut city, Egypt, was studied. The activity concentrations of 28 samples were measured with a NaI(Tl) detector. The radioactivity concentrations of (226)Ra, (232)Th, and (40)K showed large variations, so the results were classified into two groups (A and B) to facilitate the interpretation of the results. Group A represents samples collected from different locations in Assiut and characterized by low activity concentrations with average values of 46.15 ± 9.69, 30.57 ± 4.90, and 553.14 ± 23.19 for (226)Ra, (232)Th, and (40)K, respectively. Group B represents samples mainly collected from the area around Assiut Thermal Power Plant and characterized by very high activity concentrations with average values of 3,803 ± 145, 1,782 ± 98, and 1,377 ± 78 for (226)Ra, (232)Th, and (40)K, respectively. In order to evaluate the radiological hazard of the natural radioactivity, the radium equivalent activity (Raeq), the absorbed dose rate (D), the annual effective dose rate (E), the external hazard index (H ex), and the annual gonadal dose equivalent (AGDE) have been calculated and compared with the internationally approved values. For group A, the calculated averages of these parameters are in good agreement with the international recommended values except for the absorbed dose rate and the AGDE values which are slightly higher than the international recommended values. However, for group B, all obtained averages of these parameters are much higher by several orders of magnitude than the international recommended values. The present work provides a background of radioactivity concentrations in the soil of Assiut.

  6. The impact of selection, gene conversion, and biased sampling on the assessment of microbial demography

    OpenAIRE

    Lapierre, Marguerite; Blin, Camille; Lambert, Amaury; Achaz, Guillaume; Eduardo P C Rocha

    2016-01-01

    International audience Recent studies have linked demographic changes and epidemiological patterns in bacterial populations using coalescent-based approaches. We identified 26 studies using skyline plots and found that 21 inferred overall population expansion. This surprising result led us to analyze the impact of natural selection, recombination (gene conversion), and sampling biases on demographic inference using skyline plots and site frequency spectra (SFS). Forward simulations based o...

  7. Assessment of Learning Style in a Sample of Saudi Medical Students

    OpenAIRE

    BuAli, Waleed Hamad Al; Balaha, Magdy Hassan; Muhaidab, Nouria Saab Al

    2013-01-01

    CONFLICT OF INTEREST: NONE DECLARED Background By knowing the different students’ learning styles, teachers can plan their instruction carefully in ways that are capitalized on student preferences. The current research is done to determine specific learning styles of students. Method This cross sectional study was conducted in Al Ahsa College of Medicine from 2011 to 2012. A sample of 518 students completed a questionnaire based on Kolb inventory (LSI 2) to determine their learning style. A s...

  8. Quality assessment metrics for whole genome gene expression profiling of paraffin embedded samples

    OpenAIRE

    Mahoney, Douglas W.; Terry M. Therneau; Anderson, S. Keith; Jen, Jin; Kocher, Jean-Pierre A.; Reinholz, Monica M; Perez, Edith A.; Eckel-Passow, Jeanette E

    2013-01-01

    Background Formalin fixed, paraffin embedded tissues are most commonly used for routine pathology analysis and for long term tissue preservation in the clinical setting. Many institutions have large archives of Formalin fixed, paraffin embedded tissues that provide a unique opportunity for understanding genomic signatures of disease. However, genome-wide expression profiling of Formalin fixed, paraffin embedded samples have been challenging due to RNA degradation. Because of the significant h...

  9. When is the best time to sample aquatic macroinvertebrates in ponds for biodiversity assessment?

    OpenAIRE

    Hill, M. J.; Sayer, C. D.; Wood, P J

    2016-01-01

    Ponds are sites of high biodiversity and conservation value, yet there is little or no statutory monitoring of them across most of Europe. There are clear and standardised protocols for sampling aquatic macroinvertebrate communities in ponds, but the most suitable time(s) to undertake the survey(s) remains poorly specified. This paper examined the aquatic macroinvertebrate communities from 95 ponds within different land use types over three seasons (spring, summer and autumn) to determine the...

  10. Assessing usual dietary intake in complex sample design surveys: the National Dietary Survey

    Directory of Open Access Journals (Sweden)

    Flávia dos Santos Barbosa

    2013-02-01

    Full Text Available The National Cancer Institute (NCI method allows the distributions of usual intake of nutrients and foods to be estimated. This method can be used in complex surveys. However, the user must perform additional calculations, such as balanced repeated replication (BRR, in order to obtain standard errors and confidence intervals for the percentiles and mean from the distribution of usual intake. The objective is to highlight adaptations of the NCI method using data from the National Dietary Survey. The application of the NCI method was exemplified analyzing the total energy (kcal and fruit (g intake, comparing estimations of mean and standard deviation that were based on the complex design of the Brazilian survey with those assuming simple random sample. Although means point estimates were similar, estimates of standard error using the complex design increased by up to 60% compared to simple random sample. Thus, for valid estimates of food and energy intake for the population, all of the sampling characteristics of the surveys should be taken into account because when these characteristics are neglected, statistical analysis may produce underestimated standard errors that would compromise the results and the conclusions of the survey.

  11. Technical rationale and sampling procedures for assessing the effects of subsurface volatile organic contaminants on indoor air quality

    Energy Technology Data Exchange (ETDEWEB)

    Wong, T.T.; Agar, J.G. [O' Connor Associates Environmental Inc., Calgary, AB (Canada); Gregoire, M.Y. [O' Connor Associates Environmental Inc., Winnipeg, MB (Canada)

    2003-07-01

    Volatile organic compounds (VOC) can affect indoor air quality through vapours released indoors by cigarette smoke, building materials, and common household solvents and cleaners. Therefore, the level of VOC contaminants might be a reliable indicator of the impact due to subsurface contamination. The following method of indirect assessment has been accepted by Canadian and American environmental regulatory bodies. This accepted method involves soil gas sampling close to the basement or ground floor slab of a building and VOC vapour transport modeling in order to estimate soil gas flow rates and VOC flux into a building. VOC flux concentration can be used to evaluate the potential human exposure to soil or groundwater derived VOCs, and to estimate the associated human health risks. This paper describes the shallow vapour sampler and sampling procedure specifically designed for collecting representative soil gas samples in the zone adjacent to a building basement or a ground floor slab, which was developed by O'Connor Associates. The results of numerical modeling, and the technical rationale behind the design of a soil gas sampling program for different soil types were presented. Soil type, depth to groundwater, and sampling well construction and their respective influence were discussed with reference to soil gas sampling programs and human health risk evaluations. 12 refs., 5 tabs., 6 figs.

  12. Improving Toxicity Assessment of Pesticide Mixtures: The Use of Polar Passive Sampling Devices Extracts in Microalgae Toxicity Tests.

    Science.gov (United States)

    Kim Tiam, Sandra; Fauvelle, Vincent; Morin, Soizic; Mazzella, Nicolas

    2016-01-01

    Complexity of contaminants exposure needs to be taking in account for an appropriate evaluation of risks related to mixtures of pesticides released in the ecosystems. Toxicity assessment of such mixtures can be made through a variety of toxicity tests reflecting different level of biological complexity. This paper reviews the recent developments of passive sampling techniques for polar compounds, especially Polar Organic Chemical Integrative Samplers (POCIS) and Chemcatcher® and the principal assessment techniques using microalgae in laboratory experiments. The progresses permitted by the coupled use of such passive samplers and ecotoxicology testing as well as their limitations are presented. Case studies combining passive sampling devices (PSD) extracts and toxicity assessment toward microorganisms at different biological scales from single organisms to communities level are presented. These case studies, respectively, aimed (i) at characterizing the "toxic potential" of waters using dose-response curves, and (ii) at performing microcosm experiments with increased environmental realism in the toxicant exposure in term of cocktail composition and concentration. Finally perspectives and limitations of such approaches for future applications in the area of environmental risk assessment are discussed. PMID:27667986

  13. Improving Toxicity Assessment of Pesticide Mixtures: The Use of Polar Passive Sampling Devices Extracts in Microalgae Toxicity Tests

    Science.gov (United States)

    Kim Tiam, Sandra; Fauvelle, Vincent; Morin, Soizic; Mazzella, Nicolas

    2016-01-01

    Complexity of contaminants exposure needs to be taking in account for an appropriate evaluation of risks related to mixtures of pesticides released in the ecosystems. Toxicity assessment of such mixtures can be made through a variety of toxicity tests reflecting different level of biological complexity. This paper reviews the recent developments of passive sampling techniques for polar compounds, especially Polar Organic Chemical Integrative Samplers (POCIS) and Chemcatcher® and the principal assessment techniques using microalgae in laboratory experiments. The progresses permitted by the coupled use of such passive samplers and ecotoxicology testing as well as their limitations are presented. Case studies combining passive sampling devices (PSD) extracts and toxicity assessment toward microorganisms at different biological scales from single organisms to communities level are presented. These case studies, respectively, aimed (i) at characterizing the “toxic potential” of waters using dose-response curves, and (ii) at performing microcosm experiments with increased environmental realism in the toxicant exposure in term of cocktail composition and concentration. Finally perspectives and limitations of such approaches for future applications in the area of environmental risk assessment are discussed. PMID:27667986

  14. Rapid inventory of the ant assemblage in a temperate hardwood forest: species composition and assessment of sampling methods.

    Science.gov (United States)

    Ellison, Aaron M; Record, Sydne; Arguello, Alexander; Gotelli, Nicholas J

    2007-08-01

    Ants are key indicators of ecological change, but few studies have investigated how ant assemblages respond to dramatic changes in vegetation structure in temperate forests. Pests and pathogens are causing widespread loss of dominant canopy tree species; ant species composition and abundance may be very sensitive to such losses. Before the experimental removal of red oak trees to simulate effects of sudden oak death and examine the long-term impact of oak loss at the Black Rock Forest (Cornwall, NY), we carried out a rapid assessment of the ant assemblage in a 10-ha experimental area. We also determined the efficacy in a northern temperate forest of five different collecting methods--pitfall traps, litter samples, tuna fish and cookie baits, and hand collection--routinely used to sample ants in tropical systems. A total of 33 species in 14 genera were collected and identified; the myrmecines, Aphaenogaster rudis and Myrmica punctiventris, and the formicine Formica neogagates were the most common and abundant species encountered. Ninety-four percent (31 of 33) of the species were collected by litter sampling and structured hand sampling together, and we conclude that, in combination, these two methods are sufficient to assess species richness and composition of ant assemblages in northern temperate forests. Using new, unbiased estimators, we project that 38-58 ant species are likely to occur at Black Rock Forest. Loss of oak from these forests may favor Camponotus species that nest in decomposing wood and open habitat specialists in the genus Lasius. PMID:17716467

  15. Western corn rootworm damage assessment in Virginia and adult sampling with commercial yellow sticky traps

    OpenAIRE

    Kuhar, Patrick Thomas

    1996-01-01

    The risk of corn rootworm damage to continuously-grown corn was assessed in 32 fields from seven counties in Virginia in 1993 and 1994, Approximately 28% of the fields examined had economic root damage in corn left untreated with a soil insecticide. In addition, 190/0 of the fields overall had an economic loss in silage due to corn rootworm damage. A second study evaluated the effectiveness of using adult corn rootworm counts on commercial Olson yellow sticky traps and ear-z...

  16. Assessment of the 3410 Building Filtered Exhaust Stack Sampling Probe Location

    Energy Technology Data Exchange (ETDEWEB)

    Glissmeyer, John A.; Flaherty, Julia E.

    2010-07-16

    Pacific Northwest National Laboratory performed several tests in the exhaust air discharge from the new 3410 Building Filtered Exhaust Stack to determine whether the air sampling probe for emissions monitoring for radionuclides is acceptable. The method followed involved adopting the results of a previously performed test series from a system with a similar configuration, followed by several tests on the actual system to verify the applicability of the previously performed tests. The qualification criteria for these types of stacks include metrics concerning 1) uniformity of air velocity, 2) sufficiently small flow angle with respect to the axis of the duct, 3) uniformity of tracer gas concentration, and 4) uniformity tracer particle concentration.

  17. Assessment of the 3420 Building Filtered Exhaust Stack Sampling Probe Location

    Energy Technology Data Exchange (ETDEWEB)

    Glissmeyer, John A.; Flaherty, Julia E.

    2010-07-16

    Pacific Northwest National Laboratory performed several tests in the exhaust air discharge from the new 3420 Building Filtered Exhaust Stack to determine whether the air sampling probe for emissions monitoring for radionuclides is acceptable. The method followed involved adopting the results of a previously performed test series from a system with a similar configuration, followed by several tests on the actual system to verify the applicability of the previously performed tests. The qualification criteria for these types of stacks include metrics concerning 1) uniformity of air velocity, 2) sufficiently small flow angle with respect to the axis of the duct, 3) uniformity of tracer gas concentration, and 4) uniformity tracer particle concentration.

  18. Assessment of radioactivity levels in some oil samples from the western desert, Egypt

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Distributions of natural gamma-emitting radionuclides were determined in 93 oil samples collected from some petroleum fields in the western desert of Egypt. The radioisotope activities in the area under investigation lay in tivities could be on account of differences in TDS, HCO3, and Ba, with high or low pH. In this environment, oil properties differently affected the mobilization of natural radionuclides. The range of 226Ra variation had been compared cation (B) and (E) respectively, which was less than the accepted value.

  19. Assessment of PDMS-water partition coefficients: implications for passive environmental sampling of hydrophobic organic compounds

    Science.gov (United States)

    DiFilippo, Erica L.; Eganhouse, Robert P.

    2010-01-01

    Solid-phase microextraction (SPME) has shown potential as an in situ passive-sampling technique in aquatic environments. The reliability of this method depends upon accurate determination of the partition coefficient between the fiber coating and water (Kf). For some hydrophobic organic compounds (HOCs), Kf values spanning 4 orders of magnitude have been reported for polydimethylsiloxane (PDMS) and water. However, 24% of the published data examined in this review did not pass the criterion for negligible depletion, resulting in questionable Kf values. The range in reported Kf is reduced to just over 2 orders of magnitude for some polychlorinated biphenyls (PCBs) when these questionable values are removed. Other factors that could account for the range in reported Kf, such as fiber-coating thickness and fiber manufacturer, were evaluated and found to be insignificant. In addition to accurate measurement of Kf, an understanding of the impact of environmental variables, such as temperature and ionic strength, on partitioning is essential for application of laboratory-measured Kf values to field samples. To date, few studies have measured Kf for HOCs at conditions other than at 20 degrees or 25 degrees C in distilled water. The available data indicate measurable variations in Kf at different temperatures and different ionic strengths. Therefore, if the appropriate environmental variables are not taken into account, significant error will be introduced into calculated aqueous concentrations using this passive sampling technique. A multiparameter linear solvation energy relationship (LSER) was developed to estimate log Kf in distilled water at 25 degrees C based on published physicochemical parameters. This method provided a good correlation (R2 = 0.94) between measured and predicted log Kf values for several compound classes. Thus, an LSER approach may offer a reliable means of predicting log Kf for HOCs whose experimental log Kf values are presently unavailable. Future

  20. Comparative assessment of the Pu content of MOX samples by different techniques

    International Nuclear Information System (INIS)

    The isotopic composition and concentration of Pu in eight 'high-burn-up' mixed-oxide (MOX) fuel samples have been determined by destructive and non-destructive techniques. In addition, the U concentration and U isotopic composition was also available from the destructive techniques. The applied non-destructive techniques were gamma spectrometry, calorimetry and neutron coincidence counting, while the destructive techniques were titration, alpha spectrometry and thermal ionization mass spectrometry combined with isotope dilution. The current study describes the measurements and compares the results obtained by the mentioned techniques. Some lessons learned for the improvement of the non-destructive assay are also discussed.