WorldWideScience

Sample records for fully automated protein

  1. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    Science.gov (United States)

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  2. Designing a fully automated multi-bioreactor plant for fast DoE optimization of pharmaceutical protein production.

    Science.gov (United States)

    Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner

    2013-06-01

    The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  4. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  5. An international crowdsourcing study into people's statements on fully automated driving

    NARCIS (Netherlands)

    Bazilinskyy, P.; Kyriakidis, M.; de Winter, J.C.F.; Ahram, Tareq; Karwowski, Waldemar; Schmorrow, Dylan

    2015-01-01

    Fully automated driving can potentially provide enormous benefits to society. However, it has been unclear whether people will appreciate such far-reaching technology. This study investigated anonymous textual comments regarding fully automated driving, based on data extracted from three online

  6. Fully automated MRI-guided robotics for prostate brachytherapy

    International Nuclear Information System (INIS)

    Stoianovici, D.; Vigaru, B.; Petrisor, D.; Muntener, M.; Patriciu, A.; Song, D.

    2008-01-01

    The uncertainties encountered in the deployment of brachytherapy seeds are related to the commonly used ultrasound imager and the basic instrumentation used for the implant. An alternative solution is under development in which a fully automated robot is used to place the seeds according to the dosimetry plan under direct MRI-guidance. Incorporation of MRI-guidance creates potential for physiological and molecular image-guided therapies. Moreover, MRI-guided brachytherapy is also enabling for re-estimating dosimetry during the procedure, because with the MRI the seeds already implanted can be localised. An MRI compatible robot (MrBot) was developed. The robot is designed for transperineal percutaneous prostate interventions, and customised for fully automated MRI-guided brachytherapy. With different end-effectors, the robot applies to other image-guided interventions of the prostate. The robot is constructed of non-magnetic and dielectric materials and is electricity free using pneumatic actuation and optic sensing. A new motor (PneuStep) was purposely developed to set this robot in motion. The robot fits alongside the patient in closed-bore MRI scanners. It is able to stay fully operational during MR imaging without deteriorating the quality of the scan. In vitro, cadaver, and animal tests showed millimetre needle targeting accuracy, and very precise seed placement. The robot tested without any interference up to 7T. The robot is the first fully automated robot to function in MRI scanners. Its first application is MRI-guided seed brachytherapy. It is capable of automated, highly accurate needle placement. Extensive testing is in progress prior to clinical trials. Preliminary results show that the robot may become a useful image-guided intervention instrument. (author)

  7. Automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

    Science.gov (United States)

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of applications. Since the usefulness of a model for specific application is determined by its accuracy, model quality estimation is an essential component of protein structure prediction. Comparative protein modeling has become a routine approach in many areas of life science research since fully automated modeling systems allow also nonexperts to build reliable models. In this chapter, we describe practical approaches for automated protein structure modeling with SWISS-MODEL Workspace and the Protein Model Portal.

  8. Development of Fully Automated Low-Cost Immunoassay System for Research Applications.

    Science.gov (United States)

    Wang, Guochun; Das, Champak; Ledden, Bradley; Sun, Qian; Nguyen, Chien

    2017-10-01

    Enzyme-linked immunosorbent assay (ELISA) automation for routine operation in a small research environment would be very attractive. A portable fully automated low-cost immunoassay system was designed, developed, and evaluated with several protein analytes. It features disposable capillary columns as the reaction sites and uses real-time calibration for improved accuracy. It reduces the overall assay time to less than 75 min with the ability of easy adaptation of new testing targets. The running cost is extremely low due to the nature of automation, as well as reduced material requirements. Details about system configuration, components selection, disposable fabrication, system assembly, and operation are reported. The performance of the system was initially established with a rabbit immunoglobulin G (IgG) assay, and an example of assay adaptation with an interleukin 6 (IL6) assay is shown. This system is ideal for research use, but could work for broader testing applications with further optimization.

  9. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  10. Intention to use a fully automated car: attitudes and a priori acceptability

    OpenAIRE

    PAYRE, William; CESTAC, Julien; DELHOMME, Patricia

    2014-01-01

    If previous research studied acceptability of partially or highly automated driving, few of them focused on fully automated driving (FAD), including the ability to master longitudinal control, lateral control and maneuvers. The present study analyzes a priori acceptability, attitudes, personality traits and intention to use a fully automated vehicle. 421 French drivers (153 males, M= 40.2 years, age range 19-73) answered an online questionnaire. 68.1% of the sample a priori accepted FAD. P...

  11. Fully automated segmentation of callus by micro-CT compared to biomechanics.

    Science.gov (United States)

    Bissinger, Oliver; Götz, Carolin; Wolff, Klaus-Dietrich; Hapfelmeier, Alexander; Prodinger, Peter Michael; Tischer, Thomas

    2017-07-11

    A high percentage of closed femur fractures have slight comminution. Using micro-CT (μCT), multiple fragment segmentation is much more difficult than segmentation of unfractured or osteotomied bone. Manual or semi-automated segmentation has been performed to date. However, such segmentation is extremely laborious, time-consuming and error-prone. Our aim was to therefore apply a fully automated segmentation algorithm to determine μCT parameters and examine their association with biomechanics. The femura of 64 rats taken after randomised inhibitory or neutral medication, in terms of the effect on fracture healing, and controls were closed fractured after a Kirschner wire was inserted. After 21 days, μCT and biomechanical parameters were determined by a fully automated method and correlated (Pearson's correlation). The fully automated segmentation algorithm automatically detected bone and simultaneously separated cortical bone from callus without requiring ROI selection for each single bony structure. We found an association of structural callus parameters obtained by μCT to the biomechanical properties. However, results were only explicable by additionally considering the callus location. A large number of slightly comminuted fractures in combination with therapies that influence the callus qualitatively and/or quantitatively considerably affects the association between μCT and biomechanics. In the future, contrast-enhanced μCT imaging of the callus cartilage might provide more information to improve the non-destructive and non-invasive prediction of callus mechanical properties. As studies evaluating such important drugs increase, fully automated segmentation appears to be clinically important.

  12. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  13. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  14. Fully automated radiosynthesis of [11C]PBR28, a radiopharmaceutical for the translocator protein (TSPO) 18 kDa, using a GE TRACERlab FXC-Pro

    International Nuclear Information System (INIS)

    Hoareau, Raphaël; Shao, Xia; Henderson, Bradford D.; Scott, Peter J.H.

    2012-01-01

    In order to image the translocator protein (TSPO) 18 kDa in the clinic using positron emission tomography (PET) imaging, we had a cause to prepare [ 11 C]PBR28. In this communication we highlight our novel, recently developed, one-pot synthesis of the desmethyl-PBR28 precursor, as well as present an optimized fully automated preparation of [ 11 C]PBR28 using a GE TRACERlab FX C-Pro . Following radiolabelling, purification is achieved by HPLC and, to the best of our knowledge, the first reported example of reconstituting [ 11 C]PBR28 into ethanolic saline using solid-phase extraction (SPE). This procedure is operationally simple, and provides high quality doses of [ 11 C]PBR28 suitable for use in clinical PET imaging studies. Typical radiochemical yield using the optimized method is 3.6% yield (EOS, n=3), radiochemical and chemical purity are consistently >99%, and specific activities are 14,523 Ci/mmol. Highlights: ► This paper reports a fully automated synthesis of [ 11 C]PBR28 using a TRACERlab FXc-pro. ► We report a solid-phase extraction technique for the reconstitution of [ 11 C]PBR28. ► ICP-MS data for PBR28 precursor is reported confirming suitability for clinical use.

  15. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  16. Automated backbone assignment of labeled proteins using the threshold accepting algorithm

    International Nuclear Information System (INIS)

    Leutner, Michael; Gschwind, Ruth M.; Liermann, Jens; Schwarz, Christian; Gemmecker, Gerd; Kessler, Horst

    1998-01-01

    The sequential assignment of backbone resonances is the first step in the structure determination of proteins by heteronuclear NMR. For larger proteins, an assignment strategy based on proton side-chain information is no longer suitable for the use in an automated procedure. Our program PASTA (Protein ASsignment by Threshold Accepting) is therefore designed to partially or fully automate the sequential assignment of proteins, based on the analysis of NMR backbone resonances plus C β information. In order to overcome the problems caused by peak overlap and missing signals in an automated assignment process, PASTA uses threshold accepting, a combinatorial optimization strategy, which is superior to simulated annealing due to generally faster convergence and better solutions. The reliability of this algorithm is shown by reproducing the complete sequential backbone assignment of several proteins from published NMR data. The robustness of the algorithm against misassigned signals, noise, spectral overlap and missing peaks is shown by repeating the assignment with reduced sequential information and increased chemical shift tolerances. The performance of the program on real data is finally demonstrated with automatically picked peak lists of human nonpancreatic synovial phospholipase A 2 , a protein with 124 residues

  17. Fully Automated Volumetric Modulated Arc Therapy Plan Generation for Prostate Cancer Patients

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Al-Mamgani, Abrahim; Incrocci, Luca; Heijmen, Ben J.M.

    2014-01-01

    Purpose: To develop and evaluate fully automated volumetric modulated arc therapy (VMAT) treatment planning for prostate cancer patients, avoiding manual trial-and-error tweaking of plan parameters by dosimetrists. Methods and Materials: A system was developed for fully automated generation of VMAT plans with our commercial clinical treatment planning system (TPS), linked to the in-house developed Erasmus-iCycle multicriterial optimizer for preoptimization. For 30 randomly selected patients, automatically generated VMAT plans (VMAT auto ) were compared with VMAT plans generated manually by 1 expert dosimetrist in the absence of time pressure (VMAT man ). For all treatment plans, planning target volume (PTV) coverage and sparing of organs-at-risk were quantified. Results: All generated plans were clinically acceptable and had similar PTV coverage (V 95%  > 99%). For VMAT auto and VMAT man plans, the organ-at-risk sparing was similar as well, although only the former plans were generated without any planning workload. Conclusions: Fully automated generation of high-quality VMAT plans for prostate cancer patients is feasible and has recently been implemented in our clinic

  18. MannDB – A microbial database of automated protein sequence analyses and evidence integration for protein characterization

    Directory of Open Access Journals (Sweden)

    Kuczmarski Thomas A

    2006-10-01

    Full Text Available Abstract Background MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. Description MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-source tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. Conclusion MannDB comprises a large number of genomes and comprehensive protein

  19. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard; Gao, Xin; Li, Ming

    2011-01-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg's contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  20. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg\\'s contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  1. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    Science.gov (United States)

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  2. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  3. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  4. A novel method to determine simultaneously methane production during in vitro gas production using fully automated equipment

    NARCIS (Netherlands)

    Pellikaan, W.F.; Hendriks, W.H.; Uwimanaa, G.; Bongers, L.J.G.M.; Becker, P.M.; Cone, J.W.

    2011-01-01

    An adaptation of fully automated gas production equipment was tested for its ability to simultaneously measure methane and total gas. The simultaneous measurement of gas production and gas composition was not possible using fully automated equipment, as the bottles should be kept closed during the

  5. A new fully automated TLD badge reader

    International Nuclear Information System (INIS)

    Kannan, S.; Ratna, P.; Kulkarni, M.S.

    2003-01-01

    At present personnel monitoring in India is being carried out using a number of manual and semiautomatic TLD badge Readers and the BARC TL dosimeter badge designed during 1970. Of late the manual TLD badge readers are almost completely replaced by semiautomatic readers with a number of performance improvements like use of hot gas heating to reduce the readout time considerably. PC based design with storage of glow curve for every dosimeter, on-line dose computation and printout of dose reports, etc. However the semiautomatic system suffers from the lack of a machine readable ID code on the badge and the physical design of the dosimeter card not readily compatible for automation. This paper describes a fully automated TLD badge Reader developed in the RSS Division, using a new TLD badge with machine readable ID code. The new PC based reader has a built-in reader for reading the ID code, in the form of an array of holes, on the dosimeter card. The reader has a number of self-diagnostic features to ensure a high degree of reliability. (author)

  6. [18F]FMeNER-D2: Reliable fully-automated synthesis for visualization of the norepinephrine transporter

    International Nuclear Information System (INIS)

    Rami-Mark, Christina; Zhang, Ming-Rong; Mitterhauser, Markus; Lanzenberger, Rupert; Hacker, Marcus; Wadsak, Wolfgang

    2013-01-01

    Purpose: In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [ 18 F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [ 18 F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Procedures: Synthesis of [ 18 F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20–30 GBq [ 18 F]fluoride, azeotropic drying, reaction with Br 2 CD 2 , distillation of 1-bromo-2-[ 18 F]fluoromethane-D2 ([ 18 F]BFM) and reaction of the pure [ 18 F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Results: Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0–2.5 GBq of formulated [ 18 F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. Conclusions: A first fully automated [ 18 F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization

  7. [18F]FMeNER-D2: reliable fully-automated synthesis for visualization of the norepinephrine transporter.

    Science.gov (United States)

    Rami-Mark, Christina; Zhang, Ming-Rong; Mitterhauser, Markus; Lanzenberger, Rupert; Hacker, Marcus; Wadsak, Wolfgang

    2013-11-01

    In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [(18)F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [(18)F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Synthesis of [(18)F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20-30 GBq [(18)F]fluoride, azeotropic drying, reaction with Br2CD2, distillation of 1-bromo-2-[(18)F]fluoromethane-D2 ([(18)F]BFM) and reaction of the pure [(18)F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0-2.5 GBq of formulated [(18)F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. A first fully automated [(18)F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization. © 2013.

  8. A fully automated microfluidic femtosecond laser axotomy platform for nerve regeneration studies in C. elegans.

    Science.gov (United States)

    Gokce, Sertan Kutal; Guo, Samuel X; Ghorashian, Navid; Everett, W Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner.

  9. Breast Density Estimation with Fully Automated Volumetric Method: Comparison to Radiologists' Assessment by BI-RADS Categories.

    Science.gov (United States)

    Singh, Tulika; Sharma, Madhurima; Singla, Veenu; Khandelwal, Niranjan

    2016-01-01

    The objective of our study was to calculate mammographic breast density with a fully automated volumetric breast density measurement method and to compare it to breast imaging reporting and data system (BI-RADS) breast density categories assigned by two radiologists. A total of 476 full-field digital mammography examinations with standard mediolateral oblique and craniocaudal views were evaluated by two blinded radiologists and BI-RADS density categories were assigned. Using a fully automated software, mean fibroglandular tissue volume, mean breast volume, and mean volumetric breast density were calculated. Based on percentage volumetric breast density, a volumetric density grade was assigned from 1 to 4. The weighted overall kappa was 0.895 (almost perfect agreement) for the two radiologists' BI-RADS density estimates. A statistically significant difference was seen in mean volumetric breast density among the BI-RADS density categories. With increased BI-RADS density category, increase in mean volumetric breast density was also seen (P BI-RADS categories and volumetric density grading by fully automated software (ρ = 0.728, P BI-RADS density category by two observers showed fair agreement (κ = 0.398 and 0.388, respectively). In our study, a good correlation was seen between density grading using fully automated volumetric method and density grading using BI-RADS density categories assigned by the two radiologists. Thus, the fully automated volumetric method may be used to quantify breast density on routine mammography. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  10. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    Elżbieta Pociask

    2016-01-01

    Full Text Available Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement, segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects.

  11. Approaches to automated protein crystal harvesting

    Energy Technology Data Exchange (ETDEWEB)

    Deller, Marc C., E-mail: mdeller@scripps.edu; Rupp, Bernhard, E-mail: mdeller@scripps.edu

    2014-01-28

    Approaches to automated and robot-assisted harvesting of protein crystals are critically reviewed. While no true turn-key solutions for automation of protein crystal harvesting are currently available, systems incorporating advanced robotics and micro-electromechanical systems represent exciting developments with the potential to revolutionize the way in which protein crystals are harvested.

  12. Comparison of semi-automated center-dot and fully automated endothelial cell analyses from specular microscopy images.

    Science.gov (United States)

    Maruoka, Sachiko; Nakakura, Shunsuke; Matsuo, Naoko; Yoshitomi, Kayo; Katakami, Chikako; Tabuchi, Hitoshi; Chikama, Taiichiro; Kiuchi, Yoshiaki

    2017-10-30

    To evaluate two specular microscopy analysis methods across different endothelial cell densities (ECDs). Endothelial images of one eye from each of 45 patients were taken by using three different specular microscopes (three replicates each). To determine the consistency of the center-dot method, we compared SP-6000 and SP-2000P images. CME-530 and SP-6000 images were compared to assess the consistency of the fully automated method. The SP-6000 images from the two methods were compared. Intraclass correlation coefficients (ICCs) for the three measurements were calculated, and parametric multiple comparisons tests and Bland-Altman analysis were performed. The ECD mean value was 2425 ± 883 (range 516-3707) cells/mm 2 . ICC values were > 0.9 for all three microscopes for ECD, but the coefficients of variation (CVs) were 0.3-0.6. For ECD measurements, Bland-Altman analysis revealed that the mean difference was 42 cells/mm 2 between the SP-2000P and SP-6000 for the center-dot method; 57 cells/mm 2 between the SP-6000 measurements from both methods; and -5 cells/mm 2 between the SP-6000 and CME-530 for the fully automated method (95% limits of agreement: - 201 to 284 cell/mm 2 , - 410 to 522 cells/mm 2 , and - 327 to 318 cells/mm 2 , respectively). For CV measurements, the mean differences were - 3, - 12, and 13% (95% limits of agreement - 18 to 11, - 26 to 2, and - 5 to 32%, respectively). Despite using three replicate measurements, the precision of the center-dot method with the SP-2000P and SP-6000 software was only ± 10% for ECD data and was even worse for the fully automated method. Japan Clinical Trials Register ( http://www.umin.ac.jp/ctr/index/htm9 ) number UMIN 000015236.

  13. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  14. A Fully Automated Penumbra Segmentation Tool

    DEFF Research Database (Denmark)

    Nagenthiraja, Kartheeban; Ribe, Lars Riisgaard; Hougaard, Kristina Dupont

    2012-01-01

    Introduction: Perfusion- and diffusion weighted MRI (PWI/DWI) is widely used to select patients who are likely to benefit from recanalization therapy. The visual identification of PWI-DWI-mismatch tissue depends strongly on the observer, prompting a need for software, which estimates potentially...... salavageable tissue, quickly and accurately. We present a fully Automated Penumbra Segmentation (APS) algorithm using PWI and DWI images. We compare automatically generated PWI-DWI mismatch mask to mask outlined manually by experts, in 168 patients. Method: The algorithm initially identifies PWI lesions......) at 600∙10-6 mm2/sec. Due to the nature of thresholding, the ADC mask overestimates the DWI lesion volume and consequently we initialized level-set algorithm on DWI image with ADC mask as prior knowledge. Combining the PWI and inverted DWI mask then yield the PWI-DWI mismatch mask. Four expert raters...

  15. [{sup 18}F]FMeNER-D2: Reliable fully-automated synthesis for visualization of the norepinephrine transporter

    Energy Technology Data Exchange (ETDEWEB)

    Rami-Mark, Christina [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Department of Inorganic Chemistry, University of Vienna (Austria); Zhang, Ming-Rong [Molecular Imaging Center, National Institute of Radiological Sciences, Chiba (Japan); Mitterhauser, Markus [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Hospital Pharmacy of the General Hospital of Vienna (Austria); Lanzenberger, Rupert [Department of Psychiatry and Psychotherapy, Medical University of Vienna (Austria); Hacker, Marcus [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Wadsak, Wolfgang [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Department of Inorganic Chemistry, University of Vienna (Austria)

    2013-11-15

    Purpose: In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [{sup 18}F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [{sup 18}F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Procedures: Synthesis of [{sup 18}F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20–30 GBq [{sup 18}F]fluoride, azeotropic drying, reaction with Br{sub 2}CD{sub 2}, distillation of 1-bromo-2-[{sup 18}F]fluoromethane-D2 ([{sup 18}F]BFM) and reaction of the pure [{sup 18}F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Results: Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0–2.5 GBq of formulated [{sup 18}F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. Conclusions: A first fully automated [{sup 18}F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization.

  16. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    Science.gov (United States)

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  17. Fully Automated Deep Learning System for Bone Age Assessment.

    Science.gov (United States)

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-08-01

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  18. FULLY AUTOMATED IMAGE ORIENTATION IN THE ABSENCE OF TARGETS

    Directory of Open Access Journals (Sweden)

    C. Stamatopoulos

    2012-07-01

    Full Text Available Automated close-range photogrammetric network orientation has traditionally been associated with the use of coded targets in the object space to allow for an initial relative orientation (RO and subsequent spatial resection of the images. Over the past decade, automated orientation via feature-based matching (FBM techniques has attracted renewed research attention in both the photogrammetry and computer vision (CV communities. This is largely due to advances made towards the goal of automated relative orientation of multi-image networks covering untargetted (markerless objects. There are now a number of CV-based algorithms, with accompanying open-source software, that can achieve multi-image orientation within narrow-baseline networks. From a photogrammetric standpoint, the results are typically disappointing as the metric integrity of the resulting models is generally poor, or even unknown, while the number of outliers within the image matching and triangulation is large, and generally too large to allow relative orientation (RO via the commonly used coplanarity equations. On the other hand, there are few examples within the photogrammetric research field of automated markerless camera calibration to metric tolerances, and these too are restricted to narrow-baseline, low-convergence imaging geometry. The objective addressed in this paper is markerless automatic multi-image orientation, maintaining metric integrity, within networks that incorporate wide-baseline imagery. By wide-baseline we imply convergent multi-image configurations with convergence angles of up to around 90°. An associated aim is provision of a fast, fully automated process, which can be performed without user intervention. For this purpose, various algorithms require optimisation to allow parallel processing utilising multiple PC cores and graphics processing units (GPUs.

  19. Validation of Fully Automated VMAT Plan Generation for Library-Based Plan-of-the-Day Cervical Cancer Radiotherapy

    OpenAIRE

    Sharfo, Abdul Wahab M.; Breedveld, Sebastiaan; Voet, Peter W. J.; Heijkoop, Sabrina T.; Mens, Jan-Willem M.; Hoogeman, Mischa S.; Heijmen, Ben J. M.

    2016-01-01

    textabstractPurpose: To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods: Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-be...

  20. Fully automated MR liver volumetry using watershed segmentation coupled with active contouring.

    Science.gov (United States)

    Huynh, Hieu Trung; Le-Trong, Ngoc; Bao, Pham The; Oto, Aytek; Suzuki, Kenji

    2017-02-01

    Our purpose is to develop a fully automated scheme for liver volume measurement in abdominal MR images, without requiring any user input or interaction. The proposed scheme is fully automatic for liver volumetry from 3D abdominal MR images, and it consists of three main stages: preprocessing, rough liver shape generation, and liver extraction. The preprocessing stage reduced noise and enhanced the liver boundaries in 3D abdominal MR images. The rough liver shape was revealed fully automatically by using the watershed segmentation, thresholding transform, morphological operations, and statistical properties of the liver. An active contour model was applied to refine the rough liver shape to precisely obtain the liver boundaries. The liver volumes calculated by the proposed scheme were compared to the "gold standard" references which were estimated by an expert abdominal radiologist. The liver volumes computed by using our developed scheme excellently agreed (Intra-class correlation coefficient was 0.94) with the "gold standard" manual volumes by the radiologist in the evaluation with 27 cases from multiple medical centers. The running time was 8.4 min per case on average. We developed a fully automated liver volumetry scheme in MR, which does not require any interaction by users. It was evaluated with cases from multiple medical centers. The liver volumetry performance of our developed system was comparable to that of the gold standard manual volumetry, and it saved radiologists' time for manual liver volumetry of 24.7 min per case.

  1. TreeRipper web application: towards a fully automated optical tree recognition software

    Directory of Open Access Journals (Sweden)

    Hughes Joseph

    2011-05-01

    Full Text Available Abstract Background Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. Results TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/. The program accepts a range of input image formats (PNG, JPG/JPEG or GIF. The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Conclusions Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.

  2. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  3. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn [Seoul National University Hospital, Seoul (Korea, Republic of)

    2013-02-15

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume ({gamma}= 0.637 for total liver and {gamma}= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  4. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    International Nuclear Information System (INIS)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn

    2013-01-01

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume (γ= 0.637 for total liver and γ= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  5. Development of a Fully-Automated Monte Carlo Burnup Code Monteburns

    International Nuclear Information System (INIS)

    Poston, D.I.; Trellue, H.R.

    1999-01-01

    Several computer codes have been developed to perform nuclear burnup calculations over the past few decades. In addition, because of advances in computer technology, it recently has become more desirable to use Monte Carlo techniques for such problems. Monte Carlo techniques generally offer two distinct advantages over discrete ordinate methods: (1) the use of continuous energy cross sections and (2) the ability to model detailed, complex, three-dimensional (3-D) geometries. These advantages allow more accurate burnup results to be obtained, provided that the user possesses the required computing power (which is required for discrete ordinate methods as well). Several linkage codes have been written that combine a Monte Carlo N-particle transport code (such as MCNP TM ) with a radioactive decay and burnup code. This paper describes one such code that was written at Los Alamos National Laboratory: monteburns. Monteburns links MCNP with the isotope generation and depletion code ORIGEN2. The basis for the development of monteburns was the need for a fully automated code that could perform accurate burnup (and other) calculations for any 3-D system (accelerator-driven or a full reactor core). Before the initial development of monteburns, a list of desired attributes was made and is given below. o The code should be fully automated (that is, after the input is set up, no further user interaction is required). . The code should allow for the irradiation of several materials concurrently (each material is evaluated collectively in MCNP and burned separately in 0RIGEN2). o The code should allow the transfer of materials (shuffling) between regions in MCNP. . The code should allow any materials to be added or removed before, during, or after each step in an automated fashion. . The code should not require the user to provide input for 0RIGEN2 and should have minimal MCNP input file requirements (other than a working MCNP deck). . The code should be relatively easy to use

  6. The future of fully automated vehicles : opportunities for vehicle- and ride-sharing, with cost and emissions savings.

    Science.gov (United States)

    2014-08-01

    Fully automated or autonomous vehicles (AVs) hold great promise for the future of transportation. By 2020 : Google, auto manufacturers and other technology providers intend to introduce self-driving cars to the public with : either limited or fully a...

  7. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  8. Automated protein structure calculation from NMR data

    International Nuclear Information System (INIS)

    Williamson, Mike P.; Craven, C. Jeremy

    2009-01-01

    Current software is almost at the stage to permit completely automatic structure determination of small proteins of <15 kDa, from NMR spectra to structure validation with minimal user interaction. This goal is welcome, as it makes structure calculation more objective and therefore more easily validated, without any loss in the quality of the structures generated. Moreover, it releases expert spectroscopists to carry out research that cannot be automated. It should not take much further effort to extend automation to ca 20 kDa. However, there are technological barriers to further automation, of which the biggest are identified as: routines for peak picking; adoption and sharing of a common framework for structure calculation, including the assembly of an automated and trusted package for structure validation; and sample preparation, particularly for larger proteins. These barriers should be the main target for development of methodology for protein structure determination, particularly by structural genomics consortia

  9. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  10. Automation of C-terminal sequence analysis of 2D-PAGE separated proteins

    Directory of Open Access Journals (Sweden)

    P.P. Moerman

    2014-06-01

    Full Text Available Experimental assignment of the protein termini remains essential to define the functional protein structure. Here, we report on the improvement of a proteomic C-terminal sequence analysis method. The approach aims to discriminate the C-terminal peptide in a CNBr-digest where Met-Xxx peptide bonds are cleaved in internal peptides ending at a homoserine lactone (hsl-derivative. pH-dependent partial opening of the lactone ring results in the formation of doublets for all internal peptides. C-terminal peptides are distinguished as singlet peaks by MALDI-TOF MS and MS/MS is then used for their identification. We present a fully automated protocol established on a robotic liquid-handling station.

  11. Automated multi-dimensional purification of tagged proteins.

    Science.gov (United States)

    Sigrell, Jill A; Eklund, Pär; Galin, Markus; Hedkvist, Lotta; Liljedahl, Pia; Johansson, Christine Markeland; Pless, Thomas; Torstenson, Karin

    2003-01-01

    The capacity for high throughput purification (HTP) is essential in fields such as structural genomics where large numbers of protein samples are routinely characterized in, for example, studies of structural determination, functionality and drug development. Proteins required for such analysis must be pure and homogenous and available in relatively large amounts. AKTA 3D system is a powerful automated protein purification system, which minimizes preparation, run-time and repetitive manual tasks. It has the capacity to purify up to 6 different His6- or GST-tagged proteins per day and can produce 1-50 mg protein per run at >90% purity. The success of automated protein purification increases with careful experimental planning. Protocol, columns and buffers need to be chosen with the final application area for the purified protein in mind.

  12. PASA - A Program for Automated Protein NMR Backbone Signal Assignment by Pattern-Filtering Approach

    International Nuclear Information System (INIS)

    Xu Yizhuang; Wang Xiaoxia; Yang Jun; Vaynberg, Julia; Qin Jun

    2006-01-01

    We present a new program, PASA (Program for Automated Sequential Assignment), for assigning protein backbone resonances based on multidimensional heteronuclear NMR data. Distinct from existing programs, PASA emphasizes a per-residue-based pattern-filtering approach during the initial stage of the automated 13 C α and/or 13 C β chemical shift matching. The pattern filter employs one or multiple constraints such as 13 C α /C β chemical shift ranges for different amino acid types and side-chain spin systems, which helps to rule out, in a stepwise fashion, improbable assignments as resulted from resonance degeneracy or missing signals. Such stepwise filtering approach substantially minimizes early false linkage problems that often propagate, amplify, and ultimately cause complication or combinatorial explosion of the automation process. Our program (http://www.lerner.ccf.org/moleccard/qin/) was tested on four representative small-large sized proteins with various degrees of resonance degeneracy and missing signals, and we show that PASA achieved the assignments efficiently and rapidly that are fully consistent with those obtained by laborious manual protocols. The results demonstrate that PASA may be a valuable tool for NMR-based structural analyses, genomics, and proteomics

  13. Fully automated joint space width measurement and digital X-ray radiogrammetry in early RA

    NARCIS (Netherlands)

    Platten, Michael; Kisten, Yogan; Kälvesten, Johan; Arnaud, Laurent; Forslind, Kristina; van Vollenhoven, Ronald

    2017-01-01

    To study fully automated digital joint space width (JSW) and bone mineral density (BMD) in relation to a conventional radiographic scoring method in early rheumatoid arthritis (eRA). Radiographs scored by the modified Sharp van der Heijde score (SHS) in patients with eRA were acquired from the

  14. Fully Automated Driving: Impact of Trust and Practice on Manual Control Recovery.

    Science.gov (United States)

    Payre, William; Cestac, Julien; Delhomme, Patricia

    2016-03-01

    An experiment was performed in a driving simulator to investigate the impacts of practice, trust, and interaction on manual control recovery (MCR) when employing fully automated driving (FAD). To increase the use of partially or highly automated driving efficiency and to improve safety, some studies have addressed trust in driving automation and training, but few studies have focused on FAD. FAD is an autonomous system that has full control of a vehicle without any need for intervention by the driver. A total of 69 drivers with a valid license practiced with FAD. They were distributed evenly across two conditions: simple practice and elaborate practice. When examining emergency MCR, a correlation was found between trust and reaction time in the simple practice group (i.e., higher trust meant a longer reaction time), but not in the elaborate practice group. This result indicated that to mitigate the negative impact of overtrust on reaction time, more appropriate practice may be needed. Drivers should be trained in how the automated device works so as to improve MCR performance in case of an emergency. The practice format used in this study could be used for the first interaction with an FAD car when acquiring such a vehicle. © 2015, Human Factors and Ergonomics Society.

  15. Fully automated synthesis system of 3'-deoxy-3'-[18F]fluorothymidine

    International Nuclear Information System (INIS)

    Oh, Seung Jun; Mosdzianowski, Christoph; Chi, Dae Yoon; Kim, Jung Young; Kang, Se Hun; Ryu, Jin Sook; Yeo, Jeong Seok; Moon, Dae Hyuk

    2004-01-01

    We developed a new fully automated method for the synthesis of 3'-deoxy-3'-[ 18 F]fluorothymidine ([ 18 F]FLT), by modifying a commercial FDG synthesizer and its disposable fluid pathway. Optimal labeling condition was that 40 mg of precursor in acetonitrile (2 mL) was heated at 150 degree sign C for 100 sec, followed by heating at 85 degree sign C for 450 sec and hydrolysis with 1 N HCl at 105 degree sign C for 300 sec. Using 3.7 GBq of [ 18 F]F - as starting activity, [ 18 F]FLT was obtained with a yield of 50.5±5.2% (n=28, decay corrected) within 60.0±5.4 min including HPLC purification. With 37.0 GBq, we obtained 48.7±5.6% (n=10). The [ 18 F]FLT showed the good stability for 6 h. This new automated synthesis procedure combines high and reproducible yields with the benefits of a disposable cassette system

  16. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  17. Fully automated joint space width measurement and digital X-ray radiogrammetry in early RA.

    Science.gov (United States)

    Platten, Michael; Kisten, Yogan; Kälvesten, Johan; Arnaud, Laurent; Forslind, Kristina; van Vollenhoven, Ronald

    2017-01-01

    To study fully automated digital joint space width (JSW) and bone mineral density (BMD) in relation to a conventional radiographic scoring method in early rheumatoid arthritis (eRA). Radiographs scored by the modified Sharp van der Heijde score (SHS) in patients with eRA were acquired from the SWEdish FarmacOTherapy study. Fully automated JSW measurements of bilateral metacarpals 2, 3 and 4 were compared with the joint space narrowing (JSN) score in SHS. Multilevel mixed model statistics were applied to calculate the significance of the association between ΔJSW and ΔBMD over 1 year, and the JSW differences between damaged and undamaged joints as evaluated by the JSN. Based on 576 joints of 96 patients with eRA, a significant reduction from baseline to 1 year was observed in the JSW from 1.69 (±0.19) mm to 1.66 (±0.19) mm (p0) joints: 1.68 mm (95% CI 1.70 to 1.67) vs 1.54 mm (95% CI 1.63 to 1.46). Similarly the unadjusted multilevel model showed significant differences in JSW between undamaged (1.68 mm (95% CI 1.72 to 1.64)) and damaged joints (1.63 mm (95% CI 1.68 to 1.58)) (p=0.0048). This difference remained significant in the adjusted model: 1.66 mm (95% CI 1.70 to 1.61) vs 1.62 mm (95% CI 1.68 to 1.56) (p=0.042). To measure the JSW with this fully automated digital tool may be useful as a quick and observer-independent application for evaluating cartilage damage in eRA. NCT00764725.

  18. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS.

    Science.gov (United States)

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G; Baker, Philip; Pinu, Farhana R

    2016-12-29

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory.

  19. A fully automated conversational agent for promoting mental well-being: A pilot RCT using mixed methods

    Directory of Open Access Journals (Sweden)

    Kien Hoa Ly

    2017-12-01

    Full Text Available Fully automated self-help interventions can serve as highly cost-effective mental health promotion tools for massive amounts of people. However, these interventions are often characterised by poor adherence. One way to address this problem is to mimic therapy support by a conversational agent. The objectives of this study were to assess the effectiveness and adherence of a smartphone app, delivering strategies used in positive psychology and CBT interventions via an automated chatbot (Shim for a non-clinical population — as well as to explore participants' views and experiences of interacting with this chatbot. A total of 28 participants were randomized to either receive the chatbot intervention (n = 14 or to a wait-list control group (n = 14. Findings revealed that participants who adhered to the intervention (n = 13 showed significant interaction effects of group and time on psychological well-being (FS and perceived stress (PSS-10 compared to the wait-list control group, with small to large between effect sizes (Cohen's d range 0.14–1.06. Also, the participants showed high engagement during the 2-week long intervention, with an average open app ratio of 17.71 times for the whole period. This is higher compared to other studies on fully automated interventions claiming to be highly engaging, such as Woebot and the Panoply app. The qualitative data revealed sub-themes which, to our knowledge, have not been found previously, such as the moderating format of the chatbot. The results of this study, in particular the good adherence rate, validated the usefulness of replicating this study in the future with a larger sample size and an active control group. This is important, as the search for fully automated, yet highly engaging and effective digital self-help interventions for promoting mental health is crucial for the public health.

  20. Performance of an Artificial Multi-observer Deep Neural Network for Fully Automated Segmentation of Polycystic Kidneys.

    Science.gov (United States)

    Kline, Timothy L; Korfiatis, Panagiotis; Edwards, Marie E; Blais, Jaime D; Czerwiec, Frank S; Harris, Peter C; King, Bernard F; Torres, Vicente E; Erickson, Bradley J

    2017-08-01

    Deep learning techniques are being rapidly applied to medical imaging tasks-from organ and lesion segmentation to tissue and tumor classification. These techniques are becoming the leading algorithmic approaches to solve inherently difficult image processing tasks. Currently, the most critical requirement for successful implementation lies in the need for relatively large datasets that can be used for training the deep learning networks. Based on our initial studies of MR imaging examinations of the kidneys of patients affected by polycystic kidney disease (PKD), we have generated a unique database of imaging data and corresponding reference standard segmentations of polycystic kidneys. In the study of PKD, segmentation of the kidneys is needed in order to measure total kidney volume (TKV). Automated methods to segment the kidneys and measure TKV are needed to increase measurement throughput and alleviate the inherent variability of human-derived measurements. We hypothesize that deep learning techniques can be leveraged to perform fast, accurate, reproducible, and fully automated segmentation of polycystic kidneys. Here, we describe a fully automated approach for segmenting PKD kidneys within MR images that simulates a multi-observer approach in order to create an accurate and robust method for the task of segmentation and computation of TKV for PKD patients. A total of 2000 cases were used for training and validation, and 400 cases were used for testing. The multi-observer ensemble method had mean ± SD percent volume difference of 0.68 ± 2.2% compared with the reference standard segmentations. The complete framework performs fully automated segmentation at a level comparable with interobserver variability and could be considered as a replacement for the task of segmentation of PKD kidneys by a human.

  1. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use

    DEFF Research Database (Denmark)

    Arnaud, Nicolas; Baldus, Christiane; Elgán, Tobias H.

    2016-01-01

    ).Conclusions: Although the study is limited by a large drop-out, significant between-group effects for alcohol use indicate that targeted brief motivational intervention in a fully automated Web-based format can be effective to reduce drinking and lessen existing substance use service barriers for at...... of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective: This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use......, and polydrug use. All outcome analyses were conducted with and without Expectation Maximization (EM) imputation of missing follow-up data. Results: In total, 2673 adolescents were screened and 1449 (54.2%) participants were randomized to the intervention or control group. After 3 months, 211 adolescents (14...

  2. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use

    DEFF Research Database (Denmark)

    Arnaud, Nicolas; Baldus, Christiane; Elgán, Tobias H.

    2016-01-01

    of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective: This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use...... methods and screened online for at-risk substance use using the CRAFFT (Car, Relax, Alone, Forget, Friends, Trouble) screening instrument. Participants were randomized to a single session brief motivational intervention group or an assessment-only control group but not blinded. Primary outcome......).Conclusions: Although the study is limited by a large drop-out, significant between-group effects for alcohol use indicate that targeted brief motivational intervention in a fully automated Web-based format can be effective to reduce drinking and lessen existing substance use service barriers for at...

  3. Fully Automated Trimethylsilyl (TMS Derivatisation Protocol for Metabolite Profiling by GC-MS

    Directory of Open Access Journals (Sweden)

    Erica Zarate

    2016-12-01

    Full Text Available Gas Chromatography-Mass Spectrometry (GC-MS has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20 and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted in any metabolomics laboratory.

  4. Fully automated bone mineral density assessment from low-dose chest CT

    Science.gov (United States)

    Liu, Shuang; Gonzalez, Jessica; Zulueta, Javier; de-Torres, Juan P.; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2018-02-01

    A fully automated system is presented for bone mineral density (BMD) assessment from low-dose chest CT (LDCT). BMD assessment is central in the diagnosis and follow-up therapy monitoring of osteoporosis, which is characterized by low bone density and is estimated to affect 12.3 million US population aged 50 years or older, creating tremendous social and economic burdens. BMD assessment from DXA scans (BMDDXA) is currently the most widely used and gold standard technique for the diagnosis of osteoporosis and bone fracture risk estimation. With the recent large-scale implementation of annual lung cancer screening using LDCT, great potential emerges for the concurrent opportunistic osteoporosis screening. In the presented BMDCT assessment system, each vertebral body is first segmented and labeled with its anatomical name. Various 3D region of interest (ROI) inside the vertebral body are then explored for BMDCT measurements at different vertebral levels. The system was validated using 76 pairs of DXA and LDCT scans of the same subject. Average BMDDXA of L1-L4 was used as the reference standard. Statistically significant (p-value correlation is obtained between BMDDXA and BMDCT at all vertebral levels (T1 - L2). A Pearson correlation of 0.857 was achieved between BMDDXA and average BMDCT of T9-T11 by using a 3D ROI taking into account of both trabecular and cortical bone tissue. These encouraging results demonstrate the feasibility of fully automated quantitative BMD assessment and the potential of opportunistic osteoporosis screening with concurrent lung cancer screening using LDCT.

  5. Fully automated gynecomastia quantification from low-dose chest CT

    Science.gov (United States)

    Liu, Shuang; Sonnenblick, Emily B.; Azour, Lea; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2018-02-01

    Gynecomastia is characterized by the enlargement of male breasts, which is a common and sometimes distressing condition found in over half of adult men over the age of 44. Although the majority of gynecomastia is physiologic or idiopathic, its occurrence may also associate with an extensive variety of underlying systemic disease or drug toxicity. With the recent large-scale implementation of annual lung cancer screening using low-dose chest CT (LDCT), gynecomastia is believed to be a frequent incidental finding on LDCT. A fully automated system for gynecomastia quantification from LDCT is presented in this paper. The whole breast region is first segmented using an anatomyorientated approach based on the propagation of pectoral muscle fronts in the vertical direction. The subareolar region is then localized, and the fibroglandular tissue within it is measured for the assessment of gynecomastia. The presented system was validated using 454 breast regions from non-contrast LDCT scans of 227 adult men. The ground truth was established by an experienced radiologist by classifying each breast into one of the five categorical scores. The automated measurements have been demonstrated to achieve promising performance for the gynecomastia diagnosis with the AUC of 0.86 for the ROC curve and have statistically significant Spearman correlation r=0.70 (p early detection as well as the treatment of both gynecomastia and the underlying medical problems, if any, that cause gynecomastia.

  6. Implementation of a fully automated process purge-and-trap gas chromatograph at an environmental remediation site

    International Nuclear Information System (INIS)

    Blair, D.S.; Morrison, D.J.

    1997-01-01

    The AQUASCAN, a commercially available, fully automated purge-and-trap gas chromatograph from Sentex Systems Inc., was implemented and evaluated as an in-field, automated monitoring system of contaminated groundwater at an active DOE remediation site in Pinellas, FL. Though the AQUASCAN is designed as a stand alone process analytical unit, implementation at this site required additional hardware. The hardware included a sample dilution system and a method for delivering standard solution to the gas chromatograph for automated calibration. As a result of the evaluation the system was determined to be a reliable and accurate instrument. The AQUASCAN reported concentration values for methylene chloride, trichloroethylene, and toluene in the Pinellas ground water were within 20% of reference laboratory values

  7. UBO Detector - A cluster-based, fully automated pipeline for extracting white matter hyperintensities.

    Science.gov (United States)

    Jiang, Jiyang; Liu, Tao; Zhu, Wanlin; Koncz, Rebecca; Liu, Hao; Lee, Teresa; Sachdev, Perminder S; Wen, Wei

    2018-07-01

    We present 'UBO Detector', a cluster-based, fully automated pipeline for extracting and calculating variables for regions of white matter hyperintensities (WMH) (available for download at https://cheba.unsw.edu.au/group/neuroimaging-pipeline). It takes T1-weighted and fluid attenuated inversion recovery (FLAIR) scans as input, and SPM12 and FSL functions are utilised for pre-processing. The candidate clusters are then generated by FMRIB's Automated Segmentation Tool (FAST). A supervised machine learning algorithm, k-nearest neighbor (k-NN), is applied to determine whether the candidate clusters are WMH or non-WMH. UBO Detector generates both image and text (volumes and the number of WMH clusters) outputs for whole brain, periventricular, deep, and lobar WMH, as well as WMH in arterial territories. The computation time for each brain is approximately 15 min. We validated the performance of UBO Detector by showing a) high segmentation (similarity index (SI) = 0.848) and volumetric (intraclass correlation coefficient (ICC) = 0.985) agreement between the UBO Detector-derived and manually traced WMH; b) highly correlated (r 2  > 0.9) and a steady increase of WMH volumes over time; and c) significant associations of periventricular (t = 22.591, p deep (t = 14.523, p < 0.001) WMH volumes generated by UBO Detector with Fazekas rating scores. With parallel computing enabled in UBO Detector, the processing can take advantage of multi-core CPU's that are commonly available on workstations. In conclusion, UBO Detector is a reliable, efficient and fully automated WMH segmentation pipeline. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    Science.gov (United States)

    Barat, Christian; Phlypo, Ronald

    2010-12-01

    We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  9. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    Science.gov (United States)

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Fully automated atlas-based hippocampal volumetry for detection of Alzheimer's disease in a memory clinic setting.

    Science.gov (United States)

    Suppa, Per; Anker, Ulrich; Spies, Lothar; Bopp, Irene; Rüegger-Frey, Brigitte; Klaghofer, Richard; Gocke, Carola; Hampel, Harald; Beck, Sacha; Buchert, Ralph

    2015-01-01

    Hippocampal volume is a promising biomarker to enhance the accuracy of the diagnosis of dementia due to Alzheimer's disease (AD). However, whereas hippocampal volume is well studied in patient samples from clinical trials, its value in clinical routine patient care is still rather unclear. The aim of the present study, therefore, was to evaluate fully automated atlas-based hippocampal volumetry for detection of AD in the setting of a secondary care expert memory clinic for outpatients. One-hundred consecutive patients with memory complaints were clinically evaluated and categorized into three diagnostic groups: AD, intermediate AD, and non-AD. A software tool based on open source software (Statistical Parametric Mapping SPM8) was employed for fully automated tissue segmentation and stereotactical normalization of high-resolution three-dimensional T1-weighted magnetic resonance images. Predefined standard masks were used for computation of grey matter volume of the left and right hippocampus which then was scaled to the patient's total grey matter volume. The right hippocampal volume provided an area under the receiver operating characteristic curve of 84% for detection of AD patients in the whole sample. This indicates that fully automated MR-based hippocampal volumetry fulfills the requirements for a relevant core feasible biomarker for detection of AD in everyday patient care in a secondary care memory clinic for outpatients. The software used in the present study has been made freely available as an SPM8 toolbox. It is robust and fast so that it is easily integrated into routine workflow.

  11. Development of a fully automated software system for rapid analysis/processing of the falling weight deflectometer data.

    Science.gov (United States)

    2009-02-01

    The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for ra...

  12. Fully automated gamma spectrometry gauge observing possible radioactive contamination of melting-shop samples

    International Nuclear Information System (INIS)

    Kroos, J.; Westkaemper, G.; Stein, J.

    1999-01-01

    At Salzgitter AG, several monitoring systems have been installed to check the scrap transport by rail and by car. At the moment, the scrap transport by ship is reloaded onto wagons for monitoring afterwards. In the future, a detection system will be mounted onto a crane for a direct check on scrap upon the departure of ship. Furthermore, at Salzgitter AG Central Chemical Laboratory, a fully automated gamma spectrometry gauge is installed in order to observe a possible radioactive contamination of the products. The gamma spectrometer is integrated into the automated OE spectrometry line for testing melting shop samples after performing the OE spectrometry. With this technique the specific activity of selected nuclides and dose rate will be determined. The activity observation is part of the release procedure. The corresponding measurement data are stored in a database for quality management reasons. (author)

  13. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  14. A new fully automated FTIR system for total column measurements of greenhouse gases

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  15. Fully automated system for Pu measurement by gamma spectrometry of alpha contaminated solid wastes

    International Nuclear Information System (INIS)

    Cresti, P.

    1986-01-01

    A description is given of a fully automated system developed at Comb/Mepis Laboratories which is based on the detection of specific gamma signatures of Pu isotopes for monitoring Pu content in 15-25 l containers of low density (0.1 g/cm 3 ) wastes. The methodological approach is discussed; based on experimental data, an evaluation of the achievable performances (detection limit, precision, accuracy, etc.) is also given

  16. The development of a fully automated radioimmunoassay instrument - micromedic systems concept 4

    International Nuclear Information System (INIS)

    Painter, K.

    1977-01-01

    The fully automatic RIA system Concept 4 by Micromedic is described in detail. The system uses antibody-coated test tubes to take up the samples. It has a maximum capacity of 200 tubes including standards and control tubes. Its advantages are, in particular, high flow rate, reproducibility, and fully automatic testing i.e. low personnel requirements. Its disadvantage are difficulties in protein assays. (ORU) [de

  17. Fully automated parallel oligonucleotide synthesizer

    Czech Academy of Sciences Publication Activity Database

    Lebl, M.; Burger, Ch.; Ellman, B.; Heiner, D.; Ibrahim, G.; Jones, A.; Nibbe, M.; Thompson, J.; Mudra, Petr; Pokorný, Vít; Poncar, Pavel; Ženíšek, Karel

    2001-01-01

    Roč. 66, č. 8 (2001), s. 1299-1314 ISSN 0010-0765 Institutional research plan: CEZ:AV0Z4055905 Keywords : automated oligonucleotide synthesizer Subject RIV: CC - Organic Chemistry Impact factor: 0.778, year: 2001

  18. Microscope image based fully automated stomata detection and pore measurement method for grapevines

    Directory of Open Access Journals (Sweden)

    Hiranya Jayakody

    2017-11-01

    Full Text Available Abstract Background Stomatal behavior in grapevines has been identified as a good indicator of the water stress level and overall health of the plant. Microscope images are often used to analyze stomatal behavior in plants. However, most of the current approaches involve manual measurement of stomatal features. The main aim of this research is to develop a fully automated stomata detection and pore measurement method for grapevines, taking microscope images as the input. The proposed approach, which employs machine learning and image processing techniques, can outperform available manual and semi-automatic methods used to identify and estimate stomatal morphological features. Results First, a cascade object detection learning algorithm is developed to correctly identify multiple stomata in a large microscopic image. Once the regions of interest which contain stomata are identified and extracted, a combination of image processing techniques are applied to estimate the pore dimensions of the stomata. The stomata detection approach was compared with an existing fully automated template matching technique and a semi-automatic maximum stable extremal regions approach, with the proposed method clearly surpassing the performance of the existing techniques with a precision of 91.68% and an F1-score of 0.85. Next, the morphological features of the detected stomata were measured. Contrary to existing approaches, the proposed image segmentation and skeletonization method allows us to estimate the pore dimensions even in cases where the stomatal pore boundary is only partially visible in the microscope image. A test conducted using 1267 images of stomata showed that the segmentation and skeletonization approach was able to correctly identify the stoma opening 86.27% of the time. Further comparisons made with manually traced stoma openings indicated that the proposed method is able to estimate stomata morphological features with accuracies of 89.03% for area

  19. Fully automated synthesis of [(18) F]fluoro-dihydrotestosterone ([(18) F]FDHT) using the FlexLab module.

    Science.gov (United States)

    Ackermann, Uwe; Lewis, Jason S; Young, Kenneth; Morris, Michael J; Weickhardt, Andrew; Davis, Ian D; Scott, Andrew M

    2016-08-01

    Imaging of androgen receptor expression in prostate cancer using F-18 FDHT is becoming increasingly popular. With the radiolabelling precursor now commercially available, developing a fully automated synthesis of [(18) F] FDHT is important. We have fully automated the synthesis of F-18 FDHT using the iPhase FlexLab module using only commercially available components. Total synthesis time was 90 min, radiochemical yields were 25-33% (n = 11). Radiochemical purity of the final formulation was > 99% and specific activity was > 18.5 GBq/µmol for all batches. This method can be up-scaled as desired, thus making it possible to study multiple patients in a day. Furthermore, our procedure uses 4 mg of precursor only and is therefore cost-effective. The synthesis has now been validated at Austin Health and is currently used for [(18) F]FDHT studies in patients. We believe that this method can easily adapted by other modules to further widen the availability of [(18) F]FDHT. Copyright © 2016 John Wiley & Sons, Ltd.

  20. A two-dimensionally coincident second difference cosmic ray spike removal method for the fully automated processing of Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Turner, Robin F B

    2014-01-01

    Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.

  1. Association between fully automated MRI-based volumetry of different brain regions and neuropsychological test performance in patients with amnestic mild cognitive impairment and Alzheimer's disease.

    Science.gov (United States)

    Arlt, Sönke; Buchert, Ralph; Spies, Lothar; Eichenlaub, Martin; Lehmbeck, Jan T; Jahn, Holger

    2013-06-01

    Fully automated magnetic resonance imaging (MRI)-based volumetry may serve as biomarker for the diagnosis in patients with mild cognitive impairment (MCI) or dementia. We aimed at investigating the relation between fully automated MRI-based volumetric measures and neuropsychological test performance in amnestic MCI and patients with mild dementia due to Alzheimer's disease (AD) in a cross-sectional and longitudinal study. In order to assess a possible prognostic value of fully automated MRI-based volumetry for future cognitive performance, the rate of change of neuropsychological test performance over time was also tested for its correlation with fully automated MRI-based volumetry at baseline. In 50 subjects, 18 with amnestic MCI, 21 with mild AD, and 11 controls, neuropsychological testing and T1-weighted MRI were performed at baseline and at a mean follow-up interval of 2.1 ± 0.5 years (n = 19). Fully automated MRI volumetry of the grey matter volume (GMV) was performed using a combined stereotactic normalisation and segmentation approach as provided by SPM8 and a set of pre-defined binary lobe masks. Left and right hippocampus masks were derived from probabilistic cytoarchitectonic maps. Volumes of the inner and outer liquor space were also determined automatically from the MRI. Pearson's test was used for the correlation analyses. Left hippocampal GMV was significantly correlated with performance in memory tasks, and left temporal GMV was related to performance in language tasks. Bilateral frontal, parietal and occipital GMVs were correlated to performance in neuropsychological tests comprising multiple domains. Rate of GMV change in the left hippocampus was correlated with decline of performance in the Boston Naming Test (BNT), Mini-Mental Status Examination, and trail making test B (TMT-B). The decrease of BNT and TMT-A performance over time correlated with the loss of grey matter in multiple brain regions. We conclude that fully automated MRI

  2. A new fully automated FTIR system for total column measurements of greenhouse gases

    Directory of Open Access Journals (Sweden)

    M. C. Geibel

    2010-10-01

    Full Text Available This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics.

    Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control.

    First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months.

    After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  3. Performance of a fully automated program for measurement of left ventricular ejection fraction

    International Nuclear Information System (INIS)

    Douglass, K.H.; Tibbits, P.; Kasecamp, W.; Han, S.T.; Koller, D.; Links, J.M.; Wagner, H.H. Jr.

    1982-01-01

    A fully automated program developed by us for measurement of left ventricular ejection fraction from equilibrium gated blood studies was evaluated in 130 additional patients. Both of 6-min (130 studies) and 2-min (142 studies in 31 patients) gated blood pool studies were acquired and processed. The program successfully generated ejection fractions in 86% of the studies. These automatically generated ejection fractions were compared with ejection fractions derived from manually drawn regions the interest. When studies were acquired for 6-min with the patient at rest, the correlation between automated and manual ejection fractions was 0.92. When studies were acquired for 2-min, both at rest and during bicycle exercise, the correlation was 0.81. In 25 studies from patients who also underwent contrast ventriculography, the program successfully generated regions of interest in 22 (88%). The correlation between the ejection fraction determined by contrast ventriculography and the automatically generated radionuclide ejection fraction was 0.79. (orig.)

  4. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    Science.gov (United States)

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Self-consistent hybrid functionals for solids: a fully-automated implementation

    Science.gov (United States)

    Erba, A.

    2017-08-01

    A fully-automated algorithm for the determination of the system-specific optimal fraction of exact exchange in self-consistent hybrid functionals of the density-functional-theory is illustrated, as implemented into the public Crystal program. The exchange fraction of this new class of functionals is self-consistently updated proportionally to the inverse of the dielectric response of the system within an iterative procedure (Skone et al 2014 Phys. Rev. B 89, 195112). Each iteration of the present scheme, in turn, implies convergence of a self-consistent-field (SCF) and a coupled-perturbed-Hartree-Fock/Kohn-Sham (CPHF/KS) procedure. The present implementation, beside improving the user-friendliness of self-consistent hybrids, exploits the unperturbed and electric-field perturbed density matrices from previous iterations as guesses for subsequent SCF and CPHF/KS iterations, which is documented to reduce the overall computational cost of the whole process by a factor of 2.

  6. Development and evaluation of fully automated demand response in large facilities

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing

  7. 3D model assisted fully automated scanning laser Doppler vibrometer measurements

    Science.gov (United States)

    Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve

    2017-12-01

    In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle

  8. Fully automated chest wall line segmentation in breast MRI by using context information

    Science.gov (United States)

    Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina

    2012-03-01

    Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).

  9. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography...

  10. Detection of virus-specific intrathecally synthesised immunoglobulin G with a fully automated enzyme immunoassay system

    Directory of Open Access Journals (Sweden)

    Weissbrich Benedikt

    2007-05-01

    Full Text Available Abstract Background The determination of virus-specific immunoglobulin G (IgG antibodies in cerebrospinal fluid (CSF is useful for the diagnosis of virus associated diseases of the central nervous system (CNS and for the detection of a polyspecific intrathecal immune response in patients with multiple sclerosis. Quantification of virus-specific IgG in the CSF is frequently performed by calculation of a virus-specific antibody index (AI. Determination of the AI is a demanding and labour-intensive technique and therefore automation is desirable. We evaluated the precision and the diagnostic value of a fully automated enzyme immunoassay for the detection of virus-specific IgG in serum and CSF using the analyser BEP2000 (Dade Behring. Methods The AI for measles, rubella, varicella-zoster, and herpes simplex virus IgG was determined from pairs of serum and CSF samples of patients with viral CNS infections, multiple sclerosis and of control patients. CSF and serum samples were tested simultaneously with reference to a standard curve. Starting dilutions were 1:6 and 1:36 for CSF and 1:1386 and 1:8316 for serum samples. Results The interassay coefficient of variation was below 10% for all parameters tested. There was good agreement between AIs obtained with the BEP2000 and AIs derived from the semi-automated reference method. Conclusion Determination of virus-specific IgG in serum-CSF-pairs for calculation of AI has been successfully automated on the BEP2000. Current limitations of the assay layout imposed by the analyser software should be solved in future versions to offer more convenience in comparison to manual or semi-automated methods.

  11. A fully automated system for ultrasonic power measurement and simulation accordingly to IEC 61161:2006

    International Nuclear Information System (INIS)

    Costa-Felix, Rodrigo P B; Alvarenga, Andre V; Hekkenberg, Rob

    2011-01-01

    The ultrasonic power measurement, worldwide accepted, standard is the IEC 61161, presently in its 2nd edition (2006), but under review. To fulfil its requirements, considering that a radiation force balance is to be used as ultrasonic power detector, a large amount of raw data (mass measurement) shall be collected as function of time to perform all necessary calculations and corrections. Uncertainty determination demands calculation effort of raw and processed data. Although it is possible to be undertaken in an old-fashion way, using spread sheets and manual data collection, automation software are often used in metrology to provide a virtually error free environment concerning data acquisition and repetitive calculations and corrections. Considering that, a fully automate ultrasonic power measurement system was developed and comprehensively tested. A 0,1 mg of precision balance model CP224S (Sartorius, Germany) was used as measuring device and a calibrated continuous wave ultrasound check source (Precision Acoustics, UK) was the device under test. A 150 ml container filled with degassed water and containing an absorbing target at the bottom was placed on the balance pan. Besides the feature of automation software, a routine of power measurement simulation was implemented. It was idealized as a teaching tool of how ultrasonic power emission behaviour is with a radiation force balance equipped with an absorbing target. Automation software was considered as an effective tool for speeding up ultrasonic power measurement, while allowing accurate calculation and attractive graphical partial and final results.

  12. Fully automated left ventricular contour detection for gated radionuclide angiography, (1)

    International Nuclear Information System (INIS)

    Hosoba, Minoru; Wani, Hidenobu; Hiroe, Michiaki; Kusakabe, Kiyoko.

    1984-01-01

    A fully automated practical method has been developed to detect the left ventricular (LV) contour from gated pool images. Ejection fraction and volume curve can be computed accurately without operater variance. The characteristics of the method are summarized as follows: 1. Optimal design of the filter that works on Fourier domain, can be achieved to improve the signal to noise ratio. 2. New algorithm which use the cosine and sine transform images has been developed for the separating ventricle from atrium and defining center of LV. 3. Contrast enhancement by optimized square filter. 4. Radial profiles are generated from the center of LV and smoothed by fourth order Fourier series approximation. The crossing point with local threshold value searched from the center of the LV is defined as edge. 5. LV contour is obtained by conecting all the edge points defined on radial profiles by fitting them to Fourier function. (author)

  13. Computer-aided liver volumetry: performance of a fully-automated, prototype post-processing solution for whole-organ and lobar segmentation based on MDCT imaging.

    Science.gov (United States)

    Fananapazir, Ghaneh; Bashir, Mustafa R; Marin, Daniele; Boll, Daniel T

    2015-06-01

    To evaluate the performance of a prototype, fully-automated post-processing solution for whole-liver and lobar segmentation based on MDCT datasets. A polymer liver phantom was used to assess accuracy of post-processing applications comparing phantom volumes determined via Archimedes' principle with MDCT segmented datasets. For the IRB-approved, HIPAA-compliant study, 25 patients were enrolled. Volumetry performance compared the manual approach with the automated prototype, assessing intraobserver variability, and interclass correlation for whole-organ and lobar segmentation using ANOVA comparison. Fidelity of segmentation was evaluated qualitatively. Phantom volume was 1581.0 ± 44.7 mL, manually segmented datasets estimated 1628.0 ± 47.8 mL, representing a mean overestimation of 3.0%, automatically segmented datasets estimated 1601.9 ± 0 mL, representing a mean overestimation of 1.3%. Whole-liver and segmental volumetry demonstrated no significant intraobserver variability for neither manual nor automated measurements. For whole-liver volumetry, automated measurement repetitions resulted in identical values; reproducible whole-organ volumetry was also achieved with manual segmentation, p(ANOVA) 0.98. For lobar volumetry, automated segmentation improved reproducibility over manual approach, without significant measurement differences for either methodology, p(ANOVA) 0.95-0.99. Whole-organ and lobar segmentation results from manual and automated segmentation showed no significant differences, p(ANOVA) 0.96-1.00. Assessment of segmentation fidelity found that segments I-IV/VI showed greater segmentation inaccuracies compared to the remaining right hepatic lobe segments. Automated whole-liver segmentation showed non-inferiority of fully-automated whole-liver segmentation compared to manual approaches with improved reproducibility and post-processing duration; automated dual-seed lobar segmentation showed slight tendencies for underestimating the right hepatic lobe

  14. A fully automated FTIR system for remote sensing of greenhouse gases in the tropics

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-07-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. First results of total column measurements at Jena, Germany show that the instrument works well and can provide diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  15. A fully automated entanglement-based quantum cryptography system for telecom fiber networks

    International Nuclear Information System (INIS)

    Treiber, Alexander; Ferrini, Daniele; Huebel, Hannes; Zeilinger, Anton; Poppe, Andreas; Loruenser, Thomas; Querasser, Edwin; Matyus, Thomas; Hentschel, Michael

    2009-01-01

    We present in this paper a quantum key distribution (QKD) system based on polarization entanglement for use in telecom fibers. A QKD exchange up to 50 km was demonstrated in the laboratory with a secure key rate of 550 bits s -1 . The system is compact and portable with a fully automated start-up, and stabilization modules for polarization, synchronization and photon coupling allow hands-off operation. Stable and reliable key exchange in a deployed optical fiber of 16 km length was demonstrated. In this fiber network, we achieved over 2 weeks an automatic key generation with an average key rate of 2000 bits s -1 without manual intervention. During this period, the system had an average entanglement visibility of 93%, highlighting the technical level and stability achieved for entanglement-based quantum cryptography.

  16. ProteinSplit: splitting of multi-domain proteins using prediction of ordered and disordered regions in protein sequences for virtual structural genomics

    International Nuclear Information System (INIS)

    Wyrwicz, Lucjan S; Koczyk, Grzegorz; Rychlewski, Leszek; Plewczynski, Dariusz

    2007-01-01

    The annotation of protein folds within newly sequenced genomes is the main target for semi-automated protein structure prediction (virtual structural genomics). A large number of automated methods have been developed recently with very good results in the case of single-domain proteins. Unfortunately, most of these automated methods often fail to properly predict the distant homology between a given multi-domain protein query and structural templates. Therefore a multi-domain protein should be split into domains in order to overcome this limitation. ProteinSplit is designed to identify protein domain boundaries using a novel algorithm that predicts disordered regions in protein sequences. The software utilizes various sequence characteristics to assess the local propensity of a protein to be disordered or ordered in terms of local structure stability. These disordered parts of a protein are likely to create interdomain spacers. Because of its speed and portability, the method was successfully applied to several genome-wide fold annotation experiments. The user can run an automated analysis of sets of proteins or perform semi-automated multiple user projects (saving the results on the server). Additionally the sequences of predicted domains can be sent to the Bioinfo.PL Protein Structure Prediction Meta-Server for further protein three-dimensional structure and function prediction. The program is freely accessible as a web service at http://lucjan.bioinfo.pl/proteinsplit together with detailed benchmark results on the critical assessment of a fully automated structure prediction (CAFASP) set of sequences. The source code of the local version of protein domain boundary prediction is available upon request from the authors

  17. Fully automated synthesis of 11C-acetate as tumor PET tracer by simple modified solid-phase extraction purification

    International Nuclear Information System (INIS)

    Tang, Xiaolan; Tang, Ganghua; Nie, Dahong

    2013-01-01

    Introduction: Automated synthesis of 11 C-acetate ( 11 C-AC) as the most commonly used radioactive fatty acid tracer is performed by a simple, rapid, and modified solid-phase extraction (SPE) purification. Methods: Automated synthesis of 11 C-AC was implemented by carboxylation reaction of MeMgBr on a polyethylene Teflon loop ring with 11 C-CO 2 , followed by acidic hydrolysis with acid and SCX cartridge, and purification on SCX, AG11A8 and C18 SPE cartridges using a commercially available 11 C-tracer synthesizer. Quality control test and animals positron emission tomography (PET) imaging were also carried out. Results: A high and reproducible decay-uncorrected radiochemical yield of (41.0±4.6)% (n=10) was obtained from 11 C-CO 2 within the whole synthesis time about 8 min. The radiochemical purity of 11 C-AC was over 95% by high-performance liquid chromatography (HPLC) analysis. Quality control test and PET imaging showed that 11 C-AC injection produced by the simple SPE procedure was safe and efficient, and was in agreement with the current Chinese radiopharmaceutical quality control guidelines. Conclusion: The novel, simple, and rapid method is readily adapted to the fully automated synthesis of 11 C-AC on several existing commercial synthesis module. The method can be used routinely to produce 11 C-AC for preclinical and clinical studies with PET imaging. - Highlights: • A fully automated synthesis of 11 C-acetate by simple modified solid-phase extraction purification has been developed. • Typical non-decay-corrected yields were (41.0±4.6)% (n=10) • Radiochemical purity was determined by radio-HPLC analysis on a C18 column using the gradient program, instead of expensive organic acid column or anion column. • QC testing (RCP>99%)

  18. Improvement of an automated protein crystal exchange system PAM for high-throughput data collection

    International Nuclear Information System (INIS)

    Hiraki, Masahiko; Yamada, Yusuke; Chavas, Leonard M. G.; Wakatsuki, Soichi; Matsugaki, Naohiro

    2013-01-01

    A special liquid-nitrogen Dewar with double capacity for the sample-exchange robot has been created at AR-NE3A at the Photon Factory, allowing continuous fully automated data collection. In this work, this new system is described and the stability of its calibration is discussed. Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable

  19. Improvement of an automated protein crystal exchange system PAM for high-throughput data collection

    Energy Technology Data Exchange (ETDEWEB)

    Hiraki, Masahiko, E-mail: masahiko.hiraki@kek.jp; Yamada, Yusuke; Chavas, Leonard M. G. [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Wakatsuki, Soichi [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); SLAC National Accelerator Laboratory, 2575 Sand Hill Road, MS 69, Menlo Park, CA 94025-7015 (United States); Stanford University, Beckman Center B105, Stanford, CA 94305-5126 (United States); Matsugaki, Naohiro [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)

    2013-11-01

    A special liquid-nitrogen Dewar with double capacity for the sample-exchange robot has been created at AR-NE3A at the Photon Factory, allowing continuous fully automated data collection. In this work, this new system is described and the stability of its calibration is discussed. Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable.

  20. Performance of a fully automated scatterometer for BRDF and BTDF measurements at visible and infrared wavelengths

    International Nuclear Information System (INIS)

    Anderson, S.; Shepard, D.F.; Pompea, S.M.; Castonguay, R.

    1989-01-01

    The general performance of a fully automated scatterometer shows that the instrument can make rapid, accurate BRDF (bidirectional reflectance distribution function) and BTDF (bidirectional transmittance distribution function) measurements of optical surfaces over a range of approximately ten orders of magnitude in BRDF. These measurements can be made for most surfaces even with the detector at the specular angle, because of beam-attenuation techniques. He-Ne and CO2 lasers are used as sources in conjunction with a reference detector and chopper

  1. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning versus non-deep learning.

    Science.gov (United States)

    Wang, Xinggang; Yang, Wei; Weinreb, Jeffrey; Han, Juan; Li, Qiubai; Kong, Xiangchuang; Yan, Yongluan; Ke, Zan; Luo, Bo; Liu, Tao; Wang, Liang

    2017-11-13

    Prostate cancer (PCa) is a major cause of death since ancient time documented in Egyptian Ptolemaic mummy imaging. PCa detection is critical to personalized medicine and varies considerably under an MRI scan. 172 patients with 2,602 morphologic images (axial 2D T2-weighted imaging) of the prostate were obtained. A deep learning with deep convolutional neural network (DCNN) and a non-deep learning with SIFT image feature and bag-of-word (BoW), a representative method for image recognition and analysis, were used to distinguish pathologically confirmed PCa patients from prostate benign conditions (BCs) patients with prostatitis or prostate benign hyperplasia (BPH). In fully automated detection of PCa patients, deep learning had a statistically higher area under the receiver operating characteristics curve (AUC) than non-deep learning (P = 0.0007 deep learning method and 0.70 (95% CI 0.63-0.77) for non-deep learning method, respectively. Our results suggest that deep learning with DCNN is superior to non-deep learning with SIFT image feature and BoW model for fully automated PCa patients differentiation from prostate BCs patients. Our deep learning method is extensible to image modalities such as MR imaging, CT and PET of other organs.

  2. Photochemical-chemiluminometric determination of aldicarb in a fully automated multicommutation based flow-assembly

    International Nuclear Information System (INIS)

    Palomeque, M.; Garcia Bautista, J.A.; Catala Icardo, M.; Garcia Mateo, J.V.; Martinez Calatayud, J.

    2004-01-01

    A sensitive and fully automated method for determination of aldicarb in technical formulations (Temik) and mineral waters is proposed. The automation of the flow-assembly is based on the multicommutation approach, which uses a set of solenoid valves acting as independent switchers. The operating cycle for obtaining a typical analytical transient signal can be easily programmed by means of a home-made software running in the Windows environment. The manifold is provided with a photoreactor consisting of a 150 cm long x 0.8 mm i.d. piece of PTFE tubing coiled around a 20 W low-pressure mercury lamp. The determination of aldicarb is performed on the basis of the iron(III) catalytic mineralization of the pesticide by UV irradiation (150 s), and the chemiluminescent (CL) behavior of the photodegradated pesticide in presence of potassium permanganate and quinine sulphate as sensitizer. UV irradiation of aldicarb turns the very week chemiluminescent pesticide into a strongly chemiluminescent photoproduct. The method is linear over the range 2.2-100.0 μg l -1 of aldicarb; the limit of detection is 0.069 μg l -1 ; the reproducibility (as the R.S.D. of 20 peaks of a 24 μg l -1 solution) is 3.7% and the sample throughput is 17 h -1

  3. Fully-automated identification of fish species based on otolith contour: using short-time Fourier transform and discriminant analysis (STFT-DA).

    Science.gov (United States)

    Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder; Chong, Ving Ching

    2016-01-01

    Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies.

  4. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    Science.gov (United States)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  5. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    Science.gov (United States)

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Fully automated VMAT treatment planning for advanced-stage NSCLC patients

    International Nuclear Information System (INIS)

    Della Gala, Giuseppe; Dirkx, Maarten L.P.; Hoekstra, Nienke; Fransen, Dennie; Pol, Marjan van de; Heijmen, Ben J.M.; Lanconelli, Nico; Petit, Steven F.

    2017-01-01

    To develop a fully automated procedure for multicriterial volumetric modulated arc therapy (VMAT) treatment planning (autoVMAT) for stage III/IV non-small cell lung cancer (NSCLC) patients treated with curative intent. After configuring the developed autoVMAT system for NSCLC, autoVMAT plans were compared with manually generated clinically delivered intensity-modulated radiotherapy (IMRT) plans for 41 patients. AutoVMAT plans were also compared to manually generated VMAT plans in the absence of time pressure. For 16 patients with reduced planning target volume (PTV) dose prescription in the clinical IMRT plan (to avoid violation of organs at risk tolerances), the potential for dose escalation with autoVMAT was explored. Two physicians evaluated 35/41 autoVMAT plans (85%) as clinically acceptable. Compared to the manually generated IMRT plans, autoVMAT plans showed statistically significant improved PTV coverage (V_9_5_% increased by 1.1% ± 1.1%), higher dose conformity (R_5_0 reduced by 12.2% ± 12.7%), and reduced mean lung, heart, and esophagus doses (reductions of 0.9 Gy ± 1.0 Gy, 1.5 Gy ± 1.8 Gy, 3.6 Gy ± 2.8 Gy, respectively, all p [de

  7. Fully automated laboratory and field-portable goniometer used for performing accurate and precise multiangular reflectance measurements

    Science.gov (United States)

    Harms, Justin D.; Bachmann, Charles M.; Ambeau, Brittany L.; Faulring, Jason W.; Ruiz Torres, Andres J.; Badura, Gregory; Myers, Emily

    2017-10-01

    Field-portable goniometers are created for a wide variety of applications. Many of these applications require specific types of instruments and measurement schemes and must operate in challenging environments. Therefore, designs are based on the requirements that are specific to the application. We present a field-portable goniometer that was designed for measuring the hemispherical-conical reflectance factor (HCRF) of various soils and low-growing vegetation in austere coastal and desert environments and biconical reflectance factors in laboratory settings. Unlike some goniometers, this system features a requirement for "target-plane tracking" to ensure that measurements can be collected on sloped surfaces, without compromising angular accuracy. The system also features a second upward-looking spectrometer to measure the spatially dependent incoming illumination, an integrated software package to provide full automation, an automated leveling system to ensure a standard frame of reference, a design that minimizes the obscuration due to self-shading to measure the opposition effect, and the ability to record a digital elevation model of the target region. This fully automated and highly mobile system obtains accurate and precise measurements of HCRF in a wide variety of terrain and in less time than most other systems while not sacrificing consistency or repeatability in laboratory environments.

  8. Automated builder and database of protein/membrane complexes for molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Sunhwan Jo

    2007-09-01

    Full Text Available Molecular dynamics simulations of membrane proteins have provided deeper insights into their functions and interactions with surrounding environments at the atomic level. However, compared to solvation of globular proteins, building a realistic protein/membrane complex is still challenging and requires considerable experience with simulation software. Membrane Builder in the CHARMM-GUI website (http://www.charmm-gui.org helps users to build such a complex system using a web browser with a graphical user interface. Through a generalized and automated building process including system size determination as well as generation of lipid bilayer, pore water, bulk water, and ions, a realistic membrane system with virtually any kinds and shapes of membrane proteins can be generated in 5 minutes to 2 hours depending on the system size. Default values that were elaborated and tested extensively are given in each step to provide reasonable options and starting points for both non-expert and expert users. The efficacy of Membrane Builder is illustrated by its applications to 12 transmembrane and 3 interfacial membrane proteins, whose fully equilibrated systems with three different types of lipid molecules (DMPC, DPPC, and POPC and two types of system shapes (rectangular and hexagonal are freely available on the CHARMM-GUI website. One of the most significant advantages of using the web environment is that, if a problem is found, users can go back and re-generate the whole system again before quitting the browser. Therefore, Membrane Builder provides the intuitive and easy way to build and simulate the biologically important membrane system.

  9. Fully automated deformable registration of breast DCE-MRI and PET/CT

    Science.gov (United States)

    Dmitriev, I. D.; Loo, C. E.; Vogel, W. V.; Pengel, K. E.; Gilhuijs, K. G. A.

    2013-02-01

    Accurate characterization of breast tumors is important for the appropriate selection of therapy and monitoring of the response. For this purpose breast imaging and tissue biopsy are important aspects. In this study, a fully automated method for deformable registration of DCE-MRI and PET/CT of the breast is presented. The registration is performed using the CT component of the PET/CT and the pre-contrast T1-weighted non-fat suppressed MRI. Comparable patient setup protocols were used during the MRI and PET examinations in order to avoid having to make assumptions of biomedical properties of the breast during and after the application of chemotherapy. The registration uses a multi-resolution approach to speed up the process and to minimize the probability of converging to local minima. The validation was performed on 140 breasts (70 patients). From a total number of registration cases, 94.2% of the breasts were aligned within 4.0 mm accuracy (1 PET voxel). Fused information may be beneficial to obtain representative biopsy samples, which in turn will benefit the treatment of the patient.

  10. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    Science.gov (United States)

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  12. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  13. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  14. A Fully Automated Approach to Spike Sorting.

    Science.gov (United States)

    Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F

    2017-09-13

    Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Fully automated treatment planning for head and neck radiotherapy using a voxel-based dose prediction and dose mimicking method

    Science.gov (United States)

    McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A.; Purdie, Thomas G.

    2017-08-01

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment

  16. The Protein Maker: an automated system for high-throughput parallel purification

    International Nuclear Information System (INIS)

    Smith, Eric R.; Begley, Darren W.; Anderson, Vanessa; Raymond, Amy C.; Haffner, Taryn E.; Robinson, John I.; Edwards, Thomas E.; Duncan, Natalie; Gerdts, Cory J.; Mixon, Mark B.; Nollert, Peter; Staker, Bart L.; Stewart, Lance J.

    2011-01-01

    The Protein Maker instrument addresses a critical bottleneck in structural genomics by allowing automated purification and buffer testing of multiple protein targets in parallel with a single instrument. Here, the use of this instrument to (i) purify multiple influenza-virus proteins in parallel for crystallization trials and (ii) identify optimal lysis-buffer conditions prior to large-scale protein purification is described. The Protein Maker is an automated purification system developed by Emerald BioSystems for high-throughput parallel purification of proteins and antibodies. This instrument allows multiple load, wash and elution buffers to be used in parallel along independent lines for up to 24 individual samples. To demonstrate its utility, its use in the purification of five recombinant PB2 C-terminal domains from various subtypes of the influenza A virus is described. Three of these constructs crystallized and one diffracted X-rays to sufficient resolution for structure determination and deposition in the Protein Data Bank. Methods for screening lysis buffers for a cytochrome P450 from a pathogenic fungus prior to upscaling expression and purification are also described. The Protein Maker has become a valuable asset within the Seattle Structural Genomics Center for Infectious Disease (SSGCID) and hence is a potentially valuable tool for a variety of high-throughput protein-purification applications

  17. Performance evaluation of vertical feed fully automated TLD badge reader using 0.8 and 0.4 mm teflon embedded CaSO4:Dy dosimeters

    International Nuclear Information System (INIS)

    Ratna, P.; More, Vinay; Kulkarni, M.S.

    2012-01-01

    The personnel monitoring of more than 80,000 radiation workers in India is at present carried out by semi-automated TLD badge Reader systems (TLDBR-7B) developed by Radiation Safety Systems Division, Bhabha Atomic Research Centre. More than 60 such reader systems are in use in all the personnel monitoring centers in the country. Radiation Safety Systems Division also developed the fully automated TLD badge reader based on a new TLD badge having built-in machine readable ID code (in the form of 16x3 hole pattern). This automated reader is designed with minimum of changes in the electronics and mechanical hardware in the semiautomatic version (TLDBR-7B) so that such semi-automatic readers can be easily upgraded to the fully automated versions by using the new TLD badge with ID code. The reader was capable of reading 50 TLD cards in 90 minutes. Based on the feedback from the users, a new model of frilly automated TLD badge Reader (model VEFFA-10) is designed which is an improved version of the previously reported fully Automated TLD badge reader. This VEFFA-10 PC based Reader incorporates vertical loading of TLD bards having machine readable ID code. In this new reader, a vertical rack, which can hold 100 such cards, is mounted from the right side of the reader system. The TLD card falls into the channel by gravity from where it is taken to the reading position by rack and pinion method. After the readout, the TLD card is dropped in a eject tray. The reader employs hot N 2 gas heating method and the gas flow is controlled by a specially designed digital gas flow meter on the front panel of the reader system. The system design is very compact and simple and card stuck up problem is totally eliminated in the reader system. The reader has a number of self-diagnostic features to ensure a high degree of reliability. This paper reports the performance evaluation of the Reader using 0.4 mm thick Teflon embedded CaSO 4 :Dy TLD cards instead of 0.8 mm cards

  18. Fully automated synthesis of (phosphopeptide arrays in microtiter plate wells provides efficient access to protein tyrosine kinase characterization

    Directory of Open Access Journals (Sweden)

    Goldstein David J

    2005-01-01

    Full Text Available Abstract Background Synthetic peptides have played a useful role in studies of protein kinase substrates and interaction domains. Synthetic peptide arrays and libraries, in particular, have accelerated the process. Several factors have hindered or limited the applicability of various techniques, such as the need for deconvolution of combinatorial libraries, the inability or impracticality of achieving full automation using two-dimensional or pin solid phases, the lack of convenient interfacing with standard analytical platforms, or the difficulty of compartmentalization of a planar surface when contact between assay components needs to be avoided. This paper describes a process for synthesis of peptides and phosphopeptides on microtiter plate wells that overcomes previous limitations and demonstrates utility in determination of the epitope of an autophosphorylation site phospho-motif antibody and utility in substrate utilization assays of the protein tyrosine kinase, p60c-src. Results The overall reproducibility of phospho-peptide synthesis and multiplexed EGF receptor (EGFR autophosphorylation site (pY1173 antibody ELISA (9H2 was within 5.5 to 8.0%. Mass spectrometric analyses of the released (phosphopeptides showed homogeneous peaks of the expected molecular weights. An overlapping peptide array of the complete EGFR cytoplasmic sequence revealed a high redundancy of 9H2 reactive sites. The eight reactive phospopeptides were structurally related and interestingly, the most conserved antibody reactive peptide motif coincided with a subset of other known EGFR autophosphorylation and SH2 binding motifs and an EGFR optimal substrate motif. Finally, peptides based on known substrate specificities of c-src and related enzymes were synthesized in microtiter plate array format and were phosphorylated by c-Src with the predicted specificities. The level of phosphorylation was proportional to c-Src concentration with sensitivities below 0.1 Units of

  19. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    Science.gov (United States)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  20. The worldwide NORM production and a fully automated gamma-ray spectrometer for their characterization

    International Nuclear Information System (INIS)

    Xhixha, G.; Callegari, I.; Guastaldi, E.; De Bianchi, S.; Fiorentini, G.; Universita di Ferrara, Ferrara; Istituto Nazionale di Fisica Nucleare; Kaceli Xhixha, M.

    2013-01-01

    Materials containing radionuclides of natural origin and being subject to regulation because of their radioactivity are known as Naturally Occurring Radioactive Material (NORM). By following International Atomic Energy Agency, we include in NORM those materials with an activity concentration, which is modified by human made processes. We present a brief review of the main categories of non-nuclear industries together with the levels of activity concentration in feed raw materials, products and waste, including mechanisms of radioisotope enrichments. The global management of NORM shows a high level of complexity, mainly due to different degrees of radioactivity enhancement and the huge amount of worldwide waste production. The future tendency of guidelines concerning environmental protection will require both a systematic monitoring based on the ever-increasing sampling and high performance of gamma-ray spectroscopy. On the ground of these requirements a new low-background fully automated high-resolution gamma-ray spectrometer MCA R ad has been developed. The design of lead and cooper shielding allowed to reach a background reduction of two order of magnitude with respect to laboratory radioactivity. A severe lowering of manpower cost is obtained through a fully automation system, which enables up to 24 samples to be measured without any human attendance. Two coupled HPGe detectors increase the detection efficiency, performing accurate measurements on small sample volume (180 cm 3 ) with a reduction of sample transport cost of material. Details of the instrument calibration method are presented. MCA R ad system can measure in less than one hour a typical NORM sample enriched in U and Th with some hundreds of Bq kg -1 , with an overall uncertainty less than 5 %. Quality control of this method has been tested. Measurements of three certified reference materials RGK-1, RGU-2 and RGTh-1 containing concentrations of potassium, uranium and thorium comparable to NORM have

  1. Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System

    Science.gov (United States)

    Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth

    2018-01-01

    The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544

  2. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    Science.gov (United States)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  3. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    Science.gov (United States)

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    Science.gov (United States)

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  5. Visual Versus Fully Automated Analyses of 18F-FDG and Amyloid PET for Prediction of Dementia Due to Alzheimer Disease in Mild Cognitive Impairment.

    Science.gov (United States)

    Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander

    2016-02-01

    Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to

  6. Parameter evaluation and fully-automated radiosynthesis of [(11)C]harmine for imaging of MAO-A for clinical trials.

    Science.gov (United States)

    Philippe, C; Zeilinger, M; Mitterhauser, M; Dumanic, M; Lanzenberger, R; Hacker, M; Wadsak, W

    2015-03-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [(11)C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2-3mg/mL precursor activated with 1eq. 5M NaOH in DMSO, 80°C reaction temperature and 2min reaction time. Under these conditions 6.1±1GBq (51.0±11% based on [(11)C]CH3I, corrected for decay) of [(11)C]harmine (n=72) were obtained. The specific activity was 101.32±28.2GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    Science.gov (United States)

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  8. Fully automated laser ray tracing system to measure changes in the crystalline lens GRIN profile.

    Science.gov (United States)

    Qiu, Chen; Maceo Heilman, Bianca; Kaipio, Jari; Donaldson, Paul; Vaghefi, Ehsan

    2017-11-01

    Measuring the lens gradient refractive index (GRIN) accurately and reliably has proven an extremely challenging technical problem. A fully automated laser ray tracing (LRT) system was built to address this issue. The LRT system captures images of multiple laser projections before and after traversing through an ex vivo lens. These LRT images, combined with accurate measurements of the lens geometry, are used to calculate the lens GRIN profile. Mathematically, this is an ill-conditioned problem; hence, it is essential to apply biologically relevant constraints to produce a feasible solution. The lens GRIN measurements were compared with previously published data. Our GRIN retrieval algorithm produces fast and accurate measurements of the lens GRIN profile. Experiments to study the optics of physiologically perturbed lenses are the future direction of this research.

  9. A fully-automated neural network analysis of AFM force-distance curves for cancer tissue diagnosis

    Science.gov (United States)

    Minelli, Eleonora; Ciasca, Gabriele; Sassun, Tanya Enny; Antonelli, Manila; Palmieri, Valentina; Papi, Massimiliano; Maulucci, Giuseppe; Santoro, Antonio; Giangaspero, Felice; Delfini, Roberto; Campi, Gaetano; De Spirito, Marco

    2017-10-01

    Atomic Force Microscopy (AFM) has the unique capability of probing the nanoscale mechanical properties of biological systems that affect and are affected by the occurrence of many pathologies, including cancer. This capability has triggered growing interest in the translational process of AFM from physics laboratories to clinical practice. A factor still hindering the current use of AFM in diagnostics is related to the complexity of AFM data analysis, which is time-consuming and needs highly specialized personnel with a strong physical and mathematical background. In this work, we demonstrate an operator-independent neural-network approach for the analysis of surgically removed brain cancer tissues. This approach allowed us to distinguish—in a fully automated fashion—cancer from healthy tissues with high accuracy, also highlighting the presence and the location of infiltrating tumor cells.

  10. Evaluation of a Fully Automated Analyzer for Rapid Measurement of Water Vapor Sorption Isotherms for Applications in Soil Science

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per

    2014-01-01

    The characterization and description of important soil processes such as water vapor transport, volatilization of pesticides, and hysteresis require accurate means for measuring the soil water characteristic (SWC) at low water potentials. Until recently, measurement of the SWC at low water...... potentials was constrained by hydraulic decoupling and long equilibration times when pressure plates or single-point, chilled-mirror instruments were used. A new, fully automated Vapor Sorption Analyzer (VSA) helps to overcome these challenges and allows faster measurement of highly detailed water vapor...

  11. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  12. A fully automated cell segmentation and morphometric parameter system for quantifying corneal endothelial cell morphology.

    Science.gov (United States)

    Al-Fahdawi, Shumoos; Qahwaji, Rami; Al-Waisy, Alaa S; Ipson, Stanley; Ferdousi, Maryam; Malik, Rayaz A; Brahma, Arun

    2018-07-01

    Corneal endothelial cell abnormalities may be associated with a number of corneal and systemic diseases. Damage to the endothelial cells can significantly affect corneal transparency by altering hydration of the corneal stroma, which can lead to irreversible endothelial cell pathology requiring corneal transplantation. To date, quantitative analysis of endothelial cell abnormalities has been manually performed by ophthalmologists using time consuming and highly subjective semi-automatic tools, which require an operator interaction. We developed and applied a fully-automated and real-time system, termed the Corneal Endothelium Analysis System (CEAS) for the segmentation and computation of endothelial cells in images of the human cornea obtained by in vivo corneal confocal microscopy. First, a Fast Fourier Transform (FFT) Band-pass filter is applied to reduce noise and enhance the image quality to make the cells more visible. Secondly, endothelial cell boundaries are detected using watershed transformations and Voronoi tessellations to accurately quantify the morphological parameters of the human corneal endothelial cells. The performance of the automated segmentation system was tested against manually traced ground-truth images based on a database consisting of 40 corneal confocal endothelial cell images in terms of segmentation accuracy and obtained clinical features. In addition, the robustness and efficiency of the proposed CEAS system were compared with manually obtained cell densities using a separate database of 40 images from controls (n = 11), obese subjects (n = 16) and patients with diabetes (n = 13). The Pearson correlation coefficient between automated and manual endothelial cell densities is 0.9 (p system, and the possibility of utilizing it in a real world clinical setting to enable rapid diagnosis and for patient follow-up, with an execution time of only 6 seconds per image. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Smartnotebook: A semi-automated approach to protein sequential NMR resonance assignments

    International Nuclear Information System (INIS)

    Slupsky, Carolyn M.; Boyko, Robert F.; Booth, Valerie K.; Sykes, Brian D.

    2003-01-01

    Complete and accurate NMR spectral assignment is a prerequisite for high-throughput automated structure determination of biological macromolecules. However, completely automated assignment procedures generally encounter difficulties for all but the most ideal data sets. Sources of these problems include difficulty in resolving correlations in crowded spectral regions, as well as complications arising from dynamics, such as weak or missing peaks, or atoms exhibiting more than one peak due to exchange phenomena. Smartnotebook is a semi-automated assignment software package designed to combine the best features of the automated and manual approaches. The software finds and displays potential connections between residues, while the spectroscopist makes decisions on which connection is correct, allowing rapid and robust assignment. In addition, smartnotebook helps the user fit chains of connected residues to the primary sequence of the protein by comparing the experimentally determined chemical shifts with expected shifts derived from a chemical shift database, while providing bookkeeping throughout the assignment procedure

  14. Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH

    International Nuclear Information System (INIS)

    Volk, Jochen; Herrmann, Torsten; Wuethrich, Kurt

    2008-01-01

    MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness

  15. Blind testing of routine, fully automated determination of protein structures from NMR data.

    NARCIS (Netherlands)

    Rosato, A.; Aramini, J.M.; Arrowsmith, C.; Bagaria, A.; Baker, D.; Cavalli, A.; Doreleijers, J.; Eletsky, A.; Giachetti, A.; Guerry, P.; Gutmanas, A.; Guntert, P.; He, Y.; Herrmann, T.; Huang, Y.J.; Jaravine, V.; Jonker, H.R.; Kennedy, M.A.; Lange, O.F.; Liu, G.; Malliavin, T.E.; Mani, R.; Mao, B.; Montelione, G.T.; Nilges, M.; Rossi, P.; Schot, G. van der; Schwalbe, H.; Szyperski, T.A.; Vendruscolo, M.; Vernon, R.; Vranken, W.F.; Vries, S.D. de; Vuister, G.W.; Wu, B.; Yang, Y.; Bonvin, A.M.

    2012-01-01

    The protocols currently used for protein structure determination by nuclear magnetic resonance (NMR) depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by

  16. Blind Testing of Routine, Fully Automated Determination of Protein Structures from NMR Data

    NARCIS (Netherlands)

    Rosato, A.; Aramini, J.M.; van der Schot, G.; de Vries, S.J.|info:eu-repo/dai/nl/304837717; Bonvin, A.M.J.J.|info:eu-repo/dai/nl/113691238

    2012-01-01

    The protocols currently used for protein structure determination by nuclear magnetic resonance (NMR) depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by

  17. Fully Automated Atlas-Based Hippocampus Volumetry for Clinical Routine: Validation in Subjects with Mild Cognitive Impairment from the ADNI Cohort.

    Science.gov (United States)

    Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2015-01-01

    Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.

  18. NMRNet: A deep learning approach to automated peak picking of protein NMR spectra.

    Science.gov (United States)

    Klukowski, Piotr; Augoff, Michal; Zieba, Maciej; Drwal, Maciej; Gonczarek, Adam; Walczak, Michal J

    2018-03-14

    Automated selection of signals in protein NMR spectra, known as peak picking, has been studied for over 20 years, nevertheless existing peak picking methods are still largely deficient. Accurate and precise automated peak picking would accelerate the structure calculation, and analysis of dynamics and interactions of macromolecules. Recent advancement in handling big data, together with an outburst of machine learning techniques, offer an opportunity to tackle the peak picking problem substantially faster than manual picking and on par with human accuracy. In particular, deep learning has proven to systematically achieve human-level performance in various recognition tasks, and thus emerges as an ideal tool to address automated identification of NMR signals. We have applied a convolutional neural network for visual analysis of multidimensional NMR spectra. A comprehensive test on 31 manually-annotated spectra has demonstrated top-tier average precision (AP) of 0.9596, 0.9058 and 0.8271 for backbone, side-chain and NOESY spectra, respectively. Furthermore, a combination of extracted peak lists with automated assignment routine, FLYA, outperformed other methods, including the manual one, and led to correct resonance assignment at the levels of 90.40%, 89.90% and 90.20% for three benchmark proteins. The proposed model is a part of a Dumpling software (platform for protein NMR data analysis), and is available at https://dumpling.bio/. michaljerzywalczak@gmail.compiotr.klukowski@pwr.edu.pl. Supplementary data are available at Bioinformatics online.

  19. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    Science.gov (United States)

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  20. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    Science.gov (United States)

    Niklasson, Markus; Ahlner, Alexandra; Andresen, Cecilia; Marsh, Joseph A; Lundström, Patrik

    2015-01-01

    The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  1. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    Directory of Open Access Journals (Sweden)

    Markus Niklasson

    2015-01-01

    Full Text Available The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  2. Fully automated drug screening of dried blood spots using online LC-MS/MS analysis

    Directory of Open Access Journals (Sweden)

    Stefan Gaugler

    2018-01-01

    Full Text Available A new and fully automated workflow for the cost effective drug screening of large populations based on the dried blood spot (DBS technology was introduced in this study. DBS were prepared by spotting 15 μL of whole blood, previously spiked with alprazolam, amphetamine, cocaine, codeine, diazepam, fentanyl, lysergic acid diethylamide (LSD, 3,4-methylenedioxymethamphet-amine (MDMA, methadone, methamphetamine, morphine and oxycodone onto filter paper cards. The dried spots were scanned, spiked with deuterated standards and directly extracted. The extract was transferred online to an analytical LC column and then to the electrospray ionization tandem mass spectrometry system. All drugs were quantified at their cut-off level and good precision and correlation within the calibration range was obtained. The method was finally applied to DBS samples from two patients with back pain and codeine and oxycodone could be identified and quantified accurately below the level of misuse of 89.6 ng/mL and 39.6 ng/mL respectively.

  3. FireProt: web server for automated design of thermostable proteins

    Science.gov (United States)

    Musil, Milos; Stourac, Jan; Brezovsky, Jan; Prokop, Zbynek; Zendulka, Jaroslav; Martinek, Tomas

    2017-01-01

    Abstract There is a continuous interest in increasing proteins stability to enhance their usability in numerous biomedical and biotechnological applications. A number of in silico tools for the prediction of the effect of mutations on protein stability have been developed recently. However, only single-point mutations with a small effect on protein stability are typically predicted with the existing tools and have to be followed by laborious protein expression, purification, and characterization. Here, we present FireProt, a web server for the automated design of multiple-point thermostable mutant proteins that combines structural and evolutionary information in its calculation core. FireProt utilizes sixteen tools and three protein engineering strategies for making reliable protein designs. The server is complemented with interactive, easy-to-use interface that allows users to directly analyze and optionally modify designed thermostable mutants. FireProt is freely available at http://loschmidt.chemi.muni.cz/fireprot. PMID:28449074

  4. Flexible automated approach for quantitative liquid handling of complex biological samples.

    Science.gov (United States)

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  5. Current status and future prospects of an automated sample exchange system PAM for protein crystallography

    Science.gov (United States)

    Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.

    2013-03-01

    To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.

  6. Automated design evolution of stereochemically randomized protein foldamers

    Science.gov (United States)

    Ranbhor, Ranjit; Kumar, Anil; Patel, Kirti; Ramakrishnan, Vibin; Durani, Susheel

    2018-05-01

    Diversification of chain stereochemistry opens up the possibilities of an ‘in principle’ increase in the design space of proteins. This huge increase in the sequence and consequent structural variation is aimed at the generation of smart materials. To diversify protein structure stereochemically, we introduced L- and D-α-amino acids as the design alphabet. With a sequence design algorithm, we explored the usage of specific variables such as chirality and the sequence of this alphabet in independent steps. With molecular dynamics, we folded stereochemically diverse homopolypeptides and evaluated their ‘fitness’ for possible design as protein-like foldamers. We propose a fitness function to prune the most optimal fold among 1000 structures simulated with an automated repetitive simulated annealing molecular dynamics (AR-SAMD) approach. The highly scored poly-leucine fold with sequence lengths of 24 and 30 amino acids were later sequence-optimized using a Dead End Elimination cum Monte Carlo based optimization tool. This paper demonstrates a novel approach for the de novo design of protein-like foldamers.

  7. Automating the application of smart materials for protein crystallization

    International Nuclear Information System (INIS)

    Khurshid, Sahir; Govada, Lata; EL-Sharif, Hazim F.; Reddy, Subrayal M.; Chayen, Naomi E.

    2015-01-01

    The first semi-liquid, non-protein nucleating agent for automated protein crystallization trials is described. This ‘smart material’ is demonstrated to induce crystal growth and will provide a simple, cost-effective tool for scientists in academia and industry. The fabrication and validation of the first semi-liquid nonprotein nucleating agent to be administered automatically to crystallization trials is reported. This research builds upon prior demonstration of the suitability of molecularly imprinted polymers (MIPs; known as ‘smart materials’) for inducing protein crystal growth. Modified MIPs of altered texture suitable for high-throughput trials are demonstrated to improve crystal quality and to increase the probability of success when screening for suitable crystallization conditions. The application of these materials is simple, time-efficient and will provide a potent tool for structural biologists embarking on crystallization trials

  8. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol.

    Science.gov (United States)

    Block, Gladys; Azar, Kristen Mj; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-21

    In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. The randomized trial will provide rigorous evidence regarding the efficacy of this Web- and Internet-based program in reducing or

  9. Automated mass correction and data interpretation for protein open-access liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Wagner, Craig D; Hall, John T; White, Wendy L; Miller, Luke A D; Williams, Jon D

    2007-02-01

    Characterization of recombinant protein purification fractions and final products by liquid chromatography-mass spectrometry (LC/MS) are requested more frequently each year. A protein open-access (OA) LC/MS system was developed in our laboratory to meet this demand. This paper compares the system that we originally implemented in our facilities in 2003 to the one now in use, and discusses, in more detail, recent enhancements that have improved its robustness, reliability, and data reporting capabilities. The system utilizes instruments equipped with reversed-phase chromatography and an orthogonal accelerated time-of-flight mass spectrometer fitted with an electrospray source. Sample analysis requests are accomplished using a simple form on a web-enabled laboratory information management system (LIMS). This distributed form is accessible from any intranet-connected company desktop computer. Automated data acquisition and processing are performed using a combination of in-house (OA-Self Service, OA-Monitor, and OA-Analysis Engine) and vendor-supplied programs (AutoLynx, and OpenLynx) located on acquisition computers and off-line processing workstations. Analysis results are then reported via the same web-based LIMS. Also presented are solutions to problems not addressed on commercially available, small-molecule OA-LC/MS systems. These include automated transforming of mass-to-charge (m/z) spectra to mass spectra and automated data interpretation that considers minor variants to the protein sequence-such as common post-translational modifications (PTMs). Currently, our protein OA-LC/MS platform runs on five LC/MS instruments located in three separate GlaxoSmithKline R&D sites in the US and UK. To date, more than 8000 protein OA-LC/MS samples have been analyzed. With these user friendly and highly automated OA systems in place, mass spectrometry plays a key role in assessing the quality of recombinant proteins, either produced at our facilities or bought from external

  10. Development of a phantom to test fully automated breast density software – A work in progress

    International Nuclear Information System (INIS)

    Waade, G.G.; Hofvind, S.; Thompson, J.D.; Highnam, R.; Hogg, P.

    2017-01-01

    Objectives: Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Methods: Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Results: Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Conclusion: Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. Advances in knowledge: We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. - Highlights: • Several phantoms of different configurations were created. • Three methods to assess phantom density were implemented. • All phantoms were identified as breasts by the Volpara software. • Reducing phantom thickness caused a change in phantom density.

  11. Clinical validation of fully automated computation of ejection fraction from gated equilibrium blood-pool scintigrams

    International Nuclear Information System (INIS)

    Reiber, J.H.C.; Lie, S.P.; Simoons, M.L.; Hoek, C.; Gerbrands, J.J.; Wijns, W.; Bakker, W.H.; Kooij, P.P.M.

    1983-01-01

    A fully automated procedure for the computation of left-ventricular ejection fraction (EF) from cardiac-gated Tc-99m blood-pool (GBP) scintigrams with fixed, dual, and variable ROI methods is described. By comparison with EF data from contrast ventriculography in 68 patients, the dual-ROI method (separate end-diastolic and end-systolic contours) was found to be the method of choice; processing time was 2 min. Success score of dual-ROI procedure was 92% as assessed from 100 GBP studies. Overall reproducibility of data acquisition and analysis was determined in 12 patients. Mean value and standard deviation of differences between repeat studies (average time interval 27 min) were 0.8% and 4.3% EF units, respectively, (r=0.98). The authors conclude that left-ventricular EF can be computed automatically from GBP scintigrams with minimal operator-interaction and good reproducibility; EFs are similar to those from contrast ventriculography

  12. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    International Nuclear Information System (INIS)

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-01

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs

  13. Fully automated data acquisition, processing, and display in equilibrium radioventriculography

    International Nuclear Information System (INIS)

    Bourguignon, M.H.; Douglass, K.H.; Links, J.M.; Wagner, H.N. Jr.; Johns Hopkins Medical Institutions, Baltimore, MD

    1981-01-01

    A fully automated data acquisition, processing, and display procedure was developed for equilibrium radioventriculography. After a standardized acquisition, the study is automatically analyzed to yield both right and left ventricular time-activity curves. The program first creates a series of edge-enhanced images (difference between squared images and scaled original images). A marker point within each ventricle is then identified as that pixel with maximum counts to the patient's right and left of the count center of gravity of a stroke volume image. Regions of interest are selected on each frame as the first contour of local maxima of the two-dimensional second derivative (pseudo-Laplacian) which encloses the appropriate marker point, using a method developed by Goris. After shifting the left ventricular end-systolic region of interest four pixels to the patient's left, a background region of interest is generated as the crescent-shaped area of the shifted region of interest not intersected by the end systolic region. The average counts/pixel in this background region in the end systolic frame of the original study are subtracted from each pixel in all frames of the gated study. Right and left ventricular time-activity curves are then obtained by applying each region of interest to its corresponding background-subtracted frame, and the ejection fraction, end diastolic, end systolic, and stroke counts determined for both ventricles. In fourteen consecutive patients, in addition to the automatic ejection fractions, manually drawn regions of interest were used to obtain ejection fractions for both ventricles. The manual regions of interest were drawn twice, and the average obtained. (orig./TR)

  14. Development of a fully automated adaptive unsharp masking technique in digital chest radiograph

    International Nuclear Information System (INIS)

    Abe, Katsumi; Katsuragawa, Shigehiko; Sasaki, Yasuo

    1991-01-01

    We are developing a fully automated adaptive unsharp masking technique with various parameters depending on regional image features of a digital chest radiograph. A chest radiograph includes various regions such as lung fields, retrocardiac area and spine in which their texture patterns and optical densities are extremely different. Therefore, it is necessary to enhance image contrast of each region by each optimum parameter. First, we investigated optimum weighting factors and mask sizes of unsharp masking technique in a digital chest radiograph. Then, a chest radiograph is automatically divided into three segments, one for the lung field, one for the retrocardiac area, and one for the spine, by using histogram analysis of pixel values. Finally, high frequency components of the lung field and retrocardiac area are selectively enhanced with a small mask size and mild weighting factors which are previously determined as optimum parameters. In addition, low frequency components of the spine are enhanced with a large mask size and adequate weighting factors. This processed image shows excellent depiction of the lung field, retrocardiac area and spine simultaneously with optimum contrast. Our image processing technique may be useful for diagnosis of chest radiographs. (author)

  15. Fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    Energy Technology Data Exchange (ETDEWEB)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-03-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed.

  16. CASA: An Efficient Automated Assignment of Protein Mainchain NMR Data Using an Ordered Tree Search Algorithm

    International Nuclear Information System (INIS)

    Wang Jianyong; Wang Tianzhi; Zuiderweg, Erik R. P.; Crippen, Gordon M.

    2005-01-01

    Rapid analysis of protein structure, interaction, and dynamics requires fast and automated assignments of 3D protein backbone triple-resonance NMR spectra. We introduce a new depth-first ordered tree search method of automated assignment, CASA, which uses hand-edited peak-pick lists of a flexible number of triple resonance experiments. The computer program was tested on 13 artificially simulated peak lists for proteins up to 723 residues, as well as on the experimental data for four proteins. Under reasonable tolerances, it generated assignments that correspond to the ones reported in the literature within a few minutes of CPU time. The program was also tested on the proteins analyzed by other methods, with both simulated and experimental peaklists, and it could generate good assignments in all relevant cases. The robustness was further tested under various situations

  17. FULLY AUTOMATED GENERATION OF ACCURATE DIGITAL SURFACE MODELS WITH SUB-METER RESOLUTION FROM SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    J. Wohlfeil

    2012-07-01

    Full Text Available Modern pixel-wise image matching algorithms like Semi-Global Matching (SGM are able to compute high resolution digital surface models from airborne and spaceborne stereo imagery. Although image matching itself can be performed automatically, there are prerequisites, like high geometric accuracy, which are essential for ensuring the high quality of resulting surface models. Especially for line cameras, these prerequisites currently require laborious manual interaction using standard tools, which is a growing problem due to continually increasing demand for such surface models. The tedious work includes partly or fully manual selection of tie- and/or ground control points for ensuring the required accuracy of the relative orientation of images for stereo matching. It also includes masking of large water areas that seriously reduce the quality of the results. Furthermore, a good estimate of the depth range is required, since accurate estimates can seriously reduce the processing time for stereo matching. In this paper an approach is presented that allows performing all these steps fully automated. It includes very robust and precise tie point selection, enabling the accurate calculation of the images’ relative orientation via bundle adjustment. It is also shown how water masking and elevation range estimation can be performed automatically on the base of freely available SRTM data. Extensive tests with a large number of different satellite images from QuickBird and WorldView are presented as proof of the robustness and reliability of the proposed method.

  18. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  19. Anesthesiology, automation, and artificial intelligence.

    Science.gov (United States)

    Alexander, John C; Joshi, Girish P

    2018-01-01

    There have been many attempts to incorporate automation into the practice of anesthesiology, though none have been successful. Fundamentally, these failures are due to the underlying complexity of anesthesia practice and the inability of rule-based feedback loops to fully master it. Recent innovations in artificial intelligence, especially machine learning, may usher in a new era of automation across many industries, including anesthesiology. It would be wise to consider the implications of such potential changes before they have been fully realized.

  20. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    International Nuclear Information System (INIS)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A.

    2014-01-01

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  1. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography.

    Science.gov (United States)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A

    2014-03-01

    Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum

  2. Fully automated synthesis of ¹¹C-acetate as tumor PET tracer by simple modified solid-phase extraction purification.

    Science.gov (United States)

    Tang, Xiaolan; Tang, Ganghua; Nie, Dahong

    2013-12-01

    Automated synthesis of (11)C-acetate ((11)C-AC) as the most commonly used radioactive fatty acid tracer is performed by a simple, rapid, and modified solid-phase extraction (SPE) purification. Automated synthesis of (11)C-AC was implemented by carboxylation reaction of MeMgBr on a polyethylene Teflon loop ring with (11)C-CO2, followed by acidic hydrolysis with acid and SCX cartridge, and purification on SCX, AG11A8 and C18 SPE cartridges using a commercially available (11)C-tracer synthesizer. Quality control test and animals positron emission tomography (PET) imaging were also carried out. A high and reproducible decay-uncorrected radiochemical yield of (41.0 ± 4.6)% (n=10) was obtained from (11)C-CO2 within the whole synthesis time about 8 min. The radiochemical purity of (11)C-AC was over 95% by high-performance liquid chromatography (HPLC) analysis. Quality control test and PET imaging showed that (11)C-AC injection produced by the simple SPE procedure was safe and efficient, and was in agreement with the current Chinese radiopharmaceutical quality control guidelines. The novel, simple, and rapid method is readily adapted to the fully automated synthesis of (11)C-AC on several existing commercial synthesis module. The method can be used routinely to produce (11)C-AC for preclinical and clinical studies with PET imaging. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. A fully-automated multiscale kernel graph cuts based particle localization scheme for temporal focusing two-photon microscopy

    Science.gov (United States)

    Huang, Xia; Li, Chunqiang; Xiao, Chuan; Sun, Wenqing; Qian, Wei

    2017-03-01

    The temporal focusing two-photon microscope (TFM) is developed to perform depth resolved wide field fluorescence imaging by capturing frames sequentially. However, due to strong nonignorable noises and diffraction rings surrounding particles, further researches are extremely formidable without a precise particle localization technique. In this paper, we developed a fully-automated scheme to locate particles positions with high noise tolerance. Our scheme includes the following procedures: noise reduction using a hybrid Kalman filter method, particle segmentation based on a multiscale kernel graph cuts global and local segmentation algorithm, and a kinematic estimation based particle tracking method. Both isolated and partial-overlapped particles can be accurately identified with removal of unrelated pixels. Based on our quantitative analysis, 96.22% isolated particles and 84.19% partial-overlapped particles were successfully detected.

  4. A novel fully automated molecular diagnostic system (AMDS for colorectal cancer mutation detection.

    Directory of Open Access Journals (Sweden)

    Shiro Kitano

    Full Text Available BACKGROUND: KRAS, BRAF and PIK3CA mutations are frequently observed in colorectal cancer (CRC. In particular, KRAS mutations are strong predictors for clinical outcomes of EGFR-targeted treatments such as cetuximab and panitumumab in metastatic colorectal cancer (mCRC. For mutation analysis, the current methods are time-consuming, and not readily available to all oncologists and pathologists. We have developed a novel, simple, sensitive and fully automated molecular diagnostic system (AMDS for point of care testing (POCT. Here we report the results of a comparison study between AMDS and direct sequencing (DS in the detection of KRAS, BRAF and PI3KCA somatic mutations. METHODOLOGY/PRINCIPAL FINDING: DNA was extracted from a slice of either frozen (n = 89 or formalin-fixed and paraffin-embedded (FFPE CRC tissue (n = 70, and then used for mutation analysis by AMDS and DS. All mutations (n = 41 among frozen and 27 among FFPE samples detected by DS were also successfully (100% detected by the AMDS. However, 8 frozen and 6 FFPE samples detected as wild-type in the DS analysis were shown as mutants in the AMDS analysis. By cloning-sequencing assays, these discordant samples were confirmed as true mutants. One sample had simultaneous "hot spot" mutations of KRAS and PIK3CA, and cloning assay comfirmed that E542K and E545K were not on the same allele. Genotyping call rates for DS were 100.0% (89/89 and 74.3% (52/70 in frozen and FFPE samples, respectively, for the first attempt; whereas that of AMDS was 100.0% for both sample sets. For automated DNA extraction and mutation detection by AMDS, frozen tissues (n = 41 were successfully detected all mutations within 70 minutes. CONCLUSIONS/SIGNIFICANCE: AMDS has superior sensitivity and accuracy over DS, and is much easier to execute than conventional labor intensive manual mutation analysis. AMDS has great potential for POCT equipment for mutation analysis.

  5. Fully automated intrinsic respiratory and cardiac gating for small animal CT

    Energy Technology Data Exchange (ETDEWEB)

    Kuntz, J; Baeuerle, T; Semmler, W; Bartling, S H [Department of Medical Physics in Radiology, German Cancer Research Center, Heidelberg (Germany); Dinkel, J [Department of Radiology, German Cancer Research Center, Heidelberg (Germany); Zwick, S [Department of Diagnostic Radiology, Medical Physics, Freiburg University (Germany); Grasruck, M [Siemens Healthcare, Forchheim (Germany); Kiessling, F [Chair of Experimental Molecular Imaging, RWTH-Aachen University, Medical Faculty, Aachen (Germany); Gupta, R [Department of Radiology, Massachusetts General Hospital, Boston, MA (United States)], E-mail: j.kuntz@dkfz.de

    2010-04-07

    A fully automated, intrinsic gating algorithm for small animal cone-beam CT is described and evaluated. A parameter representing the organ motion, derived from the raw projection images, is used for both cardiac and respiratory gating. The proposed algorithm makes it possible to reconstruct motion-corrected still images as well as to generate four-dimensional (4D) datasets representing the cardiac and pulmonary anatomy of free-breathing animals without the use of electrocardiogram (ECG) or respiratory sensors. Variation analysis of projections from several rotations is used to place a region of interest (ROI) on the diaphragm. The ROI is cranially extended to include the heart. The centre of mass (COM) variation within this ROI, the filtered frequency response and the local maxima are used to derive a binary motion-gating parameter for phase-sensitive gated reconstruction. This algorithm was implemented on a flat-panel-based cone-beam CT scanner and evaluated using a moving phantom and animal scans (seven rats and eight mice). Volumes were determined using a semiautomatic segmentation. In all cases robust gating signals could be obtained. The maximum volume error in phantom studies was less than 6%. By utilizing extrinsic gating via externally placed cardiac and respiratory sensors, the functional parameters (e.g. cardiac ejection fraction) and image quality were equivalent to this current gold standard. This algorithm obviates the necessity of both gating hardware and user interaction. The simplicity of the proposed algorithm enables adoption in a wide range of small animal cone-beam CT scanners.

  6. The AUDANA algorithm for automated protein 3D structure determination from NMR NOE data

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison and Biochemistry Department (United States); Petit, Chad M. [University of Alabama at Birmingham, Department of Biochemistry and Molecular Genetics (United States); Cornilescu, Gabriel; Stark, Jaime L.; Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison and Biochemistry Department (United States)

    2016-06-15

    We introduce AUDANA (Automated Database-Assisted NOE Assignment), an algorithm for determining three-dimensional structures of proteins from NMR data that automates the assignment of 3D-NOE spectra, generates distance constraints, and conducts iterative high temperature molecular dynamics and simulated annealing. The protein sequence, chemical shift assignments, and NOE spectra are the only required inputs. Distance constraints generated automatically from ambiguously assigned NOE peaks are validated during the structure calculation against information from an enlarged version of the freely available PACSY database that incorporates information on protein structures deposited in the Protein Data Bank (PDB). This approach yields robust sets of distance constraints and 3D structures. We evaluated the performance of AUDANA with input data for 14 proteins ranging in size from 6 to 25 kDa that had 27–98 % sequence identity to proteins in the database. In all cases, the automatically calculated 3D structures passed stringent validation tests. Structures were determined with and without database support. In 9/14 cases, database support improved the agreement with manually determined structures in the PDB and in 11/14 cases, database support lowered the r.m.s.d. of the family of 20 structural models.

  7. The AUDANA algorithm for automated protein 3D structure determination from NMR NOE data

    International Nuclear Information System (INIS)

    Lee, Woonghee; Petit, Chad M.; Cornilescu, Gabriel; Stark, Jaime L.; Markley, John L.

    2016-01-01

    We introduce AUDANA (Automated Database-Assisted NOE Assignment), an algorithm for determining three-dimensional structures of proteins from NMR data that automates the assignment of 3D-NOE spectra, generates distance constraints, and conducts iterative high temperature molecular dynamics and simulated annealing. The protein sequence, chemical shift assignments, and NOE spectra are the only required inputs. Distance constraints generated automatically from ambiguously assigned NOE peaks are validated during the structure calculation against information from an enlarged version of the freely available PACSY database that incorporates information on protein structures deposited in the Protein Data Bank (PDB). This approach yields robust sets of distance constraints and 3D structures. We evaluated the performance of AUDANA with input data for 14 proteins ranging in size from 6 to 25 kDa that had 27–98 % sequence identity to proteins in the database. In all cases, the automatically calculated 3D structures passed stringent validation tests. Structures were determined with and without database support. In 9/14 cases, database support improved the agreement with manually determined structures in the PDB and in 11/14 cases, database support lowered the r.m.s.d. of the family of 20 structural models.

  8. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  9. Automated 3D-Printed Unibody Immunoarray for Chemiluminescence Detection of Cancer Biomarker Proteins

    Science.gov (United States)

    Tang, C. K.; Vaze, A.; Rusling, J. F.

    2017-01-01

    A low cost three-dimensional (3D) printed clear plastic microfluidic device was fabricated for fast, low cost automated protein detection. The unibody device features three reagent reservoirs, an efficient 3D network for passive mixing, and an optically transparent detection chamber housing a glass capture antibody array for measuring chemiluminescence output with a CCD camera. Sandwich type assays were built onto the glass arrays using a multi-labeled detection antibody-polyHRP (HRP = horseradish peroxidase). Total assay time was ~30 min in a complete automated assay employing a programmable syringe pump so that the protocol required minimal operator intervention. The device was used for multiplexed detection of prostate cancer biomarker proteins prostate specific antigen (PSA) and platelet factor 4 (PF-4). Detection limits of 0.5 pg mL−1 were achieved for these proteins in diluted serum with log dynamic ranges of four orders of magnitude. Good accuracy vs ELISA was validated by analyzing human serum samples. This prototype device holds good promise for further development as a point-of-care cancer diagnostics tool. PMID:28067370

  10. Fully automated one-pot radiosynthesis of O-(2-[{sup 18}F]fluoroethyl)-L-tyrosine on the TracerLab FX{sub FN} module

    Energy Technology Data Exchange (ETDEWEB)

    Bourdier, Thomas, E-mail: bts@ansto.gov.au [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia); Greguric, Ivan [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia); Roselt, Peter [Centre for Molecular Imaging, Peter MacCallum Cancer Centre, 12 St Andrew' s Place, East Melbourne, VIC, 3002 (Australia); Jackson, Tim; Faragalla, Jane; Katsifis, Andrew [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia)

    2011-07-15

    Introduction: An efficient fully automated method for the radiosynthesis of enantiomerically pure O-(2-[{sup 18}F]fluoroethyl)-L-tyrosine ([{sup 18}F]FET) using the GE TracerLab FX{sub FN} synthesis module via the O-(2-tosyloxyethyl)-N-trityl-L-tyrosine tert-butylester precursor has been developed. Methods: The radiolabelling of [{sup 18}F]FET involved a classical [{sup 18}F]fluoride nucleophilic substitution performed in acetonitrile using potassium carbonate and Kryptofix 222, followed by acid hydrolysis using 2N hydrochloric acid. Results: [{sup 18}F]FET was produced in 35{+-}5% (n=22) yield non-decay-corrected (55{+-}5% decay-corrected) and with radiochemical and enantiomeric purity of >99% with a specific activity of >90 GBq/{mu}mol after 63 min of radiosynthesis including HPLC purification and formulation. Conclusion: The automated radiosynthesis provides high and reproducible yields suitable for routine clinical use.

  11. A fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    International Nuclear Information System (INIS)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-01-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed. (orig.)

  12. Parameter evaluation and fully-automated radiosynthesis of [11C]harmine for imaging of MAO-A for clinical trials

    International Nuclear Information System (INIS)

    Philippe, C.; Zeilinger, M.; Mitterhauser, M.; Dumanic, M.; Lanzenberger, R.; Hacker, M.; Wadsak, W.

    2015-01-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [ 11 C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2–3 mg/mL precursor activated with 1 eq. 5 M NaOH in DMSO, 80 °C reaction temperature and 2 min reaction time. Under these conditions 6.1±1 GBq (51.0±11% based on [ 11 C]CH 3 I, corrected for decay) of [ 11 C]harmine (n=72) were obtained. The specific activity was 101.32±28.2 GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. - Highlights: • Preparation of [ 11 C]harmine on a commercially available synthesizer for the routine application. • High reliability: only 4 out of 72 failed syntheses; 5% due to technical problems. • High yields: 6.1±1 GBq overall yield (EOS). • High specific activities: 101.32±28.2 GBq/µmol

  13. Comparison of subjective and fully automated methods for measuring mammographic density.

    Science.gov (United States)

    Moshina, Nataliia; Roman, Marta; Sebuødegård, Sofie; Waade, Gunvor G; Ursin, Giske; Hofvind, Solveig

    2018-02-01

    Background Breast radiologists of the Norwegian Breast Cancer Screening Program subjectively classified mammographic density using a three-point scale between 1996 and 2012 and changed into the fourth edition of the BI-RADS classification since 2013. In 2015, an automated volumetric breast density assessment software was installed at two screening units. Purpose To compare volumetric breast density measurements from the automated method with two subjective methods: the three-point scale and the BI-RADS density classification. Material and Methods Information on subjective and automated density assessment was obtained from screening examinations of 3635 women recalled for further assessment due to positive screening mammography between 2007 and 2015. The score of the three-point scale (I = fatty; II = medium dense; III = dense) was available for 2310 women. The BI-RADS density score was provided for 1325 women. Mean volumetric breast density was estimated for each category of the subjective classifications. The automated software assigned volumetric breast density to four categories. The agreement between BI-RADS and volumetric breast density categories was assessed using weighted kappa (k w ). Results Mean volumetric breast density was 4.5%, 7.5%, and 13.4% for categories I, II, and III of the three-point scale, respectively, and 4.4%, 7.5%, 9.9%, and 13.9% for the BI-RADS density categories, respectively ( P for trend density categories was k w  = 0.5 (95% CI = 0.47-0.53; P density increased with increasing density category of the subjective classifications. The agreement between BI-RADS and volumetric breast density categories was moderate.

  14. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  15. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    International Nuclear Information System (INIS)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M

    2016-01-01

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  16. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    Energy Technology Data Exchange (ETDEWEB)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M [UCLA Radiological Sciences, Los Angeles, CA (United States)

    2016-06-15

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  17. Quantification of common carotid artery and descending aorta vessel wall thickness from MR vessel wall imaging using a fully automated processing pipeline.

    Science.gov (United States)

    Gao, Shan; van 't Klooster, Ronald; Brandts, Anne; Roes, Stijntje D; Alizadeh Dehnavi, Reza; de Roos, Albert; Westenberg, Jos J M; van der Geest, Rob J

    2017-01-01

    To develop and evaluate a method that can fully automatically identify the vessel wall boundaries and quantify the wall thickness for both common carotid artery (CCA) and descending aorta (DAO) from axial magnetic resonance (MR) images. 3T MRI data acquired with T 1 -weighted gradient-echo black-blood imaging sequence from carotid (39 subjects) and aorta (39 subjects) were used to develop and test the algorithm. The vessel wall segmentation was achieved by respectively fitting a 3D cylindrical B-spline surface to the boundaries of lumen and outer wall. The tube-fitting was based on the edge detection performed on the signal intensity (SI) profile along the surface normal. To achieve a fully automated process, Hough Transform (HT) was developed to estimate the lumen centerline and radii for the target vessel. Using the outputs of HT, a tube model for lumen segmentation was initialized and deformed to fit the image data. Finally, lumen segmentation was dilated to initiate the adaptation procedure of outer wall tube. The algorithm was validated by determining: 1) its performance against manual tracing; 2) its interscan reproducibility in quantifying vessel wall thickness (VWT); 3) its capability of detecting VWT difference in hypertensive patients compared with healthy controls. Statistical analysis including Bland-Altman analysis, t-test, and sample size calculation were performed for the purpose of algorithm evaluation. The mean distance between the manual and automatically detected lumen/outer wall contours was 0.00 ± 0.23/0.09 ± 0.21 mm for CCA and 0.12 ± 0.24/0.14 ± 0.35 mm for DAO. No significant difference was observed between the interscan VWT assessment using automated segmentation for both CCA (P = 0.19) and DAO (P = 0.94). Both manual and automated segmentation detected significantly higher carotid (P = 0.016 and P = 0.005) and aortic (P < 0.001 and P = 0.021) wall thickness in the hypertensive patients. A reliable and reproducible pipeline for fully

  18. Automation in structural biology beamlines of the Photon Factory

    International Nuclear Information System (INIS)

    Igarashi, Noriyuki; Hiraki, Masahiko; Matsugaki, Naohiro; Yamada, Yusuke; Wakatsuki, Soichi

    2007-01-01

    The Photon Factory currently operates four synchrotron beamlines for protein crystallography and two more beamlines are scheduled to be constructed in the next years. Over the last years these beamlines have been upgraded and equipped with a fully automated beamline control system based on a robotic sample changer. The current system allows for remote operation, controlled from the user's area, of sample mounting, centering and data collection of pre-frozen crystals mounted in Hampton-type cryo-loops on goniometer head. New intuitive graphical user interfaces have been developed so as to control the complete beamline operation. Furthermore, algorithms for automatic sample centering based on pattern matching and X-ray beam scanning are being developed and combined with newly developed diffraction evaluation programs in order to complete entire automation of the data collection. (author)

  19. APSY-NMR for protein backbone assignment in high-throughput structural biology

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, Samit Kumar; Serrano, Pedro; Proudfoot, Andrew; Geralt, Michael [The Scripps Research Institute, Department of Integrative Structural and Computational Biology (United States); Pedrini, Bill [Paul Scherrer Institute (PSI), SwissFEL Project (Switzerland); Herrmann, Torsten [Université de Lyon, Institut des Sciences Analytiques, Centre de RMN à Très Hauts Champs, UMR 5280 CNRS, ENS Lyon, UCB Lyon 1 (France); Wüthrich, Kurt, E-mail: wuthrich@scripps.edu [The Scripps Research Institute, Department of Integrative Structural and Computational Biology (United States)

    2015-01-15

    A standard set of three APSY-NMR experiments has been used in daily practice to obtain polypeptide backbone NMR assignments in globular proteins with sizes up to about 150 residues, which had been identified as targets for structure determination by the Joint Center for Structural Genomics (JCSG) under the auspices of the Protein Structure Initiative (PSI). In a representative sample of 30 proteins, initial fully automated data analysis with the software UNIO-MATCH-2014 yielded complete or partial assignments for over 90 % of the residues. For most proteins the APSY data acquisition was completed in less than 30 h. The results of the automated procedure provided a basis for efficient interactive validation and extension to near-completion of the assignments by reference to the same 3D heteronuclear-resolved [{sup 1}H,{sup 1}H]-NOESY spectra that were subsequently used for the collection of conformational constraints. High-quality structures were obtained for all 30 proteins, using the J-UNIO protocol, which includes extensive automation of NMR structure determination.

  20. EDM-DEDM and protein crystal structure solution.

    Science.gov (United States)

    Caliandro, Rocco; Carrozzini, Benedetta; Cascarano, Giovanni Luca; Giacovazzo, Carmelo; Mazzone, Anna Maria; Siliqi, Dritan

    2009-05-01

    Electron-density modification (EDM) procedures are the classical tool for driving model phases closer to those of the target structure. They are often combined with automated model-building programs to provide a correct protein model. The task is not always performed, mostly because of the large initial phase error. A recently proposed procedure combined EDM with DEDM (difference electron-density modification); the method was applied to the refinement of phases obtained by molecular replacement, ab initio or SAD phasing [Caliandro, Carrozzini, Cascarano, Giacovazzo, Mazzone & Siliqi (2009), Acta Cryst. D65, 249-256] and was more effective in improving phases than EDM alone. In this paper, a novel fully automated protocol for protein structure refinement based on the iterative application of automated model-building programs combined with the additional power derived from the EDM-DEDM algorithm is presented. The cyclic procedure was successfully tested on challenging cases for which all other approaches had failed.

  1. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high...... flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so...... and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells...

  2. PONDEROSA, an automated 3D-NOESY peak picking program, enables automated protein structure determination.

    Science.gov (United States)

    Lee, Woonghee; Kim, Jin Hae; Westler, William M; Markley, John L

    2011-06-15

    PONDEROSA (Peak-picking Of Noe Data Enabled by Restriction of Shift Assignments) accepts input information consisting of a protein sequence, backbone and sidechain NMR resonance assignments, and 3D-NOESY ((13)C-edited and/or (15)N-edited) spectra, and returns assignments of NOESY crosspeaks, distance and angle constraints, and a reliable NMR structure represented by a family of conformers. PONDEROSA incorporates and integrates external software packages (TALOS+, STRIDE and CYANA) to carry out different steps in the structure determination. PONDEROSA implements internal functions that identify and validate NOESY peak assignments and assess the quality of the calculated three-dimensional structure of the protein. The robustness of the analysis results from PONDEROSA's hierarchical processing steps that involve iterative interaction among the internal and external modules. PONDEROSA supports a variety of input formats: SPARKY assignment table (.shifts) and spectrum file formats (.ucsf), XEASY proton file format (.prot), and NMR-STAR format (.star). To demonstrate the utility of PONDEROSA, we used the package to determine 3D structures of two proteins: human ubiquitin and Escherichia coli iron-sulfur scaffold protein variant IscU(D39A). The automatically generated structural constraints and ensembles of conformers were as good as or better than those determined previously by much less automated means. The program, in the form of binary code along with tutorials and reference manuals, is available at http://ponderosa.nmrfam.wisc.edu/.

  3. Automatic selection of reference taxa for protein-protein interaction prediction with phylogenetic profiling

    DEFF Research Database (Denmark)

    Simonsen, Martin; Maetschke, S.R.; Ragan, M.A.

    2012-01-01

    Motivation: Phylogenetic profiling methods can achieve good accuracy in predicting protein–protein interactions, especially in prokaryotes. Recent studies have shown that the choice of reference taxa (RT) is critical for accurate prediction, but with more than 2500 fully sequenced taxa publicly......: We present three novel methods for automating the selection of RT, using machine learning based on known protein–protein interaction networks. One of these methods in particular, Tree-Based Search, yields greatly improved prediction accuracies. We further show that different methods for constituting...... phylogenetic profiles often require very different RT sets to support high prediction accuracy....

  4. Fully automated SPE-based synthesis and purification of 2-[{sup 18}F]fluoroethyl-choline for human use

    Energy Technology Data Exchange (ETDEWEB)

    Schmaljohann, Joern [Department of Nuclear Medicine, University of Bonn, Bonn (Germany); Department of Nuclear Medicine, University of Aachen, Aachen (Germany); Schirrmacher, Esther [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Waengler, Bjoern; Waengler, Carmen [Department of Nuclear Medicine, Ludwig-Maximilians University, Munich (Germany); Schirrmacher, Ralf, E-mail: ralf.schirrmacher@mcgill.c [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Guhlke, Stefan, E-mail: stefan.guhlke@ukb.uni-bonn.d [Department of Nuclear Medicine, University of Bonn, Bonn (Germany)

    2011-02-15

    Introduction: 2-[{sup 18}F]Fluoroethyl-choline ([{sup 18}F]FECH) is a promising tracer for the detection of prostate cancer as well as brain tumors with positron emission tomography (PET). [{sup 18}F]FECH is actively transported into mammalian cells, becomes phosphorylated by choline kinase and gets incorporated into the cell membrane after being metabolized to phosphatidylcholine. So far, its synthesis is a two-step procedure involving at least one HPLC purification step. To allow a wider dissemination of this tracer, finding a purification method avoiding HPLC is highly desirable and would result in easier accessibility and more reliable production of [{sup 18}F]FECH. Methods: [{sup 18}F]FECH was synthesized by reaction of 2-bromo-1-[{sup 18}F]fluoroethane ([{sup 18}F]BFE) with dimethylaminoethanol (DMAE) in DMSO. We applied a novel and very reliable work-up procedure for the synthesis of [{sup 18}F]BFE. Based on a combination of three different solid-phase cartridges, the purification of [{sup 18}F]BFE from its precursor 2-bromoethyl-4-nitrobenzenesulfonate (BENos) could be achieved without using HPLC. Following the subsequent reaction of the purified [{sup 18}F]BFE with DMAE, the final product [{sup 18}F]FECH was obtained as a sterile solution by passing the crude reaction mixture through a combination of two CM plus cartridges and a sterile filter. The fully automated synthesis was performed using as well a Raytest SynChrom module (Raytest, Germany) or a Scintomics HotboxIII module (Scintomics, Germany). Results: The radiotracer [{sup 18}F]FECH can be synthesized in reliable radiochemical yields (RCY) of 37{+-}5% (Synchrom module) and 33{+-}5% (Hotbox III unit) in less than 1 h using these two fully automated commercially available synthesis units without HPLC involvement for purification. Detailed quality control of the final injectable [{sup 18}F]FECH solution proved the high radiochemical purity and the absence of Kryptofix2.2.2, DMAE and DMSO used in the

  5. Automated Hydrophobic Interaction Chromatography Column Selection for Use in Protein Purification

    Science.gov (United States)

    Murphy, Patrick J. M.; Stone, Orrin J.; Anderson, Michelle E.

    2011-01-01

    In contrast to other chromatographic methods for purifying proteins (e.g. gel filtration, affinity, and ion exchange), hydrophobic interaction chromatography (HIC) commonly requires experimental determination (referred to as screening or "scouting") in order to select the most suitable chromatographic medium for purifying a given protein 1. The method presented here describes an automated approach to scouting for an optimal HIC media to be used in protein purification. HIC separates proteins and other biomolecules from a crude lysate based on differences in hydrophobicity. Similar to affinity chromatography (AC) and ion exchange chromatography (IEX), HIC is capable of concentrating the protein of interest as it progresses through the chromatographic process. Proteins best suited for purification by HIC include those with hydrophobic surface regions and able to withstand exposure to salt concentrations in excess of 2 M ammonium sulfate ((NH4)2SO4). HIC is often chosen as a purification method for proteins lacking an affinity tag, and thus unsuitable for AC, and when IEX fails to provide adequate purification. Hydrophobic moieties on the protein surface temporarily bind to a nonpolar ligand coupled to an inert, immobile matrix. The interaction between protein and ligand are highly dependent on the salt concentration of the buffer flowing through the chromatography column, with high ionic concentrations strengthening the protein-ligand interaction and making the protein immobile (i.e. bound inside the column) 2. As salt concentrations decrease, the protein-ligand interaction dissipates, the protein again becomes mobile and elutes from the column. Several HIC media are commercially available in pre-packed columns, each containing one of several hydrophobic ligands (e.g. S-butyl, butyl, octyl, and phenyl) cross-linked at varying densities to agarose beads of a specific diameter 3. Automated column scouting allows for an efficient approach for determining which HIC media

  6. Automated evaluation of ultrasonic indications

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  7. A fully automated mass spectrometer for the analysis of organic solids

    International Nuclear Information System (INIS)

    Hillig, H.; Kueper, H.; Riepe, W.

    1979-01-01

    Automation of a mass spectrometer-computer system makes it possible to process up to 30 samples without attention after sample loading. An automatic sample changer introduces the samples successively into the ion source by means of a direct inlet probe. A process control unit determines the operation sequence. Computer programs are available for the hardware support, system supervision and evaluation of the spectrometer signals. The most essential precondition for automation - automatic evaporation of the sample material by electronic control of the total ion current - is confirmed to be satisfactory. The system operates routinely overnight in an industrial laboratory, so that day work can be devoted to difficult analytical problems. The cost of routine analyses is halved. (Auth.)

  8. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E0 (reversible potential), k0 (heterogeneous charge transfer rate constant at E0), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm\\'s Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k0 values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN) 6]3-/4- process, but remarkably, all fit the quasi-reversible model satisfactorily. © 2013 American Chemical Society.

  9. Comparison and clinical utility evaluation of four multiple allergen simultaneous tests including two newly introduced fully automated analyzers

    Directory of Open Access Journals (Sweden)

    John Hoon Rim

    2016-04-01

    Full Text Available Background: We compared the diagnostic performances of two newly introduced fully automated multiple allergen simultaneous tests (MAST analyzers with two conventional MAST assays. Methods: The serum samples from a total of 53 and 104 patients were tested for food panels and inhalant panels, respectively, in four analyzers including AdvanSure AlloScreen (LG Life Science, Korea, AdvanSure Allostation Smart II (LG Life Science, PROTIA Allergy-Q (ProteomeTech, Korea, and RIDA Allergy Screen (R-Biopharm, Germany. We compared not only the total agreement percentages but also positive propensities among four analyzers. Results: Evaluation of AdvanSure Allostation Smart II as upgraded version of AdvanSure AlloScreen revealed good concordance with total agreement percentages of 93.0% and 92.2% in food and inhalant panel, respectively. Comparisons of AdvanSure Allostation Smart II or PROTIA Allergy-Q with RIDA Allergy Screen also showed good concordance performance with positive propensities of two new analyzers for common allergens (Dermatophagoides farina and Dermatophagoides pteronyssinus. The changes of cut-off level resulted in various total agreement percentage fluctuations among allergens by different analyzers, although current cut-off level of class 2 appeared to be generally suitable. Conclusions: AdvanSure Allostation Smart II and PROTIA Allergy-Q presented favorable agreement performances with RIDA Allergy Screen, although positive propensities were noticed in common allergens. Keywords: Multiple allergen simultaneous test, Automated analyzer

  10. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M.

    2013-01-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  11. Automated main-chain model building by template matching and iterative fragment extension

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    A method for automated macromolecular main-chain model building is described. An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and β-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and β-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C α positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 Å. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition

  12. Fully-automated in-syringe dispersive liquid-liquid microextraction for the determination of caffeine in coffee beverages.

    Science.gov (United States)

    Frizzarin, Rejane M; Maya, Fernando; Estela, José M; Cerdà, Víctor

    2016-12-01

    A novel fully-automated magnetic stirring-assisted lab-in-syringe analytical procedure has been developed for the fast and efficient dispersive liquid-liquid microextraction (DLLME) of caffeine in coffee beverages. The procedure is based on the microextraction of caffeine with a minute amount of dichloromethane, isolating caffeine from the sample matrix with no further sample pretreatment. Selection of the relevant extraction parameters such as the dispersive solvent, proportion of aqueous/organic phase, pH and flow rates have been carefully evaluated. Caffeine quantification was linear from 2 to 75mgL(-1), with detection and quantification limits of 0.46mgL(-1) and 1.54mgL(-1), respectively. A coefficient of variation (n=8; 5mgL(-1)) of a 2.1% and a sampling rate of 16h(-1), were obtained. The procedure was satisfactorily applied to the determination of caffeine in brewed, instant and decaf coffee samples, being the results for the sample analysis validated using high-performance liquid chromatography. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A fully automated fast analysis system for capillary gas chromatography. Part 1. Automation of system control

    NARCIS (Netherlands)

    Snijders, H.M.J.; Rijks, J.P.E.M.; Bombeeck, A.J.; Rijks, J.A.; Sandra, P.; Lee, M.L.

    1992-01-01

    This paper is dealing with the design, the automation and evaluation of a high speed capillary gas chromatographic system. A combination of software and hardware was developed for a new cold trap/reinjection device that allows selective solvent eliminating and on column sample enrichment and an

  14. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  15. Automated protein identification by the combination of MALDI MS and MS/MS spectra from different instruments.

    Science.gov (United States)

    Levander, Fredrik; James, Peter

    2005-01-01

    The identification of proteins separated on two-dimensional gels is most commonly performed by trypsin digestion and subsequent matrix-assisted laser desorption ionization (MALDI) with time-of-flight (TOF). Recently, atmospheric pressure (AP) MALDI coupled to an ion trap (IT) has emerged as a convenient method to obtain tandem mass spectra (MS/MS) from samples on MALDI target plates. In the present work, we investigated the feasibility of using the two methodologies in line as a standard method for protein identification. In this setup, the high mass accuracy MALDI-TOF spectra are used to calibrate the peptide precursor masses in the lower mass accuracy AP-MALDI-IT MS/MS spectra. Several software tools were developed to automate the analysis process. Two sets of MALDI samples, consisting of 142 and 421 gel spots, respectively, were analyzed in a highly automated manner. In the first set, the protein identification rate increased from 61% for MALDI-TOF only to 85% for MALDI-TOF combined with AP-MALDI-IT. In the second data set the increase in protein identification rate was from 44% to 58%. AP-MALDI-IT MS/MS spectra were in general less effective than the MALDI-TOF spectra for protein identification, but the combination of the two methods clearly enhanced the confidence in protein identification.

  16. Technical Note: A fully automated purge and trap GC-MS system for quantification of volatile organic compound (VOC fluxes between the ocean and atmosphere

    Directory of Open Access Journals (Sweden)

    S. J. Andrews

    2015-04-01

    Full Text Available The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater–air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (conductivity, temperature, depth profiles. The essential components comprise a bespoke, automated purge and trap (AutoP & T unit coupled to a commercial thermal desorption and gas chromatograph mass spectrometer (TD-GC-MS. The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34–180 °C with Henry's law coefficients of 0.018 and greater (CH22l, kHcc dimensionless gas/aqueous and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the eastern tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH22l within the surface ocean water.

  17. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    Energy Technology Data Exchange (ETDEWEB)

    Rosato, Antonio [University of Florence, Department of Chemistry and Magnetic Resonance Center (Italy); Vranken, Wim [Vrije Universiteit Brussel, Structural Biology Brussels (Belgium); Fogh, Rasmus H.; Ragan, Timothy J. [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom); Tejero, Roberto [Universidad de Valencia, Departamento de Química Física (Spain); Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H. [University of Georgia, Complex Carbohydrate Research Center and Northeast Structural Genomics Consortium (United States); Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H. [University of Toronto, Department of Medical Biophysics, Cancer Genomics and Proteomics, Ontario Cancer Institute, Northeast Structural Genomics Consortium (Canada); Kennedy, Michael [Miami University, Department of Chemistry and Biochemistry, Northeast Structural Genomics Consortium (United States); Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, Northeast Structural Genomics Consortium, Rutgers (United States); Vuister, Geerten W., E-mail: gv29@le.ac.uk [University of Leicester, Department of Biochemistry, School of Biological Sciences (United Kingdom)

    2015-08-15

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.

  18. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013

    International Nuclear Information System (INIS)

    Rosato, Antonio; Vranken, Wim; Fogh, Rasmus H.; Ragan, Timothy J.; Tejero, Roberto; Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H.; Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H.; Kennedy, Michael; Acton, Thomas B.; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T.; Vuister, Geerten W.

    2015-01-01

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged

  19. Plant protein annotation in the UniProt Knowledgebase.

    Science.gov (United States)

    Schneider, Michel; Bairoch, Amos; Wu, Cathy H; Apweiler, Rolf

    2005-05-01

    The Swiss-Prot, TrEMBL, Protein Information Resource (PIR), and DNA Data Bank of Japan (DDBJ) protein database activities have united to form the Universal Protein Resource (UniProt) Consortium. UniProt presents three database layers: the UniProt Archive, the UniProt Knowledgebase (UniProtKB), and the UniProt Reference Clusters. The UniProtKB consists of two sections: UniProtKB/Swiss-Prot (fully manually curated entries) and UniProtKB/TrEMBL (automated annotation, classification and extensive cross-references). New releases are published fortnightly. A specific Plant Proteome Annotation Program (http://www.expasy.org/sprot/ppap/) was initiated to cope with the increasing amount of data produced by the complete sequencing of plant genomes. Through UniProt, our aim is to provide the scientific community with a single, centralized, authoritative resource for protein sequences and functional information that will allow the plant community to fully explore and utilize the wealth of information available for both plant and non-plant model organisms.

  20. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    Science.gov (United States)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  1. Automated mammographic breast density estimation using a fully convolutional network.

    Science.gov (United States)

    Lee, Juhun; Nishikawa, Robert M

    2018-03-01

    The purpose of this study was to develop a fully automated algorithm for mammographic breast density estimation using deep learning. Our algorithm used a fully convolutional network, which is a deep learning framework for image segmentation, to segment both the breast and the dense fibroglandular areas on mammographic images. Using the segmented breast and dense areas, our algorithm computed the breast percent density (PD), which is the faction of dense area in a breast. Our dataset included full-field digital screening mammograms of 604 women, which included 1208 mediolateral oblique (MLO) and 1208 craniocaudal (CC) views. We allocated 455, 58, and 91 of 604 women and their exams into training, testing, and validation datasets, respectively. We established ground truth for the breast and the dense fibroglandular areas via manual segmentation and segmentation using a simple thresholding based on BI-RADS density assessments by radiologists, respectively. Using the mammograms and ground truth, we fine-tuned a pretrained deep learning network to train the network to segment both the breast and the fibroglandular areas. Using the validation dataset, we evaluated the performance of the proposed algorithm against radiologists' BI-RADS density assessments. Specifically, we conducted a correlation analysis between a BI-RADS density assessment of a given breast and its corresponding PD estimate by the proposed algorithm. In addition, we evaluated our algorithm in terms of its ability to classify the BI-RADS density using PD estimates, and its ability to provide consistent PD estimates for the left and the right breast and the MLO and CC views of the same women. To show the effectiveness of our algorithm, we compared the performance of our algorithm against a state of the art algorithm, laboratory for individualized breast radiodensity assessment (LIBRA). The PD estimated by our algorithm correlated well with BI-RADS density ratings by radiologists. Pearson's rho values of

  2. A fully robust PARAFAC method for analyzing fluorescence data

    DEFF Research Database (Denmark)

    Engelen, Sanne; Frosch, Stina; Jørgensen, Bo

    2009-01-01

    and Rayleigh scatter. Recently, a robust PARAFAC method that circumvents the harmful effects of outlying samples has been developed. For removing the scatter effects on the final PARAFAC model, different techniques exist. Newly, an automated scatter identification tool has been constructed. However......, there still exists no robust method for handling fluorescence data encountering both outlying EEM landscapes and scatter. In this paper, we present an iterative algorithm where the robust PARAFAC method and the scatter identification tool are alternately performed. A fully automated robust PARAFAC method...

  3. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template. Validation using magnetic resonance imaging. Technical note

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Ryo; Katayama, Shigenori; Takeda, Naoya; Fujita, Katsuzo [Nishi-Kobe Medical Center (Japan); Yonekura, Yoshiharu [Fukui Medical Univ., Matsuoka (Japan); Konishi, Junji [Kyoto Univ. (Japan). Graduate School of Medicine

    2003-03-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer ({sup 99m}Tc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T{sub 2}-weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using {sup 99m}Tc-ECD SPECT images obtained by the resting and vascular reserve (RVR) method. (author)

  4. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template. Validation using magnetic resonance imaging. Technical note

    International Nuclear Information System (INIS)

    Takeuchi, Ryo; Katayama, Shigenori; Takeda, Naoya; Fujita, Katsuzo; Yonekura, Yoshiharu; Konishi, Junji

    2003-01-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer ( 99m Tc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T 2 -weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using 99m Tc-ECD SPECT images obtained by the resting and vascular reserve (RVR) method. (author)

  5. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  6. Full automation and validation of a flexible ELISA platform for host cell protein and protein A impurity detection in biopharmaceuticals.

    Science.gov (United States)

    Rey, Guillaume; Wendeler, Markus W

    2012-11-01

    Monitoring host cell protein (HCP) and protein A impurities is important to ensure successful development of recombinant antibody drugs. Here, we report the full automation and validation of an ELISA platform on a robotic system that allows the detection of Chinese hamster ovary (CHO) HCPs and residual protein A of in-process control samples and final drug substance. The ELISA setup is designed to serve three main goals: high sample throughput, high quality of results, and sample handling flexibility. The processing of analysis requests, determination of optimal sample dilutions, and calculation of impurity content is performed automatically by a spreadsheet. Up to 48 samples in three unspiked and spiked dilutions each are processed within 24 h. The dilution of each sample is individually prepared based on the drug concentration and the expected impurity content. Adaptable dilution protocols allow the analysis of sample dilutions ranging from 1:2 to 1:2×10(7). The validity of results is assessed by automatic testing for dilutional linearity and spike recovery for each sample. This automated impurity ELISA facilitates multi-project process development, is easily adaptable to other impurity ELISA formats, and increases analytical capacity by combining flexible sample handling with high data quality. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Analysis of xanthines in beverages using a fully automated SPE-SPC-DAD hyphenated system

    Energy Technology Data Exchange (ETDEWEB)

    Medvedovici, A. [Bucarest Univ., Bucarest (Romania). Faculty of Chemistry, Dept. of Analytical Chemistry; David, F.; David, V.; Sandra, P. [Research Institute of Chromatography, Kortrijk (Belgium)

    2000-08-01

    Analysis of some xanthines (caffeine, theophylline and theobromine) in beverages has been achieved by a fully automated on-line Solid Phase Extraction - Supercritical Fluid Chromatography - Diode Array Detection (Spe - Sofc - Dad). Three adsorbents have been tested for the Spe procedure: octadecyl modified silicagel (ODS) and two types of styrene-divinylbenzen copolymer based materials, from which Porapack proved to be the most suitable adsorbent. Optimisation and correlation of both Spe and Sofc operational parameters are also discussed. By this technique, caffeine was determined in ice tea and Coca-Cola in a concentration of 0.15 ppm, theobromine - 1.5 ppb, and theophylline - 0.15 ppb. [Italian] Si e' realizzata l'analis di alcune xantine (caffeina, teofillina e teobromina) mediante un sistema, in linea, completamente automatizzato basato su Estrazione in Fase Solida - Cromatografia in Fase Supercritica - Rivelazione con Diode Array (Spe - Sfc - Dad). Per la procedura Spe sono stati valutati tre substrati: silice ottadecilica (ODS) e due tipi di materiali polimerici a base stirene-divinilbenzene, di cui, quello denominato PRP-1, e' risultato essere il piu' efficiente. Sono discusse sia l'ottimizzazione che la correlazione dei parametri operazionali per la Spe e la Sfc. Con questa tecnica sono state determinate, in te' ghiacciato e Coca-Cola, la caffeina, la teobromina e la teofillina alle concentrazini di 0.15, 1.5 e 0.15 ppm.

  8. Safe interaction between cyclists, pedestrians and automated vehicles : what do we know and what do we need to know?

    NARCIS (Netherlands)

    Vissers, L. Kint, S. van der Schagen, I.N.L.G. van & Hagenzieker, M.P.

    2017-01-01

    Automated vehicles are gradually entering our roadway system. Before our roads will be solely used by fully automated vehicles, a long transition period is to be expected in which fully automated vehicles, partly automated vehicles and manually-driven vehicles have to share our roads. The current

  9. LV challenge LKEB contribution : fully automated myocardial contour detection

    NARCIS (Netherlands)

    Wijnhout, J.S.; Hendriksen, D.; Assen, van H.C.; Geest, van der R.J.

    2009-01-01

    In this paper a contour detection method is described and evaluated on the evaluation data sets of the Cardiac MR Left Ventricle Segmentation Challenge as part of MICCAI 2009s 3D Segmentation Challenge for Clinical Applications. The proposed method, using 2D AAM and 3D ASM, performs a fully

  10. Gravity-driven pH adjustment for site-specific protein pKa measurement by solution-state NMR

    Science.gov (United States)

    Li, Wei

    2017-12-01

    To automate pH adjustment in site-specific protein pKa measurement by solution-state NMR, I present a funnel with two caps for the standard 5 mm NMR tube. The novelty of this simple-to-build and inexpensive apparatus is that it allows automatic gravity-driven pH adjustment within the magnet, and consequently results in a fully automated NMR-monitored pH titration without any hardware modification on the NMR spectrometer.

  11. The Buccaneer software for automated model building. 1. Tracing protein chains.

    Science.gov (United States)

    Cowtan, Kevin

    2006-09-01

    A new technique for the automated tracing of protein chains in experimental electron-density maps is described. The technique relies on the repeated application of an oriented electron-density likelihood target function to identify likely C(alpha) positions. This function is applied both in the location of a few promising ;seed' positions in the map and to grow those initial C(alpha) positions into extended chain fragments. Techniques for assembling the chain fragments into an initial chain trace are discussed.

  12. A novel approach to sequence validating protein expression clones with automated decision making

    Directory of Open Access Journals (Sweden)

    Mohr Stephanie E

    2007-06-01

    Full Text Available Abstract Background Whereas the molecular assembly of protein expression clones is readily automated and routinely accomplished in high throughput, sequence verification of these clones is still largely performed manually, an arduous and time consuming process. The ultimate goal of validation is to determine if a given plasmid clone matches its reference sequence sufficiently to be "acceptable" for use in protein expression experiments. Given the accelerating increase in availability of tens of thousands of unverified clones, there is a strong demand for rapid, efficient and accurate software that automates clone validation. Results We have developed an Automated Clone Evaluation (ACE system – the first comprehensive, multi-platform, web-based plasmid sequence verification software package. ACE automates the clone verification process by defining each clone sequence as a list of multidimensional discrepancy objects, each describing a difference between the clone and its expected sequence including the resulting polypeptide consequences. To evaluate clones automatically, this list can be compared against user acceptance criteria that specify the allowable number of discrepancies of each type. This strategy allows users to re-evaluate the same set of clones against different acceptance criteria as needed for use in other experiments. ACE manages the entire sequence validation process including contig management, identifying and annotating discrepancies, determining if discrepancies correspond to polymorphisms and clone finishing. Designed to manage thousands of clones simultaneously, ACE maintains a relational database to store information about clones at various completion stages, project processing parameters and acceptance criteria. In a direct comparison, the automated analysis by ACE took less time and was more accurate than a manual analysis of a 93 gene clone set. Conclusion ACE was designed to facilitate high throughput clone sequence

  13. Plant Protein Annotation in the UniProt Knowledgebase1

    Science.gov (United States)

    Schneider, Michel; Bairoch, Amos; Wu, Cathy H.; Apweiler, Rolf

    2005-01-01

    The Swiss-Prot, TrEMBL, Protein Information Resource (PIR), and DNA Data Bank of Japan (DDBJ) protein database activities have united to form the Universal Protein Resource (UniProt) Consortium. UniProt presents three database layers: the UniProt Archive, the UniProt Knowledgebase (UniProtKB), and the UniProt Reference Clusters. The UniProtKB consists of two sections: UniProtKB/Swiss-Prot (fully manually curated entries) and UniProtKB/TrEMBL (automated annotation, classification and extensive cross-references). New releases are published fortnightly. A specific Plant Proteome Annotation Program (http://www.expasy.org/sprot/ppap/) was initiated to cope with the increasing amount of data produced by the complete sequencing of plant genomes. Through UniProt, our aim is to provide the scientific community with a single, centralized, authoritative resource for protein sequences and functional information that will allow the plant community to fully explore and utilize the wealth of information available for both plant and nonplant model organisms. PMID:15888679

  14. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.; Simonov, Alexandr N.; Mashkina, Elena A.; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E.; Gavaghan, David J.; Bond, Alan M.

    2013-01-01

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered

  15. A fully automated temperature-dependent resistance measurement setup using van der Pauw method

    Science.gov (United States)

    Pandey, Shivendra Kumar; Manivannan, Anbarasu

    2018-03-01

    The van der Pauw (VDP) method is widely used to identify the resistance of planar homogeneous samples with four contacts placed on its periphery. We have developed a fully automated thin film resistance measurement setup using the VDP method with the capability of precisely measuring a wide range of thin film resistances from few mΩ up to 10 GΩ under controlled temperatures from room-temperature up to 600 °C. The setup utilizes a robust, custom-designed switching network board (SNB) for measuring current-voltage characteristics automatically at four different source-measure configurations based on the VDP method. Moreover, SNB is connected with low noise shielded coaxial cables that reduce the effect of leakage current as well as the capacitance in the circuit thereby enhancing the accuracy of measurement. In order to enable precise and accurate resistance measurement of the sample, wide range of sourcing currents/voltages are pre-determined with the capability of auto-tuning for ˜12 orders of variation in the resistances. Furthermore, the setup has been calibrated with standard samples and also employed to investigate temperature dependent resistance (few Ω-10 GΩ) measurements for various chalcogenide based phase change thin films (Ge2Sb2Te5, Ag5In5Sb60Te30, and In3SbTe2). This setup would be highly helpful for measurement of temperature-dependent resistance of wide range of materials, i.e., metals, semiconductors, and insulators illuminating information about structural change upon temperature as reflected by change in resistances, which are useful for numerous applications.

  16. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Directory of Open Access Journals (Sweden)

    Yu Huang

    Full Text Available Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS. The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG. The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF, skull and soft tissues, with a field of view (FOV that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas and morphological constraints using Markov random fields (MRF. The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM. With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  17. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Science.gov (United States)

    Huang, Yu; Parra, Lucas C

    2015-01-01

    Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS). The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG). The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF), skull and soft tissues, with a field of view (FOV) that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas) and morphological constraints using Markov random fields (MRF). The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI) at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM). With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  18. A fully automated and scalable timing probe-based method for time alignment of the LabPET II scanners

    Science.gov (United States)

    Samson, Arnaud; Thibaudeau, Christian; Bouchard, Jonathan; Gaudin, Émilie; Paulin, Caroline; Lecomte, Roger; Fontaine, Réjean

    2018-05-01

    A fully automated time alignment method based on a positron timing probe was developed to correct the channel-to-channel coincidence time dispersion of the LabPET II avalanche photodiode-based positron emission tomography (PET) scanners. The timing probe was designed to directly detect positrons and generate an absolute time reference. The probe-to-channel coincidences are recorded and processed using firmware embedded in the scanner hardware to compute the time differences between detector channels. The time corrections are then applied in real-time to each event in every channel during PET data acquisition to align all coincidence time spectra, thus enhancing the scanner time resolution. When applied to the mouse version of the LabPET II scanner, the calibration of 6 144 channels was performed in less than 15 min and showed a 47% improvement on the overall time resolution of the scanner, decreasing from 7 ns to 3.7 ns full width at half maximum (FWHM).

  19. A new TLD badge with machine readable ID for fully automated readout

    International Nuclear Information System (INIS)

    Kannan, S. Ratna P.; Kulkarni, M.S.

    2003-01-01

    The TLD badge currently being used for personnel monitoring of more than 40,000 radiation workers has a few drawbacks such as lack of on-badge machine readable ID code, delicate two-point clamping of dosimeters on an aluminium card with the chances of dosimeters falling off during handling or readout, projections on one side making automation of readout difficult etc. A new badge has been designed with a 8-digit identification code in the form of an array of holes and smooth exteriors to enable full automation of readout. The new badge also permits changing of dosimeters when necessary. The new design does not affect the readout time or the dosimetric characteristics. The salient features and the dosimetric characteristics are discussed. (author)

  20. Fully convergent chemical synthesis of ester insulin: determination of the high resolution X-ray structure by racemic protein crystallography.

    Science.gov (United States)

    Avital-Shmilovici, Michal; Mandal, Kalyaneswar; Gates, Zachary P; Phillips, Nelson B; Weiss, Michael A; Kent, Stephen B H

    2013-02-27

    Efficient total synthesis of insulin is important to enable the application of medicinal chemistry to the optimization of the properties of this important protein molecule. Recently we described "ester insulin"--a novel form of insulin in which the function of the 35 residue C-peptide of proinsulin is replaced by a single covalent bond--as a key intermediate for the efficient total synthesis of insulin. Here we describe a fully convergent synthetic route to the ester insulin molecule from three unprotected peptide segments of approximately equal size. The synthetic ester insulin polypeptide chain folded much more rapidly than proinsulin, and at physiological pH. Both the D-protein and L-protein enantiomers of monomeric DKP ester insulin (i.e., [Asp(B10), Lys(B28), Pro(B29)]ester insulin) were prepared by total chemical synthesis. The atomic structure of the synthetic ester insulin molecule was determined by racemic protein X-ray crystallography to a resolution of 1.6 Å. Diffraction quality crystals were readily obtained from the racemic mixture of {D-DKP ester insulin + L-DKP ester insulin}, whereas crystals were not obtained from the L-ester insulin alone even after extensive trials. Both the D-protein and L-protein enantiomers of monomeric DKP ester insulin were assayed for receptor binding and in diabetic rats, before and after conversion by saponification to the corresponding DKP insulin enantiomers. L-DKP ester insulin bound weakly to the insulin receptor, while synthetic L-DKP insulin derived from the L-DKP ester insulin intermediate was fully active in binding to the insulin receptor. The D- and L-DKP ester insulins and D-DKP insulin were inactive in lowering blood glucose in diabetic rats, while synthetic L-DKP insulin was fully active in this biological assay. The structural basis of the lack of biological activity of ester insulin is discussed.

  1. Fully automated VMAT treatment planning for advanced-stage NSCLC patients

    Energy Technology Data Exchange (ETDEWEB)

    Della Gala, Giuseppe [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Dirkx, Maarten L.P.; Hoekstra, Nienke; Fransen, Dennie; Pol, Marjan van de; Heijmen, Ben J.M. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Lanconelli, Nico [Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Petit, Steven F. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Massachusetts General Hospital - Harvard Medical School, Department of Radiation Oncology, Boston, MA (United States)

    2017-05-15

    To develop a fully automated procedure for multicriterial volumetric modulated arc therapy (VMAT) treatment planning (autoVMAT) for stage III/IV non-small cell lung cancer (NSCLC) patients treated with curative intent. After configuring the developed autoVMAT system for NSCLC, autoVMAT plans were compared with manually generated clinically delivered intensity-modulated radiotherapy (IMRT) plans for 41 patients. AutoVMAT plans were also compared to manually generated VMAT plans in the absence of time pressure. For 16 patients with reduced planning target volume (PTV) dose prescription in the clinical IMRT plan (to avoid violation of organs at risk tolerances), the potential for dose escalation with autoVMAT was explored. Two physicians evaluated 35/41 autoVMAT plans (85%) as clinically acceptable. Compared to the manually generated IMRT plans, autoVMAT plans showed statistically significant improved PTV coverage (V{sub 95%} increased by 1.1% ± 1.1%), higher dose conformity (R{sub 50} reduced by 12.2% ± 12.7%), and reduced mean lung, heart, and esophagus doses (reductions of 0.9 Gy ± 1.0 Gy, 1.5 Gy ± 1.8 Gy, 3.6 Gy ± 2.8 Gy, respectively, all p < 0.001). To render the six remaining autoVMAT plans clinically acceptable, a dosimetrist needed less than 10 min hands-on time for fine-tuning. AutoVMAT plans were also considered equivalent or better than manually optimized VMAT plans. For 6/16 patients, autoVMAT allowed tumor dose escalation of 5-10 Gy. Clinically deliverable, high-quality autoVMAT plans can be generated fully automatically for the vast majority of advanced-stage NSCLC patients. For a subset of patients, autoVMAT allowed for tumor dose escalation. (orig.) [German] Entwicklung einer vollautomatisierten, auf multiplen Kriterien basierenden volumenmodulierten Arc-Therapie-(VMAT-)Behandlungsplanung (autoVMAT) fuer kurativ behandelte Patienten mit nicht-kleinzelligem Bronchialkarzinom (NSCLC) im Stadium III/IV. Nach Konfiguration unseres auto

  2. Automated main-chain model building by template matching and iterative fragment extension.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and beta-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and beta-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C(alpha) positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 A. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition.

  3. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    Science.gov (United States)

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Fully Automated Robust System to Detect Retinal Edema, Central Serous Chorioretinopathy, and Age Related Macular Degeneration from Optical Coherence Tomography Images

    Directory of Open Access Journals (Sweden)

    Samina Khalid

    2017-01-01

    Full Text Available Maculopathy is the excessive damage to macula that leads to blindness. It mostly occurs due to retinal edema (RE, central serous chorioretinopathy (CSCR, or age related macular degeneration (ARMD. Optical coherence tomography (OCT imaging is the latest eye testing technique that can detect these syndromes in early stages. Many researchers have used OCT images to detect retinal abnormalities. However, to the best of our knowledge, no research that presents a fully automated system to detect all of these macular syndromes is reported. This paper presents the world’s first ever decision support system to automatically detect RE, CSCR, and ARMD retinal pathologies and healthy retina from OCT images. The automated disease diagnosis in our proposed system is based on multilayered support vector machines (SVM classifier trained on 40 labeled OCT scans (10 healthy, 10 RE, 10 CSCR, and 10 ARMD. After training, SVM forms an accurate decision about the type of retinal pathology using 9 extracted features. We have tested our proposed system on 2819 OCT scans (1437 healthy, 640 RE, and 742 CSCR of 502 patients from two different datasets and our proposed system correctly diagnosed 2817/2819 subjects with the accuracy, sensitivity, and specificity ratings of 99.92%, 100%, and 99.86%, respectively.

  5. Rapid detection of enterovirus in cerebrospinal fluid by a fully-automated PCR assay is associated with improved management of aseptic meningitis in adult patients.

    Science.gov (United States)

    Giulieri, Stefano G; Chapuis-Taillard, Caroline; Manuel, Oriol; Hugli, Olivier; Pinget, Christophe; Wasserfallen, Jean-Blaise; Sahli, Roland; Jaton, Katia; Marchetti, Oscar; Meylan, Pascal

    2015-01-01

    Enterovirus (EV) is the most frequent cause of aseptic meningitis (AM). Lack of microbiological documentation results in unnecessary antimicrobial therapy and hospitalization. To assess the impact of rapid EV detection in cerebrospinal fluid (CSF) by a fully-automated PCR (GeneXpert EV assay, GXEA) on the management of AM. Observational study in adult patients with AM. Three groups were analyzed according to EV documentation in CSF: group A = no PCR or negative PCR (n=17), group B = positive real-time PCR (n = 20), and group C = positive GXEA (n = 22). Clinical, laboratory and health-care costs data were compared. Clinical characteristics were similar in the 3 groups. Median turn-around time of EV PCR decreased from 60 h (IQR (interquartile range) 44-87) in group B to 5h (IQR 4-11) in group C (p<0.0001). Median duration of antibiotics was 1 (IQR 0-6), 1 (0-1.9), and 0.5 days (single dose) in groups A, B, and C, respectively (p < 0.001). Median length of hospitalization was 4 days (2.5-7.5), 2 (1-3.7), and 0.5 (0.3-0.7), respectively (p < 0.001). Median hospitalization costs were $5458 (2676-6274) in group A, $2796 (2062-5726) in group B, and $921 (765-1230) in group C (p < 0.0001). Rapid EV detection in CSF by a fully-automated PCR improves management of AM by significantly reducing antibiotic use, hospitalization length and costs. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  7. Automation on an Open-Access Platform of Alzheimer's Disease Biomarker Immunoassays.

    Science.gov (United States)

    Gille, Benjamin; Dedeene, Lieselot; Stoops, Erik; Demeyer, Leentje; Francois, Cindy; Lefever, Stefanie; De Schaepdryver, Maxim; Brix, Britta; Vandenberghe, Rik; Tournoy, Jos; Vanderstichele, Hugo; Poesen, Koen

    2018-04-01

    The lack of (inter-)laboratory standardization has hampered the application of universal cutoff values for Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers and their transfer to general clinical practice. The automation of the AD biomarker immunoassays is suggested to generate more robust results than using manual testing. Open-access platforms will facilitate the integration of automation for novel biomarkers, allowing the introduction of the protein profiling concept. A feasibility study was performed on an automated open-access platform of the commercial immunoassays for the 42-amino-acid isoform of amyloid-β (Aβ 1-42 ), Aβ 1-40 , and total tau in CSF. Automated Aβ 1-42 , Aβ 1-40 , and tau immunoassays were performed within predefined acceptance criteria for bias and imprecision. Similar accuracy was obtained for ready-to-use calibrators as for reconstituted lyophilized kit calibrators. When compared with the addition of a standard curve in each test run, the use of a master calibrator curve, determined before and applied to each batch analysis as the standard curve, yielded an acceptable overall bias of -2.6% and -0.9% for Aβ 1-42 and Aβ 1-40 , respectively, with an imprecision profile of 6.2% and 8.4%, respectively. Our findings show that transfer of commercial manual immunoassays to fully automated open-access platforms is feasible, as it performs according to universal acceptance criteria.

  8. Validation of commercially available automated canine-specific immunoturbidimetric method for measuring canine C-reactive protein

    DEFF Research Database (Denmark)

    Hillström, Anna; Hagman, Ragnvi; Tvedten, Harold

    2014-01-01

    BACKGROUND: Measurement of C-reactive protein (CRP) is used for diagnosing and monitoring systemic inflammatory disease in canine patients. An automated human immunoturbidimetric assay has been validated for measuring canine CRP, but cross-reactivity with canine CRP is unpredictable. OBJECTIVE......: The purpose of the study was to validate a new automated canine-specific immunoturbidimetric CRP method (Gentian cCRP). METHODS: Studies of imprecision, accuracy, prozone effect, interference, limit of quantification, and stability under different storage conditions were performed. The new method was compared...... with a human CRP assay previously validated for canine CRP determination. Samples from 40 healthy dogs were analyzed to establish a reference interval. RESULTS: Total imprecision was

  9. Fully automated laboratory for the assay of plutonium in wastes and recoverable scraps

    International Nuclear Information System (INIS)

    Guiberteau, P.; Michaut, F.; Bergey, C.; Debruyne, T.

    1990-01-01

    To determine the plutonium content of wastes and recoverable scraps in intermediate size containers (ten liters) an automated laboratory has been carried out. Two passive methods of measurement are used. Gamma ray spectrometry allows plutonium isotopic analysis, americium determination and plutonium assay in wastes and poor scraps. Calorimetry is used for accurate (± 3%) plutonium determination in rich scraps. A full automation was realized with a barcode management and a supply robot to feed the eight assay set-ups. The laboratory works on a 24 hours per day and 365 days per year basis and has a capacity of 8,000 assays per year

  10. Automated selected reaction monitoring software for accurate label-free protein quantification.

    Science.gov (United States)

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  11. Automating tasks in protein structure determination with the clipper python module.

    Science.gov (United States)

    McNicholas, Stuart; Croll, Tristan; Burnley, Tom; Palmer, Colin M; Hoh, Soon Wen; Jenkins, Huw T; Dodson, Eleanor; Cowtan, Kevin; Agirre, Jon

    2018-01-01

    Scripting programming languages provide the fastest means of prototyping complex functionality. Those with a syntax and grammar resembling human language also greatly enhance the maintainability of the produced source code. Furthermore, the combination of a powerful, machine-independent scripting language with binary libraries tailored for each computer architecture allows programs to break free from the tight boundaries of efficiency traditionally associated with scripts. In the present work, we describe how an efficient C++ crystallographic library such as Clipper can be wrapped, adapted and generalized for use in both crystallographic and electron cryo-microscopy applications, scripted with the Python language. We shall also place an emphasis on best practices in automation, illustrating how this can be achieved with this new Python module. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  12. Human-automation collaboration in manufacturing: identifying key implementation factors

    OpenAIRE

    Charalambous, George; Fletcher, Sarah; Webb, Philip

    2013-01-01

    Human-automation collaboration refers to the concept of human operators and intelligent automation working together interactively within the same workspace without conventional physical separation. This concept has commanded significant attention in manufacturing because of the potential applications, such as the installation of large sub-assemblies. However, the key human factors relevant to human-automation collaboration have not yet been fully investigated. To maximise effective implement...

  13. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  14. The MMP inhibitor (R)-2-(N-benzyl-4-(2-[18F]fluoroethoxy)phenylsulphonamido) -N-hydroxy-3-methylbutanamide: Improved precursor synthesis and fully automated radiosynthesis

    International Nuclear Information System (INIS)

    Wagner, Stefan; Faust, Andreas; Breyholz, Hans-Joerg; Schober, Otmar; Schaefers, Michael; Kopka, Klaus

    2011-01-01

    Summary: The CGS 25966 derivative (R)-2-(N-Benzyl-4-(2-[ 18 F]fluoroethoxy)phenyl-sulphonamido) -N-hydroxy-3-methylbutanamide [ 18 F]9 represents a very potent radiolabelled matrix metalloproteinase inhibitor. For first human PET studies it is mandatory to have a fully automated radiosynthesis and a straightforward precursor synthesis available. The realisation of both requirements is reported herein. In particular, the corresponding precursor 8 was obtained in a reliable 7 step synthesis with an overall chemical yield of 2.3%. Furthermore, the target compound [ 18 F]9 was prepared with a radiochemical yield of 14.8±3.9% (not corrected for decay).

  15. PINE-SPARKY.2 for automated NMR-based protein structure research.

    Science.gov (United States)

    Lee, Woonghee; Markley, John L

    2018-05-01

    Nuclear magnetic resonance (NMR) spectroscopy, along with X-ray crystallography and cryoelectron microscopy, is one of the three major tools that enable the determination of atomic-level structural models of biological macromolecules. Of these, NMR has the unique ability to follow important processes in solution, including conformational changes, internal dynamics and protein-ligand interactions. As a means for facilitating the handling and analysis of spectra involved in these types of NMR studies, we have developed PINE-SPARKY.2, a software package that integrates and automates discrete tasks that previously required interaction with separate software packages. The graphical user interface of PINE-SPARKY.2 simplifies chemical shift assignment and verification, automated detection of secondary structural elements, predictions of flexibility and hydrophobic cores, and calculation of three-dimensional structural models. PINE-SPARKY.2 is available in the latest version of NMRFAM-SPARKY from the National Magnetic Resonance Facility at Madison (http://pine.nmrfam.wisc.edu/download_packages.html), the NMRbox Project (https://nmrbox.org) and to subscribers to the SBGrid (https://sbgrid.org). For a detailed description of the program, see http://www.nmrfam.wisc.edu/pine-sparky2.htm. whlee@nmrfam.wisc.edu or markley@nmrfam.wisc.edu. Supplementary data are available at Bioinformatics online.

  16. Assay of mouse-cell clones for retrovirus p30 protein by use of an automated solid-state radioimmunoassay

    International Nuclear Information System (INIS)

    Kennel, S.J.; Tnnant, R.W.

    1979-01-01

    A solid-state radioimmunoassay system has been developed that is useful for automated analysis of samples in microtiter plates. Assays for interspecies and type-specific antigenic determinants of the C-type retrovirus protein, p30, have been used to identify clones of cells producing this protein. This method allows testing of at least 1000 clones a day, making it useful for studies of frequencies of virus protein induction, defective virus production, and formation of recombinant viruses

  17. MIEC-SVM: automated pipeline for protein peptide/ligand interaction prediction.

    Science.gov (United States)

    Li, Nan; Ainsworth, Richard I; Wu, Meixin; Ding, Bo; Wang, Wei

    2016-03-15

    MIEC-SVM is a structure-based method for predicting protein recognition specificity. Here, we present an automated MIEC-SVM pipeline providing an integrated and user-friendly workflow for construction and application of the MIEC-SVM models. This pipeline can handle standard amino acids and those with post-translational modifications (PTMs) or small molecules. Moreover, multi-threading and support to Sun Grid Engine (SGE) are implemented to significantly boost the computational efficiency. The program is available at http://wanglab.ucsd.edu/MIEC-SVM CONTACT: : wei-wang@ucsd.edu Supplementary data available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Mars - robust automatic backbone assignment of proteins

    International Nuclear Information System (INIS)

    Jung, Young-Sang; Zweckstetter, Markus

    2004-01-01

    MARS a program for robust automatic backbone assignment of 13 C/ 15 N labeled proteins is presented. MARS does not require tight thresholds for establishing sequential connectivity or detailed adjustment of these thresholds and it can work with a wide variety of NMR experiments. Using only 13 C α / 13 C β connectivity information, MARS allows automatic, error-free assignment of 96% of the 370-residue maltose-binding protein. MARS can successfully be used when data are missing for a substantial portion of residues or for proteins with very high chemical shift degeneracy such as partially or fully unfolded proteins. Other sources of information, such as residue specific information or known assignments from a homologues protein, can be included into the assignment process. MARS exports its result in SPARKY format. This allows visual validation and integration of automated and manual assignment

  19. Automated surveillance of healthcare-associated infections : state of the art

    NARCIS (Netherlands)

    Sips, Meander E; Bonten, Marc J M; van Mourik, Maaike S M

    PURPOSE OF REVIEW: This review describes recent advances in the field of automated surveillance of healthcare-associated infections (HAIs), with a focus on data sources and the development of semiautomated or fully automated algorithms. RECENT FINDINGS: The availability of high-quality data in

  20. AuTom: a novel automatic platform for electron tomography reconstruction

    KAUST Repository

    Han, Renmin

    2017-07-26

    We have developed a software package towards automatic electron tomography (ET): Automatic Tomography (AuTom). The presented package has the following characteristics: accurate alignment modules for marker-free datasets containing substantial biological structures; fully automatic alignment modules for datasets with fiducial markers; wide coverage of reconstruction methods including a new iterative method based on the compressed-sensing theory that suppresses the “missing wedge” effect; and multi-platform acceleration solutions that support faster iterative algebraic reconstruction. AuTom aims to achieve fully automatic alignment and reconstruction for electron tomography and has already been successful for a variety of datasets. AuTom also offers user-friendly interface and auxiliary designs for file management and workflow management, in which fiducial marker-based datasets and marker-free datasets are addressed with totally different subprocesses. With all of these features, AuTom can serve as a convenient and effective tool for processing in electron tomography.

  1. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  2. Medical ADP Systems: Automated Medical Records Hold Promise to Improve Patient Care

    Science.gov (United States)

    1991-01-01

    automated medical records. The report discusses the potential benefits that automation could make to the quality of patient care and the factors that impede...information systems, but no organization has fully automated one of the most critical types of information, patient medical records. The patient medical record...its review of automated medical records. GAO’s objectives in this study were to identify the (1) benefits of automating patient records and (2) factors

  3. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    International Nuclear Information System (INIS)

    Prieto, Elena; Peñuelas, Iván; Martí-Climent, Josep M; Lecumberri, Pablo; Gómez, Marisol; Pagola, Miguel; Bilbao, Izaskun; Ecay, Margarita

    2012-01-01

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18 F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools. (paper)

  4. Automated measurement of serum thyroxine with the ''AIRA II,'' as compared with competitive protein binding and radioimmunoassay

    International Nuclear Information System (INIS)

    Reese, M.G.; Johnson, L.V.R.

    1978-01-01

    Two conventional serum thyroxine assays, run in separate laboratories, one by competitive protein binding and one by radioimmunoassay, were used to evaluate the automated ARIA II (Becton Dickinson Immunodiagnostics) serum thyroxine assay. Competitive protein binding as compared to ARIA II with 111 clinical serum samples gave a slope of 1.04 and a correlation coefficient of 0.94. The radioimmunoassay comparison to ARIA II with 53 clinical serum samples gave a slope of 1.05 and a correlation coefficient of 0.92. The ARIA II inter-assay coefficient of variation for 10 replicates of low, medium, and high thyroxine serum samples was 6.2, 6.0, and 2.9%, respectively, with an inter-assay coefficient of variation among 15 different assays of 15.5, 10.1, and 7.9%. The automated ARIA II, with a 2.2-min cycle per sample, gives results that compare well with those by manual methodology

  5. Robotic and nuclear safety for an automated/teleoperated glove box system

    International Nuclear Information System (INIS)

    Domning, E.E.; McMahon, T.T.; Sievers, R.H.

    1991-09-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system to handle the processing of special nuclear materials (SNM). This work is performed in response to the new goals at the Department of Energy (DOE) for hazardous waste minimization and radiation dose reduction. This fully automated system, called the automated test bed (ATB), consists of an IBM gantry robot and automated processing equipment sealed within a glove box. While the ATB is a cold system, we are designing it as a prototype of the future hot system. We recognized that identification and application of safety requirements early in the design phase will lead to timely installation and approval of the hot system. This paper identifies these safety issues as well as the general safety requirements necessary for the safe operation of the ATB. 4 refs., 2 figs

  6. Automated Analysis of Protein Expression and Gene Amplification within the Same Cells of Paraffin-Embedded Tumour Tissue

    Directory of Open Access Journals (Sweden)

    Timo Gaiser

    2010-01-01

    Full Text Available Background: The simultaneous detection of protein expression and gene copy number changes in patient samples, like paraffin-embedded tissue sections, is challenging since the procedures of immunohistochemistry (IHC and Fluorescence in situ Hybridization (FISH negatively influence each other which often results in suboptimal staining. Therefore, we developed a novel automated algorithm based on relocation which allows subsequent detection of protein content and gene copy number changes within the same cell.

  7. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Science.gov (United States)

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  8. Automated quantification of proliferation with automated hot-spot selection in phosphohistone H3/MART1 dual-stained stage I/II melanoma.

    Science.gov (United States)

    Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben

    2016-04-09

    Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated

  9. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    Energy Technology Data Exchange (ETDEWEB)

    Wu Binbin, E-mail: binbin.wu@gunet.georgetown.edu [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); McNutt, Todd [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Zahurak, Marianna [Department of Oncology Biostatistics, Johns Hopkins University, Baltimore, Maryland (United States); Simari, Patricio [Autodesk Research, Toronto, ON (Canada); Pang, Dalong [Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); Taylor, Russell [Department of Computer Science, Johns Hopkins University, Baltimore, Maryland (United States); Sanguineti, Giuseppe [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States)

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  10. Fully Automated Simultaneous Integrated Boosted–Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-01-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)–driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  11. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    Science.gov (United States)

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  12. Deployment of a Fully-Automated Green Fluorescent Protein Imaging System in a High Arctic Autonomous Greenhouse

    Directory of Open Access Journals (Sweden)

    Alain Berinstain

    2013-03-01

    Full Text Available Higher plants are an integral part of strategies for sustained human presence in space. Space-based greenhouses have the potential to provide closed-loop recycling of oxygen, water and food. Plant monitoring systems with the capacity to remotely observe the condition of crops in real-time within these systems would permit operators to take immediate action to ensure optimum system yield and reliability. One such plant health monitoring technique involves the use of reporter genes driving fluorescent proteins as biological sensors of plant stress. In 2006 an initial prototype green fluorescent protein imager system was deployed at the Arthur Clarke Mars Greenhouse located in the Canadian High Arctic. This prototype demonstrated the advantageous of this biosensor technology and underscored the challenges in collecting and managing telemetric data from exigent environments. We present here the design and deployment of a second prototype imaging system deployed within and connected to the infrastructure of the Arthur Clarke Mars Greenhouse. This is the first imager to run autonomously for one year in the un-crewed greenhouse with command and control conducted through the greenhouse satellite control system. Images were saved locally in high resolution and sent telemetrically in low resolution. Imager hardware is described, including the custom designed LED growth light and fluorescent excitation light boards, filters, data acquisition and control system, and basic sensing and environmental control. Several critical lessons learned related to the hardware of small plant growth payloads are also elaborated.

  13. Deployment of a Fully-Automated Green Fluorescent Protein Imaging System in a High Arctic Autonomous Greenhouse

    Science.gov (United States)

    Abboud, Talal; Bamsey, Matthew; Paul, Anna-Lisa; Graham, Thomas; Braham, Stephen; Noumeir, Rita; Berinstain, Alain; Ferl, Robert

    2013-01-01

    Higher plants are an integral part of strategies for sustained human presence in space. Space-based greenhouses have the potential to provide closed-loop recycling of oxygen, water and food. Plant monitoring systems with the capacity to remotely observe the condition of crops in real-time within these systems would permit operators to take immediate action to ensure optimum system yield and reliability. One such plant health monitoring technique involves the use of reporter genes driving fluorescent proteins as biological sensors of plant stress. In 2006 an initial prototype green fluorescent protein imager system was deployed at the Arthur Clarke Mars Greenhouse located in the Canadian High Arctic. This prototype demonstrated the advantageous of this biosensor technology and underscored the challenges in collecting and managing telemetric data from exigent environments. We present here the design and deployment of a second prototype imaging system deployed within and connected to the infrastructure of the Arthur Clarke Mars Greenhouse. This is the first imager to run autonomously for one year in the un-crewed greenhouse with command and control conducted through the greenhouse satellite control system. Images were saved locally in high resolution and sent telemetrically in low resolution. Imager hardware is described, including the custom designed LED growth light and fluorescent excitation light boards, filters, data acquisition and control system, and basic sensing and environmental control. Several critical lessons learned related to the hardware of small plant growth payloads are also elaborated. PMID:23486220

  14. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...

  15. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dennison, D.K.; Domning, E.E.; Seivers, R.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  16. 1st workshop on situational awareness in semi-Automated vehicles

    NARCIS (Netherlands)

    McCall, R.; Baumann, M.; Politis, I.; Borojeni, S.S.; Alvarez, I.; Mirnig, A.; Meschtscherjakov, A.; Tscheligi, M.; Chuang, L.; Terken, J.M.B.

    2016-01-01

    This workshop will focus on the problem of occupant and vehicle situational awareness with respect to automated vehicles when the driver must take over control. It will explore the future of fully automated and mixed traffic situations where vehicles are assumed to be operating at level 3 or above.

  17. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn by Measurement of Oil Content.

    Directory of Open Access Journals (Sweden)

    Hongzhi Wang

    Full Text Available One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed.

  18. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content

    Science.gov (United States)

    Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  19. Fully automated dual-frequency three-pulse-echo 2DIR spectrometer accessing spectral range from 800 to 4000 wavenumbers

    Energy Technology Data Exchange (ETDEWEB)

    Leger, Joel D.; Nyby, Clara M.; Varner, Clyde; Tang, Jianan; Rubtsova, Natalia I.; Yue, Yuankai; Kireev, Victor V.; Burtsev, Viacheslav D.; Qasim, Layla N.; Rubtsov, Igor V., E-mail: irubtsov@tulane.edu [Department of Chemistry, Tulane University, New Orleans, Louisiana 70118 (United States); Rubtsov, Grigory I. [Institute for Nuclear Research of the Russian Academy of Sciences, Moscow 117312 (Russian Federation)

    2014-08-15

    A novel dual-frequency two-dimensional infrared instrument is designed and built that permits three-pulse heterodyned echo measurements of any cross-peak within a spectral range from 800 to 4000 cm{sup −1} to be performed in a fully automated fashion. The superior sensitivity of the instrument is achieved by a combination of spectral interferometry, phase cycling, and closed-loop phase stabilization accurate to ∼70 as. The anharmonicity of smaller than 10{sup −4} cm{sup −1} was recorded for strong carbonyl stretching modes using 800 laser shot accumulations. The novel design of the phase stabilization scheme permits tuning polarizations of the mid-infrared (m-IR) pulses, thus supporting measurements of the angles between vibrational transition dipoles. The automatic frequency tuning is achieved by implementing beam direction stabilization schemes for each m-IR beam, providing better than 50 μrad beam stability, and novel scheme for setting the phase-matching geometry for the m-IR beams at the sample. The errors in the cross-peak amplitudes associated with imperfect phase matching conditions and alignment are found to be at the level of 20%. The instrument can be used by non-specialists in ultrafast spectroscopy.

  20. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  1. A wearable device for a fully automated in-hospital staff and patient identification.

    Science.gov (United States)

    Cavalleri, M; Morstabilini, R; Reni, G

    2004-01-01

    In the health care context, devices for automated staff / patient identification provide multiple benefits, including error reduction in drug administration, an easier and faster use of the Electronic Health Record, enhanced security and control features when accessing confidential data, etc. Current identification systems (e.g. smartcards, bar codes) are not completely seamless to users and require mechanical operations that sometimes are difficult to perform for impaired subjects. Emerging wireless RFID technologies are encouraging, but cannot still be introduced in health care environments due to their electromagnetic emissions and the need for large size antenna to operate at reasonable distances. The present work describes a prototype of wearable device for automated staff and patient identification which is small in size and complies with the in-hospital electromagnetic requirements. This prototype also implements an anti-counterfeit option. Its experimental application allowed the introduction of some security functions for confidential data management.

  2. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    International Nuclear Information System (INIS)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M; Noo, F

    2016-01-01

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT_wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  3. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    Energy Technology Data Exchange (ETDEWEB)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M [UCLA School of Medicine, Los Angeles, CA (United States); Noo, F [University of Utah, Salt Lake City, UT (United States)

    2016-06-15

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT-wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  4. Rig automation: where it's been and where it's going

    Energy Technology Data Exchange (ETDEWEB)

    Rinaldi, R.

    1982-06-01

    For over 30 years dreamers, tinkerers and engineers have attempted to automate various drilling functions. Now this effort is paying off, and a partially automated rig is no longer a curiosity. Fully automated and computerized rigs are on the way. For the contractor this means higher productivity, but more maintenance and training responsibilities.

  5. Automated de novo phasing and model building of coiled-coil proteins.

    Science.gov (United States)

    Rämisch, Sebastian; Lizatović, Robert; André, Ingemar

    2015-03-01

    Models generated by de novo structure prediction can be very useful starting points for molecular replacement for systems where suitable structural homologues cannot be readily identified. Protein-protein complexes and de novo-designed proteins are examples of systems that can be challenging to phase. In this study, the potential of de novo models of protein complexes for use as starting points for molecular replacement is investigated. The approach is demonstrated using homomeric coiled-coil proteins, which are excellent model systems for oligomeric systems. Despite the stereotypical fold of coiled coils, initial phase estimation can be difficult and many structures have to be solved with experimental phasing. A method was developed for automatic structure determination of homomeric coiled coils from X-ray diffraction data. In a benchmark set of 24 coiled coils, ranging from dimers to pentamers with resolutions down to 2.5 Å, 22 systems were automatically solved, 11 of which had previously been solved by experimental phasing. The generated models contained 71-103% of the residues present in the deposited structures, had the correct sequence and had free R values that deviated on average by 0.01 from those of the respective reference structures. The electron-density maps were of sufficient quality that only minor manual editing was necessary to produce final structures. The method, named CCsolve, combines methods for de novo structure prediction, initial phase estimation and automated model building into one pipeline. CCsolve is robust against errors in the initial models and can readily be modified to make use of alternative crystallographic software. The results demonstrate the feasibility of de novo phasing of protein-protein complexes, an approach that could also be employed for other small systems beyond coiled coils.

  6. Fully-Automated μMRI Morphometric Phenotyping of the Tc1 Mouse Model of Down Syndrome.

    Directory of Open Access Journals (Sweden)

    Nick M Powell

    Full Text Available We describe a fully automated pipeline for the morphometric phenotyping of mouse brains from μMRI data, and show its application to the Tc1 mouse model of Down syndrome, to identify new morphological phenotypes in the brain of this first transchromosomic animal carrying human chromosome 21. We incorporate an accessible approach for simultaneously scanning multiple ex vivo brains, requiring only a 3D-printed brain holder, and novel image processing steps for their separation and orientation. We employ clinically established multi-atlas techniques-superior to single-atlas methods-together with publicly-available atlas databases for automatic skull-stripping and tissue segmentation, providing high-quality, subject-specific tissue maps. We follow these steps with group-wise registration, structural parcellation and both Voxel- and Tensor-Based Morphometry-advantageous for their ability to highlight morphological differences without the laborious delineation of regions of interest. We show the application of freely available open-source software developed for clinical MRI analysis to mouse brain data: NiftySeg for segmentation and NiftyReg for registration, and discuss atlases and parameters suitable for the preclinical paradigm. We used this pipeline to compare 29 Tc1 brains with 26 wild-type littermate controls, imaged ex vivo at 9.4T. We show an unexpected increase in Tc1 total intracranial volume and, controlling for this, local volume and grey matter density reductions in the Tc1 brain compared to the wild-types, most prominently in the cerebellum, in agreement with human DS and previous histological findings.

  7. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  8. A Framework for Fully Automated Performance Testing for Smart Buildings

    DEFF Research Database (Denmark)

    Markoska, Elena; Johansen, Aslak; Lazarova-Molnar, Sanja

    2018-01-01

    , setup of performance tests has been manual and labor-intensive and has required intimate knowledge of buildings’ complexity and systems. The emergence of the concept of smart buildings has provided an opportunity to overcome this restriction. In this paper, we propose a framework for automated......A significant proportion of energy consumption by buildings worldwide, estimated to ca. 40%, has yielded a high importance to studying buildings’ performance. Performance testing is a mean by which buildings can be continuously commissioned to ensure that they operate as designed. Historically...... performance testing of smart buildings that utilizes metadata models. The approach features automatic detection of applicable performance tests using metadata queries and their corresponding instantiation, as well as continuous commissioning based on metadata. The presented approach has been implemented...

  9. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  10. ROBOCAL: An automated NDA [nondestructive analysis] calorimetry and gamma isotopic system

    International Nuclear Information System (INIS)

    Hurd, J.R.; Powell, W.D.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototype robotic system for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multidrawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface is provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric and gamma-ray data acquisition and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices

  11. The Stanford Automated Mounter: Enabling High-Throughput Protein Crystal Screening at SSRL

    International Nuclear Information System (INIS)

    Smith, C.A.; Cohen, A.E.

    2009-01-01

    The macromolecular crystallography experiment lends itself perfectly to high-throughput technologies. The initial steps including the expression, purification, and crystallization of protein crystals, along with some of the later steps involving data processing and structure determination have all been automated to the point where some of the last remaining bottlenecks in the process have been crystal mounting, crystal screening, and data collection. At the Stanford Synchrotron Radiation Laboratory, a National User Facility that provides extremely brilliant X-ray photon beams for use in materials science, environmental science, and structural biology research, the incorporation of advanced robotics has enabled crystals to be screened in a true high-throughput fashion, thus dramatically accelerating the final steps. Up to 288 frozen crystals can be mounted by the beamline robot (the Stanford Auto-Mounting System) and screened for diffraction quality in a matter of hours without intervention. The best quality crystals can then be remounted for the collection of complete X-ray diffraction data sets. Furthermore, the entire screening and data collection experiment can be controlled from the experimenter's home laboratory by means of advanced software tools that enable network-based control of the highly automated beamlines.

  12. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images.

    Science.gov (United States)

    Serag, Ahmed; Macnaught, Gillian; Denison, Fiona C; Reynolds, Rebecca M; Semple, Scott I; Boardman, James P

    2017-01-01

    Fetal brain magnetic resonance imaging (MRI) is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG) feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  13. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images

    Directory of Open Access Journals (Sweden)

    Ahmed Serag

    2017-01-01

    Full Text Available Fetal brain magnetic resonance imaging (MRI is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  14. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  15. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  16. Comparison of two theory-based, fully automated telephone interventions designed to maintain dietary change in healthy adults: study protocol of a three-arm randomized controlled trial.

    Science.gov (United States)

    Wright, Julie A; Quintiliani, Lisa M; Turner-McGrievy, Gabrielle M; Migneault, Jeffrey P; Heeren, Timothy; Friedman, Robert H

    2014-11-10

    Health behavior change interventions have focused on obtaining short-term intervention effects; few studies have evaluated mid-term and long-term outcomes, and even fewer have evaluated interventions that are designed to maintain and enhance initial intervention effects. Moreover, behavior theory has not been developed for maintenance or applied to maintenance intervention design to the degree that it has for behavior change initiation. The objective of this paper is to describe a study that compared two theory-based interventions (social cognitive theory [SCT] vs goal systems theory [GST]) designed to maintain previously achieved improvements in fruit and vegetable (F&V) consumption. The interventions used tailored, interactive conversations delivered by a fully automated telephony system (Telephone-Linked Care [TLC]) over a 6-month period. TLC maintenance intervention based on SCT used a skills-based approach to build self-efficacy. It assessed confidence in and barriers to eating F&V, provided feedback on how to overcome barriers, plan ahead, and set goals. The TLC maintenance intervention based on GST used a cognitive-based approach. Conversations trained participants in goal management to help them integrate their newly acquired dietary behavior into their hierarchical system of goals. Content included goal facilitation, conflict, shielding, and redundancy, and reflection on personal goals and priorities. To evaluate and compare the two approaches, a sample of adults whose F&V consumption was below public health goal levels were recruited from a large urban area to participate in a fully automated telephony intervention (TLC-EAT) for 3-6 months. Participants who increase their daily intake of F&V by ≥1 serving/day will be eligible for the three-arm randomized controlled trial. A sample of 405 participants will be randomized to one of three arms: (1) an assessment-only control, (2) TLC-SCT, and (3) TLC-GST. The maintenance interventions are 6 months. All 405

  17. A facile and rapid automated synthesis of 3'-deoxy-3'-[18F]fluorothymidine

    International Nuclear Information System (INIS)

    Tang Ganghua; Tang Xiaolan; Wen Fuhua; Wang Mingfang; Li Baoyuan

    2010-01-01

    Aim: To develop a simplified and fully automated synthesis procedure of 3'-deoxy-3'-[ 18 F]fluorothymidine ([ 18 F]FLT) using PET-MF-2V-IT-I synthesis module. Methods: Synthesis of [ 18 F]FLT was performed using PET-MF-2V-IT-I synthesis module by one-pot two-step reaction procedure, including nucleophilic fluorination of 3-N-t-butoxycarbonyl-1-[5'-O-(4,4'-dimethoxy triphenylmethyl)-2'-deoxy-3'-O-(4-nitrobenzenesulfonyl) -β-D-threopentofuranosyl]thymine (15 mg) as the precursor molecule with [ 18 F]fluoride, and subsequent hydrolysis of the protecting group with 1.0 M HCl at the same reaction vessel and purification with SEP PAK cartridges instead of the HPLC system. Results: The automated synthesis of [ 18 F]FLT with SEP PAK purification gave corrected radiochemical yield of 23.2±2.6% (n=6, uncorrected yield: 16-22%) and radiochemical purity of >97% within the total synthesis time of 35 min. Conclusion: The fully one-pot automated synthesis procedure with SEP PAK purification can be applied to the fully automated synthesis of [ 18 F]FLT using commercial [ 18 F]FDG synthesis module.

  18. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  19. Automated MRI segmentation for individualized modeling of current flow in the human head.

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible

  20. Validation of the fully automated A&D TM-2656 blood pressure monitor according to the British Hypertension Society Protocol.

    Science.gov (United States)

    Zeng, Wei-Fang; Liu, Ming; Kang, Yuan-Yuan; Li, Yan; Wang, Ji-Guang

    2013-08-01

    The present study aimed to evaluate the accuracy of the fully automated oscillometric upper-arm blood pressure monitor TM-2656 according to the British Hypertension Society (BHS) Protocol 1993. We recruited individuals until there were 85 eligible participants and their blood pressure could meet the blood pressure distribution requirements specified by the BHS Protocol. For each individual, we sequentially measured the systolic and diastolic blood pressures using a mercury sphygmomanometer (two observers) and the TM-2656 device (one supervisor). Data analysis was carried out according to the BHS Protocol. The device achieved grade A. The percentage of blood pressure differences within 5, 10, and 15 mmHg was 62, 85, and 96%, respectively, for systolic blood pressure, and 71, 93, and 99%, respectively, for diastolic blood pressure. The average (±SD) of the device-observer differences was -2.1±7.8 mmHg (P<0.0001) and -1.1±5.8 mmHg (P<0.0001) for systolic and diastolic blood pressures, respectively. The A&D upper-arm blood pressure monitor TM-2656 has passed the requirements of the BHS Protocol, and can thus be recommended for blood pressure measurement.

  1. Maximizing Your Investment in Building Automation System Technology.

    Science.gov (United States)

    Darnell, Charles

    2001-01-01

    Discusses how organizational issues and system standardization can be important factors that determine an institution's ability to fully exploit contemporary building automation systems (BAS). Further presented is management strategy for maximizing BAS investments. (GR)

  2. Automated tetraploid genotype calling by hierarchical clustering

    Science.gov (United States)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  3. An automated tensile machine for small specimens heavily neutron irradiated in FFTF/MOTA

    International Nuclear Information System (INIS)

    Kohyama, Akira; Sato, Shinji; Hamada, Kenichi

    1993-01-01

    The objective of this work is to develop a fully automated tensile machine for post-irradiation examination (PIE) of Fast Flux Test Facility (FFTF)/Materials Open Test Assembly (MOTA) irradiated miniature tension specimens. The anticipated merit of the automated tensile machine is to reduce damage to specimens during specimen handling for PIE and to reduce exposure to radioactive specimens. This machine is designed for testing at elevated temperatures, up to 873 K, in a vacuum or in an inert gas environment. Twelve specimen assemblies are placed in the vacuum chamber that can be tested successively in a fully automated manner. A unique automated tensile machine for the PIE of FFTF/MOTA irradiated specimens, the Monbusho Automated Tensile Machine (MATRON) consists of a test frame with controlling units and an automated specimen-loading apparatus. The qualification of the test frame has been completed, and the results have satisfied the machine specifications. The capabilities of producing creep and relaxation data have been demonstrated for Cu, Al, 316SS, and ferritic steels. The specimen holders for the three-point bending test and the small bulge test (small punch test; SP test) were also designed and produced

  4. Automated ultrasonic inspection using PULSDAT

    International Nuclear Information System (INIS)

    Naybour, P.J.

    1992-01-01

    PULSDAT (Portable Ultrasonic Data Acquisition Tool) is a system for recording the data from single probe automated ultrasonic inspections. It is one of a range of instruments and software developed by Nuclear Electric to carry out a wide variety of high quality ultrasonic inspections. These vary from simple semi-automated inspections through to multi-probe, highly automated ones. PULSDAT runs under the control of MIPS software, and collects data which is compatible with the GUIDE data display system. PULSDAT is therefore fully compatible with Nuclear Electric's multi-probe inspection systems and utilises all the reliability and quality assurance of the software. It is a rugged, portable system that can be used in areas of difficult access. The paper discusses the benefits of automated inspection and gives an outline of the main features of PULSDAT. Since April 1990 PULSDAT has been used in several applications within Nuclear Electric and this paper presents two examples: the first is a ferritic set-through nozzle and the second is an austenitic fillet weld. (Author)

  5. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  6. Fully automated processing of buffy-coat-derived pooled platelet concentrates.

    Science.gov (United States)

    Janetzko, Karin; Klüter, Harald; van Waeg, Geert; Eichler, Hermann

    2004-07-01

    The OrbiSac device, which was developed to automate the manufacture of buffy-coat PLT concentrates (BC-PCs), was evaluated. In-vitro characteristics of BC-PC preparations using the OrbiSac device were compared with manually prepared BC-PCs. For standard processing (Std-PC, n = 20), four BC-PCs were pooled using 300 mL of PLT AS (PAS) followed by soft-spin centrifugation and WBC filtration. The OrbiSac preparation (OS-PC, n = 20) was performed by automated pooling of four BC-PCs with 300 mL PAS followed by centrifugation and inline WBC filtration. All PCs were stored at 22 degrees C. Samples were withdrawn on Day 1, 5, and 7 evaluating PTL count, blood gas analysis, glucose, lactate, LDH, beta-thromboglobulin, hypotonic shock response, and CD62p expression. A PLT content of 3.1 +/- 0.4 x 10(11) (OS-PCs) versus 2.7 +/- 0.5 x 10(11) (Std-PCs, p < 0.05) was found. A CV of 19 percent (Std-PC) versus 14 percent (OS-PC) suggests more standardization in the OS group. At Day 7, the Std-PCs versus OS-PCs showed a glucose consumption of 1.03 +/- 0.32 micro mol per 10(9) PLT versus 0.75 +/- 0.25 micro mol per 10(9) PLT (p < 0.001), and a lactate production of 1.50 +/- 0.86 micro mol per 10(9) versus 1.11 +/- 0.61 micro mol per 10(9) (p < 0.001). The pH (7.00 +/- 0.19 vs. 7.23 +/- 0.06; p < 0.001), pO(2) (45.3 +/- 18 vs. 31.3 +/- 10.4 mmHg; p < 0.01), and HCO(3) levels (4.91 +/- 1.49 vs. 7.14 +/- 0.95 mmol/L; p < 0.001) suggest a slightly better aerobic metabolism within the OS group. Only small differences in CD62p expression was observed (37.3 +/- 12.9% Std-PC vs. 44.8 +/- 6.6% OS-PC; p < 0.05). The OrbiSac device allows an improved PLT yield without affecting PLT in-vitro characteristics and may enable an improved consistency in product volume and yield.

  7. Progress report on a fully automatic Gas Tungsten Arc Welding (GTAW) system development

    Energy Technology Data Exchange (ETDEWEB)

    Daumeyer, G.J. III

    1994-12-01

    A plan to develop a fully automatic gas tungsten arc welding (GTAW) system that will utilize a vision-sensing computer (which will provide in-process feedback control) is presently in work. Evaluations of different technological aspects and system design requirements continue. This report summaries major activities in the plan`s successful progress. The technological feasibility of producing the fully automated GTAW system has been proven. The goal of this process development project is to provide a production-ready system within the shortest reasonable time frame.

  8. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  9. Automated meteorological data from commercial aircraft via satellite: Present experience and future implications

    Science.gov (United States)

    Steinberg, R.

    1978-01-01

    A low-cost communications system to provide meteorological data from commercial aircraft, in neat real-time, on a fully automated basis has been developed. The complete system including the low profile antenna and all installation hardware weighs 34 kg. The prototype system was installed on a B-747 aircraft and provided meteorological data (wind angle and velocity, temperature, altitude and position as a function of time) on a fully automated basis. The results were exceptional. This concept is expected to have important implications for operational meteorology and airline route forecasting.

  10. The Set-Up and Implementation of Fully Virtualized Lessons with an Automated Workflow Utilizing VMC/Moodle at the Medical University of Graz

    Directory of Open Access Journals (Sweden)

    Herwig Erich Rehatschek

    2011-12-01

    Full Text Available With start of winter semester 2010/11 the Medical University of Graz (MUG successfully introduced a new primary learning management system (LMS Moodle. Moodle currently serves more than 4,300 students from three studies and holds more than 7,500 unique learning objects. With begin of the summer semester 2010 we decided to start a pilot with Moodle and 430 students. For the pilot we migrated the learning content of one module and two optional subjects to Moodle. The evaluation results were extremely promising – more than 92% of the students wanted immediately Moodle – also Moodle did meet our high expectations in terms of performance and scalability. Within this paper we describe how we defined and set-up a scalable and highly available platform for hosting Moodle and extended it by the functionality for fully automated virtual lessons. We state our experiences and give valuable clues for universities and institutions who want to introduce Moodle in the near future.

  11. Automated optical assembly

    Science.gov (United States)

    Bala, John L.

    1995-08-01

    Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.

  12. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  13. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    to point distance. T-test for common mean are used to determine the performance of the two methods (supported by a Wilcoxon signed rank test). The performance influence of sampling density, sampling quantity, and norms is analyzed using a similar method.......We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore...

  14. FULLY AUTOMATED GIS-BASED INDIVIDUAL TREE CROWN DELINEATION BASED ON CURVATURE VALUES FROM A LIDAR DERIVED CANOPY HEIGHT MODEL IN A CONIFEROUS PLANTATION

    Directory of Open Access Journals (Sweden)

    R. J. L. Argamosa

    2016-06-01

    Full Text Available The generation of high resolution canopy height model (CHM from LiDAR makes it possible to delineate individual tree crown by means of a fully-automated method using the CHM’s curvature through its slope. The local maxima are obtained by taking the maximum raster value in a 3 m x 3 m cell. These values are assumed as tree tops and therefore considered as individual trees. Based on the assumptions, thiessen polygons were generated to serve as buffers for the canopy extent. The negative profile curvature is then measured from the slope of the CHM. The results show that the aggregated points from a negative profile curvature raster provide the most realistic crown shape. The absence of field data regarding tree crown dimensions require accurate visual assessment after the appended delineated tree crown polygon was superimposed to the hill shaded CHM.

  15. Automated driving and its effects on the safety ecosystem: How do compatibility issues affect the transition period?

    NARCIS (Netherlands)

    van Loon, R.J.; Martens, Marieke Hendrikje

    2015-01-01

    Different components of automated vehicles are being made available commercially as we speak. Much research has been conducted into these components and many of these have been studied with respect to their effects on safety, but the transition period from non-automated driving to fully automated

  16. Automated driving and its effect on the safety ecosystem: how do compatibility issues affect the transition period?

    NARCIS (Netherlands)

    Loon, R.J. van; Martens, M.H.

    2015-01-01

    Different components of automated vehicles are being made available commercially as we speak. Much research has been conducted into these components and many of these have been studied with respect to their effects on safety, but the transition period from non-automated driving to fully automated

  17. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    International Nuclear Information System (INIS)

    Brown, Aaron W.; Simone, Paul S.; York, J.C.; Emmert, Gary L.

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L –1 . • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L −1 . Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature

  18. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Aaron W. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Simone, Paul S. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States); York, J.C. [City of Lebanon, TN Water Treatment Plant, 7 Gilmore Hill Rd., Lebanon, TN 37087 (United States); Emmert, Gary L., E-mail: gemmert@memphis.edu [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States)

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L{sup –1}. • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L{sup −1}. Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature.

  19. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Burks, M.B. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Hoop, R.C.; Hoffman, E.P. [Univ. of Pittsburgh School of Medicine, PA (United States)

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  20. A LabVIEW®-based software for the control of the AUTORAD platform. A fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis

    International Nuclear Information System (INIS)

    Barbesi, Donato; Vilas, Victor Vicente; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Heras, Laura Aldave de las

    2017-01-01

    A LabVIEW®-based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino®-based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW®VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste. (author)

  1. A LabVIEW®-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    Science.gov (United States)

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  2. Adapting the γ-H2AX assay for automated processing in human lymphocytes. 1. Technological aspects.

    Science.gov (United States)

    Turner, Helen C; Brenner, David J; Chen, Youhua; Bertucci, Antonella; Zhang, Jian; Wang, Hongliang; Lyulko, Oleksandra V; Xu, Yanping; Shuryak, Igor; Schaefer, Julia; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y Lawrence; Amundson, Sally A; Garty, Guy

    2011-03-01

    The immunofluorescence-based detection of γ-H2AX is a reliable and sensitive method for quantitatively measuring DNA double-strand breaks (DSBs) in irradiated samples. Since H2AX phosphorylation is highly linear with radiation dose, this well-established biomarker is in current use in radiation biodosimetry. At the Center for High-Throughput Minimally Invasive Radiation Biodosimetry, we have developed a fully automated high-throughput system, the RABIT (Rapid Automated Biodosimetry Tool), that can be used to measure γ-H2AX yields from fingerstick-derived samples of blood. The RABIT workstation has been designed to fully automate the γ-H2AX immunocytochemical protocol, from the isolation of human blood lymphocytes in heparin-coated PVC capillaries to the immunolabeling of γ-H2AX protein and image acquisition to determine fluorescence yield. High throughput is achieved through the use of purpose-built robotics, lymphocyte handling in 96-well filter-bottomed plates, and high-speed imaging. The goal of the present study was to optimize and validate the performance of the RABIT system for the reproducible and quantitative detection of γ-H2AX total fluorescence in lymphocytes in a multiwell format. Validation of our biodosimetry platform was achieved by the linear detection of a dose-dependent increase in γ-H2AX fluorescence in peripheral blood samples irradiated ex vivo with γ rays over the range 0 to 8 Gy. This study demonstrates for the first time the optimization and use of our robotically based biodosimetry workstation to successfully quantify γ-H2AX total fluorescence in irradiated peripheral lymphocytes.

  3. Automated spoof-detection for fingerprints using optical coherence tomography

    CSIR Research Space (South Africa)

    Darlow, LN

    2016-05-01

    Full Text Available that they are highly separable, resulting in 100% accuracy regarding spoof-detection, with no false rejections of real fingers. This is the first attempt at fully automated spoof-detection using OCT....

  4. Fully printable, strain-engineered electronic wrap for customizable soft electronics.

    Science.gov (United States)

    Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek

    2017-03-24

    Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.

  5. Fully printable, strain-engineered electronic wrap for customizable soft electronics

    Science.gov (United States)

    Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek

    2017-03-01

    Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.

  6. Synthesis of tracers using automated radiochemistry and robotics

    International Nuclear Information System (INIS)

    Dannals, R.F.

    1992-07-01

    Synthesis of high specific activity radiotracers labeled with short-lived positron-emitting radionuclides for positron emission tomography (PET) often requires handling large initial quantities of radioactivity. High specific activities are required when preparing tracers for use in PET studies of neuroreceptors. A fully automated approach for tracer synthesis is highly desirable. This proposal involves the development of a system for the Synthesis of Tracers using Automated Radiochemistry and Robotics (STARR) for this purpose. While the long range objective of the proposed research is the development of a totally automated radiochemistry system for the production of major high specific activity 11 C-radiotracers for use in PET, the specific short range objectives are the automation of 11 C-methyl iodide ( 11 CH 3 I) production via an integrated approach using both radiochemistry modular labstations and robotics, and the extension of this automated capability to the production of several radiotracers for PET (initially, 11 C-methionine, 3-N-[ 11 C-methyl]spiperone, and [ 11 C]-carfentanil)

  7. Fully automated synthesis of the M{sub 1} receptor agonist [{sup 11}C]GSK1034702 for clinical use on an Eckert and Ziegler Modular Lab system

    Energy Technology Data Exchange (ETDEWEB)

    Huiban, Mickael, E-mail: Mickael.x.huiban@gsk.com [GlaxoSmithKline, Clinical Imaging Centre, Imperial College London, Hammersmith Hospital, Du Cane Road, London, W12 0NN (United Kingdom); Pampols-Maso, Sabina; Passchier, Jan [GlaxoSmithKline, Clinical Imaging Centre, Imperial College London, Hammersmith Hospital, Du Cane Road, London, W12 0NN (United Kingdom)

    2011-10-15

    A fully automated and GMP compatible synthesis has been developed to reliably label the M{sub 1} receptor agonist GSK1034702 with carbon-11. Stille reaction of the trimethylstannyl precursor with [{sup 11}C]methyl iodide afforded [{sup 11}C]GSK1034702 in an estimated 10{+-}3% decay corrected yield. This method utilises the commercially available modular laboratory equipment and provides high purity [{sup 11}C]GSK1034702 in a formulation suitable for human use. - Highlights: > Preparation of [{sup 11}C]GSK1034702 through a Stille cross-coupling reaction. > Provision of the applicability of commercially available modules for the synthesis of non-standard PET tracers. > Defining specification for heavy metals content in final dose product. > Presenting results from validation of manufacturing process.

  8. Fully automated reconstruction of three-dimensional vascular tree structures from two orthogonal views using computational algorithms and productionrules

    Science.gov (United States)

    Liu, Iching; Sun, Ying

    1992-10-01

    A system for reconstructing 3-D vascular structure from two orthogonally projected images is presented. The formidable problem of matching segments between two views is solved using knowledge of the epipolar constraint and the similarity of segment geometry and connectivity. The knowledge is represented in a rule-based system, which also controls the operation of several computational algorithms for tracking segments in each image, representing 2-D segments with directed graphs, and reconstructing 3-D segments from matching 2-D segment pairs. Uncertain reasoning governs the interaction between segmentation and matching; it also provides a framework for resolving the matching ambiguities in an iterative way. The system was implemented in the C language and the C Language Integrated Production System (CLIPS) expert system shell. Using video images of a tree model, the standard deviation of reconstructed centerlines was estimated to be 0.8 mm (1.7 mm) when the view direction was parallel (perpendicular) to the epipolar plane. Feasibility of clinical use was shown using x-ray angiograms of a human chest phantom. The correspondence of vessel segments between two views was accurate. Computational time for the entire reconstruction process was under 30 s on a workstation. A fully automated system for two-view reconstruction that does not require the a priori knowledge of vascular anatomy is demonstrated.

  9. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  10. A Chip-Capillary Hybrid Device for Automated Transfer of Sample Pre-Separated by Capillary Isoelectric Focusing to Parallel Capillary Gel Electrophoresis for Two-Dimensional Protein Separation

    Science.gov (United States)

    Lu, Joann J.; Wang, Shili; Li, Guanbin; Wang, Wei; Pu, Qiaosheng; Liu, Shaorong

    2012-01-01

    In this report, we introduce a chip-capillary hybrid device to integrate capillary isoelectric focusing (CIEF) with parallel capillary sodium dodecyl sulfate – polyacrylamide gel electrophoresis (SDS-PAGE) or capillary gel electrophoresis (CGE) toward automating two-dimensional (2D) protein separations. The hybrid device consists of three chips that are butted together. The middle chip can be moved between two positions to re-route the fluidic paths, which enables the performance of CIEF and injection of proteins partially resolved by CIEF to CGE capillaries for parallel CGE separations in a continuous and automated fashion. Capillaries are attached to the other two chips to facilitate CIEF and CGE separations and to extend the effective lengths of CGE columns. Specifically, we illustrate the working principle of the hybrid device, develop protocols for producing and preparing the hybrid device, and demonstrate the feasibility of using this hybrid device for automated injection of CIEF-separated sample to parallel CGE for 2D protein separations. Potentials and problems associated with the hybrid device are also discussed. PMID:22830584

  11. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.

  12. An automated coil winding machine for the SSC dipole magnets

    International Nuclear Information System (INIS)

    Kamiya, S.; Iwase, T.; Inoue, I.; Fukui, I.; Ishida, K.; Kashiwagi, S.; Sato, Y.; Yoshihara, T.; Yamamoto, S.; Johnson, E.; Gibson, C.

    1990-01-01

    The authors have finished the preliminary design of a fully automated coil winding machine that can be used to manufacture the large number of SSC dipole magnets. The machine aims to perform all coil winding operations including coil parts inserting without human operators at a high productive rate. The machine is composed of five industrial robots. In order to verify the design, they built a small winding machine using an industrial robot and successfully wound a 1 meter long coil using SSC dipole magnet wire. The basic design for the full length coil and the robot winding technique are described in this paper. A fully automated coil winding machine using standard industrial components would be very useful if duplicate production lines are used. 5 figs., 1 tab

  13. CMS on the GRID: Toward a fully distributed computing architecture

    International Nuclear Information System (INIS)

    Innocente, Vincenzo

    2003-01-01

    The computing systems required to collect, analyse and store the physics data at LHC would need to be distributed and global in scope. CMS is actively involved in several grid-related projects to develop and deploy a fully distributed computing architecture. We present here recent developments of tools for automating job submission and for serving data to remote analysis stations. Plans for further test and deployment of a production grid are also described

  14. Simulation Model of Automated Peat Briquetting Press Drive

    Directory of Open Access Journals (Sweden)

    A. A. Marozka

    2012-01-01

    Full Text Available The paper presents the developed fully functional simulation model of an automated peat briquetting press drive. The given model makes it possible to reduce financial and time costs while developing, designing and operating a double-stamp peat briquetting press drive.

  15. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  16. Automated evaluation of ultrasonic indications. State of the art -development trends. Pt. 1

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  17. Automated pipe handling systems for new and retrofit applications in shallow drilling markets

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, P.; Fikowski, L.M. [Blackbird Well Servicing Inc., Calgary, AB (Canada)

    2003-07-01

    This presentation discussed the importance of the human interface as the main element in the development of automated mechanical systems on drilling rigs. Improvements in drilling rig designs are meant to improve manpower efficiencies and performance. The goal for Blackbird Well Servicing is to design automated and integrated processes that can be controlled manually at any point during an operation. Although some drilling operations can be fully automated and fully integrated, certain steps in the process are intentionally left open ended for human intervention. It was concluded that the consistency of performance is the most significant feature of integrated systems and that all drilling contractors should strive for smooth, steady performance rather than brute labour. Speed and efficiency increases with consistent performance. Reliability results in better performance, thereby lowering operating costs and more work for drilling contractors.

  18. Anchoring protein crystals to mounting loops with hydrogel using inkjet technology.

    Science.gov (United States)

    Shinoda, Akira; Tanaka, Yoshikazu; Yao, Min; Tanaka, Isao

    2014-11-01

    X-ray crystallography is an important technique for structure-based drug discovery, mainly because it is the only technique that can reveal whether a ligand binds to the target protein as well as where and how it binds. However, ligand screening by X-ray crystallography involves a crystal-soaking experiment, which is usually performed manually. Thus, the throughput is not satisfactory for screening large numbers of candidate ligands. In this study, a technique to anchor protein crystals to mounting loops by using gel and inkjet technology has been developed; the method allows soaking of the mounted crystals in ligand-containing solution. This new technique may assist in the design of a fully automated drug-screening pipeline.

  19. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  20. Automation of cDNA Synthesis and Labelling Improves Reproducibility

    Directory of Open Access Journals (Sweden)

    Daniel Klevebring

    2009-01-01

    Full Text Available Background. Several technologies, such as in-depth sequencing and microarrays, enable large-scale interrogation of genomes and transcriptomes. In this study, we asses reproducibility and throughput by moving all laboratory procedures to a robotic workstation, capable of handling superparamagnetic beads. Here, we describe a fully automated procedure for cDNA synthesis and labelling for microarrays, where the purification steps prior to and after labelling are based on precipitation of DNA on carboxylic acid-coated paramagnetic beads. Results. The fully automated procedure allows for samples arrayed on a microtiter plate to be processed in parallel without manual intervention and ensuring high reproducibility. We compare our results to a manual sample preparation procedure and, in addition, use a comprehensive reference dataset to show that the protocol described performs better than similar manual procedures. Conclusions. We demonstrate, in an automated gene expression microarray experiment, a reduced variance between replicates, resulting in an increase in the statistical power to detect differentially expressed genes, thus allowing smaller differences between samples to be identified. This protocol can with minor modifications be used to create cDNA libraries for other applications such as in-depth analysis using next-generation sequencing technologies.

  1. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    Science.gov (United States)

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets. © 2014 Society for Laboratory Automation and Screening.

  2. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    Science.gov (United States)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  3. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  4. Evaluation of an automated struvite reactor to recover phosphorus ...

    African Journals Online (AJOL)

    2015-04-03

    Apr 3, 2015 ... A reactor was developed that can run fully automated and recover up to 93% ..... uncertainty. This technique will work best when the concentration of ... Ken Jack at the School of Chemical Engineering to help build the reactor ...

  5. Automated meteorological data from commercial aircraft via satellite - Present experience and future implications

    Science.gov (United States)

    Steinberg, R.

    1978-01-01

    The National Aeronautics and Space Administration has developed a low-cost communications system to provide meteorological data from commercial aircraft, in near real-time, on a fully automated basis. The complete system including the low profile antenna and all installation hardware weighs 34 kg. The prototype system has been installed on a Pan American B-747 aircraft and has been providing meteorological data (wind angle and velocity, temperature, altitude and position as a function of time) on a fully automated basis for the past several months. The results have been exceptional. This concept is expected to have important implications for operational meteorology and airline route forecasting.

  6. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  7. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...... method. The method, however, does not fully automate the planning but make the planning process significantly fast. The results and discussion are presented and conclusion is given in the end....

  8. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications

    DEFF Research Database (Denmark)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik

    2016-01-01

    was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD......, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool...

  9. Fully Automated Segmentation of Fluid/Cyst Regions in Optical Coherence Tomography Images With Diabetic Macular Edema Using Neutrosophic Sets and Graph Algorithms.

    Science.gov (United States)

    Rashno, Abdolreza; Koozekanani, Dara D; Drayna, Paul M; Nazari, Behzad; Sadri, Saeed; Rabbani, Hossein; Parhi, Keshab K

    2018-05-01

    This paper presents a fully automated algorithm to segment fluid-associated (fluid-filled) and cyst regions in optical coherence tomography (OCT) retina images of subjects with diabetic macular edema. The OCT image is segmented using a novel neutrosophic transformation and a graph-based shortest path method. In neutrosophic domain, an image is transformed into three sets: (true), (indeterminate) that represents noise, and (false). This paper makes four key contributions. First, a new method is introduced to compute the indeterminacy set , and a new -correction operation is introduced to compute the set in neutrosophic domain. Second, a graph shortest-path method is applied in neutrosophic domain to segment the inner limiting membrane and the retinal pigment epithelium as regions of interest (ROI) and outer plexiform layer and inner segment myeloid as middle layers using a novel definition of the edge weights . Third, a new cost function for cluster-based fluid/cyst segmentation in ROI is presented which also includes a novel approach in estimating the number of clusters in an automated manner. Fourth, the final fluid regions are achieved by ignoring very small regions and the regions between middle layers. The proposed method is evaluated using two publicly available datasets: Duke, Optima, and a third local dataset from the UMN clinic which is available online. The proposed algorithm outperforms the previously proposed Duke algorithm by 8% with respect to the dice coefficient and by 5% with respect to precision on the Duke dataset, while achieving about the same sensitivity. Also, the proposed algorithm outperforms a prior method for Optima dataset by 6%, 22%, and 23% with respect to the dice coefficient, sensitivity, and precision, respectively. Finally, the proposed algorithm also achieves sensitivity of 67.3%, 88.8%, and 76.7%, for the Duke, Optima, and the university of minnesota (UMN) datasets, respectively.

  10. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1988-01-01

    Automation, the removal of the human element in inspection has not been generally applied to film radiographic NDT. The justification for automation is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 3x10 (to the power of nine) bits per 14x17. This is equivalent to 2200 computer floppy disks parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, a computer aided interpretation appears on the horizon. A unit which laser scans a 14x27 (inch) film in 6-8 seconds can digitize film in information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for film digital radiography system) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. (Author). 4 refs.; 21 figs

  11. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  12. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    Science.gov (United States)

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  13. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging.

    Science.gov (United States)

    Patel, Tapan P; Man, Karen; Firestein, Bonnie L; Meaney, David F

    2015-03-30

    Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s-1000+neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. Copyright © 2015. Published by Elsevier B.V.

  14. Multiagent-Based Flexible Automation of Microproduction Systems Including Mobile Transport Robots

    OpenAIRE

    Voos, Holger; Wangmanaopituk, Suparchoek

    2013-01-01

    In microproduction, i.e. in the production and assembly of micro-scale components and products, fully automated systems hardly exist so far. Besides the requirements of handling small parts with extreme precision, small batch sizes of highly customized products are among the main challenges. Therefore, economic microproduction requires very flexible production systems with a high level of automation. This contribution proposes a new concept of such a system that provides two main innova...

  15. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system

    Energy Technology Data Exchange (ETDEWEB)

    Ikeya, Teppei [Goethe University Frankfurt am Main, Institute of Biophysical Chemistry, Center for Biomolecular Magnetic Resonance (Germany); Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune [Tokyo Metropolitan University, Graduate School of Science (Japan)], E-mail: kainosho@nmr.chem.metro-u.ac.jp; Guentert, Peter [Goethe University Frankfurt am Main, Institute of Biophysical Chemistry, Center for Biomolecular Magnetic Resonance (Germany)], E-mail: guentert@em.uni-frankfurt.de

    2009-08-15

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly {sup 13}C/{sup 15}N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional 'through-bond' spectrum (and 2D HSQC spectra) in addition to the {sup 13}C-edited and {sup 15}N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods.

  16. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system

    International Nuclear Information System (INIS)

    Ikeya, Teppei; Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune; Guentert, Peter

    2009-01-01

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly 13 C/ 15 N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional 'through-bond' spectrum (and 2D HSQC spectra) in addition to the 13 C-edited and 15 N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods

  17. Automated NMR structure determination of stereo-array isotope labeled ubiquitin from minimal sets of spectra using the SAIL-FLYA system.

    Science.gov (United States)

    Ikeya, Teppei; Takeda, Mitsuhiro; Yoshida, Hitoshi; Terauchi, Tsutomu; Jee, Jun-Goo; Kainosho, Masatsune; Güntert, Peter

    2009-08-01

    Stereo-array isotope labeling (SAIL) has been combined with the fully automated NMR structure determination algorithm FLYA to determine the three-dimensional structure of the protein ubiquitin from different sets of input NMR spectra. SAIL provides a complete stereo- and regio-specific pattern of stable isotopes that results in sharper resonance lines and reduced signal overlap, without information loss. Here we show that as a result of the superior quality of the SAIL NMR spectra, reliable, fully automated analyses of the NMR spectra and structure calculations are possible using fewer input spectra than with conventional uniformly 13C/15N-labeled proteins. FLYA calculations with SAIL ubiquitin, using a single three-dimensional "through-bond" spectrum (and 2D HSQC spectra) in addition to the 13C-edited and 15N-edited NOESY spectra for conformational restraints, yielded structures with an accuracy of 0.83-1.15 A for the backbone RMSD to the conventionally determined solution structure of SAIL ubiquitin. NMR structures can thus be determined almost exclusively from the NOESY spectra that yield the conformational restraints, without the need to record many spectra only for determining intermediate, auxiliary data of the chemical shift assignments. The FLYA calculations for this report resulted in 252 ubiquitin structure bundles, obtained with different input data but identical structure calculation and refinement methods. These structures cover the entire range from highly accurate structures to seriously, but not trivially, wrong structures, and thus constitute a valuable database for the substantiation of structure validation methods.

  18. Sifting through genomes with iterative-sequence clustering produces a large, phylogenetically diverse protein-family resource.

    Science.gov (United States)

    Sharpton, Thomas J; Jospin, Guillaume; Wu, Dongying; Langille, Morgan G I; Pollard, Katherine S; Eisen, Jonathan A

    2012-10-13

    New computational resources are needed to manage the increasing volume of biological data from genome sequencing projects. One fundamental challenge is the ability to maintain a complete and current catalog of protein diversity. We developed a new approach for the identification of protein families that focuses on the rapid discovery of homologous protein sequences. We implemented fully automated and high-throughput procedures to de novo cluster proteins into families based upon global alignment similarity. Our approach employs an iterative clustering strategy in which homologs of known families are sifted out of the search for new families. The resulting reduction in computational complexity enables us to rapidly identify novel protein families found in new genomes and to perform efficient, automated updates that keep pace with genome sequencing. We refer to protein families identified through this approach as "Sifting Families," or SFams. Our analysis of ~10.5 million protein sequences from 2,928 genomes identified 436,360 SFams, many of which are not represented in other protein family databases. We validated the quality of SFam clustering through statistical as well as network topology-based analyses. We describe the rapid identification of SFams and demonstrate how they can be used to annotate genomes and metagenomes. The SFam database catalogs protein-family quality metrics, multiple sequence alignments, hidden Markov models, and phylogenetic trees. Our source code and database are publicly available and will be subject to frequent updates (http://edhar.genomecenter.ucdavis.edu/sifting_families/).

  19. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  20. Exploring the Use of a Test Automation Framework

    Science.gov (United States)

    Cervantes, Alex

    2009-01-01

    It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.

  1. Automated Test Requirement Document Generation

    Science.gov (United States)

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  2. Effects of Automation Types on Air Traffic Controller Situation Awareness and Performance

    Science.gov (United States)

    Sethumadhavan, A.

    2009-01-01

    The Joint Planning and Development Office has proposed the introduction of automated systems to help air traffic controllers handle the increasing volume of air traffic in the next two decades (JPDO, 2007). Because fully automated systems leave operators out of the decision-making loop (e.g., Billings, 1991), it is important to determine the right level and type of automation that will keep air traffic controllers in the loop. This study examined the differences in the situation awareness (SA) and collision detection performance of individuals when they worked with information acquisition, information analysis, decision and action selection and action implementation automation to control air traffic (Parasuraman, Sheridan, & Wickens, 2000). When the automation was unreliable, the time taken to detect an upcoming collision was significantly longer for all the automation types compared with the information acquisition automation. This poor performance following automation failure was mediated by SA, with lower SA yielding poor performance. Thus, the costs associated with automation failure are greater when automation is applied to higher order stages of information processing. Results have practical implications for automation design and development of SA training programs.

  3. Automated visual fruit detection for harvest estimation and robotic harvesting

    OpenAIRE

    Puttemans, Steven; Vanbrabant, Yasmin; Tits, Laurent; Goedemé, Toon

    2016-01-01

    Fully automated detection and localisation of fruit in orchards is a key component in creating automated robotic harvesting systems, a dream of many farmers around the world to cope with large production and personnel costs. In recent years a lot of research on this topic has been performed, using basic computer vision techniques, like colour based segmentation, as a suggested solution. When not using standard RGB cameras, research tends to resort to other sensors, like hyper spectral or 3D. ...

  4. Flow induced dispersion analysis rapidly quantifies proteins in human plasma samples

    DEFF Research Database (Denmark)

    Poulsen, Nicklas N; Andersen, Nina Z; Østergaard, Jesper

    2015-01-01

    Rapid and sensitive quantification of protein based biomarkers and drugs is a substantial challenge in diagnostics and biopharmaceutical drug development. Current technologies, such as ELISA, are characterized by being slow (hours), requiring relatively large amounts of sample and being subject...... to cumbersome and expensive assay development. In this work a new approach for quantification based on changes in diffusivity is presented. The apparent diffusivity of an indicator molecule interacting with the protein of interest is determined by Taylor Dispersion Analysis (TDA) in a hydrodynamic flow system...... in a blood plasma matrix), fully automated, and being subject to a simple assay development. FIDA is demonstrated for quantification of the protein Human Serum Albumin (HSA) in human plasma as well as for quantification of an antibody against HSA. The sensitivity of the FIDA assay depends on the indicator...

  5. Preface to the special section on human factors and automation in vehicles: designing highly automated vehicles with the driver in mind.

    Science.gov (United States)

    Merat, Natasha; Lee, John D

    2012-10-01

    This special section brings together diverse research regarding driver interaction with advanced automotive technology to guide design of increasingly automated vehicles. Rapidly evolving vehicle automation will likely change cars and trucks more in the next 5 years than the preceding 50, radically redefining what it means to drive. This special section includes 10 articles from European and North American researchers reporting simulator and naturalistic driving studies. Little research has considered the consequences of fully automated driving, with most focusing on lane-keeping and speed control systems individually. The studies reveal two underlying design philosophies: automate driving versus support driving. Results of several studies, consistent with previous research in other domains, suggest that the automate philosophy can delay driver responses to incidents in which the driver has to intervene and take control from the automation. Understanding how to orchestrate the transfer or sharing of control between the system and the driver, particularly in critical incidents, emerges as a central challenge. Designers should not assume that automation can substitute seamlessly for a human driver, nor can they assume that the driver can safely accommodate the limitations of automation. Designers, policy makers, and researchers must give careful consideration to what role the person should have in highly automated vehicles and how to support the driver if the driver is to be responsible for vehicle control. As in other domains, driving safety increasingly depends on the combined performance of the human and automation, and successful designs will depend on recognizing and supporting the new roles of the driver.

  6. LC-MS/MS Peptide Mapping with Automated Data Processing for Routine Profiling of N-Glycans in Immunoglobulins

    Science.gov (United States)

    Shah, Bhavana; Jiang, Xinzhao Grace; Chen, Louise; Zhang, Zhongqi

    2014-06-01

    Protein N-Glycan analysis is traditionally performed by high pH anion exchange chromatography (HPAEC), reversed phase liquid chromatography (RPLC), or hydrophilic interaction liquid chromatography (HILIC) on fluorescence-labeled glycans enzymatically released from the glycoprotein. These methods require time-consuming sample preparations and do not provide site-specific glycosylation information. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) peptide mapping is frequently used for protein structural characterization and, as a bonus, can potentially provide glycan profile on each individual glycosylation site. In this work, a recently developed glycopeptide fragmentation model was used for automated identification, based on their MS/MS, of N-glycopeptides from proteolytic digestion of monoclonal antibodies (mAbs). Experimental conditions were optimized to achieve accurate profiling of glycoforms. Glycan profiles obtained from LC-MS/MS peptide mapping were compared with those obtained from HPAEC, RPLC, and HILIC analyses of released glycans for several mAb molecules. Accuracy, reproducibility, and linearity of the LC-MS/MS peptide mapping method for glycan profiling were evaluated. The LC-MS/MS peptide mapping method with fully automated data analysis requires less sample preparation, provides site-specific information, and may serve as an alternative method for routine profiling of N-glycans on immunoglobulins as well as other glycoproteins with simple N-glycans.

  7. Automated identification of protein-ligand interaction features using Inductive Logic Programming: a hexose binding case study.

    Science.gov (United States)

    A Santos, Jose C; Nassif, Houssam; Page, David; Muggleton, Stephen H; E Sternberg, Michael J

    2012-07-11

    There is a need for automated methods to learn general features of the interactions of a ligand class with its diverse set of protein receptors. An appropriate machine learning approach is Inductive Logic Programming (ILP), which automatically generates comprehensible rules in addition to prediction. The development of ILP systems which can learn rules of the complexity required for studies on protein structure remains a challenge. In this work we use a new ILP system, ProGolem, and demonstrate its performance on learning features of hexose-protein interactions. The rules induced by ProGolem detect interactions mediated by aromatics and by planar-polar residues, in addition to less common features such as the aromatic sandwich. The rules also reveal a previously unreported dependency for residues cys and leu. They also specify interactions involving aromatic and hydrogen bonding residues. This paper shows that Inductive Logic Programming implemented in ProGolem can derive rules giving structural features of protein/ligand interactions. Several of these rules are consistent with descriptions in the literature. In addition to confirming literature results, ProGolem's model has a 10-fold cross-validated predictive accuracy that is superior, at the 95% confidence level, to another ILP system previously used to study protein/hexose interactions and is comparable with state-of-the-art statistical learners.

  8. Automated Motion Estimation for 2D Cine DENSE MRI

    Science.gov (United States)

    Gilliam, Andrew D.; Epstein, Frederick H.

    2013-01-01

    Cine displacement encoding with stimulated echoes (DENSE) is a magnetic resonance (MR) method that directly encodes tissue displacement into MR phase images. This technique has successfully interrogated many forms of tissue motion, but is most commonly used to evaluate cardiac mechanics. Currently, motion analysis from cine DENSE images requires manually delineated anatomical structures. An automated analysis would improve measurement throughput, simplify data interpretation, and potentially access important physiological information during the MR exam. In this article, we present the first fully automated solution for the estimation of tissue motion and strain from 2D cine DENSE data. Results using both simulated and human cardiac cine DENSE data indicate good agreement between the automated algorithm and the standard semi-manual analysis method. PMID:22575669

  9. Operating procedure automation to enhance safety of nuclear power plants

    International Nuclear Information System (INIS)

    Husseiny, A.A.; Sabri, Z.A.; Adams, S.K.; Rodriguez, R.J.; Packer, D.; Holmes, J.W.

    1989-01-01

    Use of logic statements and computer assist are explored as means for automation and improvement on design of operating procedures including those employed in abnormal and emergency situations. Operating procedures for downpower and loss of forced circulation are used for demonstration. Human-factors analysis is performed on generic emergency operating procedures for three strategies of control; manual, semi-automatic and automatic, using standard emergency operating procedures. Such preliminary analysis shows that automation of procedures is feasible provided that fault-tolerant software and hardware become available for design of the controllers. Recommendations are provided for tests to substantiate the promise of enhancement of plant safety. Adequate design of operating procedures through automation may alleviate several major operational problems of nuclear power plants. Also, automation of procedures is necessary for partial or overall automatic control of plants. Fully automatic operations are needed for space applications while supervised automation of land-based and offshore plants may become the thrust of new generation of nulcear power plants. (orig.)

  10. Analysis of an Automated Vehicle Routing Problem in Logistics considering Path Interruption

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-01-01

    Full Text Available The application of automated vehicles in logistics can efficiently reduce the cost of logistics and reduce the potential risks in the last mile. Considering the path restriction in the initial stage of the application of automated vehicles in logistics, the conventional model for a vehicle routing problem (VRP is modified. Thus, the automated vehicle routing problem with time windows (AVRPTW model considering path interruption is established. Additionally, an improved particle swarm optimisation (PSO algorithm is designed to solve this problem. Finally, a case study is undertaken to test the validity of the model and the algorithm. Four automated vehicles are designated to execute all delivery tasks required by 25 stores. Capacities of all of the automated vehicles are almost fully utilised. It is of considerable significance for the promotion of automated vehicles in last-mile situations to develop such research into real problems arising in the initial period.

  11. Vibration-based Energy Harvesting Systems Characterization Using Automated Electronic Equipment

    Directory of Open Access Journals (Sweden)

    Ioannis KOSMADAKIS

    2015-04-01

    Full Text Available A measurement bench has been developed to fully automate the procedure for the characterization of a vibration-based energy scavenging system. The measurement system is capable of monitoring all important characteristics of a vibration harvesting system (input and output voltage, current, and other parameters, frequency and acceleration values, etc.. It is composed of a PC, typical digital measuring instruments (oscilloscope, waveform generator, etc., certain sensors and actuators, along with a microcontroller based automation module. The automation of the procedure and the manipulation of the acquired data are performed by LabVIEW software. Typical measurements of a system consisting of a vibrating source, a vibration transducer and an active rectifier are presented.

  12. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  13. Automated, High Temperature Furnace for Glovebox Operation

    International Nuclear Information System (INIS)

    Neikirk, K.

    2001-01-01

    The U.S. Department of Energy will immobilize excess plutonium in the proposed Plutonium Immobilization Plant (PIP) at the Savannah River Site (SRS) as part of a two track approach for the disposition of weapons usable plutonium. As such, the Department of Energy is funding a development and testing effort for the PIP. This effort is being performed jointly by Lawrence Livermore National Laboratory (LLNL), Westinghouse Savannah River Company (WSRC), Pacific Northwest National Laboratory (PNNL), and Argonne National Laboratory (ANL). The Plutonium Immobilization process involves the disposition of excess plutonium by incorporation into ceramic pucks. As part of the immobilization process, furnaces are needed for sintering the ceramic pucks. The furnace being developed for puck sintering is an automated, bottom loaded furnace with insulting package and resistance heating elements located within a nuclear glovebox. Other furnaces considered for the application include retort furnaces and pusher furnaces. This paper, in part, will discuss the furnace technologies considered and furnace technology selected to support reliable puck sintering in a glovebox environment. Due to the radiation levels and contamination associated with the plutonium material, the sintering process will be fully automated and contained within nuclear material gloveboxes. As such, the furnace currently under development incorporates water and air cooling to minimize heat load to the glovebox. This paper will describe the furnace equipment and systems needed to employ a fully automated puck sintering process within nuclear gloveboxes as part of the Plutonium Immobilization Plant

  14. A Framework for the Automation of Air Defence Systems

    NARCIS (Netherlands)

    Choenni, R.S.; Leijnse, C.

    The need for more efficiency in military organizations is growing. It is expected that a significant increase in efficiency can be obtained by an integration of communication and information technology. This integration may result in (sub)systems that are fully automated, i.e., systems that are

  15. Advances toward fully automated in vivo assessment of oral epithelial dysplasia by nuclear endomicroscopy-A pilot study.

    Science.gov (United States)

    Liese, Jan; Winter, Karsten; Glass, Änne; Bertolini, Julia; Kämmerer, Peer Wolfgang; Frerich, Bernhard; Schiefke, Ingolf; Remmerbach, Torsten W

    2017-11-01

    Uncertainties in detection of oral epithelial dysplasia (OED) frequently result from sampling error especially in inflammatory oral lesions. Endomicroscopy allows non-invasive, "en face" imaging of upper oral epithelium, but parameters of OED are unknown. Mucosal nuclei were imaged in 34 toluidine blue-stained oral lesions with a commercial endomicroscopy. Histopathological diagnosis showed four biopsies in "dys-/neoplastic," 23 in "inflammatory," and seven in "others" disease groups. Strength of different assessment strategies of nuclear scoring, nuclear count, and automated nuclear analysis were measured by area under ROC curve (AUC) to identify histopathological "dys-/neoplastic" group. Nuclear objects from automated image analysis were visually corrected. Best-performing parameters of nuclear-to-image ratios were the count of large nuclei (AUC=0.986) and 6-nearest neighborhood relation (AUC=0.896), and best parameters of nuclear polymorphism were the count of atypical nuclei (AUC=0.996) and compactness of nuclei (AUC=0.922). Excluding low-grade OED, nuclear scoring and count reached 100% sensitivity and 98% specificity for detection of dys-/neoplastic lesions. In automated analysis, combination of parameters enhanced diagnostic strength. Sensitivity of 100% and specificity of 87% were seen for distances of 6-nearest neighbors and aspect ratios even in uncorrected objects. Correction improved measures of nuclear polymorphism only. The hue of background color was stronger than nuclear density (AUC=0.779 vs 0.687) to detect dys-/neoplastic group indicating that macroscopic aspect is biased. Nuclear-to-image ratios are applicable for automated optical in vivo diagnostics for oral potentially malignant disorders. Nuclear endomicroscopy may promote non-invasive, early detection of dys-/neoplastic lesions by reducing sampling error. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Auditory interfaces in automated driving: an international survey

    Directory of Open Access Journals (Sweden)

    Pavlo Bazilinskyy

    2015-08-01

    Full Text Available This study investigated peoples’ opinion on auditory interfaces in contemporary cars and their willingness to be exposed to auditory feedback in automated driving. We used an Internet-based survey to collect 1,205 responses from 91 countries. The respondents stated their attitudes towards two existing auditory driver assistance systems, a parking assistant (PA and a forward collision warning system (FCWS, as well as towards a futuristic augmented sound system (FS proposed for fully automated driving. The respondents were positive towards the PA and FCWS, and rated the willingness to have automated versions of these systems as 3.87 and 3.77, respectively (on a scale from 1 = disagree strongly to 5 = agree strongly. The respondents tolerated the FS (the mean willingness to use it was 3.00 on the same scale. The results showed that among the available response options, the female voice was the most preferred feedback type for takeover requests in highly automated driving, regardless of whether the respondents’ country was English speaking or not. The present results could be useful for designers of automated vehicles and other stakeholders.

  17. Automated design of degenerate codon libraries.

    Science.gov (United States)

    Mena, Marco A; Daugherty, Patrick S

    2005-12-01

    Degenerate codon libraries are frequently used in protein engineering and evolution studies but are often limited to targeting a small number of positions to adequately limit the search space. To mitigate this, codon degeneracy can be limited using heuristics or previous knowledge of the targeted positions. To automate design of libraries given a set of amino acid sequences, an algorithm (LibDesign) was developed that generates a set of possible degenerate codon libraries, their resulting size, and their score relative to a user-defined scoring function. A gene library of a specified size can then be constructed that is representative of the given amino acid distribution or that includes specific sequences or combinations thereof. LibDesign provides a new tool for automated design of high-quality protein libraries that more effectively harness existing sequence-structure information derived from multiple sequence alignment or computational protein design data.

  18. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  19. Automated classification of immunostaining patterns in breast tissue from the human protein atlas.

    Science.gov (United States)

    Swamidoss, Issac Niwas; Kårsnäs, Andreas; Uhlmann, Virginie; Ponnusamy, Palanisamy; Kampf, Caroline; Simonsson, Martin; Wählby, Carolina; Strand, Robin

    2013-01-01

    The Human Protein Atlas (HPA) is an effort to map the location of all human proteins (http://www.proteinatlas.org/). It contains a large number of histological images of sections from human tissue. Tissue micro arrays (TMA) are imaged by a slide scanning microscope, and each image represents a thin slice of a tissue core with a dark brown antibody specific stain and a blue counter stain. When generating antibodies for protein profiling of the human proteome, an important step in the quality control is to compare staining patterns of different antibodies directed towards the same protein. This comparison is an ultimate control that the antibody recognizes the right protein. In this paper, we propose and evaluate different approaches for classifying sub-cellular antibody staining patterns in breast tissue samples. The proposed methods include the computation of various features including gray level co-occurrence matrix (GLCM) features, complex wavelet co-occurrence matrix (CWCM) features, and weighted neighbor distance using compound hierarchy of algorithms representing morphology (WND-CHARM)-inspired features. The extracted features are used into two different multivariate classifiers (support vector machine (SVM) and linear discriminant analysis (LDA) classifier). Before extracting features, we use color deconvolution to separate different tissue components, such as the brownly stained positive regions and the blue cellular regions, in the immuno-stained TMA images of breast tissue. We present classification results based on combinations of feature measurements. The proposed complex wavelet features and the WND-CHARM features have accuracy similar to that of a human expert. Both human experts and the proposed automated methods have difficulties discriminating between nuclear and cytoplasmic staining patterns. This is to a large extent due to mixed staining of nucleus and cytoplasm. Methods for quantification of staining patterns in histopathology have many

  20. Automated classification of immunostaining patterns in breast tissue from the human protein Atlas

    Directory of Open Access Journals (Sweden)

    Issac Niwas Swamidoss

    2013-01-01

    Full Text Available Background: The Human Protein Atlas (HPA is an effort to map the location of all human proteins (http://www.proteinatlas.org/. It contains a large number of histological images of sections from human tissue. Tissue micro arrays (TMA are imaged by a slide scanning microscope, and each image represents a thin slice of a tissue core with a dark brown antibody specific stain and a blue counter stain. When generating antibodies for protein profiling of the human proteome, an important step in the quality control is to compare staining patterns of different antibodies directed towards the same protein. This comparison is an ultimate control that the antibody recognizes the right protein. In this paper, we propose and evaluate different approaches for classifying sub-cellular antibody staining patterns in breast tissue samples. Materials and Methods: The proposed methods include the computation of various features including gray level co-occurrence matrix (GLCM features, complex wavelet co-occurrence matrix (CWCM features, and weighted neighbor distance using compound hierarchy of algorithms representing morphology (WND-CHARM-inspired features. The extracted features are used into two different multivariate classifiers (support vector machine (SVM and linear discriminant analysis (LDA classifier. Before extracting features, we use color deconvolution to separate different tissue components, such as the brownly stained positive regions and the blue cellular regions, in the immuno-stained TMA images of breast tissue. Results: We present classification results based on combinations of feature measurements. The proposed complex wavelet features and the WND-CHARM features have accuracy similar to that of a human expert. Conclusions: Both human experts and the proposed automated methods have difficulties discriminating between nuclear and cytoplasmic staining patterns. This is to a large extent due to mixed staining of nucleus and cytoplasm. Methods for

  1. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  2. Accurate, fully-automated NMR spectral profiling for metabolomics.

    Directory of Open Access Journals (Sweden)

    Siamak Ravanbakhsh

    Full Text Available Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid, BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF, defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error, in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of

  3. Development of a Real-Time PCR Protocol Requiring Minimal Handling for Detection of Vancomycin-Resistant Enterococci with the Fully Automated BD Max System.

    Science.gov (United States)

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2016-09-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  4. 4D experiments measured with APSY for automated backbone resonance assignments of large proteins

    International Nuclear Information System (INIS)

    Krähenbühl, Barbara; Boudet, Julien; Wider, Gerhard

    2013-01-01

    Detailed structural and functional characterization of proteins by solution NMR requires sequence-specific resonance assignment. We present a set of transverse relaxation optimization (TROSY) based four-dimensional automated projection spectroscopy (APSY) experiments which are designed for resonance assignments of proteins with a size up to 40 kDa, namely HNCACO, HNCOCA, HNCACB and HN(CO)CACB. These higher-dimensional experiments include several sensitivity-optimizing features such as multiple quantum parallel evolution in a ‘just-in-time’ manner, aliased off-resonance evolution, evolution-time optimized APSY acquisition, selective water-handling and TROSY. The experiments were acquired within the concept of APSY, but they can also be used within the framework of sparsely sampled experiments. The multidimensional peak lists derived with APSY provided chemical shifts with an approximately 20 times higher precision than conventional methods usually do, and allowed the assignment of 90 % of the backbone resonances of the perdeuterated primase-polymerase ORF904, which contains 331 amino acid residues and has a molecular weight of 38.4 kDa.

  5. Left Ventricle: Fully Automated Segmentation Based on Spatiotemporal Continuity and Myocardium Information in Cine Cardiac Magnetic Resonance Imaging (LV-FAST

    Directory of Open Access Journals (Sweden)

    Lijia Wang

    2015-01-01

    Full Text Available CMR quantification of LV chamber volumes typically and manually defines the basal-most LV, which adds processing time and user-dependence. This study developed an LV segmentation method that is fully automated based on the spatiotemporal continuity of the LV (LV-FAST. An iteratively decreasing threshold region growing approach was used first from the midventricle to the apex, until the LV area and shape discontinued, and then from midventricle to the base, until less than 50% of the myocardium circumference was observable. Region growth was constrained by LV spatiotemporal continuity to improve robustness of apical and basal segmentations. The LV-FAST method was compared with manual tracing on cardiac cine MRI data of 45 consecutive patients. Of the 45 patients, LV-FAST and manual selection identified the same apical slices at both ED and ES and the same basal slices at both ED and ES in 38, 38, 38, and 41 cases, respectively, and their measurements agreed within -1.6±8.7 mL, -1.4±7.8 mL, and 1.0±5.8% for EDV, ESV, and EF, respectively. LV-FAST allowed LV volume-time course quantitatively measured within 3 seconds on a standard desktop computer, which is fast and accurate for processing the cine volumetric cardiac MRI data, and enables LV filling course quantification over the cardiac cycle.

  6. Experimental optimization of a direct injection homogeneous charge compression ignition gasoline engine using split injections with fully automated microgenetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Canakci, M. [Kocaeli Univ., Izmit (Turkey); Reitz, R.D. [Wisconsin Univ., Dept. of Mechanical Engineering, Madison, WI (United States)

    2003-03-01

    Homogeneous charge compression ignition (HCCI) is receiving attention as a new low-emission engine concept. Little is known about the optimal operating conditions for this engine operation mode. Combustion under homogeneous, low equivalence ratio conditions results in modest temperature combustion products, containing very low concentrations of NO{sub x} and particulate matter (PM) as well as providing high thermal efficiency. However, this combustion mode can produce higher HC and CO emissions than those of conventional engines. An electronically controlled Caterpillar single-cylinder oil test engine (SCOTE), originally designed for heavy-duty diesel applications, was converted to an HCCI direct injection (DI) gasoline engine. The engine features an electronically controlled low-pressure direct injection gasoline (DI-G) injector with a 60 deg spray angle that is capable of multiple injections. The use of double injection was explored for emission control and the engine was optimized using fully automated experiments and a microgenetic algorithm optimization code. The variables changed during the optimization include the intake air temperature, start of injection timing and the split injection parameters (per cent mass of fuel in each injection, dwell between the pulses). The engine performance and emissions were determined at 700 r/min with a constant fuel flowrate at 10 MPa fuel injection pressure. The results show that significant emissions reductions are possible with the use of optimal injection strategies. (Author)

  7. A FULLY AUTOMATED PIPELINE FOR CLASSIFICATION TASKS WITH AN APPLICATION TO REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    K. Suzuki

    2016-06-01

    Full Text Available Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed ‘shallow’ machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyperparameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset, small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  8. Comparisons of fully automated syphilis tests with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Choi, Seung Jun; Park, Yongjung; Lee, Eun Young; Kim, Sinyoung; Kim, Hyon-Suk

    2013-06-01

    Serologic tests are widely used for the diagnosis of syphilis. However, conventional methods require well-trained technicians to produce reliable results. We compared automated nontreponemal and treponemal tests with conventional methods. The HiSens Auto Rapid Plasma Reagin (AutoRPR) and Treponema Pallidum particle agglutination (AutoTPPA) tests, which utilize latex turbidimetric immunoassay, were assessed. A total of 504 sera were assayed by AutoRPR, AutoTPPA, conventional VDRL and FTA-ABS. Among them, 250 samples were also tested by conventional TPPA. The concordance rate between the results of VDRL and AutoRPR was 67.5%, and 164 discrepant cases were all VDRL reactive but AutoRPR negative. In the 164 cases, 133 showed FTA-ABS reactivity. Medical records of 106 among the 133 cases were reviewed, and 82 among 106 specimens were found to be collected from patients already treated for syphilis. The concordance rate between the results of AutoTPPA and FTA-ABS was 97.8%. The results of conventional TPPA and AutoTPPA for 250 samples were concordant in 241 cases (96.4%). AutoRPR showed higher specificity than that of VDRL, while VDRL demonstrated higher sensitivity than that of AutoRPR regardless of whether the patients had been already treated for syphilis or not. Both FTA-ABS and AutoTPPA showed high sensitivities and specificities greater than 98.0%. Automated RPR and TPPA tests could be alternatives to conventional syphilis tests, and AutoRPR would be particularly suitable in treatment monitoring, since results by AutoRPR in cases after treatment became negative more rapidly than by VDRL. Copyright © 2013. Published by Elsevier Inc.

  9. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  10. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    Science.gov (United States)

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  11. Improving reticle defect disposition via fully automated lithography simulation

    Science.gov (United States)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  12. Confocal nanoscanning, bead picking (CONA): PickoScreen microscopes for automated and quantitative screening of one-bead one-compound libraries.

    Science.gov (United States)

    Hintersteiner, Martin; Buehler, Christof; Uhl, Volker; Schmied, Mario; Müller, Jürgen; Kottig, Karsten; Auer, Manfred

    2009-01-01

    Solid phase combinatorial chemistry provides fast and cost-effective access to large bead based libraries with compound numbers easily exceeding tens of thousands of compounds. Incubating one-bead one-compound library beads with fluorescently labeled target proteins and identifying and isolating the beads which contain a bound target protein, potentially represents one of the most powerful generic primary high throughput screening formats. On-bead screening (OBS) based on this detection principle can be carried out with limited automation. Often hit bead detection, i.e. recognizing beads with a fluorescently labeled protein bound to the compound on the bead, relies on eye-inspection under a wide-field microscope. Using low resolution detection techniques, the identification of hit beads and their ranking is limited by a low fluorescence signal intensity and varying levels of the library beads' autofluorescence. To exploit the full potential of an OBS process, reliable methods for both automated quantitative detection of hit beads and their subsequent isolation are needed. In a joint collaborative effort with Evotec Technologies (now Perkin-Elmer Cellular Technologies Germany GmbH), we have built two confocal bead scanner and picker platforms PS02 and a high-speed variant PS04 dedicated to automated high resolution OBS. The PS0X instruments combine fully automated confocal large area scanning of a bead monolayer at the bottom of standard MTP plates with semiautomated isolation of individual hit beads via hydraulic-driven picker capillaries. The quantification of fluorescence intensities with high spatial resolution in the equatorial plane of each bead allows for a reliable discrimination between entirely bright autofluorescent beads and real hit beads which exhibit an increased fluorescence signal at the outer few micrometers of the bead. The achieved screening speed of up to 200,000 bead assayed in less than 7 h and the picking time of approximately 1 bead

  13. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Directory of Open Access Journals (Sweden)

    Gregory R Johnson

    2015-12-01

    Full Text Available Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins and clinical research (e.g. identification of cancer biomarkers. Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  14. Evaluation of an automated microplate technique in the Galileo system for ABO and Rh(D) blood grouping.

    Science.gov (United States)

    Xu, Weiyi; Wan, Feng; Lou, Yufeng; Jin, Jiali; Mao, Weilin

    2014-01-01

    A number of automated devices for pretransfusion testing have recently become available. This study evaluated the Immucor Galileo System, a fully automated device based on the microplate hemagglutination technique for ABO/Rh (D) determinations. Routine ABO/Rh typing tests were performed on 13,045 samples using the Immucor automated instruments. Manual tube method was used to resolve ABO forward and reverse grouping discrepancies. D-negative test results were investigated and confirmed manually by the indirect antiglobulin test (IAT). The system rejected 70 tests for sample inadequacy. 87 samples were read as "No-type-determined" due to forward and reverse grouping discrepancies. 25 tests gave these results because of sample hemolysis. After further tests, we found 34 tests were caused by weakened RBC antibodies, 5 tests were attributable to weak A and/or B antigens, 4 tests were due to mixed-field reactions, and 8 tests had high titer cold agglutinin with blood qualifications which react only at temperatures below 34 degrees C. In the remaining 11 cases, irregular RBC antibodies were identified in 9 samples (seven anti-M and two anti-P) and two subgroups were identified in 2 samples (one A1 and one A2) by a reference laboratory. As for D typing, 2 weak D+ samples missed by automated systems gave negative results, but weak-positive reactions were observed in the IAT. The Immucor Galileo System is reliable and suited for ABO and D blood groups, some reasons may cause a discrepancy in ABO/D typing using a fully automated system. It is suggested that standardization of sample collection may improve the performance of the fully automated system.

  15. Automated identification of protein-ligand interaction features using Inductive Logic Programming: a hexose binding case study

    Directory of Open Access Journals (Sweden)

    A Santos Jose C

    2012-07-01

    Full Text Available Abstract Background There is a need for automated methods to learn general features of the interactions of a ligand class with its diverse set of protein receptors. An appropriate machine learning approach is Inductive Logic Programming (ILP, which automatically generates comprehensible rules in addition to prediction. The development of ILP systems which can learn rules of the complexity required for studies on protein structure remains a challenge. In this work we use a new ILP system, ProGolem, and demonstrate its performance on learning features of hexose-protein interactions. Results The rules induced by ProGolem detect interactions mediated by aromatics and by planar-polar residues, in addition to less common features such as the aromatic sandwich. The rules also reveal a previously unreported dependency for residues cys and leu. They also specify interactions involving aromatic and hydrogen bonding residues. This paper shows that Inductive Logic Programming implemented in ProGolem can derive rules giving structural features of protein/ligand interactions. Several of these rules are consistent with descriptions in the literature. Conclusions In addition to confirming literature results, ProGolem’s model has a 10-fold cross-validated predictive accuracy that is superior, at the 95% confidence level, to another ILP system previously used to study protein/hexose interactions and is comparable with state-of-the-art statistical learners.

  16. Sifting through genomes with iterative-sequence clustering produces a large, phylogenetically diverse protein-family resource

    Directory of Open Access Journals (Sweden)

    Sharpton Thomas J

    2012-10-01

    Full Text Available Abstract Background New computational resources are needed to manage the increasing volume of biological data from genome sequencing projects. One fundamental challenge is the ability to maintain a complete and current catalog of protein diversity. We developed a new approach for the identification of protein families that focuses on the rapid discovery of homologous protein sequences. Results We implemented fully automated and high-throughput procedures to de novo cluster proteins into families based upon global alignment similarity. Our approach employs an iterative clustering strategy in which homologs of known families are sifted out of the search for new families. The resulting reduction in computational complexity enables us to rapidly identify novel protein families found in new genomes and to perform efficient, automated updates that keep pace with genome sequencing. We refer to protein families identified through this approach as “Sifting Families,” or SFams. Our analysis of ~10.5 million protein sequences from 2,928 genomes identified 436,360 SFams, many of which are not represented in other protein family databases. We validated the quality of SFam clustering through statistical as well as network topology–based analyses. Conclusions We describe the rapid identification of SFams and demonstrate how they can be used to annotate genomes and metagenomes. The SFam database catalogs protein-family quality metrics, multiple sequence alignments, hidden Markov models, and phylogenetic trees. Our source code and database are publicly available and will be subject to frequent updates (http://edhar.genomecenter.ucdavis.edu/sifting_families/.

  17. Automated DNA electrophoresis, hybridization and detection

    International Nuclear Information System (INIS)

    Zapolski, E.J.; Gersten, D.M.; Golab, T.J.; Ledley, R.S.

    1986-01-01

    A fully automated, computer controlled system for nucleic acid hybridization analysis has been devised and constructed. In practice, DNA is digested with restriction endonuclease enzyme(s) and loaded into the system by pipette; 32 P-labelled nucleic acid probe(s) is loaded into the nine hybridization chambers. Instructions for all the steps in the automated process are specified by answering questions that appear on the computer screen at the start of the experiment. Subsequent steps are performed automatically. The system performs horizontal electrophoresis in agarose gel, fixed the fragments to a solid phase matrix, denatures, neutralizes, prehybridizes, hybridizes, washes, dries and detects the radioactivity according to the specifications given by the operator. The results, printed out at the end, give the positions on the matrix to which radioactivity remains hybridized following stringent washing

  18. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    Padgett, H.C.; Schmidt, D.G.; Bida, G.T.; Wieland, B.W.; Pekrul, E.; Kingsbury, W.G.

    1991-01-01

    With the recent emergence of positron emission tomography (PET) as a viable clinical tool, there is a need for a convenient, cost-effective source of the positron emitter-labeled radiotracers labeled with carbon-11, nitrogen-13, oxygen-15, and fluorine-18. These short-lived radioisotopes are accelerator produced and thus, require a cyclotron and radiochemistry processing instrumentation that can be operated 3 in a clinical environment by competant technicians. The basic goal is to ensure safety and reliability while setting new standards for economy and ease of operation. The Siemens Radioisotope Delivery System (RDS 112) is a fully automated system dedicated to the production and delivery of positron-emitter labeled precursors and radiochemicals required to support a clinical PET imaging program. Thus, the entire RDS can be thought of as an automated radiochemical processing apparatus

  19. On transcending the impasse of respiratory motion correction applications in routine clinical imaging - a consideration of a fully automated data driven motion control framework

    International Nuclear Information System (INIS)

    Kesner, Adam L; Schleyer, Paul J; Büther, Florian; Walter, Martin A; Schäfers, Klaus P; Koo, Phillip J

    2014-01-01

    Positron emission tomography (PET) is increasingly used for the detection, characterization, and follow-up of tumors located in the thorax. However, patient respiratory motion presents a unique limitation that hinders the application of high-resolution PET technology for this type of imaging. Efforts to transcend this limitation have been underway for more than a decade, yet PET remains for practical considerations a modality vulnerable to motion-induced image degradation. Respiratory motion control is not employed in routine clinical operations. In this article, we take an opportunity to highlight some of the recent advancements in data-driven motion control strategies and how they may form an underpinning for what we are presenting as a fully automated data-driven motion control framework. This framework represents an alternative direction for future endeavors in motion control and can conceptually connect individual focused studies with a strategy for addressing big picture challenges and goals. The online version of this article (doi:10.1186/2197-7364-1-8) contains supplementary material, which is available to authorized users.

  20. Concept of a computer network architecture for complete automation of nuclear power plants

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ray, A.

    1990-01-01

    The state of the art in automation of nuclear power plants has been largely limited to computerized data acquisition, monitoring, display, and recording of process signals. Complete automation of nuclear power plants, which would include plant operations, control, and management, fault diagnosis, and system reconfiguration with efficient and reliable man/machine interactions, has been projected as a realistic goal. This paper presents the concept of a computer network architecture that would use a high-speed optical data highway to integrate diverse, interacting, and spatially distributed functions that are essential for a fully automated nuclear power plant

  1. Ten years of R&D and full automation in molecular diagnosis.

    Science.gov (United States)

    Greub, Gilbert; Sahli, Roland; Brouillet, René; Jaton, Katia

    2016-01-01

    A 10-year experience of our automated molecular diagnostic platform that carries out 91 different real-time PCR is described. Progresses and future perspectives in molecular diagnostic microbiology are reviewed: why automation is important; how our platform was implemented; how homemade PCRs were developed; the advantages/disadvantages of homemade PCRs, including the critical aspects of troubleshooting and the need to further reduce the turnaround time for specific samples, at least for defined clinical settings such as emergencies. The future of molecular diagnosis depends on automation, and in a novel perspective, it is time now to fully acknowledge the true contribution of molecular diagnostic and to reconsider the indication for PCR, by also using these tests as first-line assays.

  2. Automated image analysis of cyclin D1 protein expression in invasive lobular breast carcinoma provides independent prognostic information.

    Science.gov (United States)

    Tobin, Nicholas P; Lundgren, Katja L; Conway, Catherine; Anagnostaki, Lola; Costello, Sean; Landberg, Göran

    2012-11-01

    The emergence of automated image analysis algorithms has aided the enumeration, quantification, and immunohistochemical analyses of tumor cells in both whole section and tissue microarray samples. To date, the focus of such algorithms in the breast cancer setting has been on traditional markers in the common invasive ductal carcinoma subtype. Here, we aimed to optimize and validate an automated analysis of the cell cycle regulator cyclin D1 in a large collection of invasive lobular carcinoma and relate its expression to clinicopathologic data. The image analysis algorithm was trained to optimally match manual scoring of cyclin D1 protein expression in a subset of invasive lobular carcinoma tissue microarray cores. The algorithm was capable of distinguishing cyclin D1-positive cells and illustrated high correlation with traditional manual scoring (κ=0.63). It was then applied to our entire cohort of 483 patients, with subsequent statistical comparisons to clinical data. We found no correlation between cyclin D1 expression and tumor size, grade, and lymph node status. However, overexpression of the protein was associated with reduced recurrence-free survival (P=.029), as was positive nodal status (Pinvasive lobular carcinoma. Finally, high cyclin D1 expression was associated with increased hazard ratio in multivariate analysis (hazard ratio, 1.75; 95% confidence interval, 1.05-2.89). In conclusion, we describe an image analysis algorithm capable of reliably analyzing cyclin D1 staining in invasive lobular carcinoma and have linked overexpression of the protein to increased recurrence risk. Our findings support the use of cyclin D1 as a clinically informative biomarker for invasive lobular breast cancer. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Validation of a Fully Automated HER2 Staining Kit in Breast Cancer

    Directory of Open Access Journals (Sweden)

    Cathy B. Moelans

    2010-01-01

    Full Text Available Background: Testing for HER2 amplification and/or overexpression is currently routine practice to guide Herceptin therapy in invasive breast cancer. At present, HER2 status is most commonly assessed by immunohistochemistry (IHC. Standardization of HER2 IHC assays is of utmost clinical and economical importance. At present, HER2 IHC is most commonly performed with the HercepTest which contains a polyclonal antibody and applies a manual staining procedure. Analytical variability in HER2 IHC testing could be diminished by a fully automatic staining system with a monoclonal antibody.

  4. Simple heuristics: A bridge between manual core design and automated optimization methods

    International Nuclear Information System (INIS)

    White, J.R.; Delmolino, P.M.

    1993-01-01

    The primary function of RESCUE is to serve as an aid in the analysis and identification of feasible loading patterns for LWR reload cores. The unique feature of RESCUE is that its physics model is based on some recent advances in generalized perturbation theory (GPT) methods. The high order GPT techniques offer the accuracy, computational efficiency, and flexibility needed for the implementation of a full range of capabilities within a set of compatible interactive (manual and semi-automated) and automated design tools. The basic design philosophy and current features within RESCUE are reviewed, and the new semi-automated capability is highlighted. The online advisor facility appears quite promising and it provides a natural bridge between the traditional trial-and-error manual process and the recent progress towards fully automated optimization sequences. (orig.)

  5. Optimization of the Automated Spray Layer-by-Layer Technique for Thin Film Deposition

    Science.gov (United States)

    2010-06-01

    air- pumped spray-paint cans 17,18 to fully automated systems using high pressure gas .7’ 19 This work uses the automated spray system previously...spray solutions were delivered by ultra high purity nitrogen gas (AirGas) regulated to 25psi, except when examining air pressure effects . The PAH solution...polyelectrolyte solution feed tube, the resulting Venturi effect causes the liquid solution to be drawn up into the airbrush nozzle, where it is

  6. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  7. Fully Automated Detection of Corticospinal Tract Damage in Chronic Stroke Patients

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2014-01-01

    Full Text Available Structural integrity of the corticospinal tract (CST after stroke is closely linked to the degree of motor impairment. However, current methods for measurement of fractional atrophy (FA of CST based on region of interest (ROI are time-consuming and open to bias. Here, we used tract-based spatial statistics (TBSS together with a CST template with healthy volunteers to quantify structural integrity of CST automatically. Two groups of patients after ischemic stroke were enrolled, group 1 (10 patients, 7 men, and Fugl-Meyer assessment (FMA scores ⩽ 50 and group 2 (12 patients, 12 men, and FMA scores = 100. CST of FAipsi, FAcontra, and FAratio was compared between the two groups. Relative to group 2, FA was decreased in group 1 in the ipsilesional CST (P<0.01, as well as the FAratio (P<0.01. There was no significant difference between the two subgroups in the contralesional CST (P=0.23. Compared with contralesional CST, FA of ipsilesional CST decreased in group 1 (P<0.01. These results suggest that the automated method used in our study could detect a surrogate biomarker to quantify the CST after stroke, which would facilitate implementation of clinical practice.

  8. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-01-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues

  9. Evaluation of an automated struvite reactor to recover phosphorus ...

    African Journals Online (AJOL)

    In the present study we attempted to develop a reactor system to recover phosphorus by struvite precipitation, and which can be installed anywhere in the field without access to a laboratory. A reactor was developed that can run fully automated and recover up to 93% of total phosphorus (total P). Turbidity and conductivity ...

  10. A new software routine that automates the fitting of protein X-ray crystallographic electron-density maps.

    Science.gov (United States)

    Levitt, D G

    2001-07-01

    The classical approach to building the amino-acid residues into the initial electron-density map requires days to weeks of a skilled investigator's time. Automating this procedure should not only save time, but has the potential to provide a more accurate starting model for input to refinement programs. The new software routine MAID builds the protein structure into the electron-density map in a series of sequential steps. The first step is the fitting of the secondary alpha-helix and beta-sheet structures. These 'fits' are then used to determine the local amino-acid sequence assignment. These assigned fits are then extended through the loop regions and fused with the neighboring sheet or helix. The program was tested on the unaveraged 2.5 A selenomethionine multiple-wavelength anomalous dispersion (SMAD) electron-density map that was originally used to solve the structure of the 291-residue protein human heart short-chain L-3-hydroxyacyl-CoA dehydrogenase (SHAD). Inputting just the map density and the amino-acid sequence, MAID fitted 80% of the residues with an r.m.s.d. error of 0.43 A for the main-chain atoms and 1.0 A for all atoms without any user intervention. When tested on a higher quality 1.9 A SMAD map, MAID correctly fitted 100% (418) of the residues. A major advantage of the MAID fitting procedure is that it maintains ideal bond lengths and angles and constrains phi/psi angles to the appropriate Ramachandran regions. Recycling the output of this new routine through a partial structure-refinement program may have the potential to completely automate the fitting of electron-density maps.

  11. DEWS (DEep White matter hyperintensity Segmentation framework): A fully automated pipeline for detecting small deep white matter hyperintensities in migraineurs.

    Science.gov (United States)

    Park, Bo-Yong; Lee, Mi Ji; Lee, Seung-Hak; Cha, Jihoon; Chung, Chin-Sang; Kim, Sung Tae; Park, Hyunjin

    2018-01-01

    Migraineurs show an increased load of white matter hyperintensities (WMHs) and more rapid deep WMH progression. Previous methods for WMH segmentation have limited efficacy to detect small deep WMHs. We developed a new fully automated detection pipeline, DEWS (DEep White matter hyperintensity Segmentation framework), for small and superficially-located deep WMHs. A total of 148 non-elderly subjects with migraine were included in this study. The pipeline consists of three components: 1) white matter (WM) extraction, 2) WMH detection, and 3) false positive reduction. In WM extraction, we adjusted the WM mask to re-assign misclassified WMHs back to WM using many sequential low-level image processing steps. In WMH detection, the potential WMH clusters were detected using an intensity based threshold and region growing approach. For false positive reduction, the detected WMH clusters were classified into final WMHs and non-WMHs using the random forest (RF) classifier. Size, texture, and multi-scale deep features were used to train the RF classifier. DEWS successfully detected small deep WMHs with a high positive predictive value (PPV) of 0.98 and true positive rate (TPR) of 0.70 in the training and test sets. Similar performance of PPV (0.96) and TPR (0.68) was attained in the validation set. DEWS showed a superior performance in comparison with other methods. Our proposed pipeline is freely available online to help the research community in quantifying deep WMHs in non-elderly adults.

  12. Fully automatic multi-language translation with a catalogue of phrases – successful employment for the Swiss avalanche bulletin

    NARCIS (Netherlands)

    Winkler, K.; Kuhn, T.

    2016-01-01

    The Swiss avalanche bulletin is produced twice a day in four languages. Due to the lack of time available for manual translation, a fully automated translation system is employed, based on a catalogue of predefined phrases and predetermined rules of how these phrases can be combined to produce

  13. Bayesian ISOLA: new tool for automated centroid moment tensor inversion

    Science.gov (United States)

    Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John

    2017-04-01

    Focal mechanisms are important for understanding seismotectonics of a region, and they serve as a basic input for seismic hazard assessment. Usually, the point source approximation and the moment tensor (MT) are used. We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances and high signal-to-noise are rejected, and full-waveform inversion in a space-time grid around a provided hypocenter. The method is innovative in the following aspects: (i) The CMT inversion is fully automated, no user interaction is required, although the details of the process can be visually inspected latter on many figures which are automatically plotted.(ii) The automated process includes detection of disturbances based on MouseTrap code, so disturbed recordings do not affect inversion.(iii) A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequencies.(iv) Bayesian approach is used, so not only the best solution is obtained, but also the posterior probability density function.(v) A space-time grid search effectively combined with the least-squares inversion of moment tensor components speeds up the inversion and allows to obtain more accurate results compared to stochastic methods. The method has been tested on synthetic and observed data. It has been tested by comparison with manually processed moment tensors of all events greater than M≥3 in the Swiss catalogue over 16 years using data available at the Swiss data center (http://arclink.ethz.ch). The quality of the results of the presented automated process is comparable with careful manual processing of data. The software package programmed in Python has been designed to be as versatile as possible in

  14. Automated imaging system for single molecules

    Science.gov (United States)

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  15. Automated Cable Preparation for Robotized Stator Cable Winding

    Directory of Open Access Journals (Sweden)

    Erik Hultman

    2017-04-01

    Full Text Available A method for robotized cable winding of the Uppsala University Wave Energy Converter generator stator has previously been presented and validated. The purpose of this study is to present and validate further developments to the method: automated stand-alone equipment for the preparation of the winding cables. The cable preparation consists of three parts: feeding the cable from a drum, forming the cable end and cutting the cable. Forming and cutting the cable was previously done manually and only small cable drums could be handled. Therefore the robot cell needed to be stopped frequently. The new equipment was tested in an experimental robot stator cable winding setup. Through the experiments, the equipment was validated to be able to perform fully automated and robust cable preparation. Suggestions are also given on how to further develop the equipment with regards to performance, robustness and quality. Hence, this work represents another important step towards demonstrating completely automated robotized stator cable winding.

  16. Enantioselective determination of methylphenidate and ritalinic acid in whole blood from forensic cases using automated solid-phase extraction and liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Thomsen, Ragnar; B. Rasmussen, Henrik; Linnet, Kristian

    2012-01-01

    A chiral liquid chromatography tandem mass spectrometry (LC–MS-MS) method was developed and validated for quantifying methylphenidate and its major metabolite ritalinic acid in blood from forensic cases. Blood samples were prepared in a fully automated system by protein precipitation followed...... methylphenidate was not determined to be related to the cause of death, the femoral blood concentration of d-methylphenidate ranged from 5 to 58 ng/g, and from undetected to 48 ng/g for l-methylphenidate (median d/l-ratio 5.9). Ritalinic acid was present at concentrations 10–20 times higher with roughly equal...

  17. Understanding reliance on automation: effects of error type, error distribution, age and experience

    Science.gov (United States)

    Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka

    2015-01-01

    An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142

  18. Technology demonstration of space intravehicular automation and robotics

    Science.gov (United States)

    Morris, A. Terry; Barker, L. Keith

    1994-01-01

    Automation and robotic technologies are being developed and capabilities demonstrated which would increase the productivity of microgravity science and materials processing in the space station laboratory module, especially when the crew is not present. The Automation Technology Branch at NASA Langley has been working in the area of intravehicular automation and robotics (IVAR) to provide a user-friendly development facility, to determine customer requirements for automated laboratory systems, and to improve the quality and efficiency of commercial production and scientific experimentation in space. This paper will describe the IVAR facility and present the results of a demonstration using a simulated protein crystal growth experiment inside a full-scale mockup of the space station laboratory module using a unique seven-degree-of-freedom robot.

  19. Automation, parallelism, and robotics for proteomics.

    Science.gov (United States)

    Alterovitz, Gil; Liu, Jonathan; Chow, Jijun; Ramoni, Marco F

    2006-07-01

    The speed of the human genome project (Lander, E. S., Linton, L. M., Birren, B., Nusbaum, C. et al., Nature 2001, 409, 860-921) was made possible, in part, by developments in automation of sequencing technologies. Before these technologies, sequencing was a laborious, expensive, and personnel-intensive task. Similarly, automation and robotics are changing the field of proteomics today. Proteomics is defined as the effort to understand and characterize proteins in the categories of structure, function and interaction (Englbrecht, C. C., Facius, A., Comb. Chem. High Throughput Screen. 2005, 8, 705-715). As such, this field nicely lends itself to automation technologies since these methods often require large economies of scale in order to achieve cost and time-saving benefits. This article describes some of the technologies and methods being applied in proteomics in order to facilitate automation within the field as well as in linking proteomics-based information with other related research areas.

  20. Conflict Resolution Automation and Pilot Situation Awareness

    Science.gov (United States)

    Dao, Arik-Quang V.; Brandt, Summer L.; Bacon, Paige; Kraut, Josh; Nguyen, Jimmy; Minakata, Katsumi; Raza, Hamzah; Rozovski, David; Johnson, Walter W.

    2010-01-01

    This study compared pilot situation awareness across three traffic management concepts. The Concepts varied in terms of the allocation of traffic avoidance responsibility between the pilot on the flight deck, the air traffic controllers, and a conflict resolution automation system. In Concept 1, the flight deck was equipped with conflict resolution tools that enable them to fully handle the responsibility of weather avoidance and maintaining separation between ownship and surrounding traffic. In Concept 2, pilots were not responsible for traffic separation, but were provided tools for weather and traffic avoidance. In Concept 3, flight deck tools allowed pilots to deviate for weather, but conflict detection tools were disabled. In this concept pilots were dependent on ground based automation for conflict detection and resolution. Situation awareness of the pilots was measured using online probes. Results showed that individual situation awareness was highest in Concept 1, where the pilots were most engaged, and lowest in Concept 3, where automation was heavily used. These findings suggest that for conflict resolution tasks, situation awareness is improved when pilots remain in the decision-making loop.

  1. Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.

    Science.gov (United States)

    Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo

    2015-11-17

    Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.

  2. Development of Hardware and Software for Automated Ultrasonic Testing

    International Nuclear Information System (INIS)

    Choi, Sung Nam; Lee, Hee Jong; Yang, Seung Ok

    2012-01-01

    Nondestructive testing (NDT) for the construction and operating of NPPs plays an important role in confirming the integrity of the NPPs. Especially, Automated ultrasonic testing (AUT) is one of the primary nondestructive examination methods for in-service inspection of the welding parts in major components in NPPs. AUT is a reliable nondestructive testing because the data of AUT are saved and reviewed with other examiners. Korea Hydro and Nuclear Power-Central Research Institute (KHNP-CRI) has developed an automated ultrasonic testing (AUT) system based on a high speed pulser-receiver. In combination with the designed software and hardware architecture, this new system permits user configurations for a wide range of user-specific applications through fully automated inspections using compact portable systems with up to eight channels. This paper gives an overview of hardware (H/W) and software (S/W) for the AUT system to inspect welds in NPPs

  3. Closed-Loop Real-Time Imaging Enables Fully Automated Cell-Targeted Patch-Clamp Neural Recording In Vivo.

    Science.gov (United States)

    Suk, Ho-Jun; van Welie, Ingrid; Kodandaramaiah, Suhasa B; Allen, Brian; Forest, Craig R; Boyden, Edward S

    2017-08-30

    Targeted patch-clamp recording is a powerful method for characterizing visually identified cells in intact neural circuits, but it requires skill to perform. We previously developed an algorithm that automates "blind" patching in vivo, but full automation of visually guided, targeted in vivo patching has not been demonstrated, with currently available approaches requiring human intervention to compensate for cell movement as a patch pipette approaches a targeted neuron. Here we present a closed-loop real-time imaging strategy that automatically compensates for cell movement by tracking cell position and adjusting pipette motion while approaching a target. We demonstrate our system's ability to adaptively patch, under continuous two-photon imaging and real-time analysis, fluorophore-expressing neurons of multiple types in the living mouse cortex, without human intervention, with yields comparable to skilled human experimenters. Our "imagepatching" robot is easy to implement and will help enable scalable characterization of identified cell types in intact neural circuits. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Evaluation of a fully automated treponemal test and comparison with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Park, Yongjung; Park, Younhee; Joo, Shin Young; Park, Myoung Hee; Kim, Hyon-Suk

    2011-11-01

    We evaluated analytic performances of an automated treponemal test and compared this test with the Venereal Disease Research Laboratory test (VDRL) and fluorescent treponemal antibody absorption test (FTA-ABS). Precision performance of the Architect Syphilis TP assay (TP; Abbott Japan, Tokyo, Japan) was assessed, and 150 serum samples were assayed with the TP before and after heat inactivation to estimate the effect of heat inactivation. A total of 616 specimens were tested with the FTA-ABS and TP, and 400 were examined with the VDRL. The TP showed good precision performance with total imprecision of less than a 10% coefficient of variation. An excellent linear relationship between results before and after heat inactivation was observed (R(2) = 0.9961). The FTA-ABS and TP agreed well with a κ coefficient of 0.981. The concordance rate between the FTA-ABS and TP was the highest (99.0%), followed by the rates between FTA-ABS and VDRL (85.0%) and between TP and VDRL (83.8%). The automated TP assay may be adequate for screening for syphilis in a large volume of samples and can be an alternative to FTA-ABS.

  5. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  6. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C 8 MIM]NTf 2 ) is formed through the reaction between [C 8 MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf 2 ) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL −1 . The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL −1 . The proposed method opens a new avenue

  7. Harnessing Vehicle Automation for Public Mobility -- An Overview of Ongoing Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Young, Stanley E.

    2015-11-05

    This presentation takes a look at the efforts to harness automated vehicle technology for public transport. The European CityMobil2 is the leading demonstration project in which automated shuttles were, or are planned to be, demonstrated in several cities and regions. The presentation provides a brief overview of the demonstrations at Oristano, Italy (July 2014), LaRochelle, France (Dec 2014), Lausanne, Switzerland (Apr 2015), Vantaa, Finland (July 2015), and Trikala, Greece (Sept 2015). In addition to technology exposition, the objectives included generating a legal framework for operation in each location and gaging the reaction of the public to unmanned shuttles, both of which were successfully achieved. Several such demonstrations are planned throughout the world, including efforts in North America in conjunction with the GoMentum Station in California. These early demonstration with low-speed automated shuttles provide a glimpse of the possible with a fully automated fleet of driverless vehicle providing a public transit service.

  8. Note: Automated electrochemical etching and polishing of silver scanning tunneling microscope tips.

    Science.gov (United States)

    Sasaki, Stephen S; Perdue, Shawn M; Rodriguez Perez, Alejandro; Tallarida, Nicholas; Majors, Julia H; Apkarian, V Ara; Lee, Joonhee

    2013-09-01

    Fabrication of sharp and smooth Ag tips is crucial in optical scanning probe microscope experiments. To ensure reproducible tip profiles, the polishing process is fully automated using a closed-loop laminar flow system to deliver the electrolytic solution to moving electrodes mounted on a motorized translational stage. The repetitive translational motion is controlled precisely on the μm scale with a stepper motor and screw-thread mechanism. The automated setup allows reproducible control over the tip profile and improves smoothness and sharpness of tips (radius 27 ± 18 nm), as measured by ultrafast field emission.

  9. Fully automated calculation of image-derived input function in simultaneous PET/MRI in a sheep model

    International Nuclear Information System (INIS)

    Jochimsen, Thies H.; Zeisig, Vilia; Schulz, Jessica; Werner, Peter; Patt, Marianne; Patt, Jörg; Dreyer, Antje Y.; Boltze, Johannes; Barthel, Henryk; Sabri, Osama; Sattler, Bernhard

    2016-01-01

    Obtaining the arterial input function (AIF) from image data in dynamic positron emission tomography (PET) examinations is a non-invasive alternative to arterial blood sampling. In simultaneous PET/magnetic resonance imaging (PET/MRI), high-resolution MRI angiographies can be used to define major arteries for correction of partial-volume effects (PVE) and point spread function (PSF) response in the PET data. The present study describes a fully automated method to obtain the image-derived input function (IDIF) in PET/MRI. Results are compared to those obtained by arterial blood sampling. To segment the trunk of the major arteries in the neck, a high-resolution time-of-flight MRI angiography was postprocessed by a vessel-enhancement filter based on the inertia tensor. Together with the measured PSF of the PET subsystem, the arterial mask was used for geometrical deconvolution, yielding the time-resolved activity concentration averaged over a major artery. The method was compared to manual arterial blood sampling at the hind leg of 21 sheep (animal stroke model) during measurement of blood flow with O15-water. Absolute quantification of activity concentration was compared after bolus passage during steady state, i.e., between 2.5- and 5-min post injection. Cerebral blood flow (CBF) values from blood sampling and IDIF were also compared. The cross-calibration factor obtained by comparing activity concentrations in blood samples and IDIF during steady state is 0.98 ± 0.10. In all examinations, the IDIF provided a much earlier and sharper bolus peak than in the time course of activity concentration obtained by arterial blood sampling. CBF using the IDIF was 22 % higher than CBF obtained by using the AIF yielded by blood sampling. The small deviation between arterial blood sampling and IDIF during steady state indicates that correction of PVE and PSF is possible with the method presented. The differences in bolus dynamics and, hence, CBF values can be explained by the

  10. Gene Expression Measurement Module (GEMM) - A Fully Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    Science.gov (United States)

    Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio

    2012-01-01

    The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with

  11. Fully automated calculation of image-derived input function in simultaneous PET/MRI in a sheep model

    Energy Technology Data Exchange (ETDEWEB)

    Jochimsen, Thies H.; Zeisig, Vilia [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Schulz, Jessica [Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstr. 1a, Leipzig, D-04103 (Germany); Werner, Peter; Patt, Marianne; Patt, Jörg [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Dreyer, Antje Y. [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Boltze, Johannes [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Fraunhofer Research Institution of Marine Biotechnology and Institute for Medical and Marine Biotechnology, University of Lübeck, Lübeck (Germany); Barthel, Henryk; Sabri, Osama; Sattler, Bernhard [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany)

    2016-02-13

    Obtaining the arterial input function (AIF) from image data in dynamic positron emission tomography (PET) examinations is a non-invasive alternative to arterial blood sampling. In simultaneous PET/magnetic resonance imaging (PET/MRI), high-resolution MRI angiographies can be used to define major arteries for correction of partial-volume effects (PVE) and point spread function (PSF) response in the PET data. The present study describes a fully automated method to obtain the image-derived input function (IDIF) in PET/MRI. Results are compared to those obtained by arterial blood sampling. To segment the trunk of the major arteries in the neck, a high-resolution time-of-flight MRI angiography was postprocessed by a vessel-enhancement filter based on the inertia tensor. Together with the measured PSF of the PET subsystem, the arterial mask was used for geometrical deconvolution, yielding the time-resolved activity concentration averaged over a major artery. The method was compared to manual arterial blood sampling at the hind leg of 21 sheep (animal stroke model) during measurement of blood flow with O15-water. Absolute quantification of activity concentration was compared after bolus passage during steady state, i.e., between 2.5- and 5-min post injection. Cerebral blood flow (CBF) values from blood sampling and IDIF were also compared. The cross-calibration factor obtained by comparing activity concentrations in blood samples and IDIF during steady state is 0.98 ± 0.10. In all examinations, the IDIF provided a much earlier and sharper bolus peak than in the time course of activity concentration obtained by arterial blood sampling. CBF using the IDIF was 22 % higher than CBF obtained by using the AIF yielded by blood sampling. The small deviation between arterial blood sampling and IDIF during steady state indicates that correction of PVE and PSF is possible with the method presented. The differences in bolus dynamics and, hence, CBF values can be explained by the

  12. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    Irishkin, Maxim

    2014-01-01

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author) [fr

  13. Automated high-throughput protein purification using an ÄKTApurifier and a CETAC autosampler.

    Science.gov (United States)

    Yoo, Daniel; Provchy, Justin; Park, Cynthia; Schulz, Craig; Walker, Kenneth

    2014-05-30

    As the pace of drug discovery accelerates there is an increased focus on screening larger numbers of protein therapeutic candidates to identify those that are functionally superior and to assess manufacturability earlier in the process. Although there have been advances toward high throughput (HT) cloning and expression, protein purification is still an area where improvements can be made to conventional techniques. Current methodologies for purification often involve a tradeoff between HT automation or capacity and quality. We present an ÄKTA combined with an autosampler, the ÄKTA-AS, which has the capability of purifying up to 240 samples in two chromatographic dimensions without the need for user intervention. The ÄKTA-AS has been shown to be reliable with sample volumes between 0.5 mL and 100 mL, and the innovative use of a uniquely configured loading valve ensures reliability by efficiently removing air from the system as well as preventing sample cross contamination. Incorporation of a sample pump flush minimizes sample loss and enables recoveries ranging from the low tens of micrograms to milligram quantities of protein. In addition, when used in an affinity capture-buffer exchange format the final samples are formulated in a buffer compatible with most assays without requirement of additional downstream processing. The system is designed to capture samples in 96-well microplate format allowing for seamless integration of downstream HT analytic processes such as microfluidic or HPLC analysis. Most notably, there is minimal operator intervention to operate this system, thereby increasing efficiency, sample consistency and reducing the risk of human error. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    International Nuclear Information System (INIS)

    El-Alaily, T.M.; El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M.; Assar, S.T.

    2015-01-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability

  15. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  16. Towards Automated Binding Affinity Prediction Using an Iterative Linear Interaction Energy Approach

    Directory of Open Access Journals (Sweden)

    C. Ruben Vosmeer

    2014-01-01

    Full Text Available Binding affinity prediction of potential drugs to target and off-target proteins is an essential asset in drug development. These predictions require the calculation of binding free energies. In such calculations, it is a major challenge to properly account for both the dynamic nature of the protein and the possible variety of ligand-binding orientations, while keeping computational costs tractable. Recently, an iterative Linear Interaction Energy (LIE approach was introduced, in which results from multiple simulations of a protein-ligand complex are combined into a single binding free energy using a Boltzmann weighting-based scheme. This method was shown to reach experimental accuracy for flexible proteins while retaining the computational efficiency of the general LIE approach. Here, we show that the iterative LIE approach can be used to predict binding affinities in an automated way. A workflow was designed using preselected protein conformations, automated ligand docking and clustering, and a (semi-automated molecular dynamics simulation setup. We show that using this workflow, binding affinities of aryloxypropanolamines to the malleable Cytochrome P450 2D6 enzyme can be predicted without a priori knowledge of dominant protein-ligand conformations. In addition, we provide an outlook for an approach to assess the quality of the LIE predictions, based on simulation outcomes only.

  17. A Fully Automated Web-Based Program Improves Lifestyle Habits and HbA1c in Patients With Type 2 Diabetes and Abdominal Obesity: Randomized Trial of Patient E-Coaching Nutritional Support (The ANODE Study).

    Science.gov (United States)

    Hansel, Boris; Giral, Philippe; Gambotti, Laetitia; Lafourcade, Alexandre; Peres, Gilbert; Filipecki, Claude; Kadouch, Diana; Hartemann, Agnes; Oppert, Jean-Michel; Bruckert, Eric; Marre, Michel; Bruneel, Arnaud; Duchene, Emilie; Roussel, Ronan

    2017-11-08

    The prevalence of abdominal obesity and type 2 diabetes mellitus (T2DM) is a public health challenge. New solutions need to be developed to help patients implement lifestyle changes. The objective of the study was to evaluate a fully automated Web-based intervention designed to help users improve their dietary habits and increase their physical activity. The Accompagnement Nutritionnel de l'Obésité et du Diabète par E-coaching (ANODE) study was a 16-week, 1:1 parallel-arm, open-label randomized clinical trial. Patients with T2DM and abdominal obesity (n=120, aged 18-75 years) were recruited. Patients in the intervention arm (n=60) had access to a fully automated program (ANODE) to improve their lifestyle. Patients were asked to log on at least once per week. Human contact was limited to hotline support in cases of technical issues. The dietetic tool provided personalized menus and a shopping list for the day or the week. Stepwise physical activity was prescribed. The control arm (n=60) received general nutritional advice. The primary outcome was the change of the dietary score (International Diet Quality Index; DQI-I) between baseline and the end of the study. Secondary endpoints included changes in body weight, waist circumference, hemoglobin A1c (HbA1c) and measured maximum oxygen consumption (VO2 max). The mean age of the participants was 57 years (standard deviation [SD] 9), mean body mass index was 33 kg/m² (SD 4), mean HbA1c was 7.2% (SD 1.1), and 66.7% (80/120) of participants were women. Using an intention-to-treat analysis, the DQI-I score (54.0, SD 5.7 in the ANODE arm; 52.8, SD 6.2 in the control arm; P=.28) increased significantly in the ANODE arm compared to the control arm (+4.55, SD 5.91 vs -1.68, SD 5.18; between arms Pchanges improved significantly in the intervention. Among patients with T2DM and abdominal obesity, the use of a fully automated Web-based program resulted in a significant improvement in dietary habits and favorable clinical and

  18. Pilot opinions on high level flight deck automation issues: Toward the development of a design philosophy

    Science.gov (United States)

    Tenney, Yvette J.; Rogers, William H.; Pew, Richard W.

    1995-01-01

    There has been much concern in recent years about the rapid increase in automation on commercial flight decks. The survey was composed of three major sections. The first section asked pilots to rate different automation components that exist on the latest commercial aircraft regarding their obtrusiveness and the attention and effort required in using them. The second section addressed general 'automation philosophy' issues. The third section focused on issues related to levels and amount of automation. The results indicate that pilots of advanced aircraft like their automation, use it, and would welcome more automation. However, they also believe that automation has many disadvantages, especially fully autonomous automation. They want their automation to be simple and reliable and to produce predictable results. The biggest needs for higher levels of automation were in pre-flight, communication, systems management, and task management functions, planning as well as response tasks, and high workload situations. There is an irony and a challenge in the implications of these findings. On the one hand pilots would like new automation to be simple and reliable, but they need it to support the most complex part of the job--managing and planning tasks in high workload situations.

  19. PARAssign-paramagnetic NMR assignments of protein nuclei on the basis of pseudocontact shifts

    Energy Technology Data Exchange (ETDEWEB)

    Skinner, Simon P., E-mail: skinnersp@chem.leidenuniv.nl [Leiden University, Gorlaeus Laboratories, Leiden Institute of Chemistry (Netherlands); Moshev, Mois, E-mail: mois@monomon.me [Leiden University, Leiden Institute of Advanced Computer Science (Netherlands); Hass, Mathias A. S., E-mail: hassmas@chem.leidenuniv.nl; Ubbink, Marcellus, E-mail: m.ubbink@chem.leidenuniv.nl [Leiden University, Gorlaeus Laboratories, Leiden Institute of Chemistry (Netherlands)

    2013-04-15

    The use of paramagnetic NMR data for the refinement of structures of proteins and protein complexes is widespread. However, the power of paramagnetism for protein assignment has not yet been fully exploited. PARAssign is software that uses pseudocontact shift data derived from several paramagnetic centers attached to the protein to obtain amide and methyl assignments. The ability of PARAssign to perform assignment when the positions of the paramagnetic centers are known and unknown is demonstrated. PARAssign has been tested using synthetic data for methyl assignment of a 47 kDa protein, and using both synthetic and experimental data for amide assignment of a 14 kDa protein. The complex fitting space involved in such an assignment procedure necessitates that good starting conditions are found, both regarding placement and strength of paramagnetic centers. These starting conditions are obtained through automated tensor placement and user-defined tensor parameters. The results presented herein demonstrate that PARAssign is able to successfully perform resonance assignment in large systems with a high degree of reliability. This software provides a method for obtaining the assignments of large systems, which may previously have been unassignable, by using 2D NMR spectral data and a known protein structure.

  20. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Mahnken, A.H.; Kohnen, M.; Steinberg, S.; Wein, B.B.; Guenther, R.W.

    2001-01-01

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.) [de

  1. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  2. Automated side-chain model building and sequence assignment by template matching

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2002-01-01

    A method for automated macromolecular side-chain model building and for aligning the sequence to the map is described. An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer

  3. Validation of a fully automated solid‐phase extraction and ultra‐high‐performance liquid chromatography–tandem mass spectrometry method for quantification of 30 pharmaceuticals and metabolites in post‐mortem blood and brain samples

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Nedahl, Michael; Johansen, Sys Stybe

    2018-01-01

    In this study, we present the validation of an analytical method capable of quantifying 30 commonly encountered pharmaceuticals and metabolites in whole blood and brain tissue from forensic cases. Solid‐phase extraction was performed by a fully automated robotic system, thereby minimising manual...... labour and human error while increasing sample throughput, robustness, and traceability. The method was validated in blood in terms of selectivity, linear range, matrix effect, extraction recovery, process efficiency, carry‐over, stability, precision, and accuracy. Deuterated analogues of each analyte....../kg. Thus, the linear range covered both therapeutic and toxic levels. The method showed acceptable accuracy and precision, with accuracies ranging from 80 to 118% and precision below 19% for the majority of the analytes. Linear range, matrix effect, extraction recovery, process efficiency, precision...

  4. UPLC-MS/MS based diagnostics for epithelial ovarian cancer using fully sialylated C4-binding protein.

    Science.gov (United States)

    Tanabe, Kazuhiro; Matsuo, Koji; Miyazawa, Masaki; Hayashi, Masaru; Ikeda, Masae; Shida, Masako; Hirasawa, Takeshi; Sho, Ryuichiro; Mikami, Mikio

    2018-05-01

    Serum levels of fully sialylated C4-binding protein (FS-C4BP) are remarkably elevated in patients with epithelial ovarian cancer (EOC) and can be used as a marker to distinguish ovarian clear cell carcinoma from endometrioma. This study aimed to develop a stable, robust and reliable liquid chromatography-hybrid mass spectrometry (UPLC-MS/MS) based diagnostic method that would generalize FS-C4BP as a clinical EOC biomarker. Glycopeptides derived from 20 μL of trypsin-digested serum glycoprotein were analyzed via UPLC equipped with an electrospray ionization time-of-flight mass spectrometer. This UPLC-MS/MS-based diagnostic method was optimized for FS-C4BP and validated using sera from 119 patients with EOC and 127 women without cancer. A1958 (C4BP peptide with two fully sialylated biantennary glycans) was selected as a marker of FS-C4BP because its level in serum was highest among FS-C4BP family members. Preparation and UPLC-MS/MS were optimized for A1958, and performance and robustness were significantly improved relative to our previous method. An area under the curve analysis of the FS-C4BP index receiver operating characteristic curve revealed that the ratio between A1958 and A1813 (C4BP peptide with two partially sialylated biantennary glycans) reached 85%. A combination of the FS-C4BP index and carbohydrate antigen-125 levels further enhanced the sensitivity and specificity. © 2017 The Authors. Biomedical Chromatography published by John Wiley & Sons Ltd.

  5. Automated Localization of Multiple Pelvic Bone Structures on MRI.

    Science.gov (United States)

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2016-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are at present identified manually on MRI to locate reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures automatically. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this paper, we present a model that combines support vector machines and nonlinear regression capturing global and local information to automatically identify the bounding boxes of bone structures on MRI. The model identifies the location of the pelvic bone structures by establishing the association between their relative locations and using local information such as texture features. Results show that the proposed method is able to locate the bone structures of interest accurately (dice similarity index >0.75) in 87-91% of the images. This research aims to enable accurate, consistent, and fully automated localization of bone structures on MRI to facilitate and improve the diagnosis of health conditions such as female POP.

  6. Level of Automation and Failure Frequency Effects on Simulated Lunar Lander Performance

    Science.gov (United States)

    Marquez, Jessica J.; Ramirez, Margarita

    2014-01-01

    A human-in-the-loop experiment was conducted at the NASA Ames Research Center Vertical Motion Simulator, where instrument-rated pilots completed a simulated terminal descent phase of a lunar landing. Ten pilots participated in a 2 x 2 mixed design experiment, with level of automation as the within-subjects factor and failure frequency as the between subjects factor. The two evaluated levels of automation were high (fully automated landing) and low (manual controlled landing). During test trials, participants were exposed to either a high number of failures (75% failure frequency) or low number of failures (25% failure frequency). In order to investigate the pilots' sensitivity to changes in levels of automation and failure frequency, the dependent measure selected for this experiment was accuracy of failure diagnosis, from which D Prime and Decision Criterion were derived. For each of the dependent measures, no significant difference was found for level of automation and no significant interaction was detected between level of automation and failure frequency. A significant effect was identified for failure frequency suggesting failure frequency has a significant effect on pilots' sensitivity to failure detection and diagnosis. Participants were more likely to correctly identify and diagnose failures if they experienced the higher levels of failures, regardless of level of automation

  7. Completion of autobuilt protein models using a database of protein fragments

    International Nuclear Information System (INIS)

    Cowtan, Kevin

    2012-01-01

    Two developments in the process of automated protein model building in the Buccaneer software are described: the use of a database of protein fragments in improving the model completeness and the assembly of disconnected chain fragments into complete molecules. Two developments in the process of automated protein model building in the Buccaneer software are presented. A general-purpose library for protein fragments of arbitrary size is described, with a highly optimized search method allowing the use of a larger database than in previous work. The problem of assembling an autobuilt model into complete chains is discussed. This involves the assembly of disconnected chain fragments into complete molecules and the use of the database of protein fragments in improving the model completeness. Assembly of fragments into molecules is a standard step in existing model-building software, but the methods have not received detailed discussion in the literature

  8. Space station automation: the role of robotics and artificial intelligence (Invited Paper)

    Science.gov (United States)

    Park, W. T.; Firschein, O.

    1985-12-01

    Automation of the space station is necessary to make more effective use of the crew, to carry out repairs that are impractical or dangerous, and to monitor and control the many space station subsystems. Intelligent robotics and expert systems play a strong role in automation, and both disciplines are highly dependent on a common artificial intelligence (Al) technology base. The AI technology base provides the reasoning and planning capabilities needed in robotic tasks, such as perception of the environment and planning a path to a goal, and in expert systems tasks, such as control of subsystems and maintenance of equipment. This paper describes automation concepts for the space station, the specific robotic and expert systems required to attain this automation, and the research and development required. It also presents an evolutionary development plan that leads to fully automatic mobile robots for servicing satellites. Finally, we indicate the sequence of demonstrations and the research and development needed to confirm the automation capabilities. We emphasize that advanced robotics requires AI, and that to advance, AI needs the "real-world" problems provided by robotics.

  9. Development of a framework of human-centered automation for the nuclear industry

    International Nuclear Information System (INIS)

    Nelson, W.R.; Haney, L.N.

    1993-01-01

    Introduction of automated systems into control rooms for advanced reactor designs is often justified on the basis of increased efficiency and reliability, without a detailed assessment of how the new technologies will influence the role of the operator. Such a ''technology-centered'' approach carries with it the risk that entirely new mechanisms for human error will be introduced, resulting in some unpleasant surprises when the plant goes into operation. The aviation industry has experienced some of these surprises since the introduction of automated systems into the cockpits of advanced technology aircraft. Pilot errors have actually been induced by automated systems, especially when the pilot doesn't fully understand what the automated systems are doing during all modes of operation. In order to structure the research program for investigating these problems, the National Aeronautics and Space Administration (NASA) has developed a framework for human-centered automation. This framework is described in the NASA document Human-Centered Aircraft Automation Philosophy by Charles Billings. It is the thesis of this paper that a corresponding framework of human-centered automation should be developed for the nuclear industry. Such a framework would serve to guide the design and regulation of automated systems for advanced reactor designs, and would help prevent some of the problems that have arisen in other applications that have followed a ''technology-centered'' approach

  10. Automated system for crack detection using infrared thermograph

    International Nuclear Information System (INIS)

    Starman, Stanislav

    2009-01-01

    The objective of this study was the development of the automated system for crack detection on square steel bars used in the automotive industry for axle and shaft construction. The automated system for thermographic crack detection uses brief pulsed eddy currents to heat steel components under inspection. Cracks, if present, will disturb the current flow and so generate changes in the temperature profile in the crack area. These changes of temperature are visualized using an infrared camera. The image acquired by the infrared camera is evaluated through an image processing system. The advantages afforded by the system are its inspection time, its excellent flaw detection sensitivity and its ability to detect hidden, subsurface cracks. The automated system consists of four IR cameras (each side of steel bar is evaluated at a time), coil, high frequency generator and control place with computers. The system is a part of the inspection line where the subsurface and surface cracks are searched. If the crack is present, the cracked place is automatically marked. The components without cracks are then deposited apart from defective blocks. The system is fully automated and its ability is to evaluate four meter blocks within 20 seconds. This is the real reason for using this system in real industrial applications. (author)

  11. Quantification of Eosinophilic Granule Protein Deposition in Biopsies of Inflammatory Skin Diseases by Automated Image Analysis of Highly Sensitive Immunostaining

    Directory of Open Access Journals (Sweden)

    Peter Kiehl

    1999-01-01

    Full Text Available Eosinophilic granulocytes are major effector cells in inflammation. Extracellular deposition of toxic eosinophilic granule proteins (EGPs, but not the presence of intact eosinophils, is crucial for their functional effect in situ. As even recent morphometric approaches to quantify the involvement of eosinophils in inflammation have been only based on cell counting, we developed a new method for the cell‐independent quantification of EGPs by image analysis of immunostaining. Highly sensitive, automated immunohistochemistry was done on paraffin sections of inflammatory skin diseases with 4 different primary antibodies against EGPs. Image analysis of immunostaining was performed by colour translation, linear combination and automated thresholding. Using strictly standardized protocols, the assay was proven to be specific and accurate concerning segmentation in 8916 fields of 520 sections, well reproducible in repeated measurements and reliable over 16 weeks observation time. The method may be valuable for the cell‐independent segmentation of immunostaining in other applications as well.

  12. An automated framework for NMR resonance assignment through simultaneous slice picking and spin system forming

    KAUST Repository

    Abbas, Ahmed

    2014-04-19

    Despite significant advances in automated nuclear magnetic resonance-based protein structure determination, the high numbers of false positives and false negatives among the peaks selected by fully automated methods remain a problem. These false positives and negatives impair the performance of resonance assignment methods. One of the main reasons for this problem is that the computational research community often considers peak picking and resonance assignment to be two separate problems, whereas spectroscopists use expert knowledge to pick peaks and assign their resonances at the same time. We propose a novel framework that simultaneously conducts slice picking and spin system forming, an essential step in resonance assignment. Our framework then employs a genetic algorithm, directed by both connectivity information and amino acid typing information from the spin systems, to assign the spin systems to residues. The inputs to our framework can be as few as two commonly used spectra, i.e., CBCA(CO)NH and HNCACB. Different from the existing peak picking and resonance assignment methods that treat peaks as the units, our method is based on \\'slices\\', which are one-dimensional vectors in three-dimensional spectra that correspond to certain (N, H) values. Experimental results on both benchmark simulated data sets and four real protein data sets demonstrate that our method significantly outperforms the state-of-the-art methods while using a less number of spectra than those methods. Our method is freely available at http://sfb.kaust.edu.sa/Pages/Software.aspx. © 2014 Springer Science+Business Media.

  13. Towards a fully automatic and robust DIMM (DIMMA)

    International Nuclear Information System (INIS)

    Varela, A M; Muñoz-Tuñón, C; Del Olmo-García, A M; Rodríguez, L F; Delgado, J M; Castro-Almazán, J A

    2015-01-01

    Quantitative seeing measurements have been provided at the Canarian Observatories since 1990 by differential image motion monitors (DIMMs). Image quality needs to be studied in long term (routine) measurements. This is important, for instance, in deciding on the siting of large telescopes or in the development of adaptive optics programmes, not to mention the development and design of new instruments. On the other hand, the continuous real time monitoring is essential in the day-to-day operation of telescopes.These routine measurements have to be carried out by standard, easy-to-operate and cross- calibrated instruments that required to be be operational with minimum intervention over many years. The DIMMA (Automatic Differential Image Motion Monitor) is the next step, a fully automated seeing monitor that is capable of providing data without manual operation and in remote locations. Currently, the IAC has two DIMMs working at Roque de los Muchachos Observatory (ORM) and Teide Observatory (OT). They are robotic and require an operator to start and initialize the program, focus the telescope, change the star when needed and turn off at the end of the night, all of which is done remotely. With a view to automation, we have designed a code for monitoring image quality (avoiding spurious data) and a program for autofocus, which is presented here. The data quality control protocol is also given. (paper)

  14. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  15. Automated real-time detection of tonic-clonic seizures using a wearable EMG device

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Conradsen, Isa; Henning, Oliver

    2018-01-01

    OBJECTIVE: To determine the accuracy of automated detection of generalized tonic-clonic seizures (GTCS) using a wearable surface EMG device. METHODS: We prospectively tested the technical performance and diagnostic accuracy of real-time seizure detection using a wearable surface EMG device....... The seizure detection algorithm and the cutoff values were prespecified. A total of 71 patients, referred to long-term video-EEG monitoring, on suspicion of GTCS, were recruited in 3 centers. Seizure detection was real-time and fully automated. The reference standard was the evaluation of video-EEG recordings...

  16. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  17. Review: Behavioral signs of estrus and the potential of fully automated systems for detection of estrus in dairy cattle.

    Science.gov (United States)

    Reith, S; Hoy, S

    2018-02-01

    Efficient detection of estrus is a permanent challenge for successful reproductive performance in dairy cattle. In this context, comprehensive knowledge of estrus-related behaviors is fundamental to achieve optimal estrus detection rates. This review was designed to identify the characteristics of behavioral estrus as a necessary basis for developing strategies and technologies to improve the reproductive management on dairy farms. The focus is on secondary symptoms of estrus (mounting, activity, aggressive and agonistic behaviors) which seem more indicative than standing behavior. The consequences of management, housing conditions and cow- and environmental-related factors impacting expression and detection of estrus as well as their relative importance are described in order to increase efficiency and accuracy of estrus detection. As traditional estrus detection via visual observation is time-consuming and ineffective, there has been a considerable advancement of detection aids during the last 10 years. By now, a number of fully automated technologies including pressure sensing systems, activity meters, video cameras, recordings of vocalization as well as measurements of body temperature and milk progesterone concentration are available. These systems differ in many aspects regarding sustainability and efficiency as keys to their adoption for farm use. As being most practical for estrus detection a high priority - according to the current research - is given to the detection based on sensor-supported activity monitoring, especially accelerometer systems. Due to differences in individual intensity and duration of estrus multivariate analysis can support herd managers in determining the onset of estrus. Actually, there is increasing interest in investigating the potential of combining data of activity monitoring and information of several other methods, which may lead to the best results concerning sensitivity and specificity of detection. Future improvements will

  18. The 2DX robot: a membrane protein 2D crystallization Swiss Army knife.

    Science.gov (United States)

    Iacovache, Ioan; Biasini, Marco; Kowal, Julia; Kukulski, Wanda; Chami, Mohamed; van der Goot, F Gisou; Engel, Andreas; Rémigy, Hervé-W

    2010-03-01

    Among the state-of-the-art techniques that provide experimental information at atomic scale for membrane proteins, electron crystallography, atomic force microscopy and solid state NMR make use of two-dimensional crystals. We present a cyclodextrin-driven method for detergent removal implemented in a fully automated robot. The kinetics of the reconstitution processes is precisely controlled, because the detergent complexation by cyclodextrin is of stoichiometric nature. The method requires smaller volumes and lower protein concentrations than established 2D crystallization methods, making it possible to explore more conditions with the same amount of protein. The method yielded highly ordered 2D crystals diffracting to high resolution from the pore-forming toxin Aeromonas hydrophila aerolysin (2.9A), the plant aquaporin SoPIP2;1 (3.1A) and the human aquaporin-8 (hAQP8; 3.3A). This new method outperforms traditional 2D crystallization approaches in terms of accuracy, flexibility, throughput, and allows the usage of detergents having low critical micelle concentration (CMC), which stabilize the structure of membrane proteins in solution. (c) 2009 Elsevier Inc. All rights reserved.

  19. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    California utilities have been exploring the use of critical peak prices (CPP) to help reduce needle peaks in customer end-use loads. CPP is a form of price-responsive demand response (DR). Recent experience has shown that customers have limited knowledge of how to operate their facilities in order to reduce their electricity costs under CPP (Quantum 2004). While the lack of knowledge about how to develop and implement DR control strategies is a barrier to participation in DR programs like CPP, another barrier is the lack of automation of DR systems. During 2003 and 2004, the PIER Demand Response Research Center (DRRC) conducted a series of tests of fully automated electric demand response (Auto-DR) at 18 facilities. Overall, the average of the site-specific average coincident demand reductions was 8% from a variety of building types and facilities. Many electricity customers have suggested that automation will help them institutionalize their electric demand savings and improve their overall response and DR repeatability. This report focuses on and discusses the specific results of the Automated Critical Peak Pricing (Auto-CPP, a specific type of Auto-DR) tests that took place during 2005, which build on the automated demand response (Auto-DR) research conducted through PIER and the DRRC in 2003 and 2004. The long-term goal of this project is to understand the technical opportunities of automating demand response and to remove technical and market impediments to large-scale implementation of automated demand response (Auto-DR) in buildings and industry. A second goal of this research is to understand and identify best practices for DR strategies and opportunities. The specific objectives of the Automated Critical Peak Pricing test were as follows: (1) Demonstrate how an automated notification system for critical peak pricing can be used in large commercial facilities for demand response (DR). (2) Evaluate effectiveness of such a system. (3) Determine how customers

  20. Human-Automation Cooperation for Separation Assurance in Future NextGen Environments

    Science.gov (United States)

    Mercer, Joey; Homola, Jeffrey; Cabrall, Christopher; Martin, Lynne; Morey, Susan; Gomez, Ashley; Prevot, Thomas

    2014-01-01

    A 2012 Human-In-The-Loop air traffic control simulation investigated a gradual paradigm-shift in the allocation of functions between operators and automation. Air traffic controllers staffed five adjacent high-altitude en route sectors, and during the course of a two-week experiment, worked traffic under different function-allocation approaches aligned with four increasingly mature NextGen operational environments. These NextGen time-frames ranged from near current-day operations to nearly fully-automated control, in which the ground systems automation was responsible for detecting conflicts, issuing strategic and tactical resolutions, and alerting the controller to exceptional circumstances. Results indicate that overall performance was best in the most automated NextGen environment. Safe operations were achieved in this environment for twice todays peak airspace capacity, while being rated by the controllers as highly acceptable. However, results show that sector operations were not always safe; separation violations did in fact occur. This paper will describe in detail the simulation conducted, as well discuss important results and their implications.

  1. Improved automated lumen contour detection by novel multifrequency processing algorithm with current intravascular ultrasound system.

    Science.gov (United States)

    Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro

    2013-02-01

    The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.

  2. High-throughput mouse genotyping using robotics automation.

    Science.gov (United States)

    Linask, Kaari L; Lo, Cecilia W

    2005-02-01

    The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.

  3. Strep-Tagged Protein Purification.

    Science.gov (United States)

    Maertens, Barbara; Spriestersbach, Anne; Kubicek, Jan; Schäfer, Frank

    2015-01-01

    The Strep-tag system can be used to purify recombinant proteins from any expression system. Here, protocols for lysis and affinity purification of Strep-tagged proteins from E. coli, baculovirus-infected insect cells, and transfected mammalian cells are given. Depending on the amount of Strep-tagged protein in the lysate, a protocol for batch binding and subsequent washing and eluting by gravity flow can be used. Agarose-based matrices with the coupled Strep-Tactin ligand are the resins of choice, with a binding capacity of up to 9 mg ml(-1). For purification of lower amounts of Strep-tagged proteins, the use of Strep-Tactin magnetic beads is suitable. In addition, Strep-tagged protein purification can also be automated using prepacked columns for FPLC or other liquid-handling chromatography instrumentation, but automated purification is not discussed in this protocol. The protocols described here can be regarded as an update of the Strep-Tag Protein Handbook (Qiagen, 2009). © 2015 Elsevier Inc. All rights reserved.

  4. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  5. The possibility of a fully automated procedure for radiosynthesis of fluorine-18-labeled fluoromisonidazole using a simplified single, neutral alumina column purification procedure

    International Nuclear Information System (INIS)

    Nandy, Saikat; Rajan, M.G.R.; Korde, A.; Krishnamurthy, N.V.

    2010-01-01

    A novel fully automated radiosynthesis procedure for [ 18 F]Fluoromisonidazole using a simple alumina cartridge-column for purification instead of conventionally used semi-preparative HPLC was developed. [ 18 F]FMISO was prepared via a one-pot, two-step synthesis procedure using a modified nuclear interface synthesis module. Nucleophilic fluorination of the precursor molecule 1-(2'-nitro-1'-imidazolyl) -2-O-tetrahydropyranyl-3-O-toluenesulphonylpropanediol (NITTP) with no-carrier added [ 18 F]fluoride followed by hydrolysis of the protecting group with 1 M HCl. Purification was carried out using a single neutral alumina cartridge-column instead of semi-preparative HPLC. The maximum overall radiochemical yield obtained was 37.49±1.68% with 10 mg NITTP (n=3, without any decay correction) and the total synthesis time was 40±1 min. The radiochemical purity was greater than 95% and the product was devoid of other chemical impurities including residual aluminum and acetonitrile. The biodistribution study in fibrosarcoma tumor model showed maximum uptake in tumor, 2 h post injection. Finally, PET/CT imaging studies in normal healthy rabbit, showed clear uptake in the organs involved in the metabolic process of MISO. No bone uptake was observed excluding the presence of free [ 18 F]fluoride. The reported method can be easily adapted in any commercial FDG synthesis module.

  6. An automated approach to network features of protein structure ensembles

    Science.gov (United States)

    Bhattacharyya, Moitrayee; Bhat, Chanda R; Vishveshwara, Saraswathi

    2013-01-01

    Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of β2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html. PMID:23934896

  7. Roles for text mining in protein function prediction.

    Science.gov (United States)

    Verspoor, Karin M

    2014-01-01

    The Human Genome Project has provided science with a hugely valuable resource: the blueprints for life; the specification of all of the genes that make up a human. While the genes have all been identified and deciphered, it is proteins that are the workhorses of the human body: they are essential to virtually all cell functions and are the primary mechanism through which biological function is carried out. Hence in order to fully understand what happens at a molecular level in biological organisms, and eventually to enable development of treatments for diseases where some aspect of a biological system goes awry, we must understand the functions of proteins. However, experimental characterization of protein function cannot scale to the vast amount of DNA sequence data now available. Computational protein function prediction has therefore emerged as a problem at the forefront of modern biology (Radivojac et al., Nat Methods 10(13):221-227, 2013).Within the varied approaches to computational protein function prediction that have been explored, there are several that make use of biomedical literature mining. These methods take advantage of information in the published literature to associate specific proteins with specific protein functions. In this chapter, we introduce two main strategies for doing this: association of function terms, represented as Gene Ontology terms (Ashburner et al., Nat Genet 25(1):25-29, 2000), to proteins based on information in published articles, and a paradigm called LEAP-FS (Literature-Enhanced Automated Prediction of Functional Sites) in which literature mining is used to validate the predictions of an orthogonal computational protein function prediction method.

  8. Gene Expression Measurement Module (GEMM) - a fully automated, miniaturized instrument for measuring gene expression in space

    Science.gov (United States)

    Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh

    2012-07-01

    The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions

  9. Coping with the psychological impact of automated systems

    International Nuclear Information System (INIS)

    Suzuki, K.; Nogami, T.; Inoue, T.; Mitsumori, K.; Taguchi, T.

    1991-01-01

    Japanese surveys and experiments have found that operators sometimes find it difficult to anticipate automatic processes, which in turn limits their ability to keep up with those processes. One of the factors which makes anticipation difficult is the lack of flexible communication between operators and computers - communication which is easier among human operators. At present the only way of dealing with this psychological effect is to ensure that trainees fully master the characteristics of the automated processes. (author)

  10. Fully automated atlas-based method for prescribing 3D PRESS MR spectroscopic imaging: Toward robust and reproducible metabolite measurements in human brain.

    Science.gov (United States)

    Bian, Wei; Li, Yan; Crane, Jason C; Nelson, Sarah J

    2018-02-01

    To implement a fully automated atlas-based method for prescribing 3D PRESS MR spectroscopic imaging (MRSI). The PRESS selected volume and outer-volume suppression bands were predefined on the MNI152 standard template image. The template image was aligned to the subject T 1 -weighted image during a scan, and the resulting transformation was then applied to the predefined prescription. To evaluate the method, H-1 MRSI data were obtained in repeat scan sessions from 20 healthy volunteers. In each session, datasets were acquired twice without repositioning. The overlap ratio of the prescribed volume in the two sessions was calculated and the reproducibility of inter- and intrasession metabolite peak height and area ratios was measured by the coefficient of variation (CoV). The CoVs from intra- and intersession were compared by a paired t-test. The average overlap ratio of the automatically prescribed selection volumes between two sessions was 97.8%. The average voxel-based intersession CoVs were less than 0.124 and 0.163 for peak height and area ratios, respectively. Paired t-test showed no significant difference between the intra- and intersession CoVs. The proposed method provides a time efficient method to prescribe 3D PRESS MRSI with reproducible imaging positioning and metabolite measurements. Magn Reson Med 79:636-642, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  11. Automated Radioanalytical Chemistry: Applications For The Laboratory And Industrial Process Monitoring

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Farawila, Anne F.; Grate, Jay W.

    2009-01-01

    The identification and quantification of targeted α- and β-emitting radionuclides via destructive analysis in complex radioactive liquid matrices is highly challenging. Analyses are typically accomplished at on- or off-site laboratories through laborious sample preparation steps and extensive chemical separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, alpha energy spectroscopy, mass spectrometry). Analytical results may take days or weeks to report. When an industrial-scale plant requires periodic or continuous monitoring of radionuclides as an indication of the composition of its feed stream, diversion of safeguarded nuclides, or of plant operational conditions (for example), radiochemical measurements should be rapid, but not at the expense of precision and accuracy. Scientists at Pacific Northwest National Laboratory have developed and characterized a host of automated radioanalytical systems designed to perform reproducible and rapid radioanalytical processes. Platforms have been assembled for (1) automation and acceleration of sample analysis in the laboratory and (2) automated monitors for monitoring industrial scale nuclear processes on-line with near-real time results. These methods have been applied to the analysis of environmental-level actinides and fission products to high-level nuclear process fluids. Systems have been designed to integrate a number of discrete sample handling steps, including sample pretreatment (e.g., digestion and valence state adjustment) and chemical separations. The systems have either utilized on-line analyte detection or have collected the purified analyte fractions for off-line measurement applications. One PNNL system of particular note is a fully automated prototype on-line radioanalytical system designed for the Waste Treatment Plant at Hanford, WA, USA. This system demonstrated nearly continuous destructive analysis of the soft β-emitting radionuclide 99Tc in nuclear

  12. Automated evaluation of one-loop scattering amplitudes

    International Nuclear Information System (INIS)

    Deurzen, Hans van

    2015-01-01

    In this dissertation the developments toward fully automated evaluation of one-loop scattering amplitudes will be presented, as implemented in the GoSam framework. The code Xsamurai, part of GoSam, is described, which implements the integrand reduction algorithm including an extension to higher-rank capability. GoSam was used to compute several Higgs boson production channels at NLO QCD. An interface between GoSam and a Monte Carlo program was constructed, which enables computing any process at NLO precision needed in the LHC era.

  13. Varying Levels of Automation on UAS Operator Responses to Traffic Resolution Advisories in Civil Airspace

    Science.gov (United States)

    Kenny, Caitlin; Fern, Lisa

    2012-01-01

    Continuing demand for the use of Unmanned Aircraft Systems (UAS) has put increasing pressure on operations in civil airspace. The need to fly UAS in the National Airspace System (NAS) in order to perform missions vital to national security and defense, emergency management, and science is increasing at a rapid pace. In order to ensure safe operations in the NAS, operators of unmanned aircraft, like those of manned aircraft, may be required to maintain separation assurance and avoid loss of separation with other aircraft while performing their mission tasks. This experiment investigated the effects of varying levels of automation on UAS operator performance and workload while responding to conflict resolution instructions provided by the Tactical Collision Avoidance System II (TCAS II) during a UAS mission in high-density airspace. The purpose of this study was not to investigate the safety of using TCAS II on UAS, but rather to examine the effect of automation on the ability of operators to respond to traffic collision alerts. Six licensed pilots were recruited to act as UAS operators for this study. Operators were instructed to follow a specified mission flight path, while maintaining radio contact with Air Traffic Control and responding to TCAS II resolution advisories. Operators flew four, 45 minute, experimental missions with four different levels of automation: Manual, Knobs, Management by Exception, and Fully Automated. All missions included TCAS II Resolution Advisories (RAs) that required operator attention and rerouting. Operator compliance and reaction time to RAs was measured, and post-run NASA-TLX ratings were collected to measure workload. Results showed significantly higher compliance rates, faster responses to TCAS II alerts, as well as less preemptive operator actions when higher levels of automation are implemented. Physical and Temporal ratings of workload were significantly higher in the Manual condition than in the Management by Exception and

  14. Linear Approach for Synchronous State Stability in Fully Connected PLL Networks

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Synchronization is an essential feature for the use of digital systems in telecommunication networks, integrated circuits, and manufacturing automation. Formerly, master-slave (MS architectures, with precise master clock generators sending signals to phase-locked loops (PLLs working as slave oscillators, were considered the best solution. Nowadays, the development of wireless networks with dynamical connectivity and the increase of the size and the operation frequency of integrated circuits suggest that the distribution of clock signals could be more efficient if distributed solutions with fully connected oscillators are used. Here, fully connected networks with second-order PLLs as nodes are considered. In previous work, how the synchronous state frequency for this type of network depends on the node parameters and delays was studied and an expression for the long-term frequency was derived (Piqueira, 2006. Here, by taking the first term of the Taylor series expansion for the dynamical system description, it is shown that for a generic network with N nodes, the synchronous state is locally asymptotically stable.

  15. Automation in trace-element chemistry - Development of a fully automated on-line preconcentration device for trace analysis of heavy metals with atomic spectroscopy

    International Nuclear Information System (INIS)

    Michaelis, M.R.A.

    1990-01-01

    Scope of this work was the development of an automated system for trace element preconcentration to be used and integrated to analytic atomic spectroscopic methods like flame atomic absorption spectrometry (FAAS), graphite furnace atomic absorption spectrometry (GFAAS) or atomic emission spectroscopy with inductively coupled plasma (ICP-AES). Based on the newly developed cellulose-based chelating cation exchangers ethylene-diamin-triacetic acid cellulose (EDTrA-Cellulose) and sulfonated-oxine cellulose a flexible, computer-controlled instrument for automation of preconcentration and/or of matrix separation of heavy metals is described. The most important properties of these materials are fast exchange kinetics, good selectivity against alkaline and alkaline earth elements, good flow characteristics and good stability of the material and the chelating functions against changes in pH-values of reagents necessary in the process. The combination of the preconcentration device for on-line determinations of Zn, Cd, Pb, Ni, Fe, Co, Mn, V, Cu, La, U, Th is described for FAAS and for ICP-AES with a simultaneous spectrometer. Signal enhancement factors of 70 are achieved from preconcentration of 10 ml and on-line determination with FAAS due to signal quantification in peak-height mode. For GFAAS and for sequential ICP methods for off-line preconcentration are given. The optimization and adaption of the interface to the different characteristics of the analytical instrumentation is emphasized. For evaluation and future developments with respect to determination and/or preconcentration of anionic species like As, Se, Sb etc. instrument modifications are proposed and a development software is described. (Author)

  16. An Advanced Pre-Processing Pipeline to Improve Automated Photogrammetric Reconstructions of Architectural Scenes

    Directory of Open Access Journals (Sweden)

    Marco Gaiani

    2016-02-01

    Full Text Available Automated image-based 3D reconstruction methods are more and more flooding our 3D modeling applications. Fully automated solutions give the impression that from a sample of randomly acquired images we can derive quite impressive visual 3D models. Although the level of automation is reaching very high standards, image quality is a fundamental pre-requisite to produce successful and photo-realistic 3D products, in particular when dealing with large datasets of images. This article presents an efficient pipeline based on color enhancement, image denoising, color-to-gray conversion and image content enrichment. The pipeline stems from an analysis of various state-of-the-art algorithms and aims to adjust the most promising methods, giving solutions to typical failure causes. The assessment evaluation proves how an effective image pre-processing, which considers the entire image dataset, can improve the automated orientation procedure and dense 3D point cloud reconstruction, even in the case of poor texture scenarios.

  17. Automated synthesis and verification of configurable DRAM blocks for ASIC's

    Science.gov (United States)

    Pakkurti, M.; Eldin, A. G.; Kwatra, S. C.; Jamali, M.

    1993-01-01

    A highly flexible embedded DRAM compiler is developed which can generate DRAM blocks in the range of 256 bits to 256 Kbits. The compiler is capable of automatically verifying the functionality of the generated DRAM modules. The fully automated verification capability is a key feature that ensures the reliability of the generated blocks. The compiler's architecture, algorithms, verification techniques and the implementation methodology are presented.

  18. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  19. Atmospheric ozone measurement with an inexpensive and fully automated porous tube collector-colorimeter.

    Science.gov (United States)

    Li, Jianzhong; Li, Qingyang; Dyke, Jason V; Dasgupta, Purnendu K

    2008-01-15

    The bleaching action of ozone on indigo and related compounds is well known. We describe sensitive automated instrumentation for measuring ambient ozone. Air is sampled around a porous polypropylene tube filled with a solution of indigotrisulfonate. Light transmission through the tube is measured. Light transmission increases as O(3) diffuses through the membrane and bleaches the indigo. Evaporation of the solution, a function of the RH and the air temperature, can, however cause major errors. We solve this problem by adding an O(3)-inert dye that absorbs at a different wavelength. Here we provide a new algorithm for this correction and show that this very inexpensive instrument package (controlled by a BASIC Stamp Microcontroller with an on-board data logger, total parts cost US$ 300) provides data highly comparable to commercial ozone monitors over an extended period. The instrument displays an LOD of 1.2ppbv and a linear span up to 300ppbv for a sampling time of 1min. For a sampling time of 5min, the respective values are 0.24ppbv and 100ppbv O(3).

  20. Automated side-chain model building and sequence assignment by template matching.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer.

  1. Transitioning Resolution Responsibility between the Controller and Automation Team in Simulated NextGen Separation Assurance

    Science.gov (United States)

    Cabrall, C.; Gomez, A.; Homola, J.; Hunt, S..; Martin, L.; Merccer, J.; Prevott, T.

    2013-01-01

    As part of an ongoing research effort on separation assurance and functional allocation in NextGen, a controller- in-the-loop study with ground-based automation was conducted at NASA Ames' Airspace Operations Laboratory in August 2012 to investigate the potential impact of introducing self-separating aircraft in progressively advanced NextGen timeframes. From this larger study, the current exploratory analysis of controller-automation interaction styles focuses on the last and most far-term time frame. Measurements were recorded that firstly verified the continued operational validity of this iteration of the ground-based functional allocation automation concept in forecast traffic densities up to 2x that of current day high altitude en-route sectors. Additionally, with greater levels of fully automated conflict detection and resolution as well as the introduction of intervention functionality, objective and subjective analyses showed a range of passive to active controller- automation interaction styles between the participants. Not only did the controllers work with the automation to meet their safety and capacity goals in the simulated future NextGen timeframe, they did so in different ways and with different attitudes of trust/use of the automation. Taken as a whole, the results showed that the prototyped controller-automation functional allocation framework was very flexible and successful overall.

  2. Continuous Automated Model EvaluatiOn (CAMEO) complementing the critical assessment of structure prediction in CASP12.

    Science.gov (United States)

    Haas, Jürgen; Barbato, Alessandro; Behringer, Dario; Studer, Gabriel; Roth, Steven; Bertoni, Martino; Mostaguir, Khaled; Gumienny, Rafal; Schwede, Torsten

    2018-03-01

    Every second year, the community experiment "Critical Assessment of Techniques for Structure Prediction" (CASP) is conducting an independent blind assessment of structure prediction methods, providing a framework for comparing the performance of different approaches and discussing the latest developments in the field. Yet, developers of automated computational modeling methods clearly benefit from more frequent evaluations based on larger sets of data. The "Continuous Automated Model EvaluatiOn (CAMEO)" platform complements the CASP experiment by conducting fully automated blind prediction assessments based on the weekly pre-release of sequences of those structures, which are going to be published in the next release of the PDB Protein Data Bank. CAMEO publishes weekly benchmarking results based on models collected during a 4-day prediction window, on average assessing ca. 100 targets during a time frame of 5 weeks. CAMEO benchmarking data is generated consistently for all participating methods at the same point in time, enabling developers to benchmark and cross-validate their method's performance, and directly refer to the benchmarking results in publications. In order to facilitate server development and promote shorter release cycles, CAMEO sends weekly email with submission statistics and low performance warnings. Many participants of CASP have successfully employed CAMEO when preparing their methods for upcoming community experiments. CAMEO offers a variety of scores to allow benchmarking diverse aspects of structure prediction methods. By introducing new scoring schemes, CAMEO facilitates new development in areas of active research, for example, modeling quaternary structure, complexes, or ligand binding sites. © 2017 Wiley Periodicals, Inc.

  3. Virtual commissioning of automated micro-optical assembly

    Science.gov (United States)

    Schlette, Christian; Losch, Daniel; Haag, Sebastian; Zontar, Daniel; Roßmann, Jürgen; Brecher, Christian

    2015-02-01

    In this contribution, we present a novel approach to enable virtual commissioning for process developers in micro-optical assembly. Our approach aims at supporting micro-optics experts to effectively develop assisted or fully automated assembly solutions without detailed prior experience in programming while at the same time enabling them to easily implement their own libraries of expert schemes and algorithms for handling optical components. Virtual commissioning is enabled by a 3D simulation and visualization system in which the functionalities and properties of automated systems are modeled, simulated and controlled based on multi-agent systems. For process development, our approach supports event-, state- and time-based visual programming techniques for the agents and allows for their kinematic motion simulation in combination with looped-in simulation results for the optical components. First results have been achieved for simply switching the agents to command the real hardware setup after successful process implementation and validation in the virtual environment. We evaluated and adapted our system to meet the requirements set by industrial partners-- laser manufacturers as well as hardware suppliers of assembly platforms. The concept is applied to the automated assembly of optical components for optically pumped semiconductor lasers and positioning of optical components for beam-shaping

  4. Is automated kinetic measurement superior to end-point for advanced oxidation protein product?

    Science.gov (United States)

    Oguz, Osman; Inal, Berrin Bercik; Emre, Turker; Ozcan, Oguzhan; Altunoglu, Esma; Oguz, Gokce; Topkaya, Cigdem; Guvenen, Guvenc

    2014-01-01

    Advanced oxidation protein product (AOPP) was first described as an oxidative protein marker in chronic uremic patients and measured with a semi-automatic end-point method. Subsequently, the kinetic method was introduced for AOPP assay. We aimed to compare these two methods by adapting them to a chemistry analyzer and to investigate the correlation between AOPP and fibrinogen, the key molecule responsible for human plasma AOPP reactivity, microalbumin, and HbA1c in patients with type II diabetes mellitus (DM II). The effects of EDTA and citrate-anticogulated tubes on these two methods were incorporated into the study. This study included 93 DM II patients (36 women, 57 men) with HbA1c levels > or = 7%, who were admitted to the diabetes and nephrology clinics. The samples were collected in EDTA and in citrate-anticoagulated tubes. Both methods were adapted to a chemistry analyzer and the samples were studied in parallel. In both types of samples, we found a moderate correlation between the kinetic and the endpoint methods (r = 0.611 for citrate-anticoagulated, r = 0.636 for EDTA-anticoagulated, p = 0.0001 for both). We found a moderate correlation between fibrinogen-AOPP and microalbumin-AOPP levels only in the kinetic method (r = 0.644 and 0.520 for citrate-anticoagulated; r = 0.581 and 0.490 for EDTA-anticoagulated, p = 0.0001). We conclude that adaptation of the end-point method to automation is more difficult and it has higher between-run CV% while application of the kinetic method is easier and it may be used in oxidative stress studies.

  5. An Open-Source Automated Peptide Synthesizer Based on Arduino and Python.

    Science.gov (United States)

    Gali, Hariprasad

    2017-10-01

    The development of the first open-source automated peptide synthesizer, PepSy, using Arduino UNO and readily available components is reported. PepSy was primarily designed to synthesize small peptides in a relatively small scale (<100 µmol). Scripts to operate PepSy in a fully automatic or manual mode were written in Python. Fully automatic script includes functions to carry out resin swelling, resin washing, single coupling, double coupling, Fmoc deprotection, ivDde deprotection, on-resin oxidation, end capping, and amino acid/reagent line cleaning. Several small peptides and peptide conjugates were successfully synthesized on PepSy with reasonably good yields and purity depending on the complexity of the peptide.

  6. A Fully Automated Classification for Mapping the Annual Cropland Extent

    Science.gov (United States)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  7. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  8. A use of Ramachandran potentials in protein solution structure determinations

    International Nuclear Information System (INIS)

    Bertini, Ivano; Cavallaro, Gabriele; Luchinat, Claudio; Poli, Irene

    2003-01-01

    A strategy is developed to use database-derived φ-ψ constraints during simulated annealing procedures for protein solution structure determination in order to improve the Ramachandran plot statistics, while maintaining the agreement with the experimental constraints as the sole criterion for the selection of the family. The procedure, fully automated, consists of two consecutive simulated annealing runs. In the first run, the database-derived φ-ψ constraints are enforced for all aminoacids (but prolines and glycines). A family of structures is then selected on the ground of the lowest violations of the experimental constraints only, and the φ-ψ values for each residue are examined. In the second and final run, the database-derived φ-ψ constraints are enforced only for those residues which in the first run have ended in one and the same favored φ-ψ region. For residues which are either spread over different favored regions or concentrated in disallowed regions, the constraints are not enforced. The final family is then selected, after the second run, again only based on the agreement with the experimental constraints. This automated approach was implemented in DYANA and was tested on as many as 12 proteins, including some containing paramagnetic metals, whose structures had been previously solved in our laboratory. The quality of the structures, and of Ramachandran plot statistics in particular, was notably improved while preserving the agreement with the experimental constraints

  9. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Vilém Guryča

    2014-03-01

    Full Text Available The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility.

  10. Automated calibration system for a high-precision measurement of neutrino mixing angle θ13 with the Daya Bay antineutrino detectors

    International Nuclear Information System (INIS)

    Liu, J.; Cai, B.; Carr, R.; Dwyer, D.A.; Gu, W.Q.; Li, G.S.; Qian, X.; McKeown, R.D.; Tsang, R.H.M.; Wang, W.; Wu, F.F.; Zhang, C.

    2014-01-01

    We describe the automated calibration system for the antineutrino detectors in the Daya Bay Neutrino Experiment. This system consists of 24 identical units instrumented on 8 identical 20-ton liquid scintillator detectors. Each unit is a fully automated robotic system capable of deploying an LED and various radioactive sources into the detector along given vertical axes. Selected results from performance studies of the calibration system are reported

  11. The comprehensive native interactome of a fully functional tagged prion protein.

    Directory of Open Access Journals (Sweden)

    Dorothea Rutishauser

    Full Text Available The enumeration of the interaction partners of the cellular prion protein, PrP(C, may help clarifying its elusive molecular function. Here we added a carboxy proximal myc epitope tag to PrP(C. When expressed in transgenic mice, PrP(myc carried a GPI anchor, was targeted to lipid rafts, and was glycosylated similarly to PrP(C. PrP(myc antagonized the toxicity of truncated PrP, restored prion infectibility of PrP(C-deficient mice, and was physically incorporated into PrP(Sc aggregates, indicating that it possessed all functional characteristics of genuine PrP(C. We then immunopurified myc epitope-containing protein complexes from PrP(myc transgenic mouse brains. Gentle differential elution with epitope-mimetic decapeptides, or a scrambled version thereof, yielded 96 specifically released proteins. Quantitative mass spectrometry with isotope-coded tags identified seven proteins which co-eluted equimolarly with PrP(C and may represent component of a multiprotein complex. Selected PrP(C interactors were validated using independent methods. Several of these proteins appear to exert functions in axomyelinic maintenance.

  12. The Environmental Control and Life Support System (ECLSS) advanced automation project

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, Ray

    1990-01-01

    The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.

  13. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  14. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  15. Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cheung, Iris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Li, Becky Z [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.

  16. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    new approach has been developed for the semiconductor industry; however, as with most new technologies its applicability extends to many other areas as well including environmental, pharmaceutical, clinical and industrial chemical processing. This instrumental system represents a fundamentally new approach. Sample preparation has been integrated as a key system element to enable automation of the instrument system. It has long been believed that an automated fully integrated system was not feasible if a powerful MS system were included. This application demonstrates one of the first fully automated and integrated sample preparation and mass spectrometric instrumental analyses systems applied to practical use. The system is also a broad and ambitious mass based analyzer capable not only for elements but also for direct speciated analysis. The complete analytical suite covering inorganic, organic, organo-metallic and speciated analytes is being applied for critical contamination control of semiconductor processes. As with new paradigms technology it will now extend from its current use into those other applications needing real-time fully automated multi-component analysis. Refs. 4 (author)

  17. JPLEX: Java Simplex Implementation with Branch-and-Bound Search for Automated Test Assembly

    Science.gov (United States)

    Park, Ryoungsun; Kim, Jiseon; Dodd, Barbara G.; Chung, Hyewon

    2011-01-01

    JPLEX, short for Java simPLEX, is an automated test assembly (ATA) program. It is a mixed integer linear programming (MILP) solver written in Java. It reads in a configuration file, solves the minimization problem, and produces an output file for postprocessing. It implements the simplex algorithm to create a fully relaxed solution and…

  18. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling

    Directory of Open Access Journals (Sweden)

    Sergi Valverde

    2015-01-01

    Full Text Available Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. First of all, images were processed using the LST and SLS toolkits with different pipeline combinations that differed in either automated or manual lesion segmentation, and lesion filling or masking out lesions. Then, images processed following each of the pipelines were segmented into gray matter (GM and white matter (WM using SPM8, and compared with the same images where expert lesion annotations were filled before segmentation. Our results showed that fully automated lesion segmentation and filling pipelines reduced significantly the % of error in GM and WM volume on images of MS patients, and performed similarly to the images where expert lesion annotations were masked before segmentation. In all the pipelines, the amount of misclassified lesion voxels was the main cause in the observed error in GM and WM volume. However, the % of error was significantly lower when automatically estimated lesions were filled and not masked before segmentation. These results are relevant and suggest that LST and SLS toolboxes allow the performance of accurate brain tissue volume measurements without any kind of manual intervention, which can be convenient not only in terms of time and economic costs, but also to avoid the inherent intra/inter variability between manual annotations.

  19. Automation of 3D cell culture using chemically defined hydrogels.

    Science.gov (United States)

    Rimann, Markus; Angres, Brigitte; Patocchi-Tenzer, Isabel; Braum, Susanne; Graf-Hausner, Ursula

    2014-04-01

    Drug development relies on high-throughput screening involving cell-based assays. Most of the assays are still based on cells grown in monolayer rather than in three-dimensional (3D) formats, although cells behave more in vivo-like in 3D. To exemplify the adoption of 3D techniques in drug development, this project investigated the automation of a hydrogel-based 3D cell culture system using a liquid-handling robot. The hydrogel technology used offers high flexibility of gel design due to a modular composition of a polymer network and bioactive components. The cell inert degradation of the gel at the end of the culture period guaranteed the harmless isolation of live cells for further downstream processing. Human colon carcinoma cells HCT-116 were encapsulated and grown in these dextran-based hydrogels, thereby forming 3D multicellular spheroids. Viability and DNA content of the cells were shown to be similar in automated and manually produced hydrogels. Furthermore, cell treatment with toxic Taxol concentrations (100 nM) had the same effect on HCT-116 cell viability in manually and automated hydrogel preparations. Finally, a fully automated dose-response curve with the reference compound Taxol showed the potential of this hydrogel-based 3D cell culture system in advanced drug development.

  20. Automated quantitative assessment of proteins' biological function in protein knowledge bases.

    Science.gov (United States)

    Mayr, Gabriele; Lepperdinger, Günter; Lackner, Peter

    2008-01-01

    Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.

  1. Automated Quantitative Assessment of Proteins' Biological Function in Protein Knowledge Bases

    Directory of Open Access Journals (Sweden)

    Gabriele Mayr

    2008-01-01

    Full Text Available Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.

  2. Logistics support economy and efficiency through consolidation and automation

    Science.gov (United States)

    Savage, G. R.; Fontana, C. J.; Custer, J. D.

    1985-01-01

    An integrated logistics support system, which would provide routine access to space and be cost-competitive as an operational space transportation system, was planned and implemented to support the NSTS program launch-on-time goal of 95 percent. A decision was made to centralize the Shuttle logistics functions in a modern facility that would provide office and training space and an efficient warehouse area. In this warehouse, the emphasis is on automation of the storage and retrieval function, while utilizing state-of-the-art warehousing and inventory management technology. This consolidation, together with the automation capabilities being provided, will allow for more effective utilization of personnel and improved responsiveness. In addition, this facility will be the prime support for the fully integrated logistics support of the operations era NSTS and reduce the program's management, procurement, transportation, and supply costs in the operations era.

  3. PFP: Automated prediction of gene ontology functional annotations with confidence scores using protein sequence data.

    Science.gov (United States)

    Hawkins, Troy; Chitale, Meghana; Luban, Stanislav; Kihara, Daisuke

    2009-02-15

    Protein function prediction is a central problem in bioinformatics, increasing in importance recently due to the rapid accumulation of biological data awaiting interpretation. Sequence data represents the bulk of this new stock and is the obvious target for consideration as input, as newly sequenced organisms often lack any other type of biological characterization. We have previously introduced PFP (Protein Function Prediction) as our sequence-based predictor of Gene Ontology (GO) functional terms. PFP interprets the results of a PSI-BLAST search by extracting and scoring individual functional attributes, searching a wide range of E-value sequence matches, and utilizing conventional data mining techniques to fill in missing information. We have shown it to be effective in predicting both specific and low-resolution functional attributes when sufficient data is unavailable. Here we describe (1) significant improvements to the PFP infrastructure, including the addition of prediction significance and confidence scores, (2) a thorough benchmark of performance and comparisons to other related prediction methods, and (3) applications of PFP predictions to genome-scale data. We applied PFP predictions to uncharacterized protein sequences from 15 organisms. Among these sequences, 60-90% could be annotated with a GO molecular function term at high confidence (>or=80%). We also applied our predictions to the protein-protein interaction network of the Malaria plasmodium (Plasmodium falciparum). High confidence GO biological process predictions (>or=90%) from PFP increased the number of fully enriched interactions in this dataset from 23% of interactions to 94%. Our benchmark comparison shows significant performance improvement of PFP relative to GOtcha, InterProScan, and PSI-BLAST predictions. This is consistent with the performance of PFP as the overall best predictor in both the AFP-SIG '05 and CASP7 function (FN) assessments. PFP is available as a web service at http

  4. Using microwave Doppler radar in automated manufacturing applications

    Science.gov (United States)

    Smith, Gregory C.

    Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help

  5. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    Science.gov (United States)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P 0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  6. Reduced dimensionality (3,2)D NMR experiments and their automated analysis: implications to high-throughput structural studies on proteins.

    Science.gov (United States)

    Reddy, Jithender G; Kumar, Dinesh; Hosur, Ramakrishna V

    2015-02-01

    Protein NMR spectroscopy has expanded dramatically over the last decade into a powerful tool for the study of their structure, dynamics, and interactions. The primary requirement for all such investigations is sequence-specific resonance assignment. The demand now is to obtain this information as rapidly as possible and in all types of protein systems, stable/unstable, soluble/insoluble, small/big, structured/unstructured, and so on. In this context, we introduce here two reduced dimensionality experiments – (3,2)D-hNCOcanH and (3,2)D-hNcoCAnH – which enhance the previously described 2D NMR-based assignment methods quite significantly. Both the experiments can be recorded in just about 2-3 h each and hence would be of immense value for high-throughput structural proteomics and drug discovery research. The applicability of the method has been demonstrated using alpha-helical bovine apo calbindin-D9k P43M mutant (75 aa) protein. Automated assignment of this data using AUTOBA has been presented, which enhances the utility of these experiments. The backbone resonance assignments so derived are utilized to estimate secondary structures and the backbone fold using Web-based algorithms. Taken together, we believe that the method and the protocol proposed here can be used for routine high-throughput structural studies of proteins. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Evaluation of a capillary zone electrophoresis system versus a conventional agarose gel system for routine serum protein separation and monoclonal component typing.

    Science.gov (United States)

    Roudiere, L; Boularan, A M; Bonardet, A; Vallat, C; Cristol, J P; Dupuy, A M

    2006-01-01

    Capillary zone electrophoresis of serum proteins is increasingly gaining impact in clinical laboratories. During 2003, we compared the fully automated capillary electrophoresis (CE) system from Beckman (Paragon CZE 2000) with the method agarose gel electrophoresis Sebia (Hydrasis-Hyris, AGE). This new study focused on the evaluation of analytical performance and a comparison including 115 fresh routine samples (group A) and a series of 97 frozen pathologic sera with suspicion of monoclonal protein (group B). Coefficients of variation (CVs %) for the five classical protein fractions have been reported to be consistenly serum samples (group B), there were 90 in which we detected a monoclonal protein by immunofixation (IF) (immunosubtraction (IS) was not used). AGE and Paragon 2000 failed to detect 7 and 12 monoclonal proteins, respectively, leading to a concordance to 92% for AGE and 87% for Paragon 2000 for identifying electrophoretic abnormalities in this group. Beta-globulin abnormalities and M paraprotein were well detected with Paragon 2000. Only 81% (21 vs 26) of the gammopathies were immunotyped with IS by two readers blinded to the IF immunotype. The Paragon 2000 is a reliable alternative to conventional agarose gel electrophoresis combining the advantages of full automation (rapidity, ease of use and cost) with high analytical performance. Qualified interpretation of results requires an adaptation period which could further improve concordance between the methods. Recently, this CE system has been improved by the manufacturer (Beckman) concerning the migration buffer and detection of beta-globulin abnormalities.

  8. Automated calibration system for a high-precision measurement of neutrino mixing angle θ{sub 13} with the Daya Bay antineutrino detectors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, J., E-mail: jianglai.liu@sjtu.edu.cn [Department of Physics, Shanghai Jiao Tong University, Shanghai (China); Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Cai, B.; Carr, R. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Dwyer, D.A. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Gu, W.Q.; Li, G.S. [Department of Physics, Shanghai Jiao Tong University, Shanghai (China); Qian, X. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Brookhaven National Laboratory, Upton, NY (United States); McKeown, R.D. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Department of Physics, College of William and Mary, Williamsburg, VA (United States); Tsang, R.H.M. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Wang, W. [Department of Physics, College of William and Mary, Williamsburg, VA (United States); Wu, F.F. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Zhang, C. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Brookhaven National Laboratory, Upton, NY (United States)

    2014-06-01

    We describe the automated calibration system for the antineutrino detectors in the Daya Bay Neutrino Experiment. This system consists of 24 identical units instrumented on 8 identical 20-ton liquid scintillator detectors. Each unit is a fully automated robotic system capable of deploying an LED and various radioactive sources into the detector along given vertical axes. Selected results from performance studies of the calibration system are reported.

  9. Synthesis of chlorophyll-binding proteins in a fully-segregated ∆ycf54 strain of the cyanobacterium Synechocystis PCC 6803

    Directory of Open Access Journals (Sweden)

    Sarah eHollingshead

    2016-03-01

    Full Text Available In the chlorophyll (Chl biosynthesis pathway the formation of protochlorophyllide is catalyzed by Mg-protoporphyrin IX methyl ester (MgPME cyclase. The Ycf54 protein was recently shown to form a complex with another component of the oxidative cyclase, Sll1214 (CycI, and partial inactivation of the ycf54 gene leads to Chl deficiency in cyanobacteria and plants. The exact function of the Ycf54 is not known, however, and further progress depends on construction and characterisation of a mutant cyanobacterial strain with a fully inactivated ycf54 gene. Here, we report the complete deletion of the ycf54 gene in the cyanobacterium Synechocystis 6803; the resulting ycf54 strain accumulates huge concentrations of the cyclase substrate MgPME together with another pigment, which we identified using nuclear magnetic resonance as 3-formyl MgPME. The detection of a small amount (~13% of Chl in the ycf54 mutant provides clear evidence that the Ycf54 protein is important, but not essential, for activity of the oxidative cyclase. The greatly reduced formation of protochlorophyllide in the ycf54 strain provided an opportunity to use 35S protein labelling combined with 2D electrophoresis to examine the synthesis of all known Chl-binding protein complexes under drastically restricted de novo Chl biosynthesis. We show that although the ycf54 strain synthesizes very limited amounts of photosystem I and the CP47 and CP43 subunits of photosystem II (PSII, the synthesis of PSII D1 and D2 subunits and their assembly into the reaction centre (RCII assembly intermediate were not affected. Furthermore, the levels of other Chl complexes such as cytochrome b6f and the HliD– Chl synthase remained comparable to wild-type. These data demonstrate that the requirement for de novo Chl molecules differs completely for each Chl-binding protein. Chl traffic and recycling in the cyanobacterial cell as well as the function of Ycf54 are discussed.

  10. Automated Protein Structure Modeling with SWISS-MODEL Workspace and the Protein Model Portal

    OpenAIRE

    Bordoli, Lorenza; Schwede, Torsten

    2012-01-01

    Comparative protein structure modeling is a computational approach to build three-dimensional structural models for proteins using experimental structures of related protein family members as templates. Regular blind assessments of modeling accuracy have demonstrated that comparative protein structure modeling is currently the most reliable technique to model protein structures. Homology models are often sufficiently accurate to substitute for experimental structures in a wide variety of appl...

  11. Automated Single Cell Data Decontamination Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Tennessen, Kristin [Lawrence Berkeley National Lab. (LBNL), Walnut Creek, CA (United States). Dept. of Energy Joint Genome Inst.; Pati, Amrita [Lawrence Berkeley National Lab. (LBNL), Walnut Creek, CA (United States). Dept. of Energy Joint Genome Inst.

    2014-03-21

    Recent technological advancements in single-cell genomics have encouraged the classification and functional assessment of microorganisms from a wide span of the biospheres phylogeny.1,2 Environmental processes of interest to the DOE, such as bioremediation and carbon cycling, can be elucidated through the genomic lens of these unculturable microbes. However, contamination can occur at various stages of the single-cell sequencing process. Contaminated data can lead to wasted time and effort on meaningless analyses, inaccurate or erroneous conclusions, and pollution of public databases. A fully automated decontamination tool is necessary to prevent these instances and increase the throughput of the single-cell sequencing process

  12. Automation of one-loop QCD corrections

    CERN Document Server

    Hirschi, Valentin; Frixione, Stefano; Garzelli, Maria Vittoria; Maltoni, Fabio; Pittau, Roberto

    2011-01-01

    We present the complete automation of the computation of one-loop QCD corrections, including UV renormalization, to an arbitrary scattering process in the Standard Model. This is achieved by embedding the OPP integrand reduction technique, as implemented in CutTools, into the MadGraph framework. By interfacing the tool so constructed, which we dub MadLoop, with MadFKS, the fully automatic computation of any infrared-safe observable at the next-to-leading order in QCD is attained. We demonstrate the flexibility and the reach of our method by calculating the production rates for a variety of processes at the 7 TeV LHC.

  13. Office automation: The administrative window into the integrated DBMS

    Science.gov (United States)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  14. Automated cell type discovery and classification through knowledge transfer

    Science.gov (United States)

    Lee, Hao-Chih; Kosoy, Roman; Becker, Christine E.

    2017-01-01

    Abstract Motivation: Recent advances in mass cytometry allow simultaneous measurements of up to 50 markers at single-cell resolution. However, the high dimensionality of mass cytometry data introduces computational challenges for automated data analysis and hinders translation of new biological understanding into clinical applications. Previous studies have applied machine learning to facilitate processing of mass cytometry data. However, manual inspection is still inevitable and becoming the barrier to reliable large-scale analysis. Results: We present a new algorithm called Automated Cell-type Discovery and Classification (ACDC) that fully automates the classification of canonical cell populations and highlights novel cell types in mass cytometry data. Evaluations on real-world data show ACDC provides accurate and reliable estimations compared to manual gating results. Additionally, ACDC automatically classifies previously ambiguous cell types to facilitate discovery. Our findings suggest that ACDC substantially improves both reliability and interpretability of results obtained from high-dimensional mass cytometry profiling data. Availability and Implementation: A Python package (Python 3) and analysis scripts for reproducing the results are availability on https://bitbucket.org/dudleylab/acdc. Contact: brian.kidd@mssm.edu or joel.dudley@mssm.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28158442

  15. A robotics platform for automated batch fabrication of high density, microfluidics-based DNA microarrays, with applications to single cell, multiplex assays of secreted proteins

    Science.gov (United States)

    Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R.

    2011-09-01

    Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells.

  16. A robotics platform for automated batch fabrication of high density, microfluidics-based DNA microarrays, with applications to single cell, multiplex assays of secreted proteins.

    Science.gov (United States)

    Ahmad, Habib; Sutherland, Alex; Shin, Young Shik; Hwang, Kiwook; Qin, Lidong; Krom, Russell-John; Heath, James R

    2011-09-01

    Microfluidics flow-patterning has been utilized for the construction of chip-scale miniaturized DNA and protein barcode arrays. Such arrays have been used for specific clinical and fundamental investigations in which many proteins are assayed from single cells or other small sample sizes. However, flow-patterned arrays are hand-prepared, and so are impractical for broad applications. We describe an integrated robotics/microfluidics platform for the automated preparation of such arrays, and we apply it to the batch fabrication of up to eighteen chips of flow-patterned DNA barcodes. The resulting substrates are comparable in quality with hand-made arrays and exhibit excellent substrate-to-substrate consistency. We demonstrate the utility and reproducibility of robotics-patterned barcodes by utilizing two flow-patterned chips for highly parallel assays of a panel of secreted proteins from single macrophage cells. © 2011 American Institute of Physics

  17. Automated measurement of cell motility and proliferation

    Directory of Open Access Journals (Sweden)

    Goff Julie

    2005-04-01

    Full Text Available Abstract Background Time-lapse microscopic imaging provides a powerful approach for following changes in cell phenotype over time. Visible responses of whole cells can yield insight into functional changes that underlie physiological processes in health and disease. For example, features of cell motility accompany molecular changes that are central to the immune response, to carcinogenesis and metastasis, to wound healing and tissue regeneration, and to the myriad developmental processes that generate an organism. Previously reported image processing methods for motility analysis required custom viewing devices and manual interactions that may introduce bias, that slow throughput, and that constrain the scope of experiments in terms of the number of treatment variables, time period of observation, replication and statistical options. Here we describe a fully automated system in which images are acquired 24/7 from 384 well plates and are automatically processed to yield high-content motility and morphological data. Results We have applied this technology to study the effects of different extracellular matrix compounds on human osteoblast-like cell lines to explore functional changes that may underlie processes involved in bone formation and maintenance. We show dose-response and kinetic data for induction of increased motility by laminin and collagen type I without significant effects on growth rate. Differential motility response was evident within 4 hours of plating cells; long-term responses differed depending upon cell type and surface coating. Average velocities were increased approximately 0.1 um/min by ten-fold increases in laminin coating concentration in some cases. Comparison with manual tracking demonstrated the accuracy of the automated method and highlighted the comparative imprecision of human tracking for analysis of cell motility data. Quality statistics are reported that associate with stage noise, interference by non

  18. i3Drefine software for protein 3D structure refinement and its assessment in CASP10.

    Science.gov (United States)

    Bhattacharya, Debswapna; Cheng, Jianlin

    2013-01-01

    Protein structure refinement refers to the process of improving the qualities of protein structures during structure modeling processes to bring them closer to their native states. Structure refinement has been drawing increasing attention in the community-wide Critical Assessment of techniques for Protein Structure prediction (CASP) experiments since its addition in 8(th) CASP experiment. During the 9(th) and recently concluded 10(th) CASP experiments, a consistent growth in number of refinement targets and participating groups has been witnessed. Yet, protein structure refinement still remains a largely unsolved problem with majority of participating groups in CASP refinement category failed to consistently improve the quality of structures issued for refinement. In order to alleviate this need, we developed a completely automated and computationally efficient protein 3D structure refinement method, i3Drefine, based on an iterative and highly convergent energy minimization algorithm with a powerful all-atom composite physics and knowledge-based force fields and hydrogen bonding (HB) network optimization technique. In the recent community-wide blind experiment, CASP10, i3Drefine (as 'MULTICOM-CONSTRUCT') was ranked as the best method in the server section as per the official assessment of CASP10 experiment. Here we provide the community with free access to i3Drefine software and systematically analyse the performance of i3Drefine in strict blind mode on the refinement targets issued in CASP10 refinement category and compare with other state-of-the-art refinement methods participating in CASP10. Our analysis demonstrates that i3Drefine is only fully-automated server participating in CASP10 exhibiting consistent improvement over the initial structures in both global and local structural quality metrics. Executable version of i3Drefine is freely available at http://protein.rnet.missouri.edu/i3drefine/.

  19. i3Drefine Software for Protein 3D Structure Refinement and Its Assessment in CASP10

    Science.gov (United States)

    Bhattacharya, Debswapna; Cheng, Jianlin

    2013-01-01

    Protein structure refinement refers to the process of improving the qualities of protein structures during structure modeling processes to bring them closer to their native states. Structure refinement has been drawing increasing attention in the community-wide Critical Assessment of techniques for Protein Structure prediction (CASP) experiments since its addition in 8th CASP experiment. During the 9th and recently concluded 10th CASP experiments, a consistent growth in number of refinement targets and participating groups has been witnessed. Yet, protein structure refinement still remains a largely unsolved problem with majority of participating groups in CASP refinement category failed to consistently improve the quality of structures issued for refinement. In order to alleviate this need, we developed a completely automated and computationally efficient protein 3D structure refinement method, i3Drefine, based on an iterative and highly convergent energy minimization algorithm with a powerful all-atom composite physics and knowledge-based force fields and hydrogen bonding (HB) network optimization technique. In the recent community-wide blind experiment, CASP10, i3Drefine (as ‘MULTICOM-CONSTRUCT’) was ranked as the best method in the server section as per the official assessment of CASP10 experiment. Here we provide the community with free access to i3Drefine software and systematically analyse the performance of i3Drefine in strict blind mode on the refinement targets issued in CASP10 refinement category and compare with other state-of-the-art refinement methods participating in CASP10. Our analysis demonstrates that i3Drefine is only fully-automated server participating in CASP10 exhibiting consistent improvement over the initial structures in both global and local structural quality metrics. Executable version of i3Drefine is freely available at http://protein.rnet.missouri.edu/i3drefine/. PMID:23894517

  20. Cosima B2B - Sales Automation for E-Procurement

    OpenAIRE

    Kießling, Werner (Prof. Dr.); Fischer, Stefan; Döring, Sven

    2006-01-01

    E-procurement is one of the fastest growing application areas for e-commerce. Though B2B transaction costs could be reduced recently by establishing XML based standards for electronic product catalogs and data interchange, B2B sales costs are still high due to the amount of human interaction. For the first time we present a fully automated electronic sales agent for e-procurement portals. The key technologies for this breakthrough are based on preferences modeled as strict partial orders, ena...