WorldWideScience

Sample records for automated high-throughput cultivations

  1. Robo-Lector – a novel platform for automated high-throughput cultivations in microtiter plates with high information content

    Directory of Open Access Journals (Sweden)

    Kensy Frank

    2009-08-01

    Full Text Available Abstract Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3 pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main

  2. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  3. Automated growth rate determination in high-throughput microbioreactor systems.

    Science.gov (United States)

    Hemmerich, Johannes; Wiechert, Wolfgang; Oldiges, Marco

    2017-11-25

    The calculation of growth rates provides basic metric for biological fitness and is standard task when using microbioreactors (MBRs) in microbial phenotyping. MBRs easily produce huge data at high frequency from parallelized high-throughput cultivations with online monitoring of biomass formation at high temporal resolution. Resulting high-density data need to be processed efficiently to accelerate experimental throughput. A MATLAB code is presented that detects the exponential growth phase from multiple microbial cultivations in an iterative procedure based on several criteria, according to the model of exponential growth. These were obtained with Corynebacterium glutamicum showing single exponential growth phase and Escherichia coli exhibiting diauxic growth with exponential phase followed by retarded growth. The procedure reproducibly detects the correct biomass data subset for growth rate calculation. The procedure was applied on data set detached from growth phenotyping of library of genome reduced C. glutamicum strains and results agree with previously reported results where manual effort was needed to pre-process the data. Thus, the automated and standardized method enables a fair comparison of strain mutants for biological fitness evaluation. The code is easily parallelized and greatly facilitates experimental throughout in biological fitness testing from strain screenings conducted with MBR systems.

  4. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  5. High-Throughput Automation in Chemical Process Development.

    Science.gov (United States)

    Selekman, Joshua A; Qiu, Jun; Tran, Kristy; Stevens, Jason; Rosso, Victor; Simmons, Eric; Xiao, Yi; Janey, Jacob

    2017-06-07

    High-throughput (HT) techniques built upon laboratory automation technology and coupled to statistical experimental design and parallel experimentation have enabled the acceleration of chemical process development across multiple industries. HT technologies are often applied to interrogate wide, often multidimensional experimental spaces to inform the design and optimization of any number of unit operations that chemical engineers use in process development. In this review, we outline the evolution of HT technology and provide a comprehensive overview of how HT automation is used throughout different industries, with a particular focus on chemical and pharmaceutical process development. In addition, we highlight the common strategies of how HT automation is incorporated into routine development activities to maximize its impact in various academic and industrial settings.

  6. A fully automated high-throughput training system for rodents.

    Directory of Open Access Journals (Sweden)

    Rajesh Poddar

    Full Text Available Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal's home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors.

  7. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  8. Automated high-throughput behavioral analyses in zebrafish larvae.

    Science.gov (United States)

    Richendrfer, Holly; Créton, Robbert

    2013-07-04

    We have created a novel high-throughput imaging system for the analysis of behavior in 7-day-old zebrafish larvae in multi-lane plates. This system measures spontaneous behaviors and the response to an aversive stimulus, which is shown to the larvae via a PowerPoint presentation. The recorded images are analyzed with an ImageJ macro, which automatically splits the color channels, subtracts the background, and applies a threshold to identify individual larvae placement in the lanes. We can then import the coordinates into an Excel sheet to quantify swim speed, preference for edge or side of the lane, resting behavior, thigmotaxis, distance between larvae, and avoidance behavior. Subtle changes in behavior are easily detected using our system, making it useful for behavioral analyses after exposure to environmental toxicants or pharmaceuticals.

  9. An industrial engineering approach to laboratory automation for high throughput screening

    Science.gov (United States)

    Menke, Karl C.

    2000-01-01

    Across the pharmaceutical industry, there are a variety of approaches to laboratory automation for high throughput screening. At Sphinx Pharmaceuticals, the principles of industrial engineering have been applied to systematically identify and develop those automated solutions that provide the greatest value to the scientists engaged in lead generation. PMID:18924701

  10. An industrial engineering approach to laboratory automation for high throughput screening

    OpenAIRE

    Menke, Karl C.

    2000-01-01

    Across the pharmaceutical industry, there are a variety of approaches to laboratory automation for high throughput screening. At Sphinx Pharmaceuticals, the principles of industrial engineering have been applied to systematically identify and develop those automated solutions that provide the greatest value to the scientists engaged in lead generation.

  11. High Throughput Light Absorber Discovery, Part 1: An Algorithm for Automated Tauc Analysis.

    Science.gov (United States)

    Suram, Santosh K; Newhouse, Paul F; Gregoire, John M

    2016-11-14

    High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2 O 3 , Cu 2 V 2 O 7 , and BiVO 4 . The applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.

  12. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    Science.gov (United States)

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established

  13. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    Science.gov (United States)

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll

  14. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......High-throughput screening of genome wide siRNA- or compound libraries is currently applied for drug target and drug discovery. Commonly, these approaches deal with sample numbers ranging from 100,000 to several millions. Efforts to decrease costs and to increase information gained include......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  15. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial...... of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method....

  16. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    Energy Technology Data Exchange (ETDEWEB)

    Li, Fenglei [Iowa State Univ., Ames, IA (United States)

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  17. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    multiplexing readouts, but this has a natural limitation. High-content screening via image acquisition and analysis allows multiplexing of few parameters, but is connected to substantial time consumption and complex logistics. We report on integration of Reverse Phase Protein Arrays (RPPA)-based readouts...... into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high...

  18. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  19. An automated system for high-throughput single cell-based breeding

    Science.gov (United States)

    Yoshimoto, Nobuo; Kida, Akiko; Jie, Xu; Kurokawa, Masaya; Iijima, Masumi; Niimi, Tomoaki; Maturana, Andrés D.; Nikaido, Itoshi; Ueda, Hiroki R.; Tatematsu, Kenji; Tanizawa, Katsuyuki; Kondo, Akihiko; Fujii, Ikuo; Kuroda, Shun'ichi

    2013-01-01

    When establishing the most appropriate cells from the huge numbers of a cell library for practical use of cells in regenerative medicine and production of various biopharmaceuticals, cell heterogeneity often found in an isogenic cell population limits the refinement of clonal cell culture. Here, we demonstrated high-throughput screening of the most suitable cells in a cell library by an automated undisruptive single-cell analysis and isolation system, followed by expansion of isolated single cells. This system enabled establishment of the most suitable cells, such as embryonic stem cells with the highest expression of the pluripotency marker Rex1 and hybridomas with the highest antibody secretion, which could not be achieved by conventional high-throughput cell screening systems (e.g., a fluorescence-activated cell sorter). This single cell-based breeding system may be a powerful tool to analyze stochastic fluctuations and delineate their molecular mechanisms. PMID:23378922

  20. The development and application of high throughput cultivation technology in bioprocess development.

    Science.gov (United States)

    Long, Quan; Liu, Xiuxia; Yang, Yankun; Li, Lu; Harvey, Linda; McNeil, Brian; Bai, Zhonghu

    2014-12-20

    This review focuses on recent progress in the technology of high throughput (HTP) cultivation and its increasing application in quality by design (QbD) -driven bioprocess development. Several practical HTP strategies aimed at shortening process development (PD) timelines from DNA to large scale processes involving commercially available HTP technology platforms, including microtiter plate (MTP) culture, micro-scale bioreactors, and in parallel fermentation systems, etc., are critically reviewed in detail. This discussion focuses upon the relative strengths and weaknesses or limitations of each of these platforms in this context. Emerging prototypes of micro-bioreactors reported recently, such as milliliter (mL) scale stirred tank bioreactors, and microfludics integrated micro-scale bioreactors, and their potential for practical application in QbD-driven HTP process development are also critically appraised. The overall aim of such technology is to rapidly gain process insights, and since the analytical technology deployed in HTP systems is critically important to the achievement of this aim, this rapidly developing area is discussed. Finally, general future trends are critically reviewed. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  2. Automated analysis of NF-κB nuclear translocation kinetics in high-throughput screening.

    Directory of Open Access Journals (Sweden)

    Zi Di

    Full Text Available Nuclear entry and exit of the NF-κB family of dimeric transcription factors plays an essential role in regulating cellular responses to inflammatory stress. The dynamics of this nuclear translocation can vary significantly within a cell population and may dramatically change e.g. upon drug exposure. Furthermore, there is significant heterogeneity in individual cell response upon stress signaling. In order to systematically determine factors that define NF-κB translocation dynamics, high-throughput screens that enable the analysis of dynamic NF-κB responses in individual cells in real time are essential. Thus far, only NF-κB downstream signaling responses of whole cell populations at the transcriptional level are in high-throughput mode. In this study, we developed a fully automated image analysis method to determine the time-course of NF-κB translocation in individual cells, suitable for high-throughput screenings in the context of compound screening and functional genomics. Two novel segmentation methods were used for defining the individual nuclear and cytoplasmic regions: watershed masked clustering (WMC and best-fit ellipse of Voronoi cell (BEVC. The dynamic NFκB oscillatory response at the single cell and population level was coupled to automated extraction of 26 analogue translocation parameters including number of peaks, time to reach each peak, and amplitude of each peak. Our automated image analysis method was validated through a series of statistical tests demonstrating computational efficient and accurate NF-κB translocation dynamics quantification of our algorithm. Both pharmacological inhibition of NF-κB and short interfering RNAs targeting the inhibitor of NFκB, IκBα, demonstrated the ability of our method to identify compounds and genetic players that interfere with the nuclear transition of NF-κB.

  3. The Protein Maker: an automated system for high-throughput parallel purification

    International Nuclear Information System (INIS)

    Smith, Eric R.; Begley, Darren W.; Anderson, Vanessa; Raymond, Amy C.; Haffner, Taryn E.; Robinson, John I.; Edwards, Thomas E.; Duncan, Natalie; Gerdts, Cory J.; Mixon, Mark B.; Nollert, Peter; Staker, Bart L.; Stewart, Lance J.

    2011-01-01

    The Protein Maker instrument addresses a critical bottleneck in structural genomics by allowing automated purification and buffer testing of multiple protein targets in parallel with a single instrument. Here, the use of this instrument to (i) purify multiple influenza-virus proteins in parallel for crystallization trials and (ii) identify optimal lysis-buffer conditions prior to large-scale protein purification is described. The Protein Maker is an automated purification system developed by Emerald BioSystems for high-throughput parallel purification of proteins and antibodies. This instrument allows multiple load, wash and elution buffers to be used in parallel along independent lines for up to 24 individual samples. To demonstrate its utility, its use in the purification of five recombinant PB2 C-terminal domains from various subtypes of the influenza A virus is described. Three of these constructs crystallized and one diffracted X-rays to sufficient resolution for structure determination and deposition in the Protein Data Bank. Methods for screening lysis buffers for a cytochrome P450 from a pathogenic fungus prior to upscaling expression and purification are also described. The Protein Maker has become a valuable asset within the Seattle Structural Genomics Center for Infectious Disease (SSGCID) and hence is a potentially valuable tool for a variety of high-throughput protein-purification applications

  4. Semi-automated library preparation for high-throughput DNA sequencing platforms.

    Science.gov (United States)

    Farias-Hesson, Eveline; Erikson, Jonathan; Atkins, Alexander; Shen, Peidong; Davis, Ronald W; Scharfe, Curt; Pourmand, Nader

    2010-01-01

    Next-generation sequencing platforms are powerful technologies, providing gigabases of genetic information in a single run. An important prerequisite for high-throughput DNA sequencing is the development of robust and cost-effective preprocessing protocols for DNA sample library construction. Here we report the development of a semi-automated sample preparation protocol to produce adaptor-ligated fragment libraries. Using a liquid-handling robot in conjunction with Carboxy Terminated Magnetic Beads, we labeled each library sample using a unique 6 bp DNA barcode, which allowed multiplex sample processing and sequencing of 32 libraries in a single run using Applied Biosystems' SOLiD sequencer. We applied our semi-automated pipeline to targeted medical resequencing of nuclear candidate genes in individuals affected by mitochondrial disorders. This novel method is capable of preparing as much as 32 DNA libraries in 2.01 days (8-hour workday) for emulsion PCR/high throughput DNA sequencing, increasing sample preparation production by 8-fold.

  5. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-08-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple

  6. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  7. Automated degenerate PCR primer design for high-throughput sequencing improves efficiency of viral sequencing

    Directory of Open Access Journals (Sweden)

    Li Kelvin

    2012-11-01

    Full Text Available Abstract Background In a high-throughput environment, to PCR amplify and sequence a large set of viral isolates from populations that are potentially heterogeneous and continuously evolving, the use of degenerate PCR primers is an important strategy. Degenerate primers allow for the PCR amplification of a wider range of viral isolates with only one set of pre-mixed primers, thus increasing amplification success rates and minimizing the necessity for genome finishing activities. To successfully select a large set of degenerate PCR primers necessary to tile across an entire viral genome and maximize their success, this process is best performed computationally. Results We have developed a fully automated degenerate PCR primer design system that plays a key role in the J. Craig Venter Institute’s (JCVI high-throughput viral sequencing pipeline. A consensus viral genome, or a set of consensus segment sequences in the case of a segmented virus, is specified using IUPAC ambiguity codes in the consensus template sequence to represent the allelic diversity of the target population. PCR primer pairs are then selected computationally to produce a minimal amplicon set capable of tiling across the full length of the specified target region. As part of the tiling process, primer pairs are computationally screened to meet the criteria for successful PCR with one of two described amplification protocols. The actual sequencing success rates for designed primers for measles virus, mumps virus, human parainfluenza virus 1 and 3, human respiratory syncytial virus A and B and human metapneumovirus are described, where >90% of designed primer pairs were able to consistently successfully amplify >75% of the isolates. Conclusions Augmenting our previously developed and published JCVI Primer Design Pipeline, we achieved similarly high sequencing success rates with only minor software modifications. The recommended methodology for the construction of the consensus

  8. High-throughput phenotyping of plant resistance to aphids by automated video tracking.

    Science.gov (United States)

    Kloth, Karen J; Ten Broeke, Cindy Jm; Thoen, Manus Pm; Hanhart-van den Brink, Marianne; Wiegers, Gerrie L; Krips, Olga E; Noldus, Lucas Pjj; Dicke, Marcel; Jongsma, Maarten A

    2015-01-01

    Piercing-sucking insects are major vectors of plant viruses causing significant yield losses in crops. Functional genomics of plant resistance to these insects would greatly benefit from the availability of high-throughput, quantitative phenotyping methods. We have developed an automated video tracking platform that quantifies aphid feeding behaviour on leaf discs to assess the level of plant resistance. Through the analysis of aphid movement, the start and duration of plant penetrations by aphids were estimated. As a case study, video tracking confirmed the near-complete resistance of lettuce cultivar 'Corbana' against Nasonovia ribisnigri (Mosely), biotype Nr:0, and revealed quantitative resistance in Arabidopsis accession Co-2 against Myzus persicae (Sulzer). The video tracking platform was benchmarked against Electrical Penetration Graph (EPG) recordings and aphid population development assays. The use of leaf discs instead of intact plants reduced the intensity of the resistance effect in video tracking, but sufficiently replicated experiments resulted in similar conclusions as EPG recordings and aphid population assays. One video tracking platform could screen 100 samples in parallel. Automated video tracking can be used to screen large plant populations for resistance to aphids and other piercing-sucking insects.

  9. Automation in the high-throughput selection of random combinatorial libraries--different approaches for select applications.

    Science.gov (United States)

    Glökler, Jörn; Schütze, Tatjana; Konthur, Zoltán

    2010-04-08

    Automation in combination with high throughput screening methods has revolutionised molecular biology in the last two decades. Today, many combinatorial libraries as well as several systems for automation are available. Depending on scope, budget and time, a different combination of library and experimental handling might be most effective. In this review we will discuss several concepts of combinatorial libraries and provide information as what to expect from these depending on the given context.

  10. High-Throughput Light Sheet Microscopy for the Automated Live Imaging of Larval Zebrafish

    Science.gov (United States)

    Baker, Ryan; Logan, Savannah; Dudley, Christopher; Parthasarathy, Raghuveer

    The zebrafish is a model organism with a variety of useful properties; it is small and optically transparent, it reproduces quickly, it is a vertebrate, and there are a large variety of transgenic animals available. Because of these properties, the zebrafish is well suited to study using a variety of optical technologies including light sheet fluorescence microscopy (LSFM), which provides high-resolution three-dimensional imaging over large fields of view. Research progress, however, is often not limited by optical techniques but instead by the number of samples one can examine over the course of an experiment, which in the case of light sheet imaging has so far been severely limited. Here we present an integrated fluidic circuit and microscope which provides rapid, automated imaging of zebrafish using several imaging modes, including LSFM, Hyperspectral Imaging, and Differential Interference Contrast Microscopy. Using this system, we show that we can increase our imaging throughput by a factor of 10 compared to previous techniques. We also show preliminary results visualizing zebrafish immune response, which is sensitive to gut microbiota composition, and which shows a strong variability between individuals that highlights the utility of high throughput imaging. National Science Foundation, Award No. DBI-1427957.

  11. Automated high-throughput quantification of mitotic spindle positioning from DIC movies of Caenorhabditis embryos.

    Directory of Open Access Journals (Sweden)

    David Cluet

    Full Text Available The mitotic spindle is a microtubule-based structure that elongates to accurately segregate chromosomes during anaphase. Its position within the cell also dictates the future cell cleavage plan, thereby determining daughter cell orientation within a tissue or cell fate adoption for polarized cells. Therefore, the mitotic spindle ensures at the same time proper cell division and developmental precision. Consequently, spindle dynamics is the matter of intensive research. Among the different cellular models that have been explored, the one-cell stage C. elegans embryo has been an essential and powerful system to dissect the molecular and biophysical basis of spindle elongation and positioning. Indeed, in this large and transparent cell, spindle poles (or centrosomes can be easily detected from simple DIC microscopy by human eyes. To perform quantitative and high-throughput analysis of spindle motion, we developed a computer program ACT for Automated-Centrosome-Tracking from DIC movies of C. elegans embryos. We therefore offer an alternative to the image acquisition and processing of transgenic lines expressing fluorescent spindle markers. Consequently, experiments on large sets of cells can be performed with a simple setup using inexpensive microscopes. Moreover, analysis of any mutant or wild-type backgrounds is accessible because laborious rounds of crosses with transgenic lines become unnecessary. Last, our program allows spindle detection in other nematode species, offering the same quality of DIC images but for which techniques of transgenesis are not accessible. Thus, our program also opens the way towards a quantitative evolutionary approach of spindle dynamics. Overall, our computer program is a unique macro for the image- and movie-processing platform ImageJ. It is user-friendly and freely available under an open-source licence. ACT allows batch-wise analysis of large sets of mitosis events. Within 2 minutes, a single movie is processed

  12. High-throughput microfluidics automated cytogenetic processing for effectively lowering biological process time and aid triage during radiation accidents

    International Nuclear Information System (INIS)

    Ramakumar, Adarsh

    2016-01-01

    Nuclear or radiation mass casualties require individual, rapid, and accurate dose-based triage of exposed subjects for cytokine therapy and supportive care, to save life. Radiation mass casualties will demand high-throughput individual diagnostic dose assessment for medical management of exposed subjects. Cytogenetic techniques are widely used for triage and definitive radiation biodosimetry. Prototype platform to demonstrate high-throughput microfluidic micro incubation to support the logistics of sample in miniaturized incubators from the site of accident to analytical labs has been developed. Efforts have been made, both at the level of developing concepts and advanced system for higher throughput in processing the samples and also implementing better and efficient methods of logistics leading to performance of lab-on-chip analyses. Automated high-throughput platform with automated feature extraction, storage, cross platform data linkage, cross platform validation and inclusion of multi-parametric biomarker approaches will provide the first generation high-throughput platform systems for effective medical management, particularly during radiation mass casualty events

  13. High-throughput isolation of giant viruses in liquid medium using automated flow cytometry and fluorescence staining.

    Directory of Open Access Journals (Sweden)

    Jacques Yaacoub Bou Khalil

    2016-01-01

    Full Text Available The isolation of giant viruses using amoeba co-culture is tedious and fastidious. Recently, the procedure was successfully associated with a method that detects amoebal lysis on agar plates. However, the procedure remains time-consuming and is limited to protozoa growing on agar. We present here advances for the isolation of giant viruses. A high-throughput automated method based on flow cytometry and fluorescent staining was used to detect the presence of giant viruses in liquid medium. Development was carried out with the Acanthamoeba polyphaga strain widely used in past and current co-culture experiments. The proof of concept was validated with virus suspensions: artificially contaminated samples but also environmental samples from which viruses were previously isolated. After validating the technique, and fortuitously isolating a new Mimivirus, we automated the technique on 96-well plates and tested it on clinical and environmental samples using other protozoa. This allowed us to detect more than ten strains of previously known species of giant viruses and 7 new strains of a new virus lineage. This automated high-throughput method demonstrated significant time saving, and higher sensitivity than older techniques. It thus creates the means to isolate giant viruses at high speed.

  14. Upscaling and automation of electrophysiology: toward high throughput screening in ion channel drug discovery

    DEFF Research Database (Denmark)

    Asmild, Margit; Oswald, Nicholas; Krzywkowski, Karen M

    2003-01-01

    by developing two lines of automated patch clamp products, a traditional pipette-based system called Apatchi-1, and a silicon chip-based system QPatch. The degree of automation spans from semi-automation (Apatchi-1) where a trained technician interacts with the system in a limited way, to a complete automation...

  15. Automated cleaning and pre-processing of immunoglobulin gene sequences from high-throughput sequencing

    Directory of Open Access Journals (Sweden)

    Miri eMichaeli

    2012-12-01

    Full Text Available High throughput sequencing (HTS yields tens of thousands to millions of sequences that require a large amount of pre-processing work to clean various artifacts. Such cleaning cannot be performed manually. Existing programs are not suitable for immunoglobulin (Ig genes, which are variable and often highly mutated. This paper describes Ig-HTS-Cleaner (Ig High Throughput Sequencing Cleaner, a program containing a simple cleaning procedure that successfully deals with pre-processing of Ig sequences derived from HTS, and Ig-Indel-Identifier (Ig Insertion – Deletion Identifier, a program for identifying legitimate and artifact insertions and/or deletions (indels. Our programs were designed for analyzing Ig gene sequences obtained by 454 sequencing, but they are applicable to all types of sequences and sequencing platforms. Ig-HTS-Cleaner and Ig-Indel-Identifier have been implemented in Java and saved as executable JAR files, supported on Linux and MS Windows. No special requirements are needed in order to run the programs, except for correctly constructing the input files as explained in the text. The programs' performance has been tested and validated on real and simulated data sets.

  16. High throughput detection of Coxiella burnetii by real-time PCR with internal control system and automated DNA preparation

    Directory of Open Access Journals (Sweden)

    Kramme Stefanie

    2008-05-01

    Full Text Available Abstract Background Coxiella burnetii is the causative agent of Q-fever, a widespread zoonosis. Due to its high environmental stability and infectivity it is regarded as a category B biological weapon agent. In domestic animals infection remains either asymptomatic or presents as infertility or abortion. Clinical presentation in humans can range from mild flu-like illness to acute pneumonia and hepatitis. Endocarditis represents the most common form of chronic Q-fever. In humans serology is the gold standard for diagnosis but is inadequate for early case detection. In order to serve as a diagnostic tool in an eventual biological weapon attack or in local epidemics we developed a real-time 5'nuclease based PCR assay with an internal control system. To facilitate high-throughput an automated extraction procedure was evaluated. Results To determine the minimum number of copies that are detectable at 95% chance probit analysis was used. Limit of detection in blood was 2,881 copies/ml [95%CI, 2,188–4,745 copies/ml] with a manual extraction procedure and 4,235 copies/ml [95%CI, 3,143–7,428 copies/ml] with a fully automated extraction procedure, respectively. To demonstrate clinical application a total of 72 specimens of animal origin were compared with respect to manual and automated extraction. A strong correlation between both methods was observed rendering both methods suitable. Testing of 247 follow up specimens of animal origin from a local Q-fever epidemic rendered real-time PCR more sensitive than conventional PCR. Conclusion A sensitive and thoroughly evaluated real-time PCR was established. Its high-throughput mode may show a useful approach to rapidly screen samples in local outbreaks for other organisms relevant for humans or animals. Compared to a conventional PCR assay sensitivity of real-time PCR was higher after testing samples from a local Q-fever outbreak.

  17. PLAN: a web platform for automating high-throughput BLAST searches and for managing and mining results

    Directory of Open Access Journals (Sweden)

    Zhao Xuechun

    2007-02-01

    Full Text Available Abstract Background BLAST searches are widely used for sequence alignment. The search results are commonly adopted for various functional and comparative genomics tasks such as annotating unknown sequences, investigating gene models and comparing two sequence sets. Advances in sequencing technologies pose challenges for high-throughput analysis of large-scale sequence data. A number of programs and hardware solutions exist for efficient BLAST searching, but there is a lack of generic software solutions for mining and personalized management of the results. Systematically reviewing the results and identifying information of interest remains tedious and time-consuming. Results Personal BLAST Navigator (PLAN is a versatile web platform that helps users to carry out various personalized pre- and post-BLAST tasks, including: (1 query and target sequence database management, (2 automated high-throughput BLAST searching, (3 indexing and searching of results, (4 filtering results online, (5 managing results of personal interest in favorite categories, (6 automated sequence annotation (such as NCBI NR and ontology-based annotation. PLAN integrates, by default, the Decypher hardware-based BLAST solution provided by Active Motif Inc. with a greatly improved efficiency over conventional BLAST software. BLAST results are visualized by spreadsheets and graphs and are full-text searchable. BLAST results and sequence annotations can be exported, in part or in full, in various formats including Microsoft Excel and FASTA. Sequences and BLAST results are organized in projects, the data publication levels of which are controlled by the registered project owners. In addition, all analytical functions are provided to public users without registration. Conclusion PLAN has proved a valuable addition to the community for automated high-throughput BLAST searches, and, more importantly, for knowledge discovery, management and sharing based on sequence alignment results

  18. PLAN: a web platform for automating high-throughput BLAST searches and for managing and mining results.

    Science.gov (United States)

    He, Ji; Dai, Xinbin; Zhao, Xuechun

    2007-02-09

    BLAST searches are widely used for sequence alignment. The search results are commonly adopted for various functional and comparative genomics tasks such as annotating unknown sequences, investigating gene models and comparing two sequence sets. Advances in sequencing technologies pose challenges for high-throughput analysis of large-scale sequence data. A number of programs and hardware solutions exist for efficient BLAST searching, but there is a lack of generic software solutions for mining and personalized management of the results. Systematically reviewing the results and identifying information of interest remains tedious and time-consuming. Personal BLAST Navigator (PLAN) is a versatile web platform that helps users to carry out various personalized pre- and post-BLAST tasks, including: (1) query and target sequence database management, (2) automated high-throughput BLAST searching, (3) indexing and searching of results, (4) filtering results online, (5) managing results of personal interest in favorite categories, (6) automated sequence annotation (such as NCBI NR and ontology-based annotation). PLAN integrates, by default, the Decypher hardware-based BLAST solution provided by Active Motif Inc. with a greatly improved efficiency over conventional BLAST software. BLAST results are visualized by spreadsheets and graphs and are full-text searchable. BLAST results and sequence annotations can be exported, in part or in full, in various formats including Microsoft Excel and FASTA. Sequences and BLAST results are organized in projects, the data publication levels of which are controlled by the registered project owners. In addition, all analytical functions are provided to public users without registration. PLAN has proved a valuable addition to the community for automated high-throughput BLAST searches, and, more importantly, for knowledge discovery, management and sharing based on sequence alignment results. The PLAN web interface is platform

  19. An automated high-throughput system for phenotypic screening of chemical libraries on C. elegans and parasitic nematodes.

    Science.gov (United States)

    Partridge, Frederick A; Brown, Anwen E; Buckingham, Steven D; Willis, Nicky J; Wynne, Graham M; Forman, Ruth; Else, Kathryn J; Morrison, Alison A; Matthews, Jacqueline B; Russell, Angela J; Lomas, David A; Sattelle, David B

    2017-12-02

    Parasitic nematodes infect hundreds of millions of people and farmed livestock. Further, plant parasitic nematodes result in major crop damage. The pipeline of therapeutic compounds is limited and parasite resistance to the existing anthelmintic compounds is a global threat. We have developed an INVertebrate Automated Phenotyping Platform (INVAPP) for high-throughput, plate-based chemical screening, and an algorithm (Paragon) which allows screening for compounds that have an effect on motility and development of parasitic worms. We have validated its utility by determining the efficacy of a panel of known anthelmintics against model and parasitic nematodes: Caenorhabditis elegans, Haemonchus contortus, Teladorsagia circumcincta, and Trichuris muris. We then applied the system to screen the Pathogen Box chemical library in a blinded fashion and identified compounds already known to have anthelmintic or anti-parasitic activity, including tolfenpyrad, auranofin, and mebendazole; and 14 compounds previously undescribed as anthelmintics, including benzoxaborole and isoxazole chemotypes. This system offers an effective, high-throughput system for the discovery of novel anthelmintics. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. RNA structure framework: automated transcriptome-wide reconstruction of RNA secondary structures from high-throughput structure probing data.

    Science.gov (United States)

    Incarnato, Danny; Neri, Francesco; Anselmi, Francesca; Oliviero, Salvatore

    2016-02-01

    The rapidly increasing number of discovered non-coding RNAs makes the understanding of their structure a key feature toward a deeper comprehension of gene expression regulation. Various enzymatic- and chemically- based approaches have been recently developed to allow whole-genome studies of RNA secondary structures. Several methods have been recently presented that allow high-throughput RNA structure probing (CIRS-seq, Structure-seq, SHAPE-seq, PARS, etc.) and unbiased structural inference of residues within RNAs in their native conformation. We here present an analysis toolkit, named RNA Structure Framework (RSF), which allows fast and fully-automated analysis of high-throughput structure probing data, from data pre-processing to whole-transcriptome RNA structure inference. RSF is written in Perl and is freely available under the GPLv3 license from http://rsf.hugef-research.org. salvatore.oliviero@hugef-torino.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Database-Centric Method for Automated High-Throughput Deconvolution and Analysis of Kinetic Antibody Screening Data.

    Science.gov (United States)

    Nobrega, R Paul; Brown, Michael; Williams, Cody; Sumner, Chris; Estep, Patricia; Caffry, Isabelle; Yu, Yao; Lynaugh, Heather; Burnina, Irina; Lilov, Asparouh; Desroches, Jordan; Bukowski, John; Sun, Tingwan; Belk, Jonathan P; Johnson, Kirt; Xu, Yingda

    2017-10-01

    The state-of-the-art industrial drug discovery approach is the empirical interrogation of a library of drug candidates against a target molecule. The advantage of high-throughput kinetic measurements over equilibrium assessments is the ability to measure each of the kinetic components of binding affinity. Although high-throughput capabilities have improved with advances in instrument hardware, three bottlenecks in data processing remain: (1) intrinsic molecular properties that lead to poor biophysical quality in vitro are not accounted for in commercially available analysis models, (2) processing data through a user interface is time-consuming and not amenable to parallelized data collection, and (3) a commercial solution that includes historical kinetic data in the analysis of kinetic competition data does not exist. Herein, we describe a generally applicable method for the automated analysis, storage, and retrieval of kinetic binding data. This analysis can deconvolve poor quality data on-the-fly and store and organize historical data in a queryable format for use in future analyses. Such database-centric strategies afford greater insight into the molecular mechanisms of kinetic competition, allowing for the rapid identification of allosteric effectors and the presentation of kinetic competition data in absolute terms of percent bound to antigen on the biosensor.

  2. Automation: a key technology to safe and reliable spent nuclear fuel handling in high throughput plants

    International Nuclear Information System (INIS)

    Blanc, E.; Berge, F.

    1999-01-01

    La Hague 30 year experience with nuclear spent fuel handling represents more than 48,000 assemblies handled in wet and dry environments. The front end facilities of the UP2-800 and UP3 reprocessing plants are dedicated to spent fuel handling, e.g. fuel unloading, interim storage, dispatch and measurement. The operations, including maintenance, are largely automated and are performed remotely from central control rooms. The use of automation at La Hague is aimed at reducing personnel exposure, increasing the purposeful utilization of equipment, increasing the reliability of operations and thus the safety of the facilities, and improving fuel accountability. The automation of the plants was designed to maintain a high achievable availability and flexibility of the facilities. Today, La Hague reprocessing plants have successfully reached their design capacity and handle fuel from utilities all over the world with a wide range of types and burnup. The future developments include a decision support system for operators. (author)

  3. High-throughput automated system for statistical biosensing employing microcantilevers arrays

    DEFF Research Database (Denmark)

    Bosco, Filippo; Chen, Ching H.; Hwu, En T.

    2011-01-01

    In this paper we present a completely new and fully automated system for parallel microcantilever-based biosensing. Our platform is able to monitor simultaneously the change of resonance frequency (dynamic mode), of deflection (static mode), and of surface roughness of hundreds of cantilevers in ...

  4. Process automation toward ultra-high-throughput screening of combinatorial one-bead-one-compound (OBOC) peptide libraries.

    Science.gov (United States)

    Cha, Junhoe; Lim, Jaehong; Zheng, Yiran; Tan, Sylvia; Ang, Yi Li; Oon, Jessica; Ang, Mei Wei; Ling, Jingjing; Bode, Marcus; Lee, Su Seong

    2012-06-01

    With an aim to develop peptide-based protein capture agents that can replace antibodies for in vitro diagnosis, an ultra-high-throughput screening strategy has been investigated by automating labor-intensive, time-consuming processes that are the construction of peptide libraries, sorting of positive beads, and peptide sequencing through analysis of tandem mass spectrometry data. Although instruments for automation, such as peptide synthesizers and automatic bead sorters, have been used in some groups, the overall process has not been well optimized to minimize time, cost, and efforts, as well as to maximize product quality and performance. Herein we suggest and explore several solutions to the existing problems with the automation of the key processes. The overall process optimization has been done successfully in orchestration with the technologies such as rapid cleavage of peptides from beads and semiautomatic peptide sequencing that we have developed previously. This optimization allowed one-round screening, from peptide library construction to peptide sequencing, to be completed within 4 to 5 days. We also successfully identified a 6-mer ligand for carcinoembryonic antigen-cell adhesion molecule 5 (CEACAM 5) through three-round screenings, including one-round screening of a focused library.

  5. High-throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium.

    Science.gov (United States)

    Abubakar, Mustapha; Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh-Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang-Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A E M; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; Wm Martens, John; Hm van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia-Closas, Montserrat

    2016-07-01

    Automated methods are needed to facilitate high-throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large-scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37-0.87) and study (kappa range = 0.39-0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p-value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000-4,500 cells: kappa = 0.78) than those with lower counts (50-500 cells: kappa = 0.41; p-value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre- and post-analytical quality control procedures are

  6. Automated Analysis of Barley Organs Using 3D Laser Scanning: An Approach for High Throughput Phenotyping

    Directory of Open Access Journals (Sweden)

    Stefan Paulus

    2014-07-01

    Full Text Available Due to the rise of laser scanning the 3D geometry of plant architecture is easy to acquire. Nevertheless, an automated interpretation and, finally, the segmentation into functional groups are still difficult to achieve. Two barley plants were scanned in a time course, and the organs were separated by applying a histogram-based classification algorithm. The leaf organs were represented by meshing algorithms, while the stem organs were parameterized by a least-squares cylinder approximation. We introduced surface feature histograms with an accuracy of 96% for the separation of the barley organs, leaf and stem. This enables growth monitoring in a time course for barley plants. Its reliability was demonstrated by a comparison with manually fitted parameters with a correlation R2 = 0:99 for the leaf area and R2 = 0:98 for the cumulated stem height. A proof of concept has been given for its applicability for the detection of water stress in barley, where the extension growth of an irrigated and a non-irrigated plant has been monitored.

  7. High Throughput Petrochronology and Sedimentary Provenance Analysis by Automated Phase Mapping and LAICPMS

    Science.gov (United States)

    Vermeesch, Pieter; Rittner, Martin; Petrou, Ethan; Omma, Jenny; Mattinson, Chris; Garzanti, Eduardo

    2017-11-01

    The first step in most geochronological studies is to extract dateable minerals from the host rock, which is time consuming, removes textural context, and increases the chance for sample cross contamination. We here present a new method to rapidly perform in situ analyses by coupling a fast scanning electron microscope (SEM) with Energy Dispersive X-ray Spectrometer (EDS) to a Laser Ablation Inductively Coupled Plasma Mass Spectrometer (LAICPMS) instrument. Given a polished hand specimen, a petrographic thin section, or a grain mount, Automated Phase Mapping (APM) by SEM/EDS produces chemical and mineralogical maps from which the X-Y coordinates of the datable minerals are extracted. These coordinates are subsequently passed on to the laser ablation system for isotopic analysis. We apply the APM + LAICPMS method to three igneous, metamorphic, and sedimentary case studies. In the first case study, a polished slab of granite from Guernsey was scanned for zircon, producing a 609 ± 8 Ma weighted mean age. The second case study investigates a paragneiss from an ultra high pressure terrane in the north Qaidam terrane (Qinghai, China). One hundred seven small (25 µm) metamorphic zircons were analyzed by LAICPMS to confirm a 419 ± 4 Ma age of peak metamorphism. The third and final case study uses APM + LAICPMS to generate a large provenance data set and trace the provenance of 25 modern sediments from Angola, documenting longshore drift of Orange River sediments over a distance of 1,500 km. These examples demonstrate that APM + LAICPMS is an efficient and cost effective way to improve the quantity and quality of geochronological data.

  8. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Daniel J. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Tfaily, Malak M. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Moore, Ronald J. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; LaMarche, Brian L. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Zheng, Xueyun [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Fillmore, Thomas L. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Chu, Rosalie K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Weitz, Karl K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Monroe, Matthew E. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Kelly, Ryan T. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Smith, Richard D. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Baker, Erin S. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States

    2017-12-13

    To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, low pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.

  9. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements.

    Science.gov (United States)

    Orton, Daniel J; Tfaily, Malak M; Moore, Ronald J; LaMarche, Brian L; Zheng, Xueyun; Fillmore, Thomas L; Chu, Rosalie K; Weitz, Karl K; Monroe, Matthew E; Kelly, Ryan T; Smith, Richard D; Baker, Erin S

    2018-01-02

    To better understand disease conditions and environmental perturbations, multiomic studies combining proteomic, lipidomic, and metabolomic analyses are vastly increasing in popularity. In a multiomic study, a single sample is typically extracted in multiple ways, and various analyses are performed using different instruments, most often based upon mass spectrometry (MS). Thus, one sample becomes many measurements, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injections. While some FIA systems have been created to address these challenges, many have limitations such as costly consumables, low pressure capabilities, limited pressure monitoring, and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at a range of flow rates (∼50 nL/min to 500 μL/min) to accommodate both low- and high-flow MS ionization sources. This system also functions at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system, and results showed a highly robust and reproducible platform capable of providing consistent performance over many days without carryover, as long as washing buffers specific to each molecular analysis were utilized.

  10. Automated high-throughput dense matrix protein folding screen using a liquid handling robot combined with microfluidic capillary electrophoresis.

    Science.gov (United States)

    An, Philip; Winters, Dwight; Walker, Kenneth W

    2016-04-01

    Modern molecular genetics technology has made it possible to swiftly sequence, clone and mass-produce recombinant DNA for the purpose of expressing heterologous genes of interest; however, recombinant protein production systems have struggled to keep pace. Mammalian expression systems are typically favored for their ability to produce and secrete proteins in their native state, but bacterial systems benefit from rapid cell line development and robust growth. The primary drawback to prokaryotic expression systems are that recombinant proteins are generally not secreted at high levels or correctly folded, and are often insoluble, necessitating post-expression protein folding to obtain the active product. In order to harness the advantages of prokaryotic expression, high-throughput methods for executing protein folding screens and the subsequent analytics to identify lead conditions are required. Both of these tasks can be accomplished using a Biomek 3000 liquid handling robot to prepare the folding screen and to subsequently prepare the reactions for assessment using Caliper microfluidic capillary electrophoresis. By augmenting a protein folding screen with automation, the primary disadvantage of Escherichia coli expression has been mitigated, namely the labor intensive identification of the required protein folding conditions. Furthermore, a rigorous, quantitative method for identifying optimal protein folding buffer aids in the rapid development of an optimal production process. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Automated integrative high-throughput phenotyping of plant shoots: a case study of the cold-tolerance of pea (Pisum sativum L.).

    Science.gov (United States)

    Humplík, Jan F; Lazár, Dušan; Fürst, Tomáš; Husičková, Alexandra; Hýbl, Miroslav; Spíchal, Lukáš

    2015-01-01

    Recently emerging approaches to high-throughput plant phenotyping have discovered their importance as tools in unravelling the complex questions of plant growth, development and response to the environment, both in basic and applied science. High-throughput methods have been also used to study plant responses to various types of biotic and abiotic stresses (drought, heat, salinity, nutrient-starving, UV light) but only rarely to cold tolerance. We present here an experimental procedure of integrative high-throughput in-house phenotyping of plant shoots employing automated simultaneous analyses of shoot biomass and photosystem II efficiency to study the cold tolerance of pea (Pisum sativum L.). For this purpose, we developed new software for automatic RGB image analysis, evaluated various parameters of chlorophyll fluorescence obtained from kinetic chlorophyll fluorescence imaging, and performed an experiment in which the growth and photosynthetic activity of two different pea cultivars were followed during cold acclimation. The data obtained from the automated RGB imaging were validated through correlation of pixel based shoot area with measurement of the shoot fresh weight. Further, data obtained from automated chlorophyll fluorescence imaging analysis were compared with chlorophyll fluorescence parameters measured by a non-imaging chlorophyll fluorometer. In both cases, high correlation was obtained, confirming the reliability of the procedure described. This study of the response of two pea cultivars to cold stress confirmed that our procedure may have important application, not only for selection of cold-sensitive/tolerant varieties of pea, but also for studies of plant cold-response strategies in general. The approach, provides a very broad tool for the morphological and physiological selection of parameters which correspond to shoot growth and the efficiency of photosystem II, and is thus applicable in studies of various plant species and crops.

  12. Application of high-throughput sequencing to measure the performance of commonly used selective cultivation methods for the foodborne pathogen Campylobacter.

    Science.gov (United States)

    Oakley, Brian B; Morales, Cesar A; Line, J Eric; Seal, Bruce S; Hiett, Kelli L

    2012-02-01

    Campylobacter is an important foodborne human pathogen, which has traditionally been studied using a variety of selective cultivation methods. Here we use next-generation sequencing to ask the following: (i) how selective are commonly used Campylobacter cultivation methods relative to the initial sample and (ii) how do the specificity and sensitivity of these methods compare with one another? To answer these questions, we used 16S rRNA tagged-pyrosequencing to sequence directly from a pooled fecal sample representing a c. 16,000 bird poultry flock and compared these data to exhaustive sequencing of colonies formed after plating. We compared five commonly used media [Cefex, Cape Town, modified cefoperazone charcoal deoxycholate agar (mCCDA), Campy-Line agar (CLA), and Campy-CVA agar (CVA)], two incubation atmospheres (10% CO(2), 5% O(2), 85% N(2) and 10% CO(2), 10% H(2), 80% N(2)), and two incubation temperatures (37 and 42 °C). Analysis of 404,104 total sequence reads, including 19 472 total fecal reads, revealed Campylobacter represented only a small proportion (media type. Incubation atmosphere had little effect on recovery, but a significant difference in media specificity (more non-Campylobacter OTUs; P = 0.028) was found at 42 vs. 37 °C. The most common non-Campylobacter sequence type was Proteus, which ranged from 0.04% of sequences (mCCDA) to 10.8% (Cape Town). High-throughput sequencing provides a novel and powerful approach to measure the performance of selective media, which remain widely used for research and regulatory purposes. Published 2011. This article is a U.S. Government work and is in the public domain in the USA.

  13. A mobile, high-throughput semi-automated system for testing cognition in large non-primate animal models of Huntington disease.

    Science.gov (United States)

    McBride, Sebastian D; Perentos, Nicholas; Morton, A Jennifer

    2016-05-30

    For reasons of cost and ethical concerns, models of neurodegenerative disorders such as Huntington disease (HD) are currently being developed in farm animals, as an alternative to non-human primates. Developing reliable methods of testing cognitive function is essential to determining the usefulness of such models. Nevertheless, cognitive testing of farm animal species presents a unique set of challenges. The primary aims of this study were to develop and validate a mobile operant system suitable for high throughput cognitive testing of sheep. We designed a semi-automated testing system with the capability of presenting stimuli (visual, auditory) and reward at six spatial locations. Fourteen normal sheep were used to validate the system using a two-choice visual discrimination task. Four stages of training devised to acclimatise animals to the system are also presented. All sheep progressed rapidly through the training stages, over eight sessions. All sheep learned the 2CVDT and performed at least one reversal stage. The mean number of trials the sheep took to reach criterion in the first acquisition learning was 13.9±1.5 and for the reversal learning was 19.1±1.8. This is the first mobile semi-automated operant system developed for testing cognitive function in sheep. We have designed and validated an automated operant behavioural testing system suitable for high throughput cognitive testing in sheep and other medium-sized quadrupeds, such as pigs and dogs. Sheep performance in the two-choice visual discrimination task was very similar to that reported for non-human primates and strongly supports the use of farm animals as pre-clinical models for the study of neurodegenerative diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. High-throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium (BCAC) : Large-scale scoring of Ki67 in breast cancer TMAs

    NARCIS (Netherlands)

    M. Abubakar (Mustapha); W.J. Howat (Will); F. Daley (Frances); L. Zabaglo (Lila); L.A. McDuffus (Leigh-Anne); F. Blows (Fiona); P. Coulson (Penny); H.A. Raza (Ali); J. Benitez (Javier); R.L. Milne (Roger); H. Brenner (Hermann); C. Stegmaier (Christa); A. Mannermaa (Arto); J. Chang-Claude (Jenny); A. Rudolph (Anja); P. Sinn (Peter); F.J. Couch (Fergus); P. Devilee (Peter); J.D. Figueroa (Jonine); M.E. Sherman (Mark); J. Lissowska (Jolanta); S.M. Hewitt (Stephen); D. Eccles (Diana); M.J. Hooning (Maartje); A. Hollestelle (Antoinette); J.W.M. Martens (John); C.H.M. van Deurzen (Carolien); K. Investigators (Kconfab); M.K. Bolla (Manjeet); Q. Wang (Qin); M. Jones (Michael); M. Schoemaker (Minouk); A. Broeks (Annegien); F.E. van Leeuwen (Flora); L.J. van 't Veer (Laura); A.J. Swerdlow (Anthony ); N. Orr (Nick); M. Dowsett (Mitch); D.F. Easton (Douglas); M. Schmidt (Marjanka); P.D.P. Pharoah (Paul); M. García-Closas (Montserrat)

    2016-01-01

    textabstractAutomated methods are needed to facilitate high-throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large-scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in

  15. Automated vector selection of SIVQ and parallel computing integration MATLAB TM : Innovations supporting large-scale and high-throughput image analysis studies

    Directory of Open Access Journals (Sweden)

    Jerome Cheng

    2011-01-01

    Full Text Available Introduction: Spatially invariant vector quantization (SIVQ is a texture and color-based image matching algorithm that queries the image space through the use of ring vectors. In prior studies, the selection of one or more optimal vectors for a particular feature of interest required a manual process, with the user initially stochastically selecting candidate vectors and subsequently testing them upon other regions of the image to verify the vector′s sensitivity and specificity properties (typically by reviewing a resultant heat map. In carrying out the prior efforts, the SIVQ algorithm was noted to exhibit highly scalable computational properties, where each region of analysis can take place independently of others, making a compelling case for the exploration of its deployment on high-throughput computing platforms, with the hypothesis that such an exercise will result in performance gains that scale linearly with increasing processor count. Methods: An automated process was developed for the selection of optimal ring vectors to serve as the predicate matching operator in defining histopathological features of interest. Briefly, candidate vectors were generated from every possible coordinate origin within a user-defined vector selection area (VSA and subsequently compared against user-identified positive and negative "ground truth" regions on the same image. Each vector from the VSA was assessed for its goodness-of-fit to both the positive and negative areas via the use of the receiver operating characteristic (ROC transfer function, with each assessment resulting in an associated area-under-the-curve (AUC figure of merit. Results: Use of the above-mentioned automated vector selection process was demonstrated in two cases of use: First, to identify malignant colonic epithelium, and second, to identify soft tissue sarcoma. For both examples, a very satisfactory optimized vector was identified, as defined by the AUC metric. Finally, as an

  16. Reducing the cost of semi-automated in-gel tryptic digestion and GeLC sample preparation for high-throughput proteomics.

    Science.gov (United States)

    Ruelcke, Jayde E; Loo, Dorothy; Hill, Michelle M

    2016-10-21

    Peptide generation by trypsin digestion is typically the first step in mass spectrometry-based proteomics experiments, including 'bottom-up' discovery and targeted proteomics using multiple reaction monitoring. Manual tryptic digest and the subsequent clean-up steps can add variability even before the sample reaches the analytical platform. While specialized filter plates and tips have been designed for automated sample processing, the specialty reagents required may not be accessible or feasible due to their high cost. Here, we report a lower-cost semi-automated protocol for in-gel digestion and GeLC using standard 96-well microplates. Further cost savings were realized by re-using reagent tips with optimized sample ordering. To evaluate the methodology, we compared a simple mixture of 7 proteins and a complex cell-lysate sample. The results across three replicates showed that our semi-automated protocol had performance equal to or better than a manual in-gel digestion with respect to replicate variability and level of contamination. In this paper, we also provide the Agilent Bravo method file, which can be adapted to other liquid handlers. The simplicity, reproducibility, and cost-effectiveness of our semi-automated protocol make it ideal for routine in-gel and GeLC sample preparations, as well as high throughput processing of large clinical sample cohorts. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility – High throughput sample evaluation and automation

    International Nuclear Information System (INIS)

    Theveneau, P; Baker, R; Barrett, R; Beteva, A; Bowler, M W; Carpentier, P; Caserotto, H; Sanctis, D de; Dobias, F; Flot, D; Guijarro, M; Giraud, T; Lentini, M; Leonard, G A; Mattenet, M; McSweeney, S M; Morawe, C; Nurizzo, D; McCarthy, A A; Nanao, M

    2013-01-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This 'first generation' of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  18. Yeast Replicator: A High-Throughput Multiplexed Microfluidics Platform for Automated Measurements of Single-Cell Aging

    Directory of Open Access Journals (Sweden)

    Ping Liu

    2015-10-01

    Full Text Available The yeast Saccharomyces cerevisiae is a model organism for replicative aging studies; however, conventional lifespan measurement platforms have several limitations. Here, we present a microfluidics platform that facilitates simultaneous lifespan and gene expression measurements of aging yeast cells. Our multiplexed high-throughput platform offers the capability to perform independent lifespan experiments using different yeast strains or growth media. Using this platform in minimal media environments containing glucose, we measured the full lifespan of individual yeast cells in wild-type and canonical gene deletion backgrounds. Compared to glucose, in galactose we observed a 16.8% decrease in replicative lifespan accompanied by an ∼2-fold increase in single-cell oxidative stress levels reported by PSOD1-mCherry. Using PGAL1-YFP to measure the activity of the bistable galactose network, we saw that OFF and ON cells are similar in their lifespan. Our work shows that aging cells are committed to a single phenotypic state throughout their lifespan.

  19. Automated high throughput whole slide imaging using area sensors, flash light illumination and solid state light engine.

    Science.gov (United States)

    Varga, Viktor Sebestyén; Molnár, Béla; Virág, Tibor

    2012-01-01

    Whole Slide Imagers or digital slide scanners have developed very rapidly in the last 8 years and went through three generations. Third generation instruments have just reached the market which have the stability and throughput to be used for routine clinical work. We describe in this article the technical background and reasoning behind engineering decisions we made during the development of 3DHISTECH's 3rd generation combined brightfield and fluorescent scanner. The Panoramic 250 FLASH utilizes Plan-Apochromat 20x and 40x objectives, a 2 megapixel 3CCD camera for brightfield and a monochrome scientific CMOS camera for fluorescent scanning. A solid state light engine for fluorescent and a strobe light for bright field illumination are used. The system can scan 1cm2 including focusing at 45x resolution in 1 minute. It can scan a well stained DAPI, FITC, TRIC, 1cm2 fluorescent slide in 11 minutes. It can load and scan 250 slides in walk away mode. Using the latest camera technology and electronics, state of the art computer and standard microscope optical components high throughput high quality whole slide imaging is feasible and is sufficient for most of the routine diagnostic work. Extended depth of field and Z-stack scanning is possible with the use of area scan technology.

  20. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    Science.gov (United States)

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Strain Library Imaging Protocol for high-throughput, automated single-cell microscopy of large bacterial collections arrayed on multiwell plates.

    Science.gov (United States)

    Shi, Handuo; Colavin, Alexandre; Lee, Timothy K; Huang, Kerwyn Casey

    2017-02-01

    Single-cell microscopy is a powerful tool for studying gene functions using strain libraries, but it suffers from throughput limitations. Here we describe the Strain Library Imaging Protocol (SLIP), which is a high-throughput, automated microscopy workflow for large strain collections that requires minimal user involvement. SLIP involves transferring arrayed bacterial cultures from multiwell plates onto large agar pads using inexpensive replicator pins and automatically imaging the resulting single cells. The acquired images are subsequently reviewed and analyzed by custom MATLAB scripts that segment single-cell contours and extract quantitative metrics. SLIP yields rich data sets on cell morphology and gene expression that illustrate the function of certain genes and the connections among strains in a library. For a library arrayed on 96-well plates, image acquisition can be completed within 4 min per plate.

  2. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications

    DEFF Research Database (Denmark)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik

    2016-01-01

    was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD......, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool...

  3. Automation methodologies and large-scale validation for G W : Towards high-throughput G W calculations

    Science.gov (United States)

    van Setten, M. J.; Giantomassi, M.; Gonze, X.; Rignanese, G.-M.; Hautier, G.

    2017-10-01

    The search for new materials based on computational screening relies on methods that accurately predict, in an automatic manner, total energy, atomic-scale geometries, and other fundamental characteristics of materials. Many technologically important material properties directly stem from the electronic structure of a material, but the usual workhorse for total energies, namely density-functional theory, is plagued by fundamental shortcomings and errors from approximate exchange-correlation functionals in its prediction of the electronic structure. At variance, the G W method is currently the state-of-the-art ab initio approach for accurate electronic structure. It is mostly used to perturbatively correct density-functional theory results, but is, however, computationally demanding and also requires expert knowledge to give accurate results. Accordingly, it is not presently used in high-throughput screening: fully automatized algorithms for setting up the calculations and determining convergence are lacking. In this paper, we develop such a method and, as a first application, use it to validate the accuracy of G0W0 using the PBE starting point and the Godby-Needs plasmon-pole model (G0W0GN @PBE) on a set of about 80 solids. The results of the automatic convergence study utilized provide valuable insights. Indeed, we find correlations between computational parameters that can be used to further improve the automatization of G W calculations. Moreover, we find that G0W0GN @PBE shows a correlation between the PBE and the G0W0GN @PBE gaps that is much stronger than that between G W and experimental gaps. However, the G0W0GN @PBE gaps still describe the experimental gaps more accurately than a linear model based on the PBE gaps. With this paper, we hence show that G W can be made automatic and is more accurate than using an empirical correction of the PBE gap, but that, for accurate predictive results for a broad class of materials, an improved starting point or some

  4. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    Science.gov (United States)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-11-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called "Robofurnace." Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  5. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    International Nuclear Information System (INIS)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-01-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes

  6. High-throughput screening of cellulase F mutants from multiplexed plasmid sets using an automated plate assay on a functional proteomic robotic workcell

    Directory of Open Access Journals (Sweden)

    Qureshi Nasib

    2006-05-01

    Full Text Available Abstract Background The field of plasmid-based functional proteomics requires the rapid assay of proteins expressed from plasmid libraries. Automation is essential since large sets of mutant open reading frames are being cloned for evaluation. To date no integrated automated platform is available to carry out the entire process including production of plasmid libraries, expression of cloned genes, and functional testing of expressed proteins. Results We used a functional proteomic assay in a multiplexed setting on an integrated plasmid-based robotic workcell for high-throughput screening of mutants of cellulase F, an endoglucanase from the anaerobic fungus Orpinomyces PC-2. This allowed us to identify plasmids containing optimized clones expressing mutants with improved activity at lower pH. A plasmid library of mutagenized clones of the celF gene with targeted variations in the last four codons was constructed by site-directed PCR mutagenesis and transformed into Escherichia coli. A robotic picker integrated into the workcell was used to inoculate medium in a 96-well deep well plate, combining the transformants into a multiplexed set in each well, and the plate was incubated on the workcell. Plasmids were prepared from the multiplexed culture on the liquid handler component of the workcell and used for in vitro transcription/translation. The multiplexed expressed recombinant proteins were screened for improved activity and stability in an azo-carboxymethylcellulose plate assay. The multiplexed wells containing mutants with improved activity were identified and linked back to the corresponding multiplexed cultures stored in glycerol. Spread plates were prepared from the glycerol stocks and the workcell was used to pick single colonies from the spread plates, prepare plasmid, produce recombinant protein, and assay for activity. The screening assay and subsequent deconvolution of the multiplexed wells resulted in identification of improved Cel

  7. High-throughput de novo screening of receptor agonists with an automated single-cell analysis and isolation system.

    Science.gov (United States)

    Yoshimoto, Nobuo; Tatematsu, Kenji; Iijima, Masumi; Niimi, Tomoaki; Maturana, Andrés D; Fujii, Ikuo; Kondo, Akihiko; Tanizawa, Katsuyuki; Kuroda, Shun'ichi

    2014-02-28

    Reconstitution of signaling pathways involving single mammalian transmembrane receptors has not been accomplished in yeast cells. In this study, intact EGF receptor (EGFR) and a cell wall-anchored form of EGF were co-expressed on the yeast cell surface, which led to autophosphorylation of the EGFR in an EGF-dependent autocrine manner. After changing from EGF to a conformationally constrained peptide library, cells were fluorescently labeled with an anti-phospho-EGFR antibody. Each cell was subjected to an automated single-cell analysis and isolation system that analyzed the fluorescent intensity of each cell and automatically retrieved each cell with the highest fluorescence. In ~3.2 × 10(6) peptide library, we isolated six novel peptides with agonistic activity of the EGFR in human squamous carcinoma A431 cells. The combination of yeast cells expressing mammalian receptors, a cell wall-anchored peptide library, and an automated single-cell analysis and isolation system might facilitate a rational approach for de novo drug screening.

  8. Automated High-Throughput Genotyping for Study of Global Epidemiology of Mycobacterium tuberculosis Based on Mycobacterial Interspersed Repetitive Units

    Science.gov (United States)

    Supply, Philip; Lesjean, Sarah; Savine, Evgueni; Kremer, Kristin; van Soolingen, Dick; Locht, Camille

    2001-01-01

    Large-scale genotyping of Mycobacterium tuberculosis is especially challenging, as the current typing methods are labor-intensive and the results are difficult to compare among laboratories. Here, automated typing based on variable-number tandem repeats (VNTRs) of genetic elements named mycobacterial interspersed repetitive units (MIRUs) in 12 mammalian minisatellite-like loci of M. tuberculosis is presented. This system combines analysis of multiplex PCRs on a fluorescence-based DNA analyzer with computerized automation of the genotyping. Analysis of a blinded reference set of 90 strains from 38 countries (K. Kremer et al., J. Clin. Microbiol. 37:2607–2618, 1999) demonstrated that it is 100% reproducible, sensitive, and specific for M. tuberculosis complex isolates, a performance that has not been achieved by any other typing method tested in the same conditions. MIRU-VNTRs can be used for analysis of the global genetic diversity of M. tuberculosis complex strains at different levels of evolutionary divergence. To fully exploit the portability of this typing system, a website was set up for the analysis of M. tuberculosis MIRU-VNTR genotypes via the Internet. This opens the way for global epidemiological surveillance of tuberculosis and should lead to novel insights into the evolutionary and population genetics of this major pathogen. PMID:11574573

  9. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  10. Inertial Microfluidic Cell Stretcher (iMCS): Fully Automated, High-Throughput, and Near Real-Time Cell Mechanotyping.

    Science.gov (United States)

    Deng, Yanxiang; Davis, Steven P; Yang, Fan; Paulsen, Kevin S; Kumar, Maneesh; Sinnott DeVaux, Rebecca; Wang, Xianhui; Conklin, Douglas S; Oberai, Assad; Herschkowitz, Jason I; Chung, Aram J

    2017-07-01

    Mechanical biomarkers associated with cytoskeletal structures have been reported as powerful label-free cell state identifiers. In order to measure cell mechanical properties, traditional biophysical (e.g., atomic force microscopy, micropipette aspiration, optical stretchers) and microfluidic approaches were mainly employed; however, they critically suffer from low-throughput, low-sensitivity, and/or time-consuming and labor-intensive processes, not allowing techniques to be practically used for cell biology research applications. Here, a novel inertial microfluidic cell stretcher (iMCS) capable of characterizing large populations of single-cell deformability near real-time is presented. The platform inertially controls cell positions in microchannels and deforms cells upon collision at a T-junction with large strain. The cell elongation motions are recorded, and thousands of cell deformability information is visualized near real-time similar to traditional flow cytometry. With a full automation, the entire cell mechanotyping process runs without any human intervention, realizing a user friendly and robust operation. Through iMCS, distinct cell stiffness changes in breast cancer progression and epithelial mesenchymal transition are reported, and the use of the platform for rapid cancer drug discovery is shown as well. The platform returns large populations of single-cell quantitative mechanical properties (e.g., shear modulus) on-the-fly with high statistical significances, enabling actual usages in clinical and biophysical studies. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Enzyme engineering: A synthetic biology approach for more effective library generation and automated high-throughput screening

    Science.gov (United States)

    Ebert, Maximilian C. C. J. C.; Mugford, Paul F.; Pelletier, Joelle N.

    2017-01-01

    The Golden Gate strategy entails the use of type IIS restriction enzymes, which cut outside of their recognition sequence. It enables unrestricted design of unique DNA fragments that can be readily and seamlessly recombined. Successfully employed in other synthetic biology applications, we demonstrate its advantageous use to engineer a biocatalyst. Hot-spots for mutations were individuated in three distinct regions of Candida antarctica lipase A (Cal-A), the biocatalyst chosen as a target to demonstrate the versatility of this recombination method. The three corresponding gene segments were subjected to the most appropriate method of mutagenesis (targeted or random). Their straightforward reassembly allowed combining products of different mutagenesis methods in a single round for rapid production of a series of diverse libraries, thus facilitating directed evolution. Screening to improve discrimination of short-chain versus long-chain fatty acid substrates was aided by development of a general, automated method for visual discrimination of the hydrolysis of varied substrates by whole cells. PMID:28178357

  12. Enzyme engineering: A synthetic biology approach for more effective library generation and automated high-throughput screening.

    Directory of Open Access Journals (Sweden)

    Daniela Quaglia

    Full Text Available The Golden Gate strategy entails the use of type IIS restriction enzymes, which cut outside of their recognition sequence. It enables unrestricted design of unique DNA fragments that can be readily and seamlessly recombined. Successfully employed in other synthetic biology applications, we demonstrate its advantageous use to engineer a biocatalyst. Hot-spots for mutations were individuated in three distinct regions of Candida antarctica lipase A (Cal-A, the biocatalyst chosen as a target to demonstrate the versatility of this recombination method. The three corresponding gene segments were subjected to the most appropriate method of mutagenesis (targeted or random. Their straightforward reassembly allowed combining products of different mutagenesis methods in a single round for rapid production of a series of diverse libraries, thus facilitating directed evolution. Screening to improve discrimination of short-chain versus long-chain fatty acid substrates was aided by development of a general, automated method for visual discrimination of the hydrolysis of varied substrates by whole cells.

  13. Automation and integration of polymerase chain reaction with capillary electrophoresis for high throughput genotyping and disease diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, N.

    1999-02-12

    Genotyping is to detect specific loci in the human genome. These loci provide important information for forensic testing, construction of genetic linkage maps, gene related disease diagnosis and pharmacogenetic research. Genotyping is becoming more and more popular after these loci can be easily amplified by polymerase chain reaction (PCR). Capillary electrophoresis has its unique advantages for DNA analysis due to its fast heat dissipation and ease of automation. Four projects are described in which genotyping is performed by capillary electrophoresis emphasizing different aspects. First, the author demonstrates a principle to determine the genotype based on capillary electrophoresis system. VNTR polymorphism in the human D1S80 locus was studied. Second, the separation of four short tandem repeat (STR) loci vWF, THO1, TPOX and CSF1PO (CTTv) by using poly(ethylene oxide) (PEO) was studied in achieving high resolution and preventing rehybridization of the DNA fragments. Separation under denaturing, non-denaturing conditions and at elevated temperature was discussed. Third, a 250 {micro}m i.d., 365 {micro}m o.d. fused silica capillary was used as the microreactor for PCR. Fourth, direct PCR from blood was studied to simplify the sample preparation for genotyping to minimum.

  14. Semi-automated high-throughput fluorescent intercalator displacement-based discovery of cytotoxic DNA binding agents from a large compound library.

    Science.gov (United States)

    Glass, Lateca S; Bapat, Aditi; Kelley, Mark R; Georgiadis, Millie M; Long, Eric C

    2010-03-01

    High-throughput fluorescent intercalator displacement (HT-FID) was adapted to the semi-automated screening of a commercial compound library containing 60,000 molecules resulting in the discovery of cytotoxic DNA-targeted agents. Although commercial libraries are routinely screened in drug discovery efforts, the DNA binding potential of the compounds they contain has largely been overlooked. HT-FID led to the rapid identification of a number of compounds for which DNA binding properties were validated through demonstration of concentration-dependent DNA binding and increased thermal melting of A/T- or G/C-rich DNA sequences. Selected compounds were assayed further for cell proliferation inhibition in glioblastoma cells. Seven distinct compounds emerged from this screening procedure that represent structures unknown previously to be capable of targeting DNA leading to cell death. These agents may represent structures worthy of further modification to optimally explore their potential as cytotoxic anti-cancer agents. In addition, the general screening strategy described may find broader impact toward the rapid discovery of DNA targeted agents with biological activity. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  16. A novel library-independent approach based on high-throughput cultivation in Bioscreen and fingerprinting by FTIR spectroscopy for microbial source tracking in food industry.

    Science.gov (United States)

    Shapaval, V; Møretrø, T; Wold Åsli, A; Suso, H P; Schmitt, J; Lillehaug, D; Kohler, A

    2017-05-01

    Microbiological source tracking (MST) for food industry is a rapid growing area of research and technology development. In this paper, a new library-independent approach for MST is presented. It is based on a high-throughput liquid microcultivation and FTIR spectroscopy. In this approach, FTIR spectra obtained from micro-organisms isolated along the production line and a product are compared to each other. We tested and evaluated the new source tracking approach by simulating a source tracking situation. In this simulation study, a selection of 20 spoilage mould strains from a total of six genera (Alternaria, Aspergillus, Mucor, Paecilomyces, Peyronellaea and Phoma) was used. The simulation of the source tracking situation showed that 80-100% of the sources could be correctly identified with respect to genus/species level. When performing source tracking simulations, the FTIR identification diverged for Phoma glomerata strain in the reference collection. When reidentifying the strain by sequencing, it turned out that the strain was a Peyronellaea arachidicola. The obtained results demonstrated that the proposed approach is a versatile tool for identifying sources of microbial contamination. Thus, it has a high potential for routine control in the food industry due to low costs and analysis time. The source tracking of fungal contamination in the food industry is an important aspect of food safety. Currently, all available methods are time consuming and require the use of a reference library that may limit the accuracy of the identification. In this study, we report for the first time, a library-independent FTIR spectroscopic approach for MST of fungal contamination along the food production line. It combines high-throughput microcultivation and FTIR spectroscopy and is specific on the genus and species level. Therefore, such an approach possesses great importance for food safety control in food industry. © 2016 The Society for Applied Microbiology.

  17. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  18. Integrated automation for continuous high-throughput synthetic chromosome assembly and transformation to identify improved yeast strains for industrial production of peptide sweetener brazzein

    Science.gov (United States)

    Production and recycling of recombinant sweetener peptides in industrial biorefineries involves the evaluation of large numbers of genes and proteins. High-throughput integrated robotic molecular biology platforms that have the capacity to rapidly synthesize, clone, and express heterologous gene ope...

  19. MODELING OF AUTOMATION PROCESSES CONCERNING CROP CULTIVATION BY AVIATION

    Directory of Open Access Journals (Sweden)

    V. I. Ryabkov

    2010-01-01

    Full Text Available The paper considers modeling of automation processes concerning crop cultivation by aviation. Processes that take place in three interconnected environments: human, technical and movable air objects are described by a model which is based on a set theory. Stochastic network theory of mass service systems for description of human-machine system of real time is proposed in the paper.

  20. An Automated Method for High-Throughput Screening of Arabidopsis Rosette Growth in Multi-Well Plates and Its Validation in Stress Conditions

    Czech Academy of Sciences Publication Activity Database

    De Diego, N.; Fürst, T.; Humplík, Jan; Ugena, L.; Podlešáková, K.; Spíchal, L.

    2017-01-01

    Roč. 8, OCT 4 (2017), č. článku 1702. ISSN 1664-462X R&D Projects: GA MŠk(CZ) LO1204 Institutional support: RVO:61389030 Keywords : salt stress * chlorophyll fluorescence * salinity tolerance * plant-responses * cold-tolerance * water-deficit * thaliana * selection * platform * reveals * high-throughput screening assay * Arabidopsis * multi-well plates * rosette growth * stress conditions Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Plant sciences, botany Impact factor: 4.298, year: 2016

  1. High-throughput method of dioxin analysis in aqueous samples using consecutive solid phase extraction steps with the new C18 Ultraflow™ pressurized liquid extraction and automated clean-up.

    Science.gov (United States)

    Youn, Yeu-Young; Park, Deok Hie; Lee, Yeon Hwa; Lim, Young Hee; Cho, Hye Sung

    2015-01-01

    A high-throughput analytical method has been developed for the determination of seventeen 2,3,7,8-substituted congeners of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in aqueous samples. A recently introduced octadecyl (C18) disk for semi-automated solid-phase extraction of PCDD/Fs in water samples with a high level of particulate material has been tested for the analysis of dioxins. A new type of C18 disk specially designed for the analysis of hexane extractable material (HEM), but never previously reported for use in PCDD/Fs analysis. This kind of disk allows a higher filtration flow, and therefore the time of analysis is reduced. The solid-phase extraction technique is used to change samples from liquid to solid, and therefore pressurized liquid extraction (PLE) can be used in the pre-treatment. In order to achieve efficient purification, extracts from the PLE are purified using an automated Power-prep system with disposable silica, alumina, and carbon columns. Quantitative analyses of PCDD/Fs were performed by GC-HRMS using multi-ion detection (MID) mode. The method was successfully applied to the analysis of water samples from the wastewater treatment system of a vinyl chloride monomer plant. The entire procedure is in agreement with EPA1613 recommendations regarding the blank control, MDLs (method detection limits), accuracy, and precision. The high-throughput method not only meets the requirements of international standards, but also shortens the required analysis time from 2 weeks to 3d. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...... cycle can be performed in less than 3 min. Bovine serum albumin was used as a model protein to characterize the mixing efficiency and sample consumption of the system. The N2 fragment of an adaptor protein (p120-RasGAP) was used to demonstrate how the device can be used to survey the structural space...

  3. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes......-dimensional data in terms of the indirect Fourier transform using the objective Bayesian approach to obtain the pair-distance distribution function, PDDF, and is thereby a free and open-source alternative to existing PDDF estimation software. Apart from the TIFF input format, the program also accepts ASCII...

  4. Fully automated high-throughput chromatin immunoprecipitation for ChIP-seq: identifying ChIP-quality p300 monoclonal antibodies.

    Science.gov (United States)

    Gasper, William C; Marinov, Georgi K; Pauli-Behn, Florencia; Scott, Max T; Newberry, Kimberly; DeSalvo, Gilberto; Ou, Susan; Myers, Richard M; Vielmetter, Jost; Wold, Barbara J

    2014-06-12

    Chromatin immunoprecipitation coupled with DNA sequencing (ChIP-seq) is the major contemporary method for mapping in vivo protein-DNA interactions in the genome. It identifies sites of transcription factor, cofactor and RNA polymerase occupancy, as well as the distribution of histone marks. Consortia such as the ENCyclopedia Of DNA Elements (ENCODE) have produced large datasets using manual protocols. However, future measurements of hundreds of additional factors in many cell types and physiological states call for higher throughput and consistency afforded by automation. Such automation advances, when provided by multiuser facilities, could also improve the quality and efficiency of individual small-scale projects. The immunoprecipitation process has become rate-limiting, and is a source of substantial variability when performed manually. Here we report a fully automated robotic ChIP (R-ChIP) pipeline that allows up to 96 reactions. A second bottleneck is the dearth of renewable ChIP-validated immune reagents, which do not yet exist for most mammalian transcription factors. We used R-ChIP to screen new mouse monoclonal antibodies raised against p300, a histone acetylase, well-known as a marker of active enhancers, for which ChIP-competent monoclonal reagents have been lacking. We identified, validated for ChIP-seq, and made publicly available a monoclonal reagent called ENCITp300-1.

  5. High-throughput and automated diagnosis of antimicrobial resistance using a cost-effective cellphone-based micro-plate reader

    Science.gov (United States)

    Feng, Steve; Tseng, Derek; di Carlo, Dino; Garner, Omai B.; Ozcan, Aydogan

    2016-12-01

    Routine antimicrobial susceptibility testing (AST) can prevent deaths due to bacteria and reduce the spread of multi-drug-resistance, but cannot be regularly performed in resource-limited-settings due to technological challenges, high-costs, and lack of trained professionals. We demonstrate an automated and cost-effective cellphone-based 96-well microtiter-plate (MTP) reader, capable of performing AST without the need for trained diagnosticians. Our system includes a 3D-printed smartphone attachment that holds and illuminates the MTP using a light-emitting-diode array. An inexpensive optical fiber-array enables the capture of the transmitted light of each well through the smartphone camera. A custom-designed application sends the captured image to a server to automatically determine well-turbidity, with results returned to the smartphone in ~1 minute. We tested this mobile-reader using MTPs prepared with 17 antibiotics targeting Gram-negative bacteria on clinical isolates of Klebsiella pneumoniae, containing highly-resistant antimicrobial profiles. Using 78 patient isolate test-plates, we demonstrated that our mobile-reader meets the FDA-defined AST criteria, with a well-turbidity detection accuracy of 98.21%, minimum-inhibitory-concentration accuracy of 95.12%, and a drug-susceptibility interpretation accuracy of 99.23%, with no very major errors. This mobile-reader could eliminate the need for trained diagnosticians to perform AST, reduce the cost-barrier for routine testing, and assist in spatio-temporal tracking of bacterial resistance.

  6. Combinatorial approaches for high-throughput characterization of mechanical properties

    Directory of Open Access Journals (Sweden)

    Xiaokun Zhang

    2017-09-01

    Full Text Available Since the first successful story was reported in the middle of 1990s, combinatorial materials science has attracted more and more attentions in the materials community. In the past two decades, a great amount of effort has been made to develop combinatorial high-throughput approaches for materials research. However, few high-throughput mechanical characterization methods and tools were reported. To date, a number of micro-scale mechanical characterization tools have been developed, which provided a basis for combinatorial high-throughput mechanical characterization. Many existing micro-mechanical testing apparatuses can be pertinently modified for high-throughput characterization. For example, automated scanning nanoindentation is used for measuring the hardness and elastic modulus of diffusion multiple alloy samples, and cantilever beam arrays are used to parallelly characterize the thermal mechanical behavior of thin films with wide composition gradients. The interpretation of micro-mechanical testing data from thin films and micro-scale samples is most critical and challenging, as the mechanical properties of their bulk counterparts cannot be intuitively extrapolated due to the well-known size and microstructure dependence. Nevertheless, high-throughput mechanical characterization data from combinatorial micro-scale samples still reflect the dependence trend of the mechanical properties on compositions and microstructure, which facilitates the understanding of intrinsic materials behavior and the fast screening of bulk mechanical properties. After the promising compositions and microstructure are pinned down, bulk samples can be prepared to measure the accurate properties and verify the combinatorial high-throughput characterization results. By developing combinatorial high-throughput mechanical characterization methods and tools, in combination with high-throughput synthesis, the structural materials research would be promoted by

  7. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  8. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification...

  9. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  10. Fluorescent Approaches to High Throughput Crystallography

    Science.gov (United States)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    We have shown that by covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and the presence of the probe at low concentrations does not affect the X-ray data quality or the crystallization behavior. The presence of the trace fluorescent label gives a number of advantages when used with high throughput crystallizations. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a dark background. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Brightly fluorescent crystals are readily found against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. We are now testing the use of high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that kinetics leading to non-structured phases may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Preliminary experiments with test proteins have resulted in the extraction of a number of crystallization conditions from screening outcomes based solely on the presence of bright fluorescent regions. Subsequent experiments will test this approach using a wider

  11. Connecting Earth observation to high-throughput biodiversity data

    DEFF Research Database (Denmark)

    Bush, Alex; Sollmann, Rahel; Wilting, Andreas

    2017-01-01

    Understandably, given the fast pace of biodiversity loss, there is much interest in using Earth observation technology to track biodiversity, ecosystem functions and ecosystem services. However, because most biodiversity is invisible to Earth observation, indicators based on Earth observation could...... be misleading and reduce the effectiveness of nature conservation and even unintentionally decrease conservation effort. We describe an approach that combines automated recording devices, high-throughput DNA sequencing and modern ecological modelling to extract much more of the information available in Earth...

  12. Mapper: high throughput maskless lithography

    Science.gov (United States)

    Kuiper, V.; Kampherbeek, B. J.; Wieland, M. J.; de Boer, G.; ten Berge, G. F.; Boers, J.; Jager, R.; van de Peut, T.; Peijster, J. J. M.; Slot, E.; Steenbrink, S. W. H. K.; Teepen, T. F.; van Veen, A. H. V.

    2009-01-01

    Maskless electron beam lithography, or electron beam direct write, has been around for a long time in the semiconductor industry and was pioneered from the mid-1960s onwards. This technique has been used for mask writing applications as well as device engineering and in some cases chip manufacturing. However because of its relatively low throughput compared to optical lithography, electron beam lithography has never been the mainstream lithography technology. To extend optical lithography double patterning, as a bridging technology, and EUV lithography are currently explored. Irrespective of the technical viability of both approaches, one thing seems clear. They will be expensive [1]. MAPPER Lithography is developing a maskless lithography technology based on massively-parallel electron-beam writing with high speed optical data transport for switching the electron beams. In this way optical columns can be made with a throughput of 10-20 wafers per hour. By clustering several of these columns together high throughputs can be realized in a small footprint. This enables a highly cost-competitive alternative to double patterning and EUV alternatives. In 2007 MAPPER obtained its Proof of Lithography milestone by exposing in its Demonstrator 45 nm half pitch structures with 110 electron beams in parallel, where all the beams where individually switched on and off [2]. In 2008 MAPPER has taken a next step in its development by building several tools. A new platform has been designed and built which contains a 300 mm wafer stage, a wafer handler and an electron beam column with 110 parallel electron beams. This manuscript describes the first patterning results with this 300 mm platform.

  13. High-throughput Crystallography for Structural Genomics

    Science.gov (United States)

    Joachimiak, Andrzej

    2009-01-01

    Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now over 53,000 proteins structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact. PMID:19765976

  14. On the Cultivation of Automation Majors' Research Innovation Ability Based on Scientific Research Projects

    Science.gov (United States)

    Wang, Lipeng; Li, Mingqiu

    2012-01-01

    Currently, it has become a fundamental goal for the engineering major to cultivate high-quality engineering technicians with innovation ability in scientific research which is an important academic ability necessary for them. This paper mainly explores the development of comprehensive and designing experiments in automation based on scientific…

  15. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  16. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  17. Orchestrating high-throughput genomic analysis with Bioconductor

    Science.gov (United States)

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  18. Seasonal cultivated and fallow cropland mapping using MODIS-based automated cropland classification algorithm

    Science.gov (United States)

    Wu, Zhuoting; Thenkabail, Prasad S.; Mueller, Rick; Zakzeski, Audra; Melton, Forrest; Johnson, Lee; Rosevelt, Carolyn; Dwyer, John; Jones, Jeanine; Verdin, James P.

    2014-01-01

    Increasing drought occurrences and growing populations demand accurate, routine, and consistent cultivated and fallow cropland products to enable water and food security analysis. The overarching goal of this research was to develop and test automated cropland classification algorithm (ACCA) that provide accurate, consistent, and repeatable information on seasonal cultivated as well as seasonal fallow cropland extents and areas based on the Moderate Resolution Imaging Spectroradiometer remote sensing data. Seasonal ACCA development process involves writing series of iterative decision tree codes to separate cultivated and fallow croplands from noncroplands, aiming to accurately mirror reliable reference data sources. A pixel-by-pixel accuracy assessment when compared with the U.S. Department of Agriculture (USDA) cropland data showed, on average, a producer’s accuracy of 93% and a user’s accuracy of 85% across all months. Further, ACCA-derived cropland maps agreed well with the USDA Farm Service Agency crop acreage-reported data for both cultivated and fallow croplands with R-square values over 0.7 and field surveys with an accuracy of ≥95% for cultivated croplands and ≥76% for fallow croplands. Our results demonstrated the ability of ACCA to generate cropland products, such as cultivated and fallow cropland extents and areas, accurately, automatically, and repeatedly throughout the growing season.

  19. High-throughput analysis of subtelomeric chromosome rearrangements by use of array-based comparative genomic hybridization.

    NARCIS (Netherlands)

    Veltman, J.; Schoenmakers, E.F.P.M.; Eussen, B.H.; Janssen, I.M.; Merkx, G.F.M.; Cleef, B. van; Ravenswaaij-Arts, C.M.A. van; Brunner, H.G.; Smeets, D.F.C.M.; Geurts van Kessel, A.H.M.

    2002-01-01

    Telomeric chromosome rearrangements may cause mental retardation, congenital anomalies, and miscarriages. Automated detection of subtle deletions or duplications involving telomeres is essential for high-throughput diagnosis, but impossible when conventional cytogenetic methods are used. Array-based

  20. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...

  1. High-throughput computing in the sciences.

    Science.gov (United States)

    Morgan, Mark; Grimshaw, Andrew

    2009-01-01

    While it is true that the modern computer is many orders of magnitude faster than that of yesteryear; this tremendous growth in CPU clock rates is now over. Unfortunately, however, the growth in demand for computational power has not abated; whereas researchers a decade ago could simply wait for computers to get faster, today the only solution to the growing need for more powerful computational resource lies in the exploitation of parallelism. Software parallelization falls generally into two broad categories--"true parallel" and high-throughput computing. This chapter focuses on the latter of these two types of parallelism. With high-throughput computing, users can run many copies of their software at the same time across many different computers. This technique for achieving parallelism is powerful in its ability to provide high degrees of parallelism, yet simple in its conceptual implementation. This chapter covers various patterns of high-throughput computing usage and the skills and techniques necessary to take full advantage of them. By utilizing numerous examples and sample codes and scripts, we hope to provide the reader not only with a deeper understanding of the principles behind high-throughput computing, but also with a set of tools and references that will prove invaluable as she explores software parallelism with her own software applications and research.

  2. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  3. High Throughput In Situ XAFS Screening of Catalysts

    International Nuclear Information System (INIS)

    Tsapatsaris, Nikolaos; Beesley, Angela M.; Weiher, Norbert; Tatton, Helen; Schroeder, Sven L. M.; Dent, Andy J.; Mosselmans, Frederick J. W.; Tromp, Moniek; Russu, Sergio; Evans, John; Harvey, Ian; Hayama, Shu

    2007-01-01

    We outline and demonstrate the feasibility of high-throughput (HT) in situ XAFS for synchrotron radiation studies. An XAS data acquisition and control system for the analysis of dynamic materials libraries under control of temperature and gaseous environments has been developed. The system is compatible with the 96-well industry standard and coupled to multi-stream quadrupole mass spectrometry (QMS) analysis of reactor effluents. An automated analytical workflow generates data quickly compared to traditional individual spectrum acquisition and analyses them in quasi-real time using an HT data analysis tool based on IFFEFIT. The system was used for the automated characterization of a library of 91 catalyst precursors containing ternary combinations of Cu, Pt, and Au on γ-Al2O3, and for the in situ characterization of Au catalysts supported on Al2O3 and TiO2

  4. A high-throughput liquid chromatography/tandem mass spectrometry method for simultaneous quantification of a hydrophobic drug candidate and its hydrophilic metabolite in human urine with a fully automated liquid/liquid extraction.

    Science.gov (United States)

    Wang, Perry G; Zhang, Jun; Gage, Eric M; Schmidt, Jeffrey M; Rodila, Ramona C; Ji, Qin C; El-Shourbagy, Tawakol A

    2006-01-01

    ABT-869 (A-741439) is an investigational new drug candidate under development by Abbott Laboratories. ABT-869 is hydrophobic, but is oxidized in the body to A-849529, a hydrophilic metabolite that includes both carboxyl and amino groups. Poor solubility of ABT-869 in aqueous matrix causes simultaneous analysis of both ABT-869 and its metabolite within the same extraction and injection to be extremely difficult in human urine. In this paper, a high-performance liquid chromatography/tandem mass spectrometry (HPLC/MS/MS) method has been developed and validated for high-speed simultaneous quantitation of the hydrophobic ABT-869 and its hydrophilic metabolite, A-849529, in human urine. The deuterated internal standards, A-741439D(4) and A-849529D(4), were used in this method. The disparate properties of the two analytes were mediated by treating samples with acetonitrile, adjusting pH with an extraction buffer, and optimizing the extraction solvent and mobile phase composition. For a 100 microL urine sample volume, the lower limit of quantitation was approximately 1 ng/mL for both ABT-869 and A-849529. The calibration curve was linear from 1.09 to 595.13 ng/mL for ABT-869, and 1.10 to 600.48 ng/mL for A-849529 (r2 > 0.9975 for both ABT-869 and A-849529). Because the method employs simultaneous quantification, high throughput is achieved despite the presence of both a hydrophobic analyte and its hydrophilic metabolite in human urine. Copyright 2006 John Wiley & Sons, Ltd.

  5. A high throughput spectral image microscopy system

    Science.gov (United States)

    Gesley, M.; Puri, R.

    2018-01-01

    A high throughput spectral image microscopy system is configured for rapid detection of rare cells in large populations. To overcome flow cytometry rates and use of fluorophore tags, a system architecture integrates sample mechanical handling, signal processors, and optics in a non-confocal version of light absorption and scattering spectroscopic microscopy. Spectral images with native contrast do not require the use of exogeneous stain to render cells with submicron resolution. Structure may be characterized without restriction to cell clusters of differentiation.

  6. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  7. High throughput inclusion body sizing: Nano particle tracking analysis.

    Science.gov (United States)

    Reichelt, Wieland N; Kaineder, Andreas; Brillmann, Markus; Neutsch, Lukas; Taschauer, Alexander; Lohninger, Hans; Herwig, Christoph

    2017-06-01

    The expression of pharmaceutical relevant proteins in Escherichia coli frequently triggers inclusion body (IB) formation caused by protein aggregation. In the scientific literature, substantial effort has been devoted to the quantification of IB size. However, particle-based methods used up to this point to analyze the physical properties of representative numbers of IBs lack sensitivity and/or orthogonal verification. Using high pressure freezing and automated freeze substitution for transmission electron microscopy (TEM) the cytosolic inclusion body structure was preserved within the cells. TEM imaging in combination with manual grey scale image segmentation allowed the quantification of relative areas covered by the inclusion body within the cytosol. As a high throughput method nano particle tracking analysis (NTA) enables one to derive the diameter of inclusion bodies in cell homogenate based on a measurement of the Brownian motion. The NTA analysis of fixated (glutaraldehyde) and non-fixated IBs suggests that high pressure homogenization annihilates the native physiological shape of IBs. Nevertheless, the ratio of particle counts of non-fixated and fixated samples could potentially serve as factor for particle stickiness. In this contribution, we establish image segmentation of TEM pictures as an orthogonal method to size biologic particles in the cytosol of cells. More importantly, NTA has been established as a particle-based, fast and high throughput method (1000-3000 particles), thus constituting a much more accurate and representative analysis than currently available methods. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  9. Machine Learning for High-Throughput Stress Phenotyping in Plants.

    Science.gov (United States)

    Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh Kumar; Sarkar, Soumik

    2016-02-01

    Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large corpus of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping. Four stages of the decision cycle in plant stress phenotyping and plant breeding activities where different ML approaches can be deployed are (i) identification, (ii) classification, (iii) quantification, and (iv) prediction (ICQP). We provide here a comprehensive overview and user-friendly taxonomy of ML tools to enable the plant community to correctly and easily apply the appropriate ML tools and best-practice guidelines for various biotic and abiotic stress traits. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Efficient Management of High-Throughput Screening Libraries with SAVANAH

    DEFF Research Database (Denmark)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen

    2017-01-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis...... to be serially diluted before they can be used as assay plates. This process, however, leads to an explosion in the number of plates and samples to be tracked. Here, we present SAVANAH, the first tool to effectively manage molecular screening libraries across dilution series. It conveniently links (connects......) sample information from the library to experimental results from the assay plates. All results can be exported to the R statistical environment or piped into HiTSeekR (http://hitseekr.compbio.sdu.dk) for comprehensive follow-up analyses. In summary, SAVANAH supports the HTS community in managing...

  11. Surrogate-assisted feature extraction for high-throughput phenotyping.

    Science.gov (United States)

    Yu, Sheng; Chakrabortty, Abhishek; Liao, Katherine P; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2017-04-01

    Phenotyping algorithms are capable of accurately identifying patients with specific phenotypes from within electronic medical records systems. However, developing phenotyping algorithms in a scalable way remains a challenge due to the extensive human resources required. This paper introduces a high-throughput unsupervised feature selection method, which improves the robustness and scalability of electronic medical record phenotyping without compromising its accuracy. The proposed Surrogate-Assisted Feature Extraction (SAFE) method selects candidate features from a pool of comprehensive medical concepts found in publicly available knowledge sources. The target phenotype's International Classification of Diseases, Ninth Revision and natural language processing counts, acting as noisy surrogates to the gold-standard labels, are used to create silver-standard labels. Candidate features highly predictive of the silver-standard labels are selected as the final features. Algorithms were trained to identify patients with coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis using various numbers of labels to compare the performance of features selected by SAFE, a previously published automated feature extraction for phenotyping procedure, and domain experts. The out-of-sample area under the receiver operating characteristic curve and F -score from SAFE algorithms were remarkably higher than those from the other two, especially at small label sizes. SAFE advances high-throughput phenotyping methods by automatically selecting a succinct set of informative features for algorithm training, which in turn reduces overfitting and the needed number of gold-standard labels. SAFE also potentially identifies important features missed by automated feature extraction for phenotyping or experts. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please

  12. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  13. A high-throughput, fully automated liquid/liquid extraction liquid chromatography/mass spectrometry method for the quantitation of a new investigational drug ABT-869 and its metabolite A-849529 in human plasma samples.

    Science.gov (United States)

    Rodila, Ramona C; Kim, Joseph C; Ji, Qin C; El-Shourbagy, Tawakol A

    2006-01-01

    ABT-869 is a novel ATP-competitive inhibitor for all the vascular endothelial growth factor (VEGF) and platelet-derived growth factor (PDGF) receptor tyrosine kinases (RTKs). It is one of the oncology drugs in development at Abbott Laboratories and has great potential for enhanced anti-tumor efficacy as well as activity in a broad range of human cancers. We report here an accurate, precise and rugged liquid chromatography/mass spectrometry (LC/MS/MS) assay for the quantitative measurement of ABT-869 and its acid metabolite A-849529. A fully automated 96-well liquid/liquid extraction method was achieved utilizing a Hamilton liquid handler. The only manual intervention required prior to LC/MS/MS injection is to transfer the 96-well plate to a drying rack to dry the extracts then transfer the plate back to the Hamilton for robotic reconstitution. The linear dynamic ranges were from 1.1 to 598.8 ng/mL for ABT-869 and from 1.1 to 605.8 ng/mL for A-849529. The coefficient of determination (r2) for all analytes was greater than 0.9995. For the drug ABT-869, the intra-assay coefficient of variance (CV) was between 0.4% and 3.7% and the inter-assay CV was between 0.9% and 2.8%. The inter-assay mean accuracy, expressed as percent of theoretical, was between 96.8% and 102.2%. For the metabolite A-849529, the intra-assay CV was between 0.5% and 5.1% and the inter-assay CV was between 0.8% and 4.9%. The inter-assay mean accuracy, expressed as percent of theoretical, was between 96.9% and 100.6%. Copyright 2006 John Wiley & Sons, Ltd.

  14. Automated solid-phase extraction-liquid chromatography-tandem mass spectrometry analysis of 6-acetylmorphine in human urine specimens: application for a high-throughput urine analysis laboratory.

    Science.gov (United States)

    Robandt, P V; Bui, H M; Scancella, J M; Klette, K L

    2010-10-01

    An automated solid-phase extraction-liquid chromatography- tandem mass spectrometry (SPE-LC-MS-MS) method using the Spark Holland Symbiosis Pharma SPE-LC coupled to a Waters Quattro Micro MS-MS was developed for the analysis of 6-acetylmorphine (6-AM) in human urine specimens. The method was linear (R² = 0.9983) to 100 ng/mL, with no carryover at 200 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision calculated as percent coefficient of variation (%CV) and evaluated by analyzing five specimens at 10 ng/mL over nine batches (n = 45) was 3.6%. Intrarun precision evaluated from 0 to 100 ng/mL ranged from 1.0 to 4.4%CV. Other opioids (codeine, morphine, oxycodone, oxymorphone, hydromorphone, hydrocodone, and norcodeine) did not interfere in the detection, quantification, or chromatography of 6-AM or the deuterated internal standard. The quantified values for 41 authentic human urine specimens previously found to contain 6-AM by a validated gas chromatography (GC)-MS method were compared to those obtained by the SPE-LC-MS-MS method. The SPE-LC-MS-MS procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. The time required for extraction and analysis was reduced by approximately 50% when compared to a validated 6-AM procedure using manual SPE and GC-MS analysis.

  15. High throughput assays for analyzing transcription factors.

    Science.gov (United States)

    Li, Xianqiang; Jiang, Xin; Yaoi, Takuro

    2006-06-01

    Transcription factors are a group of proteins that modulate the expression of genes involved in many biological processes, such as cell growth and differentiation. Alterations in transcription factor function are associated with many human diseases, and therefore these proteins are attractive potential drug targets. A key issue in the development of such therapeutics is the generation of effective tools that can be used for high throughput discovery of the critical transcription factors involved in human diseases, and the measurement of their activities in a variety of disease or compound-treated samples. Here, a number of innovative arrays and 96-well format assays for profiling and measuring the activities of transcription factors will be discussed.

  16. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  17. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......, focusing on oft encountered problems in data processing, such as quality assurance, mapping, normalization, visualization, and interpretation. Presented in the second part are scientific endeavors representing solutions to problems of two sub-genres of next generation sequencing. For the first flavor, RNA-sequencing...

  18. High-throughput methods for electron crystallography.

    Science.gov (United States)

    Stokes, David L; Ubarretxena-Belandia, Iban; Gonen, Tamir; Engel, Andreas

    2013-01-01

    Membrane proteins play a tremendously important role in cell physiology and serve as a target for an increasing number of drugs. Structural information is key to understanding their function and for developing new strategies for combating disease. However, the complex physical chemistry associated with membrane proteins has made them more difficult to study than their soluble cousins. Electron crystallography has historically been a successful method for solving membrane protein structures and has the advantage of providing a native lipid environment for these proteins. Specifically, when membrane proteins form two-dimensional arrays within a lipid bilayer, electron microscopy can be used to collect images and diffraction and the corresponding data can be combined to produce a three-dimensional reconstruction, which under favorable conditions can extend to atomic resolution. Like X-ray crystallography, the quality of the structures are very much dependent on the order and size of the crystals. However, unlike X-ray crystallography, high-throughput methods for screening crystallization trials for electron crystallography are not in general use. In this chapter, we describe two alternative methods for high-throughput screening of membrane protein crystallization within the lipid bilayer. The first method relies on the conventional use of dialysis for removing detergent and thus reconstituting the bilayer; an array of dialysis wells in the standard 96-well format allows the use of a liquid-handling robot and greatly increases throughput. The second method relies on titration of cyclodextrin as a chelating agent for detergent; a specialized pipetting robot has been designed not only to add cyclodextrin in a systematic way, but to use light scattering to monitor the reconstitution process. In addition, the use of liquid-handling robots for making negatively stained grids and methods for automatically imaging samples in the electron microscope are described.

  19. High-throughput protein crystallography and drug discovery.

    Science.gov (United States)

    Tickle, Ian; Sharff, Andrew; Vinkovic, Mladen; Yon, Jeff; Jhoti, Harren

    2004-10-20

    Single crystal X-ray diffraction is the technique of choice for studying the interactions of small organic molecules with proteins by determining their three-dimensional structures; however the requirement for highly purified protein and lack of process automation have traditionally limited its use in this field. Despite these shortcomings, the use of crystal structures of therapeutically relevant drug targets in pharmaceutical research has increased significantly over the last decade. The application of structure-based drug design has resulted in several marketed drugs and is now an established discipline in most pharmaceutical companies. Furthermore, the recently published full genome sequences of Homo sapiens and a number of micro-organisms have provided a plethora of new potential drug targets that could be utilised in structure-based drug design programs. In order to take maximum advantage of this explosion of information, techniques have been developed to automate and speed up the various procedures required to obtain protein crystals of suitable quality, to collect and process the raw X-ray diffraction data into usable structural information, and to use three-dimensional protein structure as a basis for drug discovery and lead optimisation. This tutorial review covers the various technologies involved in the process pipeline for high-throughput protein crystallography as it is currently being applied to drug discovery. It is aimed at synthetic and computational chemists, as well as structural biologists, in both academia and industry, who are interested in structure-based drug design.

  20. NCBI GEO: archive for high-throughput functional genomic data

    OpenAIRE

    Barrett, Tanya; Troup, Dennis B.; Wilhite, Stephen E.; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F.; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A.; Phillippy, Katherine H.; Sherman, Patti M.; Muertter, Rolf N.; Edgar, Ron

    2008-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, m...

  1. Automated intelligent rotor tine cultivation and punch planting to improve the selectivity of mechanical intra-row weed control

    DEFF Research Database (Denmark)

    Rasmussen, Jesper; Griepentrog, Hans W.; Nielsen, Jon

    2012-01-01

    in sugar beet and carrot crops showed no synergistic effects between plant establishment procedures and selectivity of post-emergence weed harrowing. Even if punch planting and automated intelligent rotor tine cultivation were not combined, the results indicated that there was no reason to believe......There is much research on technical aspects related to sensor and mapping techniques, which enable so-called intelligent cultivators to target the intra-row spaces within crop rows. This study investigates (i) an expected advantage of an intelligent rotor tine cultivator (the cycloid hoe) in terms...... of crop-weed selectivity and (ii) an expected synergistic effect between punch planting and post-emergence weed harrowing, in terms of improved crop-weed selectivity. Selectivity is defined as the relationship between weed density decline and associated crop density decline 1 week after cultivation. Punch...

  2. High-Throughput Screening in Primary Neurons

    Science.gov (United States)

    Sharma, Punita; Ando, D. Michael; Daub, Aaron; Kaye, Julia A.; Finkbeiner, Steven

    2013-01-01

    Despite years of incremental progress in our understanding of diseases such as Alzheimer's disease (AD), Parkinson's disease (PD), Huntington's disease (HD), and amyotrophic lateral sclerosis (ALS), there are still no disease-modifying therapeutics. The discrepancy between the number of lead compounds and approved drugs may partially be a result of the methods used to generate the leads and highlights the need for new technology to obtain more detailed and physiologically relevant information on cellular processes in normal and diseased states. Our high-throughput screening (HTS) system in a primary neuron model can help address this unmet need. HTS allows scientists to assay thousands of conditions in a short period of time which can reveal completely new aspects of biology and identify potential therapeutics in the span of a few months when conventional methods could take years or fail all together. HTS in primary neurons combines the advantages of HTS with the biological relevance of intact, fully differentiated neurons which can capture the critical cellular events or homeostatic states that make neurons uniquely susceptible to disease-associated proteins. We detail methodologies of our primary neuron HTS assay workflow from sample preparation to data reporting. We also discuss our adaptation of our HTS system into high-content screening (HCS), a type of HTS that uses multichannel fluorescence images to capture biological events in situ, and is uniquely suited to study dynamical processes in living cells. PMID:22341232

  3. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  4. High throughput light absorber discovery, Part 2: Establishing structure–band gap energy relationships

    International Nuclear Information System (INIS)

    Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan; Van Campen, Douglas G.; Mehta, Apurva; Gregoire, John M.

    2016-01-01

    Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4 V 1.5 Fe 0.5 O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platform for identifying new optical materials.

  5. Microfluidic platforms for high-throughput mammalian cell printing, transfection, and dosage-dependent studies

    OpenAIRE

    Woodruff, Kristina Pan

    2017-01-01

    With the advent of high-throughput and genome-wide screening initiatives, there is a need for improved methods for cell-based assays. Current approaches require expensive equipment, rely on large-scale culturing formats not suited for small or rare sample types, or involve tedious manual handling. Microfluidic systems could provide a solution to these limitations, since these assays are accessible, miniaturized, and automated. When coupled with high-content analysis, microfluidics has the pot...

  6. A high-throughput AO/PI-based cell concentration and viability detection method using the Celigo image cytometry.

    Science.gov (United States)

    Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean

    2016-10-01

    To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.

  7. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  8. MAPPER: high-throughput maskless lithography

    Science.gov (United States)

    Wieland, M. J.; de Boer, G.; ten Berge, G. F.; Jager, R.; van de Peut, T.; Peijster, J. J. M.; Slot, E.; Steenbrink, S. W. H. K.; Teepen, T. F.; van Veen, A. H. V.; Kampherbeek, B. J.

    2009-03-01

    Maskless electron beam lithography, or electron beam direct write, has been around for a long time in the semiconductor industry and was pioneered from the mid-1960s onwards. This technique has been used for mask writing applications as well as device engineering and in some cases chip manufacturing. However because of its relatively low throughput compared to optical lithography, electron beam lithography has never been the mainstream lithography technology. To extend optical lithography double patterning, as a bridging technology, and EUV lithography are currently explored. Irrespective of the technical viability of both approaches, one thing seems clear. They will be expensive [1]. MAPPER Lithography is developing a maskless lithography technology based on massively-parallel electron-beam writing with high speed optical data transport for switching the electron beams. In this way optical columns can be made with a throughput of 10-20 wafers per hour. By clustering several of these columns together high throughputs can be realized in a small footprint. This enables a highly cost-competitive alternative to double patterning and EUV alternatives. In 2007 MAPPER obtained its Proof of Lithography milestone by exposing in its Demonstrator 45 nm half pitch structures with 110 electron beams in parallel, where all the beams where individually switched on and off [2]. In 2008 MAPPER has taken a next step in its development by building several tools. The objective of building these tools is to involve semiconductor companies to be able to verify tool performance in their own environment. To enable this, the tools will have a 300 mm wafer stage in addition to a 110-beam optics column. First exposures at 45 nm half pitch resolution have been performed and analyzed. On the same wafer it is observed that all beams print and based on analysis of 11 beams the CD for the different patterns is within 2.2 nm from target and the CD uniformity for the different patterns is better

  9. Validation of a high-throughput fermentation system based on online monitoring of biomass and fluorescence in continuously shaken microtiter plates

    Directory of Open Access Journals (Sweden)

    Kensy Frank

    2009-06-01

    -based cultivation systems. In particular, applications with strong demand on high-throughput such as clone and media screening and systems biology can benefit from its simple handling, the high quantitative information content and its capacity of automation.

  10. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  11. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    Science.gov (United States)

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  12. High Throughput Multispectral Image Processing with Applications in Food Science.

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  13. High Throughput Multispectral Image Processing with Applications in Food Science.

    Directory of Open Access Journals (Sweden)

    Panagiotis Tsakanikas

    Full Text Available Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  14. High Throughput T Epitope Mapping and Vaccine Development

    Directory of Open Access Journals (Sweden)

    Giuseppina Li Pira

    2010-01-01

    Full Text Available Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th and by cytolytic T lymphocytes (CTL is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost.

  15. High Throughput, Continuous, Mass Production of Photovoltaic Modules

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Barth

    2008-02-06

    AVA Solar has developed a very low cost solar photovoltaic (PV) manufacturing process and has demonstrated the significant economic and commercial potential of this technology. This I & I Category 3 project provided significant assistance toward accomplishing these milestones. The original goals of this project were to design, construct and test a production prototype system, fabricate PV modules and test the module performance. The original module manufacturing costs in the proposal were estimated at $2/Watt. The objectives of this project have been exceeded. An advanced processing line was designed, fabricated and installed. Using this automated, high throughput system, high efficiency devices and fully encapsulated modules were manufactured. AVA Solar has obtained 2 rounds of private equity funding, expand to 50 people and initiated the development of a large scale factory for 100+ megawatts of annual production. Modules will be manufactured at an industry leading cost which will enable AVA Solar's modules to produce power that is cost-competitive with traditional energy resources. With low manufacturing costs and the ability to scale manufacturing, AVA Solar has been contacted by some of the largest customers in the PV industry to negotiate long-term supply contracts. The current market for PV has continued to grow at 40%+ per year for nearly a decade and is projected to reach $40-$60 Billion by 2012. Currently, a crystalline silicon raw material supply shortage is limiting growth and raising costs. Our process does not use silicon, eliminating these limitations.

  16. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    Science.gov (United States)

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. High-throughput single-cell manipulation in brain tissue.

    Directory of Open Access Journals (Sweden)

    Joseph D Steinmeyer

    Full Text Available The complexity of neurons and neuronal circuits in brain tissue requires the genetic manipulation, labeling, and tracking of single cells. However, current methods for manipulating cells in brain tissue are limited to either bulk techniques, lacking single-cell accuracy, or manual methods that provide single-cell accuracy but at significantly lower throughputs and repeatability. Here, we demonstrate high-throughput, efficient, reliable, and combinatorial delivery of multiple genetic vectors and reagents into targeted cells within the same tissue sample with single-cell accuracy. Our system automatically loads nanoliter-scale volumes of reagents into a micropipette from multiwell plates, targets and transfects single cells in brain tissues using a robust electroporation technique, and finally preps the micropipette by automated cleaning for repeating the transfection cycle. We demonstrate multi-colored labeling of adjacent cells, both in organotypic and acute slices, and transfection of plasmids encoding different protein isoforms into neurons within the same brain tissue for analysis of their effects on linear dendritic spine density. Our platform could also be used to rapidly deliver, both ex vivo and in vivo, a variety of genetic vectors, including optogenetic and cell-type specific agents, as well as fast-acting reagents such as labeling dyes, calcium sensors, and voltage sensors to manipulate and track neuronal circuit activity at single-cell resolution.

  18. Large scale library generation for high throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Erik Borgström

    Full Text Available BACKGROUND: Large efforts have recently been made to automate the sample preparation protocols for massively parallel sequencing in order to match the increasing instrument throughput. Still, the size selection through agarose gel electrophoresis separation is a labor-intensive bottleneck of these protocols. METHODOLOGY/PRINCIPAL FINDINGS: In this study a method for automatic library preparation and size selection on a liquid handling robot is presented. The method utilizes selective precipitation of certain sizes of DNA molecules on to paramagnetic beads for cleanup and selection after standard enzymatic reactions. CONCLUSIONS/SIGNIFICANCE: The method is used to generate libraries for de novo and re-sequencing on the Illumina HiSeq 2000 instrument with a throughput of 12 samples per instrument in approximately 4 hours. The resulting output data show quality scores and pass filter rates comparable to manually prepared samples. The sample size distribution can be adjusted for each application, and are suitable for all high throughput DNA processing protocols seeking to control size intervals.

  19. High-throughput DNA extraction of forensic adhesive tapes.

    Science.gov (United States)

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. Curating and Preparing High-Throughput Screening Data for Quantitative Structure-Activity Relationship Modeling.

    Science.gov (United States)

    Kim, Marlene T; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2016-01-01

    Publicly available bioassay data often contains errors. Curating massive bioassay data, especially high-throughput screening (HTS) data, for Quantitative Structure-Activity Relationship (QSAR) modeling requires the assistance of automated data curation tools. Using automated data curation tools are beneficial to users, especially ones without prior computer skills, because many platforms have been developed and optimized based on standardized requirements. As a result, the users do not need to extensively configure the curation tool prior to the application procedure. In this chapter, a freely available automatic tool to curate and prepare HTS data for QSAR modeling purposes will be described.

  1. Protocol: A high-throughput DNA extraction system suitable for conifers

    Directory of Open Access Journals (Sweden)

    Rajora Om P

    2008-08-01

    Full Text Available Abstract Background High throughput DNA isolation from plants is a major bottleneck for most studies requiring large sample sizes. A variety of protocols have been developed for DNA isolation from plants. However, many species, including conifers, have high contents of secondary metabolites that interfere with the extraction process or the subsequent analysis steps. Here, we describe a procedure for high-throughput DNA isolation from conifers. Results We have developed a high-throughput DNA extraction protocol for conifers using an automated liquid handler and modifying the Qiagen MagAttract Plant Kit protocol. The modifications involve change to the buffer system and improving the protocol so that it almost doubles the number of samples processed per kit, which significantly reduces the overall costs. We describe two versions of the protocol: one for medium-throughput (MTP and another for high-throughput (HTP DNA isolation. The HTP version works from start to end in the industry-standard 96-well format, while the MTP version provides higher DNA yields per sample processed. We have successfully used the protocol for DNA extraction and genotyping of thousands of individuals of several spruce and a pine species. Conclusion A high-throughput system for DNA extraction from conifer needles and seeds has been developed and validated. The quality of the isolated DNA was comparable with that obtained from two commonly used methods: the silica-spin column and the classic CTAB protocol. Our protocol provides a fully automatable and cost effective solution for processing large numbers of conifer samples.

  2. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  3. Micropillar arrays as a high-throughput screening platform for therapeutics in multiple sclerosis.

    Science.gov (United States)

    Mei, Feng; Fancy, Stephen P J; Shen, Yun-An A; Niu, Jianqin; Zhao, Chao; Presley, Bryan; Miao, Edna; Lee, Seonok; Mayoral, Sonia R; Redmond, Stephanie A; Etxeberria, Ainhoa; Xiao, Lan; Franklin, Robin J M; Green, Ari; Hauser, Stephen L; Chan, Jonah R

    2014-08-01

    Functional screening for compounds that promote remyelination represents a major hurdle in the development of rational therapeutics for multiple sclerosis. Screening for remyelination is problematic, as myelination requires the presence of axons. Standard methods do not resolve cell-autonomous effects and are not suited for high-throughput formats. Here we describe a binary indicant for myelination using micropillar arrays (BIMA). Engineered with conical dimensions, micropillars permit resolution of the extent and length of membrane wrapping from a single two-dimensional image. Confocal imaging acquired from the base to the tip of the pillars allows for detection of concentric wrapping observed as 'rings' of myelin. The platform is formatted in 96-well plates, amenable to semiautomated random acquisition and automated detection and quantification. Upon screening 1,000 bioactive molecules, we identified a cluster of antimuscarinic compounds that enhance oligodendrocyte differentiation and remyelination. Our findings demonstrate a new high-throughput screening platform for potential regenerative therapeutics in multiple sclerosis.

  4. Life in the fast lane: high-throughput chemistry for lead generation and optimisation.

    Science.gov (United States)

    Hunter, D

    2001-01-01

    The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.

  5. Accurate Classification of Protein Subcellular Localization from High-Throughput Microscopy Images Using Deep Learning

    Directory of Open Access Journals (Sweden)

    Tanel Pärnamaa

    2017-05-01

    Full Text Available High-throughput microscopy of many single cells generates high-dimensional data that are far from straightforward to analyze. One important problem is automatically detecting the cellular compartment where a fluorescently-tagged protein resides, a task relatively simple for an experienced human, but difficult to automate on a computer. Here, we train an 11-layer neural network on data from mapping thousands of yeast proteins, achieving per cell localization classification accuracy of 91%, and per protein accuracy of 99% on held-out images. We confirm that low-level network features correspond to basic image characteristics, while deeper layers separate localization classes. Using this network as a feature calculator, we train standard classifiers that assign proteins to previously unseen compartments after observing only a small number of training examples. Our results are the most accurate subcellular localization classifications to date, and demonstrate the usefulness of deep learning for high-throughput microscopy.

  6. Accurate Classification of Protein Subcellular Localization from High-Throughput Microscopy Images Using Deep Learning.

    Science.gov (United States)

    Pärnamaa, Tanel; Parts, Leopold

    2017-05-05

    High-throughput microscopy of many single cells generates high-dimensional data that are far from straightforward to analyze. One important problem is automatically detecting the cellular compartment where a fluorescently-tagged protein resides, a task relatively simple for an experienced human, but difficult to automate on a computer. Here, we train an 11-layer neural network on data from mapping thousands of yeast proteins, achieving per cell localization classification accuracy of 91%, and per protein accuracy of 99% on held-out images. We confirm that low-level network features correspond to basic image characteristics, while deeper layers separate localization classes. Using this network as a feature calculator, we train standard classifiers that assign proteins to previously unseen compartments after observing only a small number of training examples. Our results are the most accurate subcellular localization classifications to date, and demonstrate the usefulness of deep learning for high-throughput microscopy. Copyright © 2017 Parnamaa and Parts.

  7. High Throughput Hall Thruster for Small Spacecraft, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek is developing a high throughput nominal 100-W Hall Effect Thruster. This device is well sized for spacecraft ranging in size from several tens of kilograms to...

  8. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek Co. Inc. proposes to develop a high throughput, nominal 100 W Hall Effect Thruster (HET). This HET will be sized for small spacecraft (< 180 kg), including...

  9. High throughput growth and characterization of thin film materials

    Science.gov (United States)

    Mao, Samuel S.

    2013-09-01

    It usually takes more than 10 years for a new material from initial research to its first commercial application. Therefore, accelerating the pace of discovery of new materials is critical to tackling challenges in areas ranging from clean energy to national security. As discovery of new materials has not kept pace with the product design cycles in many sectors of industry, there is a pressing need to develop and utilize high throughput screening and discovery technologies for the growth and characterization of new materials. This article presents two distinctive types of high throughput thin film material growth approaches, along with a number of high throughput characterization techniques, established in the author's group. These approaches include a second-generation "discrete" combinatorial semiconductor discovery technology that enables the creation of arrays of individually separated thin film semiconductor materials of different compositions, and a "continuous" high throughput thin film material screening technology that enables the realization of ternary alloy libraries with continuously varying elemental ratios.

  10. Materiomics - High-Throughput Screening of Biomaterial Properties

    NARCIS (Netherlands)

    de Boer, Jan; van Blitterswijk, Clemens

    2013-01-01

    This complete, yet concise, guide introduces you to the rapidly developing field of high throughput screening of biomaterials: materiomics. Bringing together the key concepts and methodologies used to determine biomaterial properties, you will understand the adaptation and application of materomics

  11. HAPIscreen, a method for high-throughput aptamer identification

    Directory of Open Access Journals (Sweden)

    Evadé Laetitia

    2011-06-01

    Full Text Available Abstract Background Aptamers are oligonucleotides displaying specific binding properties for a predetermined target. They are selected from libraries of randomly synthesized candidates through an in vitro selection process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment alternating selection and amplification steps. SELEX is followed by cloning and sequencing of the enriched pool of oligonucleotides to enable comparison of the selected sequences. The most represented candidates are then synthesized and their binding properties are individually evaluated thus leading to the identification of aptamers. These post-selection steps are time consuming and introduce a bias to the expense of poorly amplified binders that might be of high affinity and are consequently underrepresented. A method that would circumvent these limitations would be highly valuable. Results We describe a novel homogeneous solution-based method for screening large populations of oligonucleotide candidates generated from SELEX. This approach, based on the AlphaScreen® technology, is carried out on the exclusive basis of the binding properties of the selected candidates without the needs of performing a priori sequencing. It therefore enables the functional identification of high affinity aptamers. We validated the HAPIscreen (High throughput APtamer Identification screen methodology using aptamers targeted to RNA hairpins, previously identified in our laboratory. We then screened pools of candidates issued from SELEX rounds in a 384 well microplate format and identify new RNA aptamers to pre-microRNAs. Conclusions HAPIscreen, an Alphascreen®-based methodology for the identification of aptamers is faster and less biased than current procedures based on sequence comparison of selected oligonucleotides and sampling binding properties of few individuals. Moreover this methodology allows for screening larger number of candidates. Used here for selecting anti

  12. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping.

    Science.gov (United States)

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-06-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays 'Fernandez') plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. © 2014 American Society of Plant Biologists. All Rights Reserved.

  13. A high throughput array microscope for the mechanical characterization of biomaterials

    Science.gov (United States)

    Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard

    2015-02-01

    In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.

  14. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    Science.gov (United States)

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  15. High throughput imaging and analysis for biological interpretation of agricultural plants and environmental interaction

    Science.gov (United States)

    Hong, Hyundae; Benac, Jasenka; Riggsbee, Daniel; Koutsky, Keith

    2014-03-01

    High throughput (HT) phenotyping of crops is essential to increase yield in environments deteriorated by climate change. The controlled environment of a greenhouse offers an ideal platform to study the genotype to phenotype linkages for crop screening. Advanced imaging technologies are used to study plants' responses to resource limitations such as water and nutrient deficiency. Advanced imaging technologies coupled with automation make HT phenotyping in the greenhouse not only feasible, but practical. Monsanto has a state of the art automated greenhouse (AGH) facility. Handling of the soil, pots water and nutrients are all completely automated. Images of the plants are acquired by multiple hyperspectral and broadband cameras. The hyperspectral cameras cover wavelengths from visible light through short wave infra-red (SWIR). Inhouse developed software analyzes the images to measure plant morphological and biochemical properties. We measure phenotypic metrics like plant area, height, and width as well as biomass. Hyperspectral imaging allows us to measure biochemcical metrics such as chlorophyll, anthocyanin, and foliar water content. The last 4 years of AGH operations on crops like corn, soybean, and cotton have demonstrated successful application of imaging and analysis technologies for high throughput plant phenotyping. Using HT phenotyping, scientists have been showing strong correlations to environmental conditions, such as water and nutrient deficits, as well as the ability to tease apart distinct differences in the genetic backgrounds of crops.

  16. High-throughput telomere length quantification by FISH and its application to human population studies.

    Science.gov (United States)

    Canela, Andrés; Vera, Elsa; Klatt, Peter; Blasco, María A

    2007-03-27

    A major limitation of studies of the relevance of telomere length to cancer and age-related diseases in human populations and to the development of telomere-based therapies has been the lack of suitable high-throughput (HT) assays to measure telomere length. We have developed an automated HT quantitative telomere FISH platform, HT quantitative FISH (Q-FISH), which allows the quantification of telomere length as well as percentage of short telomeres in large human sample sets. We show here that this technique provides the accuracy and sensitivity to uncover associations between telomere length and human disease.

  17. Zebrafish: A marvel of high-throughput biology for 21st century toxicology.

    Science.gov (United States)

    Bugel, Sean M; Tanguay, Robert L; Planchart, Antonio

    2014-09-07

    The evolutionary conservation of genomic, biochemical and developmental features between zebrafish and humans is gradually coming into focus with the end result that the zebrafish embryo model has emerged as a powerful tool for uncovering the effects of environmental exposures on a multitude of biological processes with direct relevance to human health. In this review, we highlight advances in automation, high-throughput (HT) screening, and analysis that leverage the power of the zebrafish embryo model for unparalleled advances in our understanding of how chemicals in our environment affect our health and wellbeing.

  18. High-throughput screening for novel anti-infectives using a C. elegans pathogenesis model.

    Science.gov (United States)

    Conery, Annie L; Larkins-Ford, Jonah; Ausubel, Frederick M; Kirienko, Natalia V

    2014-03-14

    In recent history, the nematode Caenorhabditis elegans has provided a compelling platform for the discovery of novel antimicrobial drugs. In this protocol, we present an automated, high-throughput C. elegans pathogenesis assay, which can be used to screen for anti-infective compounds that prevent nematodes from dying due to Pseudomonas aeruginosa. New antibiotics identified from such screens would be promising candidates for treatment of human infections, and also can be used as probe compounds to identify novel targets in microbial pathogenesis or host immunity. Copyright © 2014 John Wiley & Sons, Inc.

  19. High-throughput search for caloric materials: the CaloriCool approach

    Science.gov (United States)

    Zarkevich, N. A.; Johnson, D. D.; Pecharsky, V. K.

    2018-01-01

    The high-throughput search paradigm adopted by the newly established caloric materials consortium—CaloriCool®—with the goal to substantially accelerate discovery and design of novel caloric materials is briefly discussed. We begin with describing material selection criteria based on known properties, which are then followed by heuristic fast estimates, ab initio calculations, all of which has been implemented in a set of automated computational tools and measurements. We also demonstrate how theoretical and computational methods serve as a guide for experimental efforts by considering a representative example from the field of magnetocaloric materials.

  20. A novel high throughput method to investigate polymer dissolution.

    Science.gov (United States)

    Zhang, Ying; Mallapragada, Surya K; Narasimhan, Balaji

    2010-02-16

    The dissolution behavior of polystyrene (PS) in biodiesel was studied by developing a novel high throughput approach based on Fourier-transform infrared (FTIR) microscopy. A multiwell device for high throughput dissolution testing was fabricated using a photolithographic rapid prototyping method. The dissolution of PS films in each well was tracked by following the characteristic IR band of PS and the effect of PS molecular weight and temperature on the dissolution rate was simultaneously investigated. The results were validated with conventional gravimetric methods. The high throughput method can be extended to evaluate the dissolution profiles of a large number of samples, or to simultaneously investigate the effect of variables such as polydispersity, crystallinity, and mixed solvents. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought

    Directory of Open Access Journals (Sweden)

    Riccardo Ludovisi

    2017-09-01

    Full Text Available Poplars are fast-growing, high-yielding forest tree species, whose cultivation as second-generation biofuel crops is of increasing interest and can efficiently meet emission reduction goals. Yet, breeding elite poplar trees for drought resistance remains a major challenge. Worldwide breeding programs are largely focused on intra/interspecific hybridization, whereby Populus nigra L. is a fundamental parental pool. While high-throughput genotyping has resulted in unprecedented capabilities to rapidly decode complex genetic architecture of plant stress resistance, linking genomics to phenomics is hindered by technically challenging phenotyping. Relying on unmanned aerial vehicle (UAV-based remote sensing and imaging techniques, high-throughput field phenotyping (HTFP aims at enabling highly precise and efficient, non-destructive screening of genotype performance in large populations. To efficiently support forest-tree breeding programs, ground-truthing observations should be complemented with standardized HTFP. In this study, we develop a high-resolution (leaf level HTFP approach to investigate the response to drought of a full-sib F2 partially inbred population (termed here ‘POP6’, whose F1 was obtained from an intraspecific P. nigra controlled cross between genotypes with highly divergent phenotypes. We assessed the effects of two water treatments (well-watered and moderate drought on a population of 4603 trees (503 genotypes hosted in two adjacent experimental plots (1.67 ha by conducting low-elevation (25 m flights with an aerial drone and capturing 7836 thermal infrared (TIR images. TIR images were undistorted, georeferenced, and orthorectified to obtain radiometric mosaics. Canopy temperature (Tc was extracted using two independent semi-automated segmentation techniques, eCognition- and Matlab-based, to avoid the mixed-pixel problem. Overall, results showed that the UAV platform-based thermal imaging enables to effectively assess genotype

  2. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought.

    Science.gov (United States)

    Ludovisi, Riccardo; Tauro, Flavia; Salvati, Riccardo; Khoury, Sacha; Mugnozza Scarascia, Giuseppe; Harfouche, Antoine

    2017-01-01

    Poplars are fast-growing, high-yielding forest tree species, whose cultivation as second-generation biofuel crops is of increasing interest and can efficiently meet emission reduction goals. Yet, breeding elite poplar trees for drought resistance remains a major challenge. Worldwide breeding programs are largely focused on intra/interspecific hybridization, whereby Populus nigra L. is a fundamental parental pool. While high-throughput genotyping has resulted in unprecedented capabilities to rapidly decode complex genetic architecture of plant stress resistance, linking genomics to phenomics is hindered by technically challenging phenotyping. Relying on unmanned aerial vehicle (UAV)-based remote sensing and imaging techniques, high-throughput field phenotyping (HTFP) aims at enabling highly precise and efficient, non-destructive screening of genotype performance in large populations. To efficiently support forest-tree breeding programs, ground-truthing observations should be complemented with standardized HTFP. In this study, we develop a high-resolution (leaf level) HTFP approach to investigate the response to drought of a full-sib F 2 partially inbred population (termed here 'POP6'), whose F 1 was obtained from an intraspecific P. nigra controlled cross between genotypes with highly divergent phenotypes. We assessed the effects of two water treatments (well-watered and moderate drought) on a population of 4603 trees (503 genotypes) hosted in two adjacent experimental plots (1.67 ha) by conducting low-elevation (25 m) flights with an aerial drone and capturing 7836 thermal infrared (TIR) images. TIR images were undistorted, georeferenced, and orthorectified to obtain radiometric mosaics. Canopy temperature ( T c ) was extracted using two independent semi-automated segmentation techniques, eCognition- and Matlab-based, to avoid the mixed-pixel problem. Overall, results showed that the UAV platform-based thermal imaging enables to effectively assess genotype

  3. Screening and synthesis: high throughput technologies applied to parasitology.

    Science.gov (United States)

    Morgan, R E; Westwood, N J

    2004-01-01

    High throughput technologies continue to develop in response to the challenges set by the genome projects. This article discusses how the techniques of both high throughput screening (HTS) and synthesis can influence research in parasitology. Examples of the use of targeted and phenotype-based HTS using unbiased compound collections are provided. The important issue of identifying the protein target(s) of bioactive compounds is discussed from the synthetic chemist's perspective. This article concludes by reviewing recent examples of successful target identification studies in parasitology.

  4. Workflow for High Throughput Screening of Gas Sensing Materials

    Directory of Open Access Journals (Sweden)

    Ulrich Simon

    2006-04-01

    Full Text Available The workflow of a high throughput screening setup for the rapid identification ofnew and improved sensor materials is presented. The polyol method was applied to preparenanoparticular metal oxides as base materials, which were functionalised by surface doping.Using multi-electrode substrates and high throughput impedance spectroscopy (HT-IS awide range of materials could be screened in a short time. Applying HT-IS in search of newselective gas sensing materials a NO2-tolerant NO sensing material with reducedsensitivities towards other test gases was identified based on iridium doped zinc oxide.Analogous behaviour was observed for iridium doped indium oxide.

  5. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  6. Gold nanoparticle-mediated (GNOME) laser perforation: a new method for a high-throughput analysis of gap junction intercellular coupling.

    Science.gov (United States)

    Begandt, Daniela; Bader, Almke; Antonopoulos, Georgios C; Schomaker, Markus; Kalies, Stefan; Meyer, Heiko; Ripken, Tammo; Ngezahayo, Anaclet

    2015-10-01

    The present report evaluates the advantages of using the gold nanoparticle-mediated laser perforation (GNOME LP) technique as a computer-controlled cell optoperforation to introduce Lucifer yellow (LY) into cells in order to analyze the gap junction coupling in cell monolayers. To permeabilize GM-7373 endothelial cells grown in a 24 multiwell plate with GNOME LP, a laser beam of 88 μm in diameter was applied in the presence of gold nanoparticles and LY. After 10 min to allow dye uptake and diffusion through gap junctions, we observed a LY-positive cell band of 179 ± 8 μm width. The presence of the gap junction channel blocker carbenoxolone during the optoperforation reduced the LY-positive band to 95 ± 6 μm. Additionally, a forskolin-related enhancement of gap junction coupling, recently found using the scrape loading technique, was also observed using GNOME LP. Further, an automatic cell imaging and a subsequent semi-automatic quantification of the images using a java-based ImageJ-plugin were performed in a high-throughput sequence. Moreover, the GNOME LP was used on cells such as RBE4 rat brain endothelial cells, which cannot be mechanically scraped as well as on three-dimensionally cultivated cells, opening the possibility to implement the GNOME LP technique for analysis of gap junction coupling in tissues. We conclude that the GNOME LP technique allows a high-throughput automated analysis of gap junction coupling in cells. Moreover this non-invasive technique could be used on monolayers that do not support mechanical scraping as well as on cells in tissue allowing an in vivo/ex vivo analysis of gap junction coupling.

  7. Applications of High-Throughput Nucleotide Sequencing (PhD)

    DEFF Research Database (Denmark)

    Waage, Johannes

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...

  8. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  9. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  10. High-throughput bioinformatics with the Cyrille2 pipeline system.

    NARCIS (Netherlands)

    Fiers, M.W.E.J.; Burgt, van der A.; Datema, E.; Groot, de J.C.W.; Ham, van R.C.H.J.

    2008-01-01

    Background - Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses

  11. High-throughput screening, predictive modeling and computational embryology

    Science.gov (United States)

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  12. High-throughput screening, predictive modeling and computational embryology - Abstract

    Science.gov (United States)

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  13. One step further towards real high-throughput functional genomics

    NARCIS (Netherlands)

    Oude Elferink, Ronald

    2003-01-01

    In a recent paper by Michiels et al. an important step was made towards genuine high throughput functional genomics. The authors produced an arrayed adenoviral library containing > 120000 cDNAs isolated from human placenta. This library can be used for arrayed transduction of cell lines in

  14. A high-throughput method for assessing chemical toxicity using a Caenorhabditis elegans reproduction assay

    International Nuclear Information System (INIS)

    Boyd, Windy A.; McBride, Sandra J.; Rice, Julie R.; Snyder, Daniel W.; Freedman, Jonathan H.

    2010-01-01

    The National Research Council has outlined the need for non-mammalian toxicological models to test the potential health effects of a large number of chemicals while also reducing the use of traditional animal models. The nematode Caenorhabditis elegans is an attractive alternative model because of its well-characterized and evolutionarily conserved biology, low cost, and ability to be used in high-throughput screening. A high-throughput method is described for quantifying the reproductive capacity of C. elegans exposed to chemicals for 48 h from the last larval stage (L4) to adulthood using a COPAS Biosort. Initially, the effects of exposure conditions that could influence reproduction were defined. Concentrations of DMSO vehicle ≤ 1% did not affect reproduction. Previous studies indicated that C. elegans may be influenced by exposure to low pH conditions. At pHs greater than 4.5, C. elegans reproduction was not affected; however below this pH there was a significant decrease in the number of offspring. Cadmium chloride was chosen as a model toxicant to verify that automated measurements were comparable to those of traditional observational studies. EC 50 values for cadmium for automated measurements (176-192 μM) were comparable to those previously reported for a 72-h exposure using manual counting (151 μM). The toxicity of seven test toxicants on C. elegans reproduction was highly correlative with rodent lethality suggesting that this assay may be useful in predicting the potential toxicity of chemicals in other organisms.

  15. Fast high-throughput screening of temoporfin-loaded liposomal formulations prepared by ethanol injection method.

    Science.gov (United States)

    Yang, Kewei; Delaney, Joseph T; Schubert, Ulrich S; Fahr, Alfred

    2012-03-01

    A new strategy for fast, convenient high-throughput screening of liposomal formulations was developed, utilizing the automation of the so-called ethanol-injection method. This strategy was illustrated by the preparation and screening of the liposomal formulation library of a potent second-generation photosensitizer, temoporfin. Numerous liposomal formulations were efficiently prepared using a pipetting robot, followed by automated size characterization, using a dynamic light scattering plate reader. Incorporation efficiency of temoporfin and zeta potential were also detected in selected cases. To optimize the formulation, different parameters were investigated, including lipid types, lipid concentration in injected ethanol, ratio of ethanol to aqueous solution, ratio of drug to lipid, and the addition of functional phospholipid. Step-by-step small liposomes were prepared with high incorporation efficiency. At last, an optimized formulation was obtained for each lipid in the following condition: 36.4 mg·mL(-1) lipid, 13.1 mg·mL(-1) mPEG(2000)-DSPE, and 1:4 ethanol:buffer ratio. These liposomes were unilamellar spheres, with a diameter of approximately 50 nm, and were very stable for over 20 weeks. The results illustrate this approach to be promising for fast high-throughput screening of liposomal formulations.

  16. High throughput imaging of blood smears using white light diffraction phase microscopy

    Science.gov (United States)

    Majeed, Hassaan; Kandel, Mikhail E.; Bhaduri, Basanta; Han, Kevin; Luo, Zelun; Tangella, Krishnarao; Popescu, Gabriel

    2015-03-01

    While automated blood cell counters have made great progress in detecting abnormalities in blood, the lack of specificity for a particular disease, limited information on single cell morphology and intrinsic uncertainly due to high throughput in these instruments often necessitates detailed inspection in the form of a peripheral blood smear. Such tests are relatively time consuming and frequently rely on medical professionals tally counting specific cell types. These assays rely on the contrast generated by chemical stains, with the signal intensity strongly related to staining and preparation techniques, frustrating machine learning algorithms that require consistent quantities to denote the features in question. Instead we opt to use quantitative phase imaging, understanding that the resulting image is entirely due to the structure (intrinsic contrast) rather than the complex interplay of stain and sample. We present here our first steps to automate peripheral blood smear scanning, in particular a method to generate the quantitative phase image of an entire blood smear at high throughput using white light diffraction phase microscopy (wDPM), a single shot and common path interferometric imaging technique.

  17. MIPHENO: data normalization for high throughput metabolite analysis

    Directory of Open Access Journals (Sweden)

    Bell Shannon M

    2012-01-01

    Full Text Available Abstract Background High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course of months and years, often without the controls needed to compare directly across the dataset. Few methods are available to facilitate comparisons of high throughput metabolic data generated in batches where explicit in-group controls for normalization are lacking. Results Here we describe MIPHENO (Mutant Identification by Probabilistic High throughput-Enabled Normalization, an approach for post-hoc normalization of quantitative first-pass screening data in the absence of explicit in-group controls. This approach includes a quality control step and facilitates cross-experiment comparisons that decrease the false non-discovery rates, while maintaining the high accuracy needed to limit false positives in first-pass screening. Results from simulation show an improvement in both accuracy and false non-discovery rate over a range of population parameters (p -16 and a modest but significant (p -16 improvement in area under the receiver operator characteristic curve of 0.955 for MIPHENO vs 0.923 for a group-based statistic (z-score. Analysis of the high throughput phenotypic data from the Arabidopsis Chloroplast 2010 Project (http://www.plastid.msu.edu/ showed ~ 4-fold increase in the ability to detect previously described or expected phenotypes over the group based statistic. Conclusions Results demonstrate MIPHENO offers substantial benefit in improving the ability to detect putative mutant phenotypes from post-hoc analysis of large data sets. Additionally, it facilitates data interpretation and permits cross-dataset comparison where group-based controls are missing. MIPHENO is applicable to a wide range of high throughput screenings and the code is

  18. A robust high throughput platform to generate functional recombinant monoclonal antibodies using rabbit B cells from peripheral blood.

    Directory of Open Access Journals (Sweden)

    Stefan Seeber

    Full Text Available We have developed a robust platform to generate and functionally characterize rabbit-derived antibodies using B cells from peripheral blood. The rapid high throughput procedure generates a diverse set of antibodies, yet requires only few animals to be immunized without the need to sacrifice them. The workflow includes (i the identification and isolation of single B cells from rabbit blood expressing IgG antibodies, (ii an elaborate short term B-cell cultivation to produce sufficient monoclonal antigen specific IgG for comprehensive phenotype screens, (iii the isolation of VH and VL coding regions via PCR from B-cell clones producing antigen specific and functional antibodies followed by the sequence determination, and (iv the recombinant expression and purification of IgG antibodies. The fully integrated and to a large degree automated platform (demonstrated in this paper using IL1RL1 immunized rabbits yielded clonal and very diverse IL1RL1-specific and functional IL1RL1-inhibiting rabbit antibodies. These functional IgGs from individual animals were obtained at a short time range after immunization and could be identified already during primary screening, thus substantially lowering the workload for the subsequent B-cell PCR workflow. Early availability of sequence information permits one to select early-on function- and sequence-diverse antibodies for further characterization. In summary, this powerful technology platform has proven to be an efficient and robust method for the rapid generation of antigen specific and functional monoclonal rabbit antibodies without sacrificing the immunized animal.

  19. High Throughput Cryogenic And Room Temperature Testing Of Focal Plane Components

    Science.gov (United States)

    Voynick, Stanley

    1988-04-01

    To increase production efficiency in the manufacture of infrared focal plane components, test techniques were refined to enhance testing throughput and accuracy. The result is an integrated package of high performance hardware and software tools which performs well in high throughput production environments. The test system is also very versatile. It has been used for readout (multiplexer) device characterization, room temperature automated wafer probing, and focal plane array (FPA) testing. Tests have been performed using electrical and radiometric optical stimulus. An integrated, convenient software package was developed and is used to acquire, reduce, analyze, display, and archive test data. The test software supports fully automated operation for the production environment, as well as menu-driven operation for R&D, characterization and setup purposes. Trade-offs between handling techniques in cryogenic production testing were investigated. " atch processing" is preferred over "continuous flow", primarily due to considerations of contamination of the cryogenic environment.

  20. High-Throughput Platform for Synthesis of Melamine-Formaldehyde Microcapsules.

    Science.gov (United States)

    Çakir, Seda; Bauters, Erwin; Rivero, Guadalupe; Parasote, Tom; Paul, Johan; Du Prez, Filip E

    2017-07-10

    The synthesis of microcapsules via in situ polymerization is a labor-intensive and time-consuming process, where many composition and process factors affect the microcapsule formation and its morphology. Herein, we report a novel combinatorial technique for the preparation of melamine-formaldehyde microcapsules, using a custom-made and automated high-throughput platform (HTP). After performing validation experiments for ensuring the accuracy and reproducibility of the novel platform, a design of experiment study was performed. The influence of different encapsulation parameters was investigated, such as the effect of the surfactant, surfactant type, surfactant concentration and core/shell ratio. As a result, this HTP-platform is suitable to be used for the synthesis of different types of microcapsules in an automated and controlled way, allowing the screening of different reaction parameters in a shorter time compared to the manual synthetic techniques.

  1. High-throughput screen for novel antimicrobials using a whole animal infection model.

    Science.gov (United States)

    Moy, Terence I; Conery, Annie L; Larkins-Ford, Jonah; Wu, Gang; Mazitschek, Ralph; Casadei, Gabriele; Lewis, Kim; Carpenter, Anne E; Ausubel, Frederick M

    2009-07-17

    The nematode Caenorhabditis elegans is a unique whole animal model system for identifying small molecules with in vivo anti-infective properties. C. elegans can be infected with a broad range of human pathogens, including Enterococcus faecalis, an important human nosocomial pathogen. Here, we describe an automated, high-throughput screen of 37,200 compounds and natural product extracts for those that enhance survival of C. elegans infected with E. faecalis. Using a robot to dispense live, infected animals into 384-well plates and automated microscopy and image analysis, we identified 28 compounds and extracts not previously reported to have antimicrobial properties, including six structural classes that cure infected C. elegans animals but do not affect the growth of the pathogen in vitro, thus acting by a mechanism of action distinct from antibiotics currently in clinical use.

  2. LSGermOPA, a custom OPA of 384 EST-derived SNPs for high-throughput lettuce (Lactuca sativa L.) germplasm fingerprinting

    Science.gov (United States)

    We assessed the genetic diversity and population structure among 148 cultivated lettuce (Lactuca sativa L.) accessions using the high-throughput GoldenGate assay and 384 EST (Expressed Sequence Tag)-derived SNP (single nucleotide polymorphism) markers. A custom OPA (Oligo Pool All), LSGermOPA was fo...

  3. AELAS: Automatic ELAStic property derivations via high-throughput first-principles computation

    Science.gov (United States)

    Zhang, S. H.; Zhang, R. F.

    2017-11-01

    The elastic properties are fundamental and important for crystalline materials as they relate to other mechanical properties, various thermodynamic qualities as well as some critical physical properties. However, a complete set of experimentally determined elastic properties is only available for a small subset of known materials, and an automatic scheme for the derivations of elastic properties that is adapted to high-throughput computation is much demanding. In this paper, we present the AELAS code, an automated program for calculating second-order elastic constants of both two-dimensional and three-dimensional single crystal materials with any symmetry, which is designed mainly for high-throughput first-principles computation. Other derivations of general elastic properties such as Young's, bulk and shear moduli as well as Poisson's ratio of polycrystal materials, Pugh ratio, Cauchy pressure, elastic anisotropy and elastic stability criterion, are also implemented in this code. The implementation of the code has been critically validated by a lot of evaluations and tests on a broad class of materials including two-dimensional and three-dimensional materials, providing its efficiency and capability for high-throughput screening of specific materials with targeted mechanical properties. Program Files doi:http://dx.doi.org/10.17632/f8fwg4j9tw.1 Licensing provisions: BSD 3-Clause Programming language: Fortran Nature of problem: To automate the calculations of second-order elastic constants and the derivations of other elastic properties for two-dimensional and three-dimensional materials with any symmetry via high-throughput first-principles computation. Solution method: The space-group number is firstly determined by the SPGLIB code [1] and the structure is then redefined to unit cell with IEEE-format [2]. Secondly, based on the determined space group number, a set of distortion modes is automatically specified and the distorted structure files are generated

  4. High-throughput development of amphiphile self-assembly materials: fast-tracking synthesis, characterization, formulation, application, and understanding.

    Science.gov (United States)

    Mulet, Xavier; Conn, Charlotte E; Fong, Celesta; Kennedy, Danielle F; Moghaddam, Minoo J; Drummond, Calum J

    2013-07-16

    application. High-throughput data analysis is crucial at all stages to keep pace with data collection. In this Account, we describe high-throughput advances in the field of amphiphile self-assembly, focusing on nanostructured lyotropic liquid crystalline materials, which form when amphiphiles are added to a polar solvent. We outline recent progress in the automated preparation of amphiphile molecules and their nanostructured self-assembly systems both in the bulk phase and in dispersed colloidal particulate systems. Once prepared, we can structurally characterize these systems by establishing phase behavior in a high-throughput manner with both laboratory (infrared and light polarization microscopy) and synchrotron facilities (small-angle X-ray scattering). Additionally, we provide three case studies to demonstrate how chemists can use high-throughput approaches to evaluate the functional performance of amphiphile self-assembly materials. The high-throughput methodology for the set-up and characterization of large matrix in meso membrane protein crystallization trials can illustrate an application of bulk phase self-assembling amphiphiles. For dispersed colloidal systems, two nanomedicine examples highlight advances in high-throughput preparation, characterization, and evaluation: drug delivery and magnetic resonance imaging agents.

  5. High-throughput combinatorial chemical bath deposition: The case of doping Cu (In, Ga) Se film with antimony

    Science.gov (United States)

    Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong

    2018-01-01

    The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.

  6. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content......In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...... and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers...

  7. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  8. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels....... A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct...... characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion...

  9. Trade-Off Analysis in High-Throughput Materials Exploration.

    Science.gov (United States)

    Volety, Kalpana K; Huyberechts, Guido P J

    2017-03-13

    This Research Article presents a strategy to identify the optimum compositions in metal alloys with certain desired properties in a high-throughput screening environment, using a multiobjective optimization approach. In addition to the identification of the optimum compositions in a primary screening, the strategy also allows pointing to regions in the compositional space where further exploration in a secondary screening could be carried out. The strategy for the primary screening is a combination of two multiobjective optimization approaches namely Pareto optimality and desirability functions. The experimental data used in the present study have been collected from over 200 different compositions belonging to four different alloy systems. The metal alloys (comprising Fe, Ti, Al, Nb, Hf, Zr) are synthesized and screened using high-throughput technologies. The advantages of such a kind of approach compared to the limitations of the traditional and comparatively simpler approaches like ranking and calculating figures of merit are discussed.

  10. A high-throughput label-free nanoparticle analyser.

    Science.gov (United States)

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  11. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  12. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families.......Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...

  13. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  14. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  15. High-throughput optical system for HDES hyperspectral imager

    Science.gov (United States)

    Václavík, Jan; Melich, Radek; Pintr, Pavel; Pleštil, Jan

    2015-01-01

    Affordable, long-wave infrared hyperspectral imaging calls for use of an uncooled FPA with high-throughput optics. This paper describes the design of the optical part of a stationary hyperspectral imager in a spectral range of 7-14 um with a field of view of 20°×10°. The imager employs a push-broom method made by a scanning mirror. High throughput and a demand for simplicity and rigidity led to a fully refractive design with highly aspheric surfaces and off-axis positioning of the detector array. The design was optimized to exploit the machinability of infrared materials by the SPDT method and a simple assemblage.

  16. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  17. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  18. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  19. Web-based visual analysis for high-throughput genomics.

    Science.gov (United States)

    Goecks, Jeremy; Eberhard, Carl; Too, Tomithy; Nekrutenko, Anton; Taylor, James

    2013-06-13

    Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput

  20. A framework for accelerated phototrophic bioprocess development: integration of parallelized microscale cultivation, laboratory automation and Kriging-assisted experimental design.

    Science.gov (United States)

    Morschett, Holger; Freier, Lars; Rohde, Jannis; Wiechert, Wolfgang; von Lieres, Eric; Oldiges, Marco

    2017-01-01

    Even though microalgae-derived biodiesel has regained interest within the last decade, industrial production is still challenging for economic reasons. Besides reactor design, as well as value chain and strain engineering, laborious and slow early-stage parameter optimization represents a major drawback. The present study introduces a framework for the accelerated development of phototrophic bioprocesses. A state-of-the-art micro-photobioreactor supported by a liquid-handling robot for automated medium preparation and product quantification was used. To take full advantage of the technology's experimental capacity, Kriging-assisted experimental design was integrated to enable highly efficient execution of screening applications. The resulting platform was used for medium optimization of a lipid production process using Chlorella vulgaris toward maximum volumetric productivity. Within only four experimental rounds, lipid production was increased approximately threefold to 212 ± 11 mg L -1  d -1 . Besides nitrogen availability as a key parameter, magnesium, calcium and various trace elements were shown to be of crucial importance. Here, synergistic multi-parameter interactions as revealed by the experimental design introduced significant further optimization potential. The integration of parallelized microscale cultivation, laboratory automation and Kriging-assisted experimental design proved to be a fruitful tool for the accelerated development of phototrophic bioprocesses. By means of the proposed technology, the targeted optimization task was conducted in a very timely and material-efficient manner.

  1. Condor-COPASI: high-throughput computing for biochemical networks

    Directory of Open Access Journals (Sweden)

    Kent Edward

    2012-07-01

    Full Text Available Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage.

  2. Development of scalable high throughput fermentation approaches for physiological characterisation of yeast and filamentous fungi

    DEFF Research Database (Denmark)

    Knudsen, Peter Boldsen

    -up to economically viable industrial processes. Accurate quantitative assessment of cellular performance is required for the evaluation of the overall suitability of a microorganism as an industrial cell factory, ensuring that not only product, but also process parameters are optimised. With the increasing number...... of strains generated through genetic engineering programmes, the traditionally applied methods for strain characterisation, which are typically labour intensive and time consuming, have become somewhat limited due to throughput capacity. Unfortunately, most high throughput methods only provide low levels...... of information compared to larger scale cultivations, explaining why these systems have not been broadly implemented. The overall aim of the thesis was, therefore, to shift this paradigm towards higher throughput systems for assessment of cellular performance with a higher level of information. This was pursued...

  3. Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.

    Science.gov (United States)

    Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin

    2016-02-01

    High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.

  4. High-Throughput Peptide Epitope Mapping Using Carbon Nanotube Field-Effect Transistors

    Directory of Open Access Journals (Sweden)

    Steingrimur Stefansson

    2013-01-01

    Full Text Available Label-free and real-time detection technologies can dramatically reduce the time and cost of pharmaceutical testing and development. However, to reach their full promise, these technologies need to be adaptable to high-throughput automation. To demonstrate the potential of single-walled carbon nanotube field-effect transistors (SWCNT-FETs for high-throughput peptide-based assays, we have designed circuits arranged in an 8 × 12 (96-well format that are accessible to standard multichannel pipettors. We performed epitope mapping of two HIV-1 gp160 antibodies using an overlapping gp160 15-mer peptide library coated onto nonfunctionalized SWCNTs. The 15-mer peptides did not require a linker to adhere to the non-functionalized SWCNTs, and binding data was obtained in real time for all 96 circuits. Despite some sequence differences in the HIV strains used to generate these antibodies and the overlapping peptide library, respectively, our results using these antibodies are in good agreement with known data, indicating that peptides immobilized onto SWCNT are accessible and that linear epitope mapping can be performed in minutes using SWCNT-FET.

  5. A multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans

    Science.gov (United States)

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-01-01

    The booming nanotech industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At the organism level, our automated system analyzes hundreds of individual animals for body length, locomotion speed, and lifespan. To demonstrate the utility of our system, we applied this technology to test the toxicity of 20 nanomaterials under four concentrations. Only fullerene nanoparticles (nC60), fullerol, TiO2, and CeO2 showed little or no toxicity. Various degrees of toxicity were detected from different forms of carbon nanotubes, graphene, carbon black, Ag, and fumed SiO2 nanoparticles. Aminofullerene and UV irradiated nC60 also showed small but significant toxicity. We further investigated the effects of nanomaterial size, shape, surface chemistry, and exposure conditions on toxicity. Our data are publicly available at the open-access nanotoxicity database www.QuantWorm.org/nano. PMID:25611253

  6. MassCode Liquid Arrays as a Tool for Multiplexed High-Throughput Genetic Profiling

    Science.gov (United States)

    Richmond, Gregory S.; Khine, Htet; Zhou, Tina T.; Ryan, Daniel E.; Brand, Tony; McBride, Mary T.; Killeen, Kevin

    2011-01-01

    Multiplexed detection assays that analyze a modest number of nucleic acid targets over large sample sets are emerging as the preferred testing approach in such applications as routine pathogen typing, outbreak monitoring, and diagnostics. However, very few DNA testing platforms have proven to offer a solution for mid-plexed analysis that is high-throughput, sensitive, and with a low cost per test. In this work, an enhanced genotyping method based on MassCode technology was devised and integrated as part of a high-throughput mid-plexing analytical system that facilitates robust qualitative differential detection of DNA targets. Samples are first analyzed using MassCode PCR (MC-PCR) performed with an array of primer sets encoded with unique mass tags. Lambda exonuclease and an array of MassCode probes are then contacted with MC-PCR products for further interrogation and target sequences are specifically identified. Primer and probe hybridizations occur in homogeneous solution, a clear advantage over micro- or nanoparticle suspension arrays. The two cognate tags coupled to resultant MassCode hybrids are detected in an automated process using a benchtop single quadrupole mass spectrometer. The prospective value of using MassCode probe arrays for multiplexed bioanalysis was demonstrated after developing a 14plex proof of concept assay designed to subtype a select panel of Salmonella enterica serogroups and serovars. This MassCode system is very flexible and test panels can be customized to include more, less, or different markers. PMID:21544191

  7. MassCode liquid arrays as a tool for multiplexed high-throughput genetic profiling.

    Directory of Open Access Journals (Sweden)

    Gregory S Richmond

    Full Text Available Multiplexed detection assays that analyze a modest number of nucleic acid targets over large sample sets are emerging as the preferred testing approach in such applications as routine pathogen typing, outbreak monitoring, and diagnostics. However, very few DNA testing platforms have proven to offer a solution for mid-plexed analysis that is high-throughput, sensitive, and with a low cost per test. In this work, an enhanced genotyping method based on MassCode technology was devised and integrated as part of a high-throughput mid-plexing analytical system that facilitates robust qualitative differential detection of DNA targets. Samples are first analyzed using MassCode PCR (MC-PCR performed with an array of primer sets encoded with unique mass tags. Lambda exonuclease and an array of MassCode probes are then contacted with MC-PCR products for further interrogation and target sequences are specifically identified. Primer and probe hybridizations occur in homogeneous solution, a clear advantage over micro- or nanoparticle suspension arrays. The two cognate tags coupled to resultant MassCode hybrids are detected in an automated process using a benchtop single quadrupole mass spectrometer. The prospective value of using MassCode probe arrays for multiplexed bioanalysis was demonstrated after developing a 14plex proof of concept assay designed to subtype a select panel of Salmonella enterica serogroups and serovars. This MassCode system is very flexible and test panels can be customized to include more, less, or different markers.

  8. High-throughput volumetric reconstruction for 3D wheat plant architecture studies

    Directory of Open Access Journals (Sweden)

    Wei Fang

    2016-09-01

    Full Text Available For many tiller crops, the plant architecture (PA, including the plant fresh weight, plant height, number of tillers, tiller angle and stem diameter, significantly affects the grain yield. In this study, we propose a method based on volumetric reconstruction for high-throughput three-dimensional (3D wheat PA studies. The proposed methodology involves plant volumetric reconstruction from multiple images, plant model processing and phenotypic parameter estimation and analysis. This study was performed on 80 Triticum aestivum plants, and the results were analyzed. Comparing the automated measurements with manual measurements, the mean absolute percentage error (MAPE in the plant height and the plant fresh weight was 2.71% (1.08cm with an average plant height of 40.07cm and 10.06% (1.41g with an average plant fresh weight of 14.06g, respectively. The root mean square error (RMSE was 1.37cm and 1.79g for the plant height and plant fresh weight, respectively. The correlation coefficients were 0.95 and 0.96 for the plant height and plant fresh weight, respectively. Additionally, the proposed methodology, including plant reconstruction, model processing and trait extraction, required only approximately 20s on average per plant using parallel computing on a graphics processing unit (GPU, demonstrating that the methodology would be valuable for a high-throughput phenotyping platform.

  9. Receptor-based high-throughput screening and identification of estrogens in dietary supplements using bioaffinity liquid-chromatography ion mobility mass spectrometry

    NARCIS (Netherlands)

    Aqai, P.; Gómez Blesa, N.; Major, H.; Pedotti, P.; Varani, L.; Ferrero, V.E.V.; Haasnoot, W.; Nielen, M.W.F.

    2013-01-01

    A high-throughput bioaffinity liquid chromatography-mass spectrometry (BioMS) approach was developed and applied for the screening and identification of recombinant human estrogen receptor a (ERa) ligands in dietary supplements. For screening, a semi-automated mass spectrometric ligand binding assay

  10. Three-dimensional reconstruction and measurements of zebrafish larvae from high-throughput axial-viewin vivoimaging.

    Science.gov (United States)

    Guo, Yuanhao; Veneman, Wouter J; Spaink, Herman P; Verbeek, Fons J

    2017-05-01

    High-throughput imaging is applied to provide observations for accurate statements on phenomena in biology and this has been successfully applied in the domain of cells, i.e. cytomics. In the domain of whole organisms, we need to take the hurdles to ensure that the imaging can be accomplished with a sufficient throughput and reproducibility. For vertebrate biology, zebrafish is a popular model system for High-throughput applications. The development of the Vertebrate Automated Screening Technology (VAST BioImager), a microscope mounted system, enables the application of zebrafish high-throughput screening. The VAST BioImager contains a capillary that holds a zebrafish for imaging. Through the rotation of the capillary, multiple axial-views of a specimen can be acquired. For the VAST BioImager, fluorescence and/or confocal microscopes are used. Quantitation of a specific signal as derived from a label in one fluorescent channel requires insight in the zebrafish volume to be able to normalize quantitation to volume units. However, from the setup of the VAST BioImager, a specimen volume cannot be straightforwardly derived. We present a high-throughput axial-view imaging architecture based on the VAST BioImager. We propose profile-based 3D reconstruction to produce 3D volumetric representations for zebrafish larvae using the axial-views. Volume and surface area can then be derived from the 3D reconstruction to obtain the shape characteristics in high-throughput measurements. In addition, we develop a calibration and a validation of our methodology. From our measurements we show that with a limited amount of views, accurate measurements of volume and surface area for zebrafish larvae can be obtained. We have applied the proposed method on a range of developmental stages in zebrafish and produced metrical references for the volume and surface area for each stage.

  11. The FlyCatwalk: a high-throughput feature-based sorting system for artificial selection in Drosophila.

    Science.gov (United States)

    Medici, Vasco; Vonesch, Sibylle Chantal; Fry, Steven N; Hafen, Ernst

    2015-01-02

    Experimental evolution is a powerful tool for investigating complex traits. Artificial selection can be applied for a specific trait and the resulting phenotypically divergent populations pool-sequenced to identify alleles that occur at substantially different frequencies in the extreme populations. To maximize the proportion of loci that are causal to the phenotype among all enriched loci, population size and number of replicates need to be high. These requirements have, in fact, limited evolution studies in higher organisms, where the time investment required for phenotyping is often prohibitive for large-scale studies. Animal size is a highly multigenic trait that remains poorly understood, and an experimental evolution approach may thus aid in gaining new insights into the genetic basis of this trait. To this end, we developed the FlyCatwalk, a fully automated, high-throughput system to sort live fruit flies (Drosophila melanogaster) based on morphometric traits. With the FlyCatwalk, we can detect gender and quantify body and wing morphology parameters at a four-old higher throughput compared with manual processing. The phenotyping results acquired using the FlyCatwalk correlate well with those obtained using the standard manual procedure. We demonstrate that an automated, high-throughput, feature-based sorting system is able to avoid previous limitations in population size and replicate numbers. Our approach can likewise be applied for a variety of traits and experimental settings that require high-throughput phenotyping. Copyright © 2015 Medici et al.

  12. Beamline AR-NW12A: high-throughput beamline for macromolecular crystallography at the Photon Factory.

    Science.gov (United States)

    Chavas, L M G; Matsugaki, N; Yamada, Y; Hiraki, M; Igarashi, N; Suzuki, M; Wakatsuki, S

    2012-05-01

    AR-NW12A is an in-vacuum undulator beamline optimized for high-throughput macromolecular crystallography experiments as one of the five macromolecular crystallography (MX) beamlines at the Photon Factory. This report provides details of the beamline design, covering its optical specifications, hardware set-up, control software, and the latest developments for MX experiments. The experimental environment presents state-of-the-art instrumentation for high-throughput projects with a high-precision goniometer with an adaptable goniometer head, and a UV-light sample visualization system. Combined with an efficient automounting robot modified from the SSRL SAM system, a remote control system enables fully automated and remote-access X-ray diffraction experiments.

  13. HTP-OligoDesigner: An Online Primer Design Tool for High-Throughput Gene Cloning and Site-Directed Mutagenesis.

    Science.gov (United States)

    Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor

    2016-01-01

    Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.

  14. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  15. Precise, High-throughput Analysis of Bacterial Growth.

    Science.gov (United States)

    Kurokawa, Masaomi; Ying, Bei-Wen

    2017-09-19

    Bacterial growth is a central concept in the development of modern microbial physiology, as well as in the investigation of cellular dynamics at the systems level. Recent studies have reported correlations between bacterial growth and genome-wide events, such as genome reduction and transcriptome reorganization. Correctly analyzing bacterial growth is crucial for understanding the growth-dependent coordination of gene functions and cellular components. Accordingly, the precise quantitative evaluation of bacterial growth in a high-throughput manner is required. Emerging technological developments offer new experimental tools that allow updates of the methods used for studying bacterial growth. The protocol introduced here employs a microplate reader with a highly optimized experimental procedure for the reproducible and precise evaluation of bacterial growth. This protocol was used to evaluate the growth of several previously described Escherichia coli strains. The main steps of the protocol are as follows: the preparation of a large number of cell stocks in small vials for repeated tests with reproducible results, the use of 96-well plates for high-throughput growth evaluation, and the manual calculation of two major parameters (i.e., maximal growth rate and population density) representing the growth dynamics. In comparison to the traditional colony-forming unit (CFU) assay, which counts the cells that are cultured in glass tubes over time on agar plates, the present method is more efficient and provides more detailed temporal records of growth changes, but has a stricter detection limit at low population densities. In summary, the described method is advantageous for the precise and reproducible high-throughput analysis of bacterial growth, which can be used to draw conceptual conclusions or to make theoretical observations.

  16. A bead-based western for high-throughput cellular signal transduction analyses

    Science.gov (United States)

    Treindl, Fridolin; Ruprecht, Benjamin; Beiter, Yvonne; Schultz, Silke; Döttinger, Anette; Staebler, Annette; Joos, Thomas O.; Kling, Simon; Poetz, Oliver; Fehm, Tanja; Neubauer, Hans; Kuster, Bernhard; Templin, Markus F.

    2016-01-01

    Dissecting cellular signalling requires the analysis of large number of proteins. The DigiWest approach we describe here transfers the western blot to a bead-based microarray platform. By combining gel-based protein separation with immobilization on microspheres, hundreds of replicas of the initial blot are created, thus enabling the comprehensive analysis of limited material, such as cells collected by laser capture microdissection, and extending traditional western blotting to reach proteomic scales. The combination of molecular weight resolution, sensitivity and signal linearity on an automated platform enables the rapid quantification of hundreds of specific proteins and protein modifications in complex samples. This high-throughput western blot approach allowed us to identify and characterize alterations in cellular signal transduction that occur during the development of resistance to the kinase inhibitor Lapatinib, revealing major changes in the activation state of Ephrin-mediated signalling and a central role for p53-controlled processes. PMID:27659302

  17. High-throughput imaging-based nephrotoxicity prediction for xenobiotics with diverse chemical structures.

    Science.gov (United States)

    Su, Ran; Xiong, Sijing; Zink, Daniele; Loo, Lit-Hsin

    2016-11-01

    The kidney is a major target for xenobiotics, which include drugs, industrial chemicals, environmental toxicants and other compounds. Accurate methods for screening large numbers of potentially nephrotoxic xenobiotics with diverse chemical structures are currently not available. Here, we describe an approach for nephrotoxicity prediction that combines high-throughput imaging of cultured human renal proximal tubular cells (PTCs), quantitative phenotypic profiling, and machine learning methods. We automatically quantified 129 image-based phenotypic features, and identified chromatin and cytoskeletal features that can predict the human in vivo PTC toxicity of 44 reference compounds with ~82 % (primary PTCs) or 89 % (immortalized PTCs) test balanced accuracies. Surprisingly, our results also revealed that a DNA damage response is commonly induced by different PTC toxicants that have diverse chemical structures and injury mechanisms. Together, our results show that human nephrotoxicity can be predicted with high efficiency and accuracy by combining cell-based and computational methods that are suitable for automation.

  18. Library Design-Facilitated High-Throughput Sequencing of Synthetic Peptide Libraries.

    Science.gov (United States)

    Vinogradov, Alexander A; Gates, Zachary P; Zhang, Chi; Quartararo, Anthony J; Halloran, Kathryn H; Pentelute, Bradley L

    2017-11-13

    A methodology to achieve high-throughput de novo sequencing of synthetic peptide mixtures is reported. The approach leverages shotgun nanoliquid chromatography coupled with tandem mass spectrometry-based de novo sequencing of library mixtures (up to 2000 peptides) as well as automated data analysis protocols to filter away incorrect assignments, noise, and synthetic side-products. For increasing the confidence in the sequencing results, mass spectrometry-friendly library designs were developed that enabled unambiguous decoding of up to 600 peptide sequences per hour while maintaining greater than 85% sequence identification rates in most cases. The reliability of the reported decoding strategy was additionally confirmed by matching fragmentation spectra for select authentic peptides identified from library sequencing samples. The methods reported here are directly applicable to screening techniques that yield mixtures of active compounds, including particle sorting of one-bead one-compound libraries and affinity enrichment of synthetic library mixtures performed in solution.

  19. High-Throughput Spheroid Screens Using Volume, Resazurin Reduction, and Acid Phosphatase Activity.

    Science.gov (United States)

    Ivanov, Delyan P; Grabowska, Anna M; Garnett, Martin C

    2017-01-01

    Mainstream adoption of physiologically relevant three-dimensional models has been slow in the last 50 years due to long, manual protocols with poor reproducibility, high price, and closed commercial platforms. This chapter describes high-throughput, low-cost, open methods for spheroid viability assessment which use readily available reagents and open-source software to analyze spheroid volume, metabolism, and enzymatic activity. We provide two ImageJ macros for automated spheroid size determination-for both single images and images in stacks. We also share an Excel template spreadsheet allowing users to rapidly process spheroid size data, analyze plate uniformity (such as edge effects and systematic seeding errors), detect outliers, and calculate dose-response. The methods would be useful to researchers in preclinical and translational research planning to move away from simplistic monolayer studies and explore 3D spheroid screens for drug safety and efficacy without substantial investment in money or time.

  20. High Throughput WAN Data Transfer with Hadoop-based Storage

    International Nuclear Information System (INIS)

    Amin, A; Thomas, M; Bockelman, B; Letts, J; Martin, T; Pi, H; Sfiligoi, I; Wüerthwein, F; Levshina, T

    2011-01-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  1. Application of high-throughput DNA sequencing in phytopathology.

    Science.gov (United States)

    Studholme, David J; Glover, Rachel H; Boonham, Neil

    2011-01-01

    The new sequencing technologies are already making a big impact in academic research on medically important microbes and may soon revolutionize diagnostics, epidemiology, and infection control. Plant pathology also stands to gain from exploiting these opportunities. This manuscript reviews some applications of these high-throughput sequencing methods that are relevant to phytopathology, with emphasis on the associated computational and bioinformatics challenges and their solutions. Second-generation sequencing technologies have recently been exploited in genomics of both prokaryotic and eukaryotic plant pathogens. They are also proving to be useful in diagnostics, especially with respect to viruses. Copyright © 2011 by Annual Reviews. All rights reserved.

  2. SSFinder: High Throughput CRISPR-Cas Target Sites Prediction Tool

    Directory of Open Access Journals (Sweden)

    Santosh Kumar Upadhyay

    2014-01-01

    Full Text Available Clustered regularly interspaced short palindromic repeats (CRISPR and CRISPR-associated protein (Cas system facilitates targeted genome editing in organisms. Despite high demand of this system, finding a reliable tool for the determination of specific target sites in large genomic data remained challenging. Here, we report SSFinder, a python script to perform high throughput detection of specific target sites in large nucleotide datasets. The SSFinder is a user-friendly tool, compatible with Windows, Mac OS, and Linux operating systems, and freely available online.

  3. Quack: A quality assurance tool for high throughput sequence data.

    Science.gov (United States)

    Thrash, Adam; Arick, Mark; Peterson, Daniel G

    2018-05-01

    The quality of data generated by high-throughput DNA sequencing tools must be rapidly assessed in order to determine how useful the data may be in making biological discoveries; higher quality data leads to more confident results and conclusions. Due to the ever-increasing size of data sets and the importance of rapid quality assessment, tools that analyze sequencing data should quickly produce easily interpretable graphics. Quack addresses these issues by generating information-dense visualizations from FASTQ files at a speed far surpassing other publicly available quality assurance tools in a manner independent of sequencing technology. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  4. High Throughput WAN Data Transfer with Hadoop-based Storage

    Science.gov (United States)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  5. High throughput method for determination of caffeine in coffee drinks

    OpenAIRE

    Mihalčíková, Lýdia

    2016-01-01

    Charles University in Prague Faculty of Pharmacy in Hradec Králové Department of Analytical Chemistry Candidate: Lýdia Mihalčíková Supervisor: Warunya Boonjob, Ph.D. Consultant: Doc. PharmDr. Hana Sklenářová, Ph.D. Work title: High throughput method for determination of caffeine in coffee drinks Caffeine is a xanthine alkaloid acting like a stimulant of heart and central nervous system. Quantification of caffeine in coffee drinks is significant to show how much of caffeine was in each cup whi...

  6. REVIEW: Optical logic elements for high-throughput optical processors

    Science.gov (United States)

    Fedorov, V. B.

    1990-12-01

    An analysis is made of the current state and problems as well as prospects of the development of optical logic elements and threshold light amplifiers for high-throughput computing. An analysis is made of the specific case of a variant of an optical processor capable of 1013-1014 arithmetic operations per second under conditions of pipelined processing of two-dimensional arrays of multidigit binary operands. The basic requirements which must be satisfied by parameters and characteristics of optical logic elements in such a processor are identified.

  7. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  8. Bifrost: Stream processing framework for high-throughput applications

    Science.gov (United States)

    Barsdell, Ben; Price, Daniel; Cranmer, Miles; Garsden, Hugh; Dowell, Jayce

    2017-11-01

    Bifrost is a stream processing framework that eases the development of high-throughput processing CPU/GPU pipelines. It is designed for digital signal processing (DSP) applications within radio astronomy. Bifrost uses a flexible ring buffer implementation that allows different signal processing blocks to be connected to form a pipeline. Each block may be assigned to a CPU core, and the ring buffers are used to transport data to and from blocks. Processing blocks may be run on either the CPU or GPU, and the ring buffer will take care of memory copies between the CPU and GPU spaces.

  9. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  10. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Energy Technology Data Exchange (ETDEWEB)

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  11. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation.

    Science.gov (United States)

    Hodneland, Erlend; Kögel, Tanja; Frei, Dominik Michael; Gerdes, Hans-Hermann; Lundervold, Arvid

    2013-08-09

    : The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening.

  12. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation

    Science.gov (United States)

    2013-01-01

    The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087

  13. High-throughput full-automatic synchrotron-based tomographic microscopy

    International Nuclear Information System (INIS)

    Mader, Kevin; Marone, Federica; Hintermueller, Christoph; Mikuljan, Gordan; Isenegger, Andreas; Stampanoni, Marco

    2011-01-01

    At the TOMCAT (TOmographic Microscopy and Coherent rAdiology experimenTs) beamline of the Swiss Light Source with an energy range of 8-45 keV and voxel size from 0.37 (micro)m to 7.4 (micro)m, full tomographic datasets are typically acquired in 5 to 10 min. To exploit the speed of the system and enable high-throughput studies to be performed in a fully automatic manner, a package of automation tools has been developed. The samples are automatically exchanged, aligned, moved to the correct region of interest, and scanned. This task is accomplished through the coordination of Python scripts, a robot-based sample-exchange system, sample positioning motors and a CCD camera. The tools are suited for any samples that can be mounted on a standard SEM stub, and require no specific environmental conditions. Up to 60 samples can be analyzed at a time without user intervention. The throughput of the system is dependent on resolution, energy and sample size, but rates of four samples per hour have been achieved with 0.74 (micro)m voxel size at 17.5 keV. The maximum intervention-free scanning time is theoretically unlimited, and in practice experiments have been running unattended as long as 53 h (the average beam time allocation at TOMCAT is 48 h per user). The system is the first fully automated high-throughput tomography station: mounting samples, finding regions of interest, scanning and reconstructing can be performed without user intervention. The system also includes many features which accelerate and simplify the process of tomographic microscopy.

  14. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  15. Comparative analysis of different transformed Saccharomyces cerevisiae strains based on high-throughput Fourier transform infrared spectroscopy.

    Science.gov (United States)

    Sampaio, Pedro N Sousa; Calado, Cecília R Cruz

    2017-10-20

    This study shows the application of the Fourier transform mid-infrared spectroscopy (FT-MIR) associated with high-throughput technology to study the biochemical fingerprints of different Saccharomyces cerevisiae strains transformed with the same expression system along the similar cultivation in bioreactor. The phenotype, as well as the cellular metabolism and recombinant cyprosin biosynthesis, were determined. The differences observed were confirmed by conventional cyprosin activity protocol, and the metabolic evolution was analyzed using high-performance liquid chromatography technique. The spectral analysis based on chemometrics tools, such as the principal component analysis, is a useful methodology for the phenotypes characterization as well as the specific metabolic states along the cultivations according to the clusters created. The ratio bands of spectra also represented a useful tool to evaluate the metabolic and biochemical differences between both expression systems, allowing to have an additional parameter to the biomolecular comparison. Therefore, high-throughput FT-MIR spectroscopy associated with multivariate data analysis represent a valuable strategy for extracting significant specific biomolecular information along the cultivation, providing a complete bioprocess analysis, once it detects slight molecular changes which it will be useful for screening and optimization process in the biotechnological or pharmaceutical industry. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. High-throughput methods for characterizing the mechanical properties of coatings

    Science.gov (United States)

    Siripirom, Chavanin

    The characterization of mechanical properties in a combinatorial and high-throughput workflow has been a bottleneck that reduced the speed of the materials development process. High-throughput characterization of the mechanical properties was applied in this research in order to reduce the amount of sample handling and to accelerate the output. A puncture tester was designed and built to evaluate the toughness of materials using an innovative template design coupled with automation. The test is in the form of a circular free-film indentation. A single template contains 12 samples which are tested in a rapid serial approach. Next, the operational principles of a novel parallel dynamic mechanical-thermal analysis instrument were analyzed in detail for potential sources of errors. The test uses a model of a circular bilayer fixed-edge plate deformation. A total of 96 samples can be analyzed simultaneously which provides a tremendous increase in efficiency compared with a conventional dynamic test. The modulus values determined by the system had considerable variation. The errors were observed and improvements to the system were made. A finite element analysis was used to analyze the accuracy given by the closed-form solution with respect to testing geometries, such as thicknesses of the samples. A good control of the thickness of the sample was proven to be crucial to the accuracy and precision of the output. Then, the attempt to correlate the high-throughput experiments and conventional coating testing methods was made. Automated nanoindentation in dynamic mode was found to provide information on the near-surface modulus and could potentially correlate with the pendulum hardness test using the loss tangent component. Lastly, surface characterization of stratified siloxane-polyurethane coatings was carried out with X-ray photoelectron spectroscopy, Rutherford backscattering spectroscopy, transmission electron microscopy, and nanoindentation. The siloxane component

  17. Fusion genes and their discovery using high throughput sequencing.

    Science.gov (United States)

    Annala, M J; Parker, B C; Zhang, W; Nykter, M

    2013-11-01

    Fusion genes are hybrid genes that combine parts of two or more original genes. They can form as a result of chromosomal rearrangements or abnormal transcription, and have been shown to act as drivers of malignant transformation and progression in many human cancers. The biological significance of fusion genes together with their specificity to cancer cells has made them into excellent targets for molecular therapy. Fusion genes are also used as diagnostic and prognostic markers to confirm cancer diagnosis and monitor response to molecular therapies. High-throughput sequencing has enabled the systematic discovery of fusion genes in a wide variety of cancer types. In this review, we describe the history of fusion genes in cancer and the ways in which fusion genes form and affect cellular function. We also describe computational methodologies for detecting fusion genes from high-throughput sequencing experiments, and the most common sources of error that lead to false discovery of fusion genes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. High-Throughput Mechanobiology Screening Platform Using Micro- and Nanotopography.

    Science.gov (United States)

    Hu, Junqiang; Gondarenko, Alexander A; Dang, Alex P; Bashour, Keenan T; O'Connor, Roddy S; Lee, Sunwoo; Liapis, Anastasia; Ghassemi, Saba; Milone, Michael C; Sheetz, Michael P; Dustin, Michael L; Kam, Lance C; Hone, James C

    2016-04-13

    We herein demonstrate the first 96-well plate platform to screen effects of micro- and nanotopographies on cell growth and proliferation. Existing high-throughput platforms test a limited number of factors and are not fully compatible with multiple types of testing and assays. This platform is compatible with high-throughput liquid handling, high-resolution imaging, and all multiwell plate-based instrumentation. We use the platform to screen for topographies and drug-topography combinations that have short- and long-term effects on T cell activation and proliferation. We coated nanofabricated "trench-grid" surfaces with anti-CD3 and anti-CD28 antibodies to activate T cells and assayed for interleukin 2 (IL-2) cytokine production. IL-2 secretion was enhanced at 200 nm trench width and >2.3 μm grating pitch; however, the secretion was suppressed at 100 nm width and grid trench was further amplified with the addition of blebbistatin to reduce contractility. The 200 nm grid pattern was found to triple the number of T cells in long-term expansion, a result with direct clinical applicability in adoptive immunotherapy.

  19. High-throughput screening with micro-x-ray fluorescence

    International Nuclear Information System (INIS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-01-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity

  20. Advances, practice, and clinical perspectives in high-throughput sequencing.

    Science.gov (United States)

    Park, S-J; Saito-Adachi, M; Komiyama, Y; Nakai, K

    2016-07-01

    Remarkable advances in high-throughput sequencing technologies have fundamentally changed our understanding of the genetic and epigenetic molecular bases underlying human health and diseases. As these technologies continue to revolutionize molecular biology leading to fresh perspectives, it is imperative to thoroughly consider the enormous excitement surrounding the technologies by highlighting the characteristics of platforms and their global trends as well as potential benefits and limitations. To date, with a variety of platforms, the technologies provide an impressive range of applications, including sequencing of whole genomes and transcriptomes, identifying of genome modifications, and profiling of protein interactions. Because these applications produce a flood of data, simultaneous development of bioinformatics tools is required to efficiently deal with the big data and to comprehensively analyze them. This review covers the major achievements and performances of the high-throughput sequencing and further summarizes the characteristics of their applications along with introducing applicable bioinformatics tools. Moreover, a step-by-step procedure for a practical transcriptome analysis is described employing an analytical pipeline. Clinical perspectives with special consideration to human oral health and diseases are also covered. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  2. High-throughput technology for novel SO2 oxidation catalysts

    Science.gov (United States)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F.

    2011-10-01

    We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations.

  3. High-throughput technology for novel SO2 oxidation catalysts

    Directory of Open Access Journals (Sweden)

    Jonas Loskyll, Klaus Stoewe and Wilhelm F Maier

    2011-01-01

    Full Text Available We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations.

  4. Streamlining plant sample preparation: the use of high-throughput robotics to process echinacea samples for biomarker profiling by MALDI-TOF mass spectrometry.

    Science.gov (United States)

    Greene, Leasa A; Isaac, Issa; Gray, Dean E; Schwartz, Sarah A

    2007-09-01

    Several species in the genus Echinacea are beneficial herbs popularly used for many ailments. The most popular Echinacea species for cultivation, wild collection, and herbal products include E. purpurea (L.) Moench, E. pallida (Nutt.) Nutt., and E. angustifolia (DC). Product adulteration is a key concern for the natural products industry, where botanical misidentification and introduction of other botanical and nonbotanical contaminants exist throughout the formulation and production process. Therefore, rapid and cost-effective methods that can be used to monitor these materials for complex product purity and consistency are of benefit to consumers and producers. The objective of this continuing research was to develop automated, high-throughput processing methods that, teamed with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) analysis, differentiate Echinacea species by their mass profiles. Small molecules, peptide, and proteins from aerial parts (leaf/stem/flowers), seeds, and roots from E. purpurea and E. angustifolia; seeds and roots from E. pallida; and off-the-shelf Echinacea supplements were extracted and analyzed by MS using methods developed on the ProPrep liquid handling system (Genomic Solutions). Analysis of these samples highlighted key MS signal patterns from both small molecules and proteins that characterized the individual Echinacea materials analyzed. Based on analysis of pure Echinacea samples, off-the-shelf products containing Echinacea could then be evaluated in a streamlined process. Corresponding analysis of dietary supplements was used to monitor for product composition, including Echinacea species and plant materials used. These results highlight the potential for streamlined, automated approaches for agricultural species differentiation and botanical product evaluation.

  5. A method for high throughput bioelectrochemical research based on small scale microbial electrolysis cells

    KAUST Repository

    Call, Douglas F.

    2011-07-01

    There is great interest in studying exoelectrogenic microorganisms, but existing methods can require expensive electrochemical equipment and specialized reactors. We developed a simple system for conducting high throughput bioelectrochemical research using multiple inexpensive microbial electrolysis cells (MECs) built with commercially available materials and operated using a single power source. MECs were small crimp top serum bottles (5mL) with a graphite plate anode (92m 2/m 3) and a cathode of stainless steel (SS) mesh (86m 2/m 3), graphite plate, SS wire, or platinum wire. The highest volumetric current density (240A/m 3, applied potential of 0.7V) was obtained using a SS mesh cathode and a wastewater inoculum (acetate electron donor). Parallel operated MECs (single power source) did not lead to differences in performance compared to non-parallel operated MECs, which can allow for high throughput reactor operation (>1000 reactors) using a single power supply. The utility of this method for cultivating exoelectrogenic microorganisms was demonstrated through comparison of buffer effects on pure (Geobacter sulfurreducens and Geobacter metallireducens) and mixed cultures. Mixed cultures produced current densities equal to or higher than pure cultures in the different media, and current densities for all cultures were higher using a 50mM phosphate buffer than a 30mM bicarbonate buffer. Only the mixed culture was capable of sustained current generation with a 200mM phosphate buffer. These results demonstrate the usefulness of this inexpensive method for conducting in-depth examinations of pure and mixed exoelectrogenic cultures. © 2011 Elsevier B.V.

  6. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    Science.gov (United States)

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  7. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  8. Development and implementation of a high-throughput compound screening assay for targeting disrupted ER calcium homeostasis in Alzheimer's disease.

    Directory of Open Access Journals (Sweden)

    Kamran Honarnejad

    Full Text Available Disrupted intracellular calcium homeostasis is believed to occur early in the cascade of events leading to Alzheimer's disease (AD pathology. Particularly familial AD mutations linked to Presenilins result in exaggerated agonist-evoked calcium release from endoplasmic reticulum (ER. Here we report the development of a fully automated high-throughput calcium imaging assay utilizing a genetically-encoded FRET-based calcium indicator at single cell resolution for compound screening. The established high-throughput screening assay offers several advantages over conventional high-throughput calcium imaging technologies. We employed this assay for drug discovery in AD by screening compound libraries consisting of over 20,000 small molecules followed by structure-activity-relationship analysis. This led to the identification of Bepridil, a calcium channel antagonist drug in addition to four further lead structures capable of normalizing the potentiated FAD-PS1-induced calcium release from ER. Interestingly, it has recently been reported that Bepridil can reduce Aβ production by lowering BACE1 activity. Indeed, we also detected lowered Aβ, increased sAPPα and decreased sAPPβ fragment levels upon Bepridil treatment. The latter findings suggest that Bepridil may provide a multifactorial therapeutic modality for AD by simultaneously addressing multiple aspects of the disease.

  9. Development and Implementation of a High-Throughput Compound Screening Assay for Targeting Disrupted ER Calcium Homeostasis in Alzheimer's Disease

    Science.gov (United States)

    Honarnejad, Kamran; Daschner, Alexander; Giese, Armin; Zall, Andrea; Schmidt, Boris; Szybinska, Aleksandra; Kuznicki, Jacek; Herms, Jochen

    2013-01-01

    Disrupted intracellular calcium homeostasis is believed to occur early in the cascade of events leading to Alzheimer's disease (AD) pathology. Particularly familial AD mutations linked to Presenilins result in exaggerated agonist-evoked calcium release from endoplasmic reticulum (ER). Here we report the development of a fully automated high-throughput calcium imaging assay utilizing a genetically-encoded FRET-based calcium indicator at single cell resolution for compound screening. The established high-throughput screening assay offers several advantages over conventional high-throughput calcium imaging technologies. We employed this assay for drug discovery in AD by screening compound libraries consisting of over 20,000 small molecules followed by structure-activity-relationship analysis. This led to the identification of Bepridil, a calcium channel antagonist drug in addition to four further lead structures capable of normalizing the potentiated FAD-PS1-induced calcium release from ER. Interestingly, it has recently been reported that Bepridil can reduce Aβ production by lowering BACE1 activity. Indeed, we also detected lowered Aβ, increased sAPPα and decreased sAPPβ fragment levels upon Bepridil treatment. The latter findings suggest that Bepridil may provide a multifactorial therapeutic modality for AD by simultaneously addressing multiple aspects of the disease. PMID:24260442

  10. Ethoscopes: An open platform for high-throughput ethomics.

    Science.gov (United States)

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J; French, Alice S; Jamasb, Arian R; Gilestro, Giorgio F

    2017-10-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  11. The Principals and Practice of Distributed High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The potential of Distributed Processing Systems to deliver computing capabilities with qualities ranging from high availability and reliability to easy expansion in functionality and capacity were recognized and formalized in the 1970’s. For more three decade these principals Distributed Computing guided the development of the HTCondor resource and job management system. The widely adopted suite of software tools offered by HTCondor are based on novel distributed computing technologies and are driven by the evolving needs of High Throughput scientific applications. We will review the principals that underpin our work, the distributed computing frameworks and technologies we developed and the lessons we learned from delivering effective and dependable software tools in an ever changing landscape computing technologies and needs that range today from a desktop computer to tens of thousands of cores offered by commercial clouds. About the speaker Miron Livny received a B.Sc. degree in Physics and Mat...

  12. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  13. An improved high throughput sequencing method for studying oomycete communities

    DEFF Research Database (Denmark)

    Sapkota, Rumakanta; Nicolaisen, Mogens

    2015-01-01

    Culture-independent studies using next generation sequencing have revolutionizedmicrobial ecology, however, oomycete ecology in soils is severely lagging behind. The aimof this study was to improve and validate standard techniques for using high throughput sequencing as a tool for studying oomycete...... agricultural fields in Denmark, and 11 samples from carrot tissue with symptoms of Pythium infection. Sequence data from the Pythium and Phytophthora mock communities showed that our strategy successfully detected all included species. Taxonomic assignments of OTUs from 26 soil sample showed that 95...... the usefulness of the method not only in soil DNA but also in a plant DNA background. In conclusion, we demonstrate a successful approach for pyrosequencing of oomycete communities using ITS1 as the barcode sequence with well-known primers for oomycete DNA amplification....

  14. High-Throughput Screening Using Fourier-Transform Infrared Imaging

    Directory of Open Access Journals (Sweden)

    Erdem Sasmaz

    2015-06-01

    Full Text Available Efficient parallel screening of combinatorial libraries is one of the most challenging aspects of the high-throughput (HT heterogeneous catalysis workflow. Today, a number of methods have been used in HT catalyst studies, including various optical, mass-spectrometry, and gas-chromatography techniques. Of these, rapid-scanning Fourier-transform infrared (FTIR imaging is one of the fastest and most versatile screening techniques. Here, the new design of the 16-channel HT reactor is presented and test results for its accuracy and reproducibility are shown. The performance of the system was evaluated through the oxidation of CO over commercial Pd/Al2O3 and cobalt oxide nanoparticles synthesized with different reducer-reductant molar ratios, surfactant types, metal and surfactant concentrations, synthesis temperatures, and ramp rates.

  15. Applications of High-Throughput Nucleotide Sequencing (PhD)

    DEFF Research Database (Denmark)

    Waage, Johannes

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......, focusing on oft encountered problems in data processing, such as quality assurance, mapping, normalization, visualization, and interpretation. Presented in the second part are scientific endeavors representing solutions to problems of two sub-genres of next generation sequencing. For the first flavor, RNA-sequencing...

  16. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  17. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  18. High throughput phenotyping for aphid resistance in large plant collections

    Directory of Open Access Journals (Sweden)

    Chen Xi

    2012-08-01

    Full Text Available Abstract Background Phloem-feeding insects are among the most devastating pests worldwide. They not only cause damage by feeding from the phloem, thereby depleting the plant from photo-assimilates, but also by vectoring viruses. Until now, the main way to prevent such problems is the frequent use of insecticides. Applying resistant varieties would be a more environmental friendly and sustainable solution. For this, resistant sources need to be identified first. Up to now there were no methods suitable for high throughput phenotyping of plant germplasm to identify sources of resistance towards phloem-feeding insects. Results In this paper we present a high throughput screening system to identify plants with an increased resistance against aphids. Its versatility is demonstrated using an Arabidopsis thaliana activation tag mutant line collection. This system consists of the green peach aphid Myzus persicae (Sulzer and the circulative virus Turnip yellows virus (TuYV. In an initial screening, with one plant representing one mutant line, 13 virus-free mutant lines were identified by ELISA. Using seeds produced from these lines, the putative candidates were re-evaluated and characterized, resulting in nine lines with increased resistance towards the aphid. Conclusions This M. persicae-TuYV screening system is an efficient, reliable and quick procedure to identify among thousands of mutated lines those resistant to aphids. In our study, nine mutant lines with increased resistance against the aphid were selected among 5160 mutant lines in just 5 months by one person. The system can be extended to other phloem-feeding insects and circulative viruses to identify insect resistant sources from several collections, including for example genebanks and artificially prepared mutant collections.

  19. A pocket device for high-throughput optofluidic holographic microscopy

    Science.gov (United States)

    Mandracchia, B.; Bianco, V.; Wang, Z.; Paturzo, M.; Bramanti, A.; Pioggia, G.; Ferraro, P.

    2017-06-01

    Here we introduce a compact holographic microscope embedded onboard a Lab-on-a-Chip (LoC) platform. A wavefront division interferometer is realized by writing a polymer grating onto the channel to extract a reference wave from the object wave impinging the LoC. A portion of the beam reaches the samples flowing along the channel path, carrying their information content to the recording device, while one of the diffraction orders from the grating acts as an off-axis reference wave. Polymeric micro-lenses are delivered forward the chip by Pyro-ElectroHydroDynamic (Pyro-EHD) inkjet printing techniques. Thus, all the required optical components are embedded onboard a pocket device, and fast, non-iterative, reconstruction algorithms can be used. We use our device in combination with a novel high-throughput technique, named Space-Time Digital Holography (STDH). STDH exploits the samples motion inside microfluidic channels to obtain a synthetic hologram, mapped in a hybrid space-time domain, and with intrinsic useful features. Indeed, a single Linear Sensor Array (LSA) is sufficient to build up a synthetic representation of the entire experiment (i.e. the STDH) with unlimited Field of View (FoV) along the scanning direction, independently from the magnification factor. The throughput of the imaging system is dramatically increased as STDH provides unlimited FoV, refocusable imaging of samples inside the liquid volume with no need for hologram stitching. To test our embedded STDH microscopy module, we counted, imaged and tracked in 3D with high-throughput red blood cells moving inside the channel volume under non ideal flow conditions.

  20. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  1. Efficient production of a gene mutant cell line through integrating TALENs and high-throughput cell cloning.

    Science.gov (United States)

    Sun, Changhong; Fan, Yu; Li, Juan; Wang, Gancheng; Zhang, Hanshuo; Xi, Jianzhong Jeff

    2015-02-01

    Transcription activator-like effectors (TALEs) are becoming powerful DNA-targeting tools in a variety of mammalian cells and model organisms. However, generating a stable cell line with specific gene mutations in a simple and rapid manner remains a challenging task. Here, we report a new method to efficiently produce monoclonal cells using integrated TALE nuclease technology and a series of high-throughput cell cloning approaches. Following this method, we obtained three mTOR mutant 293T cell lines within 2 months, which included one homozygous mutant line. © 2014 Society for Laboratory Automation and Screening.

  2. A high-throughput method for the detection of homoeologous gene deletions in hexaploid wheat

    Directory of Open Access Journals (Sweden)

    Li Zhongyi

    2010-11-01

    4500 M2 mutant wheat lines generated by heavy ion irradiation, we detected multiple mutants with deletions of each TaPFT1 homoeologue, and confirmed these deletions using a CAPS method. We have subsequently designed, optimized, and applied this method for the screening of homoeologous deletions of three additional wheat genes putatively involved in plant disease resistance. Conclusions We have developed a method for automated, high-throughput screening to identify deletions of individual homoeologues of a wheat gene. This method is also potentially applicable to other polyploidy plants.

  3. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    Directory of Open Access Journals (Sweden)

    Prasanth VP

    2006-08-01

    Full Text Available Abstract Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat genotyping data from the legume (chickpea, groundnut and pigeonpea and cereal (sorghum and millets crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping

  4. Toward biotechnology in space: High-throughput instruments for in situ biological research beyond Earth.

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kianoosh; Pohorille, Andrew

    2017-11-15

    Space biotechnology is a nascent field aimed at applying tools of modern biology to advance our goals in space exploration. These advances rely on our ability to exploit in situ high throughput techniques for amplification and sequencing DNA, and measuring levels of RNA transcripts, proteins and metabolites in a cell. These techniques, collectively known as "omics" techniques have already revolutionized terrestrial biology. A number of on-going efforts are aimed at developing instruments to carry out "omics" research in space, in particular on board the International Space Station and small satellites. For space applications these instruments require substantial and creative reengineering that includes automation, miniaturization and ensuring that the device is resistant to conditions in space and works independently of the direction of the gravity vector. Different paths taken to meet these requirements for different "omics" instruments are the subjects of this review. The advantages and disadvantages of these instruments and technological solutions and their level of readiness for deployment in space are discussed. Considering that effects of space environments on terrestrial organisms appear to be global, it is argued that high throughput instruments are essential to advance (1) biomedical and physiological studies to control and reduce space-related stressors on living systems, (2) application of biology to life support and in situ resource utilization, (3) planetary protection, and (4) basic research about the limits on life in space. It is also argued that carrying out measurements in situ provides considerable advantages over the traditional space biology paradigm that relies on post-flight data analysis. Published by Elsevier Inc.

  5. Introduction of a high throughput SPM for defect inspection and process control

    Science.gov (United States)

    Sadeghian, H.; Koster, N. B.; van den Dool, T. C.

    2013-04-01

    The main driver for Semiconductor and Bio-MEMS industries is decreasing the feature size, moving from the current state-of-the-art at 22 nm towards 10 nm node. Consequently smaller defects and particles become problematic due to size and number, thus inspecting and characterizing them are very challenging. Existing industrial metrology and inspection methods cannot fulfil the requirements for these smaller features. Scanning probe Microscopy (SPM) has the distinct advantage of being able to discern the atomic structure of the substrate. It can image the 3D topography, but also a variety of material, mechanical and chemical properties. Therefore SPM has been suggested as one of the technologies that can fulfil the future requirements in terms of resolution and accuracy, while being capable of resolving 3D futures. However, the throughput of the current state-of-the-art SPMs are extremely low, as compared to the high-volume manufacturing requirements. This paper presents the development of an architecture[1] for a fully automated high throughput SPM, which can meet the requirements of future process metrology and inspection for 450 mm wafers. The targeted specifications of the concept are 1) inspecting more than 20 sites per wafer, 2) each site with dimension of about 10 × 10 μm2 (scalable to 100 × 100 μm2) and 3) with a throughput of more than 7 wafers per hour, or 70 wafers per hour with a coarse/fine scanning approach. The progress of the high throughput SPM development is discussed and the baseline design of the critical sub-modules and the research issues are presented.

  6. High-throughput viscosity measurement using capillary electrophoresis instrumentation and its application to protein formulation.

    Science.gov (United States)

    Allmendinger, Andrea; Dieu, Le-Ha; Fischer, Stefan; Mueller, Robert; Mahler, Hanns-Christian; Huwyler, Jörg

    2014-10-01

    Viscosity characterization of protein formulations is of utmost importance for the development of subcutaneously administered formulations. However, viscosity determinations are time-consuming and require large sample volumes in the range of hundreds of microliters to a few milliliters, depending on the method used. In this article, an automated, high-throughput method is described to determine dynamic viscosity of Newtonian fluids using standard capillary electrophoresis (CE) equipment. CE is an analytical method routinely used for the separation and characterization of proteins. In our set-up, the capillary is filled with the test sample, and a constant pressure is applied. A small aliquot of riboflavin is subsequently loaded into the capillary and used as a dye to monitor movement of protein samples. Migration time of the riboflavin peak moving through the filled capillary is converted to the viscosity by applying the Hagen-Poiseuille's law. The instrument is operated without using an electrical field. Repeatability, robustness, linearity, and reproducibility were demonstrated for different capillary lots and instruments, as well as for different capillary lengths and diameters. Accuracy was verified by comparing the viscosity data obtained by CE instrumentation with those obtained by plate/cone rheometry. The suitability of the method for protein formulations was demonstrated, and limitations were discussed. Typical viscosities in the range of 5-40mPas were reliably measured with this method. Advantages of the CE instrumentation-based method included short measurement times (1-15min), small sample volumes (few microliters) for a capillary with a diameter of 50μm and a length of 20.5cm as well as potential to be suitable for high-throughput measurements. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Heterogeneity in macrophage phagocytosis of Staphylococcus aureus strains: high-throughput scanning cytometry-based analysis.

    Directory of Open Access Journals (Sweden)

    Glen M DeLoid

    2009-07-01

    Full Text Available Alveolar macrophages (AMs can phagocytose unopsonized pathogens such as S. aureus via innate immune receptors, such as scavenger receptors (SRs. Cytoskeletal events and signaling pathways involved in phagocytosis of unopsonized bacteria likely govern the fate of ingested pathogens, but are poorly characterized. We have developed a high-throughput scanning cytometry-based assay to quantify phagocytosis of S. aureus by adherent human blood-derived AM-like macrophages in a 96-well microplate format. Differential fluorescent labeling of internalized vs. bound bacteria or beads allowed automated image analysis of collapsed confocal stack images acquired by scanning cytometry, and quantification of total particles bound and percent of particles internalized. We compared the effects of the classic SR blocker polyinosinic acid, the cytoskeletal inhibitors cytochalasin D and nocodazole, and the signaling inhibitors staurosporine, Gö 6976, JNK Inhibitor I and KN-93, on phagocytosis of a panel of live unopsonized S. aureus strains, (Wood, Seattle 1945 (ATCC 25923, and RN6390, as well as a commercial killed Wood strain, heat-killed Wood strain and latex beads. Our results revealed failure of the SR inhibitor polyinosinic acid to block binding of any live S. aureus strains, suggesting that SR-mediated uptake of a commercial killed fluorescent bacterial particle does not accurately model interaction with viable bacteria. We also observed heterogeneity in the effects of cytoskeletal and signaling inhibitors on internalization of different S. aureus strains. The data suggest that uptake of unopsonized live S. aureus by human macrophages is not mediated by SRs, and that the cellular mechanical and signaling processes that mediate S. aureus phagocytosis vary. The findings also demonstrate the potential utility of high-throughput scanning cytometry techniques to study phagocytosis of S. aureus and other organisms in greater detail.

  8. A family of E. coli expression vectors for laboratory scale and high throughput soluble protein production

    Directory of Open Access Journals (Sweden)

    Bottomley Stephen P

    2006-03-01

    Full Text Available Abstract Background In the past few years, both automated and manual high-throughput protein expression and purification has become an accessible means to rapidly screen and produce soluble proteins for structural and functional studies. However, many of the commercial vectors encoding different solubility tags require different cloning and purification steps for each vector, considerably slowing down expression screening. We have developed a set of E. coli expression vectors with different solubility tags that allow for parallel cloning from a single PCR product and can be purified using the same protocol. Results The set of E. coli expression vectors, encode for either a hexa-histidine tag or the three most commonly used solubility tags (GST, MBP, NusA and all with an N-terminal hexa-histidine sequence. The result is two-fold: the His-tag facilitates purification by immobilised metal affinity chromatography, whilst the fusion domains act primarily as solubility aids during expression, in addition to providing an optional purification step. We have also incorporated a TEV recognition sequence following the solubility tag domain, which allows for highly specific cleavage (using TEV protease of the fusion protein to yield native protein. These vectors are also designed for ligation-independent cloning and they possess a high-level expressing T7 promoter, which is suitable for auto-induction. To validate our vector system, we have cloned four different genes and also one gene into all four vectors and used small-scale expression and purification techniques. We demonstrate that the vectors are capable of high levels of expression and that efficient screening of new proteins can be readily achieved at the laboratory level. Conclusion The result is a set of four rationally designed vectors, which can be used for streamlined cloning, expression and purification of target proteins in the laboratory and have the potential for being adaptable to a high-throughput

  9. Fabrication of a hybrid plastic-silicon microfluidic device for high-throughput genotyping

    Science.gov (United States)

    Chartier, Isabelle; Sudor, J.; Fouillet, Yves; Sarrut, N.; Bory, C.; Gruss, A.

    2003-01-01

    The lab-on-a-chip approach has been increasingly present in biological research over the last ten years, high-throughput analyses being one of the promising utilization. The work presented here has consisted in developing an automated genotyping system based on a continuous flow analysis which integrates all the steps of the genotyping process (PCR, purification and sequencing). The genotyping device consists of a disposable hybrid silicon-plastic microfluidic chip, equipped with a permanent external, heating/cooling system, syringe-pumps based injection systems and on-line fluorescence detection. High throughput is obtained by performing the reaction in a continuous flow (1 reaction every 6min per channel) and in parallel (48 channels). We are presenting here the technical solutions developed to fabricate the hybrid silicon-plastic microfluidic device. It includes a polycarbonate substrate having 48 parallel grooves sealed by film lamination techniques to create the channels. Two different solutions for the sealing of the channels are compared in relation to their biocompatibility, fluidic behavior and fabrication process yield. Surface roughness of the surface of the channels is the key point of this step. Silicon fluidic chips are used for thermo-cycled reactions. A specific bonding technique has been developed to bond silicon chips onto the plastic part which ensures alignment and hermetic fluidic connexion. Surface coatings are studied to enhance the PCR biocompatibility and fluidic behavior of the two-phase liquid flow. We then demonstrate continuous operation over more than 20 hours of the component and validate PCR protocol on microliter samples in a continuous flow reaction.

  10. Model-based high-throughput design of ion exchange protein chromatography.

    Science.gov (United States)

    Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo

    2016-08-12

    This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process

  11. Integrating high-throughput pyrosequencing and quantitative real-time PCR to analyze complex microbial communities.

    Science.gov (United States)

    Zhang, Husen; Parameswaran, Prathap; Badalamenti, Jonathan; Rittmann, Bruce E; Krajmalnik-Brown, Rosa

    2011-01-01

    New high-throughput technologies continue to emerge for studying complex microbial communities. In particular, massively parallel pyrosequencing enables very high numbers of sequences, providing a more complete view of community structures and a more accurate inference of the functions than has been possible just a few years ago. In parallel, quantitative real-time PCR (QPCR) allows quantitative monitoring of specific community members over time, space, or different environmental conditions. In this review, we discuss the principles of these two methods and their complementary applications in studying microbial ecology in bioenvironmental systems. We explain parallel sequencing of amplicon libraries and using bar codes to differentiate multiple samples in a pyrosequencing run. We also describe best procedures and chemistries for QPCR amplifications and address advantages of applying automation to increase accuracy. We provide three examples in which we used pyrosequencing and QPCR together to define and quantify members of microbial communities: in the human large intestine, in a methanogenic digester whose sludge was made more bioavailable by a high-voltage pretreatment, and on the biofilm anode of a microbial electrolytic cell. We highlight our key findings in these systems and how both methods were used in concert to achieve those findings. Finally, we supply detailed methods for generating PCR amplicon libraries for pyrosequencing, pyrosequencing data analysis, QPCR methodology, instrumentation, and automation.

  12. SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.

    Science.gov (United States)

    Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao

    2014-08-08

    Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.

  13. High Throughput Screen for Novel Antimicrobials using a Whole Animal Infection Model

    Science.gov (United States)

    Moy, Terence I.; Conery, Annie L.; Larkins-Ford, Jonah; Wu, Gang; Mazitschek, Ralph; Casadei, Gabriele; Lewis, Kim; Carpenter, Anne E.; Ausubel, Frederick M.

    2009-01-01

    The nematode Caenorhabditis elegans is a unique whole animal model system for identifying small molecules with in vivo anti-infective properties. C. elegans can be infected with a broad range of human pathogens, including Enterococcus faecalis, an important human nosocomial pathogen with a mortality rate of up to 37% that is increasingly acquiring resistance to antibiotics. Here, we describe an automated, high throughput screen of 37,200 compounds and natural product extracts for those that enhance survival of C. elegans infected with E. faecalis. The screen uses a robot to accurately dispense live, infected animals into 384-well plates, and automated microscopy and image analysis to generate quantitative, high content data. We identified 28 compounds and extracts that were not previously reported to have antimicrobial properties, including 6 structural classes that cure infected C. elegans animals but do not affect the growth of the pathogen in vitro, thus acting by a mechanism of action distinct from antibiotics currently in clinical use. Our versatile and robust screening system can be easily adapted for other whole animal assays to probe a broad range of biological processes. PMID:19572548

  14. Multiple microfermentor battery: a versatile tool for use with automated parallel cultures of microorganisms producing recombinant proteins and for optimization of cultivation protocols.

    Science.gov (United States)

    Frachon, Emmanuel; Bondet, Vincent; Munier-Lehmann, Hélène; Bellalou, Jacques

    2006-08-01

    A multiple microfermentor battery was designed for high-throughput recombinant protein production in Escherichia coli. This novel system comprises eight aerated glass reactors with a working volume of 80 ml and a moving external optical sensor for measuring optical densities at 600 nm (OD600) ranging from 0.05 to 100 online. Each reactor can be fitted with miniature probes to monitor temperature, dissolved oxygen (DO), and pH. Independent temperature regulation for each vessel is obtained with heating/cooling Peltier devices. Data from pH, DO, and turbidity sensors are collected on a FieldPoint (National Instruments) I/O interface and are processed and recorded by a LabVIEW program on a personal computer, which enables feedback control of the culture parameters. A high-density medium formulation was designed, which enabled us to grow E. coli to OD600 up to 100 in batch cultures with oxygen-enriched aeration. Accordingly, the biomass and the amount of recombinant protein produced in a 70-ml culture were at least equivalent to the biomass and the amount of recombinant protein obtained in a Fernbach flask with 1 liter of conventional medium. Thus, the microfermentor battery appears to be well suited for automated parallel cultures and process optimization, such as that needed for structural genomics projects.

  15. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  16. SNP-PHAGE – High throughput SNP discovery pipeline

    Directory of Open Access Journals (Sweden)

    Cregan Perry B

    2006-10-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs as defined here are single base sequence changes or short insertion/deletions between or within individuals of a given species. As a result of their abundance and the availability of high throughput analysis technologies SNP markers have begun to replace other traditional markers such as restriction fragment length polymorphisms (RFLPs, amplified fragment length polymorphisms (AFLPs and simple sequence repeats (SSRs or microsatellite markers for fine mapping and association studies in several species. For SNP discovery from chromatogram data, several bioinformatics programs have to be combined to generate an analysis pipeline. Results have to be stored in a relational database to facilitate interrogation through queries or to generate data for further analyses such as determination of linkage disequilibrium and identification of common haplotypes. Although these tasks are routinely performed by several groups, an integrated open source SNP discovery pipeline that can be easily adapted by new groups interested in SNP marker development is currently unavailable. Results We developed SNP-PHAGE (SNP discovery Pipeline with additional features for identification of common haplotypes within a sequence tagged site (Haplotype Analysis and GenBank (-dbSNP submissions. This tool was applied for analyzing sequence traces from diverse soybean genotypes to discover over 10,000 SNPs. This package was developed on UNIX/Linux platform, written in Perl and uses a MySQL database. Scripts to generate a user-friendly web interface are also provided with common queries for preliminary data analysis. A machine learning tool developed by this group for increasing the efficiency of SNP discovery is integrated as a part of this package as an optional feature. The SNP-PHAGE package is being made available open source at http://bfgl.anri.barc.usda.gov/ML/snp-phage/. Conclusion SNP-PHAGE provides a bioinformatics

  17. BOOGIE: Predicting Blood Groups from High Throughput Sequencing Data.

    Science.gov (United States)

    Giollo, Manuel; Minervini, Giovanni; Scalzotto, Marta; Leonardi, Emanuela; Ferrari, Carlo; Tosatto, Silvio C E

    2015-01-01

    Over the last decade, we have witnessed an incredible growth in the amount of available genotype data due to high throughput sequencing (HTS) techniques. This information may be used to predict phenotypes of medical relevance, and pave the way towards personalized medicine. Blood phenotypes (e.g. ABO and Rh) are a purely genetic trait that has been extensively studied for decades, with currently over thirty known blood groups. Given the public availability of blood group data, it is of interest to predict these phenotypes from HTS data which may translate into more accurate blood typing in clinical practice. Here we propose BOOGIE, a fast predictor for the inference of blood groups from single nucleotide variant (SNV) databases. We focus on the prediction of thirty blood groups ranging from the well known ABO and Rh, to the less studied Junior or Diego. BOOGIE correctly predicted the blood group with 94% accuracy for the Personal Genome Project whole genome profiles where good quality SNV annotation was available. Additionally, our tool produces a high quality haplotype phase, which is of interest in the context of ethnicity-specific polymorphisms or traits. The versatility and simplicity of the analysis make it easily interpretable and allow easy extension of the protocol towards other phenotypes. BOOGIE can be downloaded from URL http://protein.bio.unipd.it/download/.

  18. Quantifying Nanoparticle Internalization Using a High Throughput Internalization Assay.

    Science.gov (United States)

    Mann, Sarah K; Czuba, Ewa; Selby, Laura I; Such, Georgina K; Johnston, Angus P R

    2016-10-01

    The internalization of nanoparticles into cells is critical for effective nanoparticle mediated drug delivery. To investigate the kinetics and mechanism of internalization of nanoparticles into cells we have developed a DNA molecular sensor, termed the Specific Hybridization Internalization Probe - SHIP. Self-assembling polymeric 'pHlexi' nanoparticles were functionalized with a Fluorescent Internalization Probe (FIP) and the interactions with two different cell lines (3T3 and CEM cells) were studied. The kinetics of internalization were quantified and chemical inhibitors that inhibited energy dependent endocytosis (sodium azide), dynamin dependent endocytosis (Dyngo-4a) and macropinocytosis (5-(N-ethyl-N-isopropyl) amiloride (EIPA)) were used to study the mechanism of internalization. Nanoparticle internalization kinetics were significantly faster in 3T3 cells than CEM cells. We have shown that ~90% of the nanoparticles associated with 3T3 cells were internalized, compared to only 20% of the nanoparticles associated with CEM cells. Nanoparticle uptake was via a dynamin-dependent pathway, and the nanoparticles were trafficked to lysosomal compartments once internalized. SHIP is able to distinguish between nanoparticles that are associated on the outer cell membrane from nanoparticles that are internalized. This study demonstrates the assay can be used to probe the kinetics of nanoparticle internalization and the mechanisms by which the nanoparticles are taken up by cells. This information is fundamental for engineering more effective nanoparticle delivery systems. The SHIP assay is a simple and a high-throughput technique that could have wide application in therapeutic delivery research.

  19. Management of High-Throughput DNA Sequencing Projects: Alpheus.

    Science.gov (United States)

    Miller, Neil A; Kingsmore, Stephen F; Farmer, Andrew; Langley, Raymond J; Mudge, Joann; Crow, John A; Gonzalez, Alvaro J; Schilkey, Faye D; Kim, Ryan J; van Velkinburgh, Jennifer; May, Gregory D; Black, C Forrest; Myers, M Kathy; Utsey, John P; Frost, Nicholas S; Sugarbaker, David J; Bueno, Raphael; Gullans, Stephen R; Baxter, Susan M; Day, Steve W; Retzel, Ernest F

    2008-12-26

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem's SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis.

  20. High-Throughput DNA sequencing of ancient wood.

    Science.gov (United States)

    Wagner, Stefanie; Lagane, Frédéric; Seguin-Orlando, Andaine; Schubert, Mikkel; Leroy, Thibault; Guichoux, Erwan; Chancerel, Emilie; Bech-Hebelstrup, Inger; Bernard, Vincent; Billard, Cyrille; Billaud, Yves; Bolliger, Matthias; Croutsch, Christophe; Čufar, Katarina; Eynaud, Frédérique; Heussner, Karl Uwe; Köninger, Joachim; Langenegger, Fabien; Leroy, Frédéric; Lima, Christine; Martinelli, Nicoletta; Momber, Garry; Billamboz, André; Nelle, Oliver; Palomo, Antoni; Piqué, Raquel; Ramstein, Marianne; Schweichel, Roswitha; Stäuble, Harald; Tegel, Willy; Terradas, Xavier; Verdin, Florence; Plomion, Christophe; Kremer, Antoine; Orlando, Ludovic

    2018-03-01

    Reconstructing the colonization and demographic dynamics that gave rise to extant forests is essential to forecasts of forest responses to environmental changes. Classical approaches to map how population of trees changed through space and time largely rely on pollen distribution patterns, with only a limited number of studies exploiting DNA molecules preserved in wooden tree archaeological and subfossil remains. Here, we advance such analyses by applying high-throughput (HTS) DNA sequencing to wood archaeological and subfossil material for the first time, using a comprehensive sample of 167 European white oak waterlogged remains spanning a large temporal (from 550 to 9,800 years) and geographical range across Europe. The successful characterization of the endogenous DNA and exogenous microbial DNA of 140 (~83%) samples helped the identification of environmental conditions favouring long-term DNA preservation in wood remains, and started to unveil the first trends in the DNA decay process in wood material. Additionally, the maternally inherited chloroplast haplotypes of 21 samples from three periods of forest human-induced use (Neolithic, Bronze Age and Middle Ages) were found to be consistent with those of modern populations growing in the same geographic areas. Our work paves the way for further studies aiming at using ancient DNA preserved in wood to reconstruct the micro-evolutionary response of trees to climate change and human forest management. © 2018 John Wiley & Sons Ltd.

  1. Probabilistic Methods for Processing High-Throughput Sequencing Signals

    DEFF Research Database (Denmark)

    Sørensen, Lasse Maretty

    High-throughput sequencing has the potential to answer many of the big questions in biology and medicine. It can be used to determine the ancestry of species, to chart complex ecosystems and to understand and diagnose disease. However, going from raw sequencing data to biological or medical insig....... By estimating the genotypes on a set of candidate variants obtained from both a standard mapping-based approach as well as de novo assemblies, we are able to find considerably more structural variation than previous studies...... for reconstructing transcript sequences from RNA sequencing data. The method is based on a novel sparse prior distribution over transcript abundances and is markedly more accurate than existing approaches. The second chapter describes a new method for calling genotypes from a fixed set of candidate variants....... The method queries the reads using a graph representation of the variants and hereby mitigates the reference-bias that characterise standard genotyping methods. In the last chapter, we apply this method to call the genotypes of 50 deeply sequencing parent-offspring trios from the GenomeDenmark project...

  2. Quantifying selection in high-throughput Immunoglobulin sequencing data sets.

    Science.gov (United States)

    Yaari, Gur; Uduman, Mohamed; Kleinstein, Steven H

    2012-09-01

    High-throughput immunoglobulin sequencing promises new insights into the somatic hypermutation and antigen-driven selection processes that underlie B-cell affinity maturation and adaptive immunity. The ability to estimate positive and negative selection from these sequence data has broad applications not only for understanding the immune response to pathogens, but is also critical to determining the role of somatic hypermutation in autoimmunity and B-cell cancers. Here, we develop a statistical framework for Bayesian estimation of Antigen-driven SELectIoN (BASELINe) based on the analysis of somatic mutation patterns. Our approach represents a fundamental advance over previous methods by shifting the problem from one of simply detecting selection to one of quantifying selection. Along with providing a more intuitive means to assess and visualize selection, our approach allows, for the first time, comparative analysis between groups of sequences derived from different germline V(D)J segments. Application of this approach to next-generation sequencing data demonstrates different selection pressures for memory cells of different isotypes. This framework can easily be adapted to analyze other types of DNA mutation patterns resulting from a mutator that displays hot/cold-spots, substitution preference or other intrinsic biases.

  3. Dimensioning storage and computing clusters for efficient high throughput computing

    International Nuclear Information System (INIS)

    Accion, E; Bria, A; Bernabeu, G; Caubet, M; Delfino, M; Espinal, X; Merino, G; Lopez, F; Martinez, F; Planas, E

    2012-01-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  4. Achieving High Throughput for Data Transfer over ATM Networks

    Science.gov (United States)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  5. High-Throughput Discovery of Aptamers for Sandwich Assays.

    Science.gov (United States)

    Csordas, Andrew T; Jørgensen, Anna; Wang, Jinpeng; Gruber, Emily; Gong, Qiang; Bagley, Elizabeth R; Nakamoto, Margaret A; Eisenstein, Michael; Soh, H Tom

    2016-11-15

    Sandwich assays are among the most powerful tools in molecular detection. These assays use "pairs" of affinity reagents so that the detection signal is generated only when both reagents bind simultaneously to different sites on the target molecule, enabling highly sensitive and specific measurements in complex samples. Thus, the capability to efficiently screen affinity reagent pairs at a high throughput is critical. In this work, we describe an experimental strategy for screening "aptamer pairs" at a throughput of 10 6 aptamer pairs per hour-which is many orders of magnitude higher than the current state of the art. The key step in our process is the conversion of solution-phase aptamers into "aptamer particles" such that we can directly measure the simultaneous binding of multiple aptamers to a target protein based on fluorescence signals and sort individual particles harboring aptamer pairs via the fluorescence-activated cell-sorter instrument. As proof of principle, we successfully isolated a high-quality DNA aptamer pair for plasminogen activator inhibitor 1 (PAI-1). Within only two rounds of screening, we discovered DNA aptamer pairs with low-nanomolar sensitivity in dilute serum and excellent specificity with minimal off-target binding even to closely related proteins such as PAI-2.

  6. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  7. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  8. High Throughput Heuristics for Prioritizing Human Exposure to ...

    Science.gov (United States)

    The risk posed to human health by any of the thousands of untested anthropogenic chemicals in our environment is a function of both the potential hazard presented by the chemical, and the possibility of being exposed. Without the capacity to make quantitative, albeit uncertain, forecasts of exposure, the putative risk of adverse health effect from a chemical cannot be evaluated. We used Bayesian methodology to infer ranges of exposure intakes that are consistent with biomarkers of chemical exposures identified in urine samples from the U.S. population by the National Health and Nutrition Examination Survey (NHANES). We perform linear regression on inferred exposure for demographic subsets of NHANES demarked by age, gender, and weight using high throughput chemical descriptors gleaned from databases and chemical structure-based calculators. We find that five of these descriptors are capable of explaining roughly 50% of the variability across chemicals for all the demographic groups examined, including children aged 6-11. For the thousands of chemicals with no other source of information, this approach allows rapid and efficient prediction of average exposure intake of environmental chemicals. The methods described by this manuscript provide a highly improved methodology for HTS of human exposure to environmental chemicals. The manuscript includes a ranking of 7785 environmental chemicals with respect to potential human exposure, including most of the Tox21 in vit

  9. A micromethod for high throughput RNA extraction in forest trees

    Directory of Open Access Journals (Sweden)

    GREGOIRE LE PROVOST

    2007-01-01

    Full Text Available A large quantity of high quality RNA is often required in the analysis of gene expression. However, RNA extraction from samples taken from woody plants is generally complex, and represents the main limitation to study gene expression, particularly in refractory species like conifers. Standard RNA extraction protocols are available but they are highly time consuming, and not adapted to large scale extraction. Here we present a high-throughput RNA extraction protocol. This protocol was adapted to a micro-scale by modifying the classical cetyltrimethylammonium (CTAB protocol developed for pine: (i quantity of material used (100-200 mg of sample, (ii disruption of samples in microtube using a mechanical tissue disrupter, and (iii the use of SSTE buffer. One hundred samples of woody plant tissues/organs can be easily treated in two working days. An average of 15 ig of high quality RNA per sample was obtained. The RNA extracted is suitable for applications such as real time reverse transcription polymerase chain reaction, cDNA library construction or synthesis of complex targets for microarray analysis

  10. Microfluidic system for high throughput characterisation of echogenic particles.

    Science.gov (United States)

    Rademeyer, Paul; Carugo, Dario; Lee, Jeong Yu; Stride, Eleanor

    2015-01-21

    Echogenic particles, such as microbubbles and volatile liquid micro/nano droplets, have shown considerable potential in a variety of clinical diagnostic and therapeutic applications. The accurate prediction of their response to ultrasound excitation is however extremely challenging, and this has hindered the optimisation of techniques such as quantitative ultrasound imaging and targeted drug delivery. Existing characterisation techniques, such as ultra-high speed microscopy provide important insights, but suffer from a number of limitations; most significantly difficulty in obtaining large data sets suitable for statistical analysis and the need to physically constrain the particles, thereby altering their dynamics. Here a microfluidic system is presented that overcomes these challenges to enable the measurement of single echogenic particle response to ultrasound excitation. A co-axial flow focusing device is used to direct a continuous stream of unconstrained particles through the combined focal region of an ultrasound transducer and a laser. Both the optical and acoustic scatter from individual particles are then simultaneously recorded. Calibration of the device and example results for different types of echogenic particle are presented, demonstrating a high throughput of up to 20 particles per second and the ability to resolve changes in particle radius down to 0.1 μm with an uncertainty of less than 3%.

  11. High Throughput Interrogation of Behavioral Transitions in C. elegans

    Science.gov (United States)

    Liu, Mochi; Shaevitz, Joshua; Leifer, Andrew

    We present a high-throughput method to probe transformations from neural activity to behavior in Caenorhabditis elegans to better understand how organisms change behavioral states. We optogenetically deliver white-noise stimuli to target sensory or inter neurons while simultaneously recording the movement of a population of worms. Using all the postural movement data collected, we computationally classify stereotyped behaviors in C. elegans by clustering based on the spectral properties of the instantaneous posture. (Berman et al., 2014) Transitions between these behavioral clusters indicate discrete behavioral changes. To study the neural correlates dictating these transitions, we perform model-driven experiments and employ Linear-Nonlinear-Poisson cascades that take the white-noise stimulus as the input. The parameters of these models are fitted by reverse-correlation from our measurements. The parameterized models of behavioral transitions predict the worm's response to novel stimuli and reveal the internal computations the animal makes before carrying out behavioral decisions. Preliminary results are shown that describe the neural-behavioral transformation between neural activity in mechanosensory neurons and reversal behavior.

  12. The JCSG high-throughput structural biology pipeline.

    Science.gov (United States)

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  13. High-Throughput Printing Process for Flexible Electronics

    Science.gov (United States)

    Hyun, Woo Jin

    Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.

  14. High-Throughput Identification of Antimicrobial Peptides from Amphibious Mudskippers

    Directory of Open Access Journals (Sweden)

    Yunhai Yi

    2017-11-01

    Full Text Available Widespread existence of antimicrobial peptides (AMPs has been reported in various animals with comprehensive biological activities, which is consistent with the important roles of AMPs as the first line of host defense system. However, no big-data-based analysis on AMPs from any fish species is available. In this study, we identified 507 AMP transcripts on the basis of our previously reported genomes and transcriptomes of two representative amphibious mudskippers, Boleophthalmus pectinirostris (BP and Periophthalmus magnuspinnatus (PM. The former is predominantly aquatic with less time out of water, while the latter is primarily terrestrial with extended periods of time on land. Within these identified AMPs, 449 sequences are novel; 15 were reported in BP previously; 48 are identically overlapped between BP and PM; 94 were validated by mass spectrometry. Moreover, most AMPs presented differential tissue transcription patterns in the two mudskippers. Interestingly, we discovered two AMPs, hemoglobin β1 and amylin, with high inhibitions on Micrococcus luteus. In conclusion, our high-throughput screening strategy based on genomic and transcriptomic data opens an efficient pathway to discover new antimicrobial peptides for ongoing development of marine drugs.

  15. Adaptation to high throughput batch chromatography enhances multivariate screening.

    Science.gov (United States)

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Assessing the utility and limitations of high throughput virtual screening

    Directory of Open Access Journals (Sweden)

    Paul Daniel Phillips

    2016-05-01

    Full Text Available Due to low cost, speed, and unmatched ability to explore large numbers of compounds, high throughput virtual screening and molecular docking engines have become widely utilized by computational scientists. It is generally accepted that docking engines, such as AutoDock, produce reliable qualitative results for ligand-macromolecular receptor binding, and molecular docking results are commonly reported in literature in the absence of complementary wet lab experimental data. In this investigation, three variants of the sixteen amino acid peptide, α-conotoxin MII, were docked to a homology model of the a3β2-nicotinic acetylcholine receptor. DockoMatic version 2.0 was used to perform a virtual screen of each peptide ligand to the receptor for ten docking trials consisting of 100 AutoDock cycles per trial. The results were analyzed for both variation in the calculated binding energy obtained from AutoDock, and the orientation of bound peptide within the receptor. The results show that, while no clear correlation exists between consistent ligand binding pose and the calculated binding energy, AutoDock is able to determine a consistent positioning of bound peptide in the majority of trials when at least ten trials were evaluated.

  17. Viscoelasticity as a Biomarker for High-Throughput Flow Cytometry

    Science.gov (United States)

    Sawetzki, Tobias; Eggleton, Charles D.; Desai, Sanjay A.; Marr, David W.M.

    2013-01-01

    The mechanical properties of living cells are a label-free biophysical marker of cell viability and health; however, their use has been greatly limited by low measurement throughput. Although examining individual cells at high rates is now commonplace with fluorescence activated cell sorters, development of comparable techniques that nondestructively probe cell mechanics remains challenging. A fundamental hurdle is the signal response time. Where light scattering and fluorescence signatures are virtually instantaneous, the cell stress relaxation, typically occurring on the order of seconds, limits the potential speed of elastic property measurement. To overcome this intrinsic barrier to rapid analysis, we show here that cell viscoelastic properties measured at frequencies far higher than those associated with cell relaxation can be used as a means of identifying significant differences in cell phenotype. In these studies, we explore changes in erythrocyte mechanical properties caused by infection with Plasmodium falciparum and find that the elastic response alone fails to detect malaria at high frequencies. At timescales associated with rapid assays, however, we observe that the inelastic response shows significant changes and can be used as a reliable indicator of infection, establishing the dynamic viscoelasticity as a basis for nondestructive mechanical analogs of current high-throughput cell classification methods. PMID:24268140

  18. High-throughput literature mining to support read-across ...

    Science.gov (United States)

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing biological similarity. Many research efforts are underway such as structuring mechanistic data in adverse outcome pathways and investigating the utility of high throughput (HT)/high content (HC) screening data. A largely untapped resource for read-across to date is the biomedical literature. This information has the potential to support read-across by facilitating the identification of valid source analogues with similar biological and toxicological profiles as well as providing the mechanistic understanding for any prediction made. A key challenge in using biomedical literature is to convert and translate its unstructured form into a computable format that can be linked to chemical structure. We developed a novel text-mining strategy to represent literature information for read across. Keywords were used to organize literature into toxicity signatures at the chemical level. These signatures were integrated with HT in vitro data and curated chemical structures. A rule-based algorithm assessed the strength of the literature relationship, providing a mechanism to rank and visualize the signature as literature ToxPIs (LitToxPIs). LitToxPIs were developed for over 6,000 chemicals for a varie

  19. High Throughput Profiling of Molecular Shapes in Crystals

    Science.gov (United States)

    Spackman, Peter R.; Thomas, Sajesh P.; Jayatilaka, Dylan

    2016-02-01

    Molecular shape is important in both crystallisation and supramolecular assembly, yet its role is not completely understood. We present a computationally efficient scheme to describe and classify the molecular shapes in crystals. The method involves rotation invariant description of Hirshfeld surfaces in terms of of spherical harmonic functions. Hirshfeld surfaces represent the boundaries of a molecule in the crystalline environment, and are widely used to visualise and interpret crystalline interactions. The spherical harmonic description of molecular shapes are compared and classified by means of principal component analysis and cluster analysis. When applied to a series of metals, the method results in a clear classification based on their lattice type. When applied to around 300 crystal structures comprising of series of substituted benzenes, naphthalenes and phenylbenzamide it shows the capacity to classify structures based on chemical scaffolds, chemical isosterism, and conformational similarity. The computational efficiency of the method is demonstrated with an application to over 14 thousand crystal structures. High throughput screening of molecular shapes and interaction surfaces in the Cambridge Structural Database (CSD) using this method has direct applications in drug discovery, supramolecular chemistry and materials design.

  20. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Directory of Open Access Journals (Sweden)

    Piyush Pandey

    2017-08-01

    Full Text Available Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N, phosphorus (P, potassium (K, magnesium (Mg, calcium (Ca, and sulfur (S, and micronutrients sodium (Na, iron (Fe, manganese (Mn, boron (B, copper (Cu, and zinc (Zn. Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62, with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69 than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 < 0.3 and RPD < 1.2. This study suggested

  1. High-throughput massively parallel sequencing for fetal aneuploidy detection from maternal plasma.

    Directory of Open Access Journals (Sweden)

    Taylor J Jensen

    Full Text Available Circulating cell-free (ccf fetal DNA comprises 3-20% of all the cell-free DNA present in maternal plasma. Numerous research and clinical studies have described the analysis of ccf DNA using next generation sequencing for the detection of fetal aneuploidies with high sensitivity and specificity. We sought to extend the utility of this approach by assessing semi-automated library preparation, higher sample multiplexing during sequencing, and improved bioinformatic tools to enable a higher throughput, more efficient assay while maintaining or improving clinical performance.Whole blood (10mL was collected from pregnant female donors and plasma separated using centrifugation. Ccf DNA was extracted using column-based methods. Libraries were prepared using an optimized semi-automated library preparation method and sequenced on an Illumina HiSeq2000 sequencer in a 12-plex format. Z-scores were calculated for affected chromosomes using a robust method after normalization and genomic segment filtering. Classification was based upon a standard normal transformed cutoff value of z = 3 for chromosome 21 and z = 3.95 for chromosomes 18 and 13.Two parallel assay development studies using a total of more than 1900 ccf DNA samples were performed to evaluate the technical feasibility of automating library preparation and increasing the sample multiplexing level. These processes were subsequently combined and a study of 1587 samples was completed to verify the stability of the process-optimized assay. Finally, an unblinded clinical evaluation of 1269 euploid and aneuploid samples utilizing this high-throughput assay coupled to improved bioinformatic procedures was performed. We were able to correctly detect all aneuploid cases with extremely low false positive rates of 0.09%, <0.01%, and 0.08% for trisomies 21, 18, and 13, respectively.These data suggest that the developed laboratory methods in concert with improved bioinformatic approaches enable higher sample

  2. Image-Based Single Cell Profiling: High-Throughput Processing of Mother Machine Experiments.

    Science.gov (United States)

    Sachs, Christian Carsten; Grünberger, Alexander; Helfrich, Stefan; Probst, Christopher; Wiechert, Wolfgang; Kohlheyer, Dietrich; Nöh, Katharina

    Microfluidic lab-on-chip technology combined with live-cell imaging has enabled the observation of single cells in their spatio-temporal context. The mother machine (MM) cultivation system is particularly attractive for the long-term investigation of rod-shaped bacteria since it facilitates continuous cultivation and observation of individual cells over many generations in a highly parallelized manner. To date, the lack of fully automated image analysis software limits the practical applicability of the MM as a phenotypic screening tool. We present an image analysis pipeline for the automated processing of MM time lapse image stacks. The pipeline supports all analysis steps, i.e., image registration, orientation correction, channel/cell detection, cell tracking, and result visualization. Tailored algorithms account for the specialized MM layout to enable a robust automated analysis. Image data generated in a two-day growth study (≈ 90 GB) is analyzed in ≈ 30 min with negligible differences in growth rate between automated and manual evaluation quality. The proposed methods are implemented in the software molyso (MOther machine AnaLYsis SOftware) that provides a new profiling tool to analyze unbiasedly hitherto inaccessible large-scale MM image stacks. Presented is the software molyso, a ready-to-use open source software (BSD-licensed) for the unsupervised analysis of MM time-lapse image stacks. molyso source code and user manual are available at https://github.com/modsim/molyso.

  3. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    Science.gov (United States)

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  4. High throughput micro-well generation of hepatocyte micro-aggregates for tissue engineering.

    Science.gov (United States)

    Gevaert, Elien; Dollé, Laurent; Billiet, Thomas; Dubruel, Peter; van Grunsven, Leo; van Apeldoorn, Aart; Cornelissen, Ria

    2014-01-01

    The main challenge in hepatic tissue engineering is the fast dedifferentiation of primary hepatocytes in vitro. One successful approach to maintain hepatocyte phenotype on the longer term is the cultivation of cells as aggregates. This paper demonstrates the use of an agarose micro-well chip for the high throughput generation of hepatocyte aggregates, uniform in size. In our study we observed that aggregation of hepatocytes had a beneficial effect on the expression of certain hepatocyte specific markers. Moreover we observed that the beneficial effect was dependent on the aggregate dimensions, indicating that aggregate parameters should be carefully considered. In a second part of the study, the selected aggregates were immobilized by encapsulation in methacrylamide-modified gelatin. Phenotype evaluations revealed that a stable hepatocyte phenotype could be maintained during 21 days when encapsulated in the hydrogel. In conclusion we have demonstrated the beneficial use of micro-well chips for hepatocyte aggregation and the size-dependent effects on hepatocyte phenotype. We also pointed out that methacrylamide-modified gelatin is suitable for the encapsulation of these aggregates.

  5. High throughput micro-well generation of hepatocyte micro-aggregates for tissue engineering.

    Directory of Open Access Journals (Sweden)

    Elien Gevaert

    Full Text Available The main challenge in hepatic tissue engineering is the fast dedifferentiation of primary hepatocytes in vitro. One successful approach to maintain hepatocyte phenotype on the longer term is the cultivation of cells as aggregates. This paper demonstrates the use of an agarose micro-well chip for the high throughput generation of hepatocyte aggregates, uniform in size. In our study we observed that aggregation of hepatocytes had a beneficial effect on the expression of certain hepatocyte specific markers. Moreover we observed that the beneficial effect was dependent on the aggregate dimensions, indicating that aggregate parameters should be carefully considered. In a second part of the study, the selected aggregates were immobilized by encapsulation in methacrylamide-modified gelatin. Phenotype evaluations revealed that a stable hepatocyte phenotype could be maintained during 21 days when encapsulated in the hydrogel. In conclusion we have demonstrated the beneficial use of micro-well chips for hepatocyte aggregation and the size-dependent effects on hepatocyte phenotype. We also pointed out that methacrylamide-modified gelatin is suitable for the encapsulation of these aggregates.

  6. A pre-validation trial - testing genotoxicity of several chemicals using standard, medium- and high-throughput comet formats.

    Directory of Open Access Journals (Sweden)

    Kristine Bjerve Gutzkow

    2015-06-01

    Results obtained with the three systems (standard, medium- and high-throughput were essentially the same. The 96-minigel format was analysed with the fully automated scoring system IMSTAR and comparable results were achieved with the semi-automated scoring system from Perceptives. The known genotoxic chemicals MNU, B(aP, 4-NQO and cyclophosphamide showed little consistent sign of genotoxicity at concentrations causing limited cytotoxicity. D-mannitol and Triton X-100 were, as expected, non-genotoxic (though Triton X-100, at high concentrations, caused DNA breaks as an apparent secondary effect of cytotoxicity. Etoposide and bleomycin gave significant increase in DNA strand break at borderline cytotoxic concentrations. The limitation of the assay to detect damaged bases by known genotoxins may be overcome by incorporating a DNA repair enzyme, such as formamidopyrimidine-DNA-glycosylase (FPG, to convert damaged bases into breaks as shown by Azqueta A et al., Mutagenesis vol. 28 no. 3 pp. 271–277, 2013 .

  7. Multiscale cartilage biomechanics: technical challenges in realizing a high-throughput modelling and simulation workflow.

    Science.gov (United States)

    Erdemir, Ahmet; Bennetts, Craig; Davis, Sean; Reddy, Akhil; Sibole, Scott

    2015-04-06

    interpretation of the results. This study aims to summarize various strategies to address the technical challenges of post-processing-based simulations of cartilage and chondrocyte mechanics with the ultimate goal of establishing the foundations of a high-throughput multiscale analysis framework. At the joint-tissue scale, rapid development of regional models of articular contact is possible by automating the process of generating parametric representations of cartilage boundaries and depth-dependent zonal delineation with associated constitutive relationships. At the tissue-cell scale, models descriptive of multicellular and fibrillar architecture of cartilage zones can also be generated in an automated fashion. Through post-processing, scripts can extract biphasic mechanical metrics at a desired point in the cartilage to assign loading and boundary conditions to models at the lower spatial scale of cells. Cell deformation metrics can be extracted from simulation results to provide a simplified description of individual chondrocyte responses. Simulations at the tissue-cell scale can be parallelized owing to the loosely coupled nature of the feed-forward approach. Verification studies illustrated the necessity of a second-order data passing scheme between scales and evaluated the role that the microscale representative volume size plays in appropriately predicting the mechanical response of the chondrocytes. The tools summarized in this study collectively provide a framework for high-throughput exploration of cartilage biomechanics, which includes minimally supervised model generation, and prediction of multiscale biomechanical metrics across a range of spatial scales, from joint regions and cartilage zones, down to that of the chondrocytes.

  8. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2018-01-01

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org. Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  9. Standardized Method for High-throughput Sterilization of Arabidopsis Seeds.

    Science.gov (United States)

    Lindsey, Benson E; Rivero, Luz; Calhoun, Chistopher S; Grotewold, Erich; Brkljacic, Jelena

    2017-10-17

    Arabidopsis thaliana (Arabidopsis) seedlings often need to be grown on sterile media. This requires prior seed sterilization to prevent the growth of microbial contaminants present on the seed surface. Currently, Arabidopsis seeds are sterilized using two distinct sterilization techniques in conditions that differ slightly between labs and have not been standardized, often resulting in only partially effective sterilization or in excessive seed mortality. Most of these methods are also not easily scalable to a large number of seed lines of diverse genotypes. As technologies for high-throughput analysis of Arabidopsis continue to proliferate, standardized techniques for sterilizing large numbers of seeds of different genotypes are becoming essential for conducting these types of experiments. The response of a number of Arabidopsis lines to two different sterilization techniques was evaluated based on seed germination rate and the level of seed contamination with microbes and other pathogens. The treatments included different concentrations of sterilizing agents and times of exposure, combined to determine optimal conditions for Arabidopsis seed sterilization. Optimized protocols have been developed for two different sterilization methods: bleach (liquid-phase) and chlorine (Cl2) gas (vapor-phase), both resulting in high seed germination rates and minimal microbial contamination. The utility of these protocols was illustrated through the testing of both wild type and mutant seeds with a range of germination potentials. Our results show that seeds can be effectively sterilized using either method without excessive seed mortality, although detrimental effects of sterilization were observed for seeds with lower than optimal germination potential. In addition, an equation was developed to enable researchers to apply the standardized chlorine gas sterilization conditions to airtight containers of different sizes. The protocols described here allow easy, efficient, and

  10. Towards Chip Scale Liquid Chromatography and High Throughput Immunosensing

    Energy Technology Data Exchange (ETDEWEB)

    Ni, Jing [Iowa State Univ., Ames, IA (United States)

    2000-09-21

    This work describes several research projects aimed towards developing new instruments and novel methods for high throughput chemical and biological analysis. Approaches are taken in two directions. The first direction takes advantage of well-established semiconductor fabrication techniques and applies them to miniaturize instruments that are workhorses in analytical laboratories. Specifically, the first part of this work focused on the development of micropumps and microvalves for controlled fluid delivery. The mechanism of these micropumps and microvalves relies on the electrochemically-induced surface tension change at a mercury/electrolyte interface. A miniaturized flow injection analysis device was integrated and flow injection analyses were demonstrated. In the second part of this work, microfluidic chips were also designed, fabricated, and tested. Separations of two fluorescent dyes were demonstrated in microfabricated channels, based on an open-tubular liquid chromatography (OT LC) or an electrochemically-modulated liquid chromatography (EMLC) format. A reduction in instrument size can potentially increase analysis speed, and allow exceedingly small amounts of sample to be analyzed under diverse separation conditions. The second direction explores the surface enhanced Raman spectroscopy (SERS) as a signal transduction method for immunoassay analysis. It takes advantage of the improved detection sensitivity as a result of surface enhancement on colloidal gold, the narrow width of Raman band, and the stability of Raman scattering signals to distinguish several different species simultaneously without exploiting spatially-separated addresses on a biochip. By labeling gold nanoparticles with different Raman reporters in conjunction with different detection antibodies, a simultaneous detection of a dual-analyte immunoassay was demonstrated. Using this scheme for quantitative analysis was also studied and preliminary dose-response curves from an immunoassay of a

  11. Mining Chemical Activity Status from High-Throughput Screening Assays

    KAUST Repository

    Soufan, Othman

    2015-12-14

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  12. Prevention of data duplication for high throughput sequencing repositories

    Science.gov (United States)

    Gabdank, Idan; Chan, Esther T; Davidson, Jean M; Hilton, Jason A; Davis, Carrie A; Baymuradov, Ulugbek K; Narayanan, Aditi; Onate, Kathrina C; Graham, Keenan; Miyasato, Stuart R; Dreszer, Timothy R; Strattan, J Seth; Jolanki, Otto; Tanaka, Forrest Y; Hitz, Benjamin C

    2018-01-01

    Abstract Prevention of unintended duplication is one of the ongoing challenges many databases have to address. Working with high-throughput sequencing data, the complexity of that challenge increases with the complexity of the definition of a duplicate. In a computational data model, a data object represents a real entity like a reagent or a biosample. This representation is similar to how a card represents a book in a paper library catalog. Duplicated data objects not only waste storage, they can mislead users into assuming the model represents more than the single entity. Even if it is clear that two objects represent a single entity, data duplication opens the door to potential inconsistencies between the objects since the content of the duplicated objects can be updated independently, allowing divergence of the metadata associated with the objects. Analogously to a situation in which a catalog in a paper library would contain by mistake two cards for a single copy of a book. If these cards are listing simultaneously two different individuals as current book borrowers, it would be difficult to determine which borrower (out of the two listed) actually has the book. Unfortunately, in a large database with multiple submitters, unintended duplication is to be expected. In this article, we present three principal guidelines the Encyclopedia of DNA Elements (ENCODE) Portal follows in order to prevent unintended duplication of both actual files and data objects: definition of identifiable data objects (I), object uniqueness validation (II) and de-duplication mechanism (III). In addition to explaining our modus operandi, we elaborate on the methods used for identification of sequencing data files. Comparison of the approach taken by the ENCODE Portal vs other widely used biological data repositories is provided. Database URL: https://www.encodeproject.org/

  13. Validation of high-throughput real time polymerase chain reaction assays for simultaneous detection of invasive citrus pathogens.

    Science.gov (United States)

    Saponari, Maria; Loconsole, Giuliana; Liao, Hui-Hong; Jiang, Bo; Savino, Vito; Yokomi, Raymond K

    2013-11-01

    A number of important citrus pathogens are spread by graft propagation, arthropod vector transmission and inadvertent import and dissemination of infected plants. For these reasons, citrus disease management and clean stock programs require pathogen detection systems which are economical and sensitive to maintain a healthy industry. To this end, multiplex quantitative real-time PCR (qPCR) assays were developed allowing high-throughput and simultaneous detection of some major invasive citrus pathogens. Automated high-throughput extraction comparing several bead-based commercial extraction kits were tested and compared with tissue print and manual extraction to obtain nucleic acids from healthy and pathogen-infected citrus trees from greenhouse in planta collections and field. Total nucleic acids were used as templates for pathogen detection. Multiplex reverse transcription-qPCR (RT-qPCR) assays were developed for simultaneous detection of six targets including a virus, two viroids, a bacterium associated with huanglongbing and a citrus RNA internal control. Specifically, two one-step TaqMan-based multiplex RT-qPCR assays were developed and tested with target templates to determine sensitivity and detection efficiency. The first assay included primers and probes for 'Candidatus Liberibacter asiaticus' (CLas) and Citrus tristeza virus (CTV) broad spectrum detection and genotype differentiation (VT- and T3-like genotypes). The second assay contained primers and probes for Hop stunt viroid (HSVd), Citrus exocortis viroid (CEVd) and the mitochondrial NADH dehydrogenase (nad5) mRNA as an internal citrus host control. Primers and TaqMan probes for the viroids were designed in this work; whereas those for the other pathogens were from reports of others. Based on quantitation cycle values, automated high-throughput extraction of samples proved to be as suitable as manual extraction. The multiplex RT-qPCR assays detected both RNA and DNA pathogens in the same dilution series

  14. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing ...

  15. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  16. Tackling calibration problems of spectroscopic analysis in high-throughput experimentation

    NARCIS (Netherlands)

    Cruz, Susana C.; Rothenberg, Gadi; Westerhuis, Johan A.; Smilde, Age K.

    2005-01-01

    High-throughput experimentation and screening methods are changing work flows and creating new possibilities in biochemistry, organometallic chemistry, and catalysis. However, many high-throughput systems rely on off-line chromatography methods that shift the bottleneck to the analysis stage.

  17. PhenStat: A Tool Kit for Standardized Analysis of High Throughput Phenotypic Data.

    Directory of Open Access Journals (Sweden)

    Natalja Kurbatova

    Full Text Available The lack of reproducibility with animal phenotyping experiments is a growing concern among the biomedical community. One contributing factor is the inadequate description of statistical analysis methods that prevents researchers from replicating results even when the original data are provided. Here we present PhenStat--a freely available R package that provides a variety of statistical methods for the identification of phenotypic associations. The methods have been developed for high throughput phenotyping pipelines implemented across various experimental designs with an emphasis on managing temporal variation. PhenStat is targeted to two user groups: small-scale users who wish to interact and test data from large resources and large-scale users who require an automated statistical analysis pipeline. The software provides guidance to the user for selecting appropriate analysis methods based on the dataset and is designed to allow for additions and modifications as needed. The package was tested on mouse and rat data and is used by the International Mouse Phenotyping Consortium (IMPC. By providing raw data and the version of PhenStat used, resources like the IMPC give users the ability to replicate and explore results within their own computing environment.

  18. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    International Nuclear Information System (INIS)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan; Durmus, Naside Gozde

    2011-01-01

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  19. Unlocking the Potential of High-Throughput Drug Combination Assays Using Acoustic Dispensing.

    Science.gov (United States)

    Chan, Grace Ka Yan; Wilson, Stacy; Schmidt, Stephen; Moffat, John G

    2016-02-01

    Assessment of synergistic effects of drug combinations in vitro is a critical part of anticancer drug research. However, the complexities of dosing and analyzing two drugs over the appropriate range of doses have generally led to compromises in experimental design that restrict the quality and robustness of the data. In particular, the use of a single dose response of combined drugs, rather than a full two-way matrix of varying doses, has predominated in higher-throughput studies. Acoustic dispensing unlocks the potential of high-throughput dose matrix analysis. We have developed acoustic dispensing protocols that enable compound synergy assays in a 384-well format. This experimental design is considerably more efficient and flexible with respect to time, reagent usage, and labware than is achievable using traditional serial-dilution approaches. Data analysis tools integrated in Genedata Screener were used to efficiently deconvolute the combination compound mapping scheme and calculate compound potency and synergy metrics. We have applied this workflow to evaluate interactions among drugs targeting different nodes of the mitogen-activated protein kinase pathway in a panel of cancer cell lines. © 2015 Society for Laboratory Automation and Screening.

  20. StructureMapper: a high-throughput algorithm for analyzing protein sequence locations in structural data.

    Science.gov (United States)

    Nurminen, Anssi; Hytönen, Vesa P; Valencia, Alfonso

    2018-02-14

    StructureMapper is a high-throughput algorithm for automated mapping of protein primary amino sequence locations to existing three-dimensional protein structures. The algorithm is intended for facilitating easy and efficient utilization of structural information in protein characterization and proteomics. StructureMapper provides an analysis of the identified structural locations that includes surface accessibility, flexibility, protein-protein interfacing, intrinsic disorder prediction, secondary structure assignment, biological assembly information, and sequence identity percentages, among other metrics. We have showcased the use of the algorithm by estimating the coverage of structural information of the human proteome, identifying critical interface residues in DNA polymerase γ, profiling structurally protease cleavage sites and post-translational modification sites, and by identifying putative, novel phosphoswitches. The StructureMapper algorithm is available as an online service and standalone implementation at http://structuremapper.uta.fi. vesa.hytonen@uta.fi. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  1. Human Adenine Nucleotide Translocase (ANT) Modulators Identified by High-Throughput Screening of Transgenic Yeast.

    Science.gov (United States)

    Zhang, Yujian; Tian, Defeng; Matsuyama, Hironori; Hamazaki, Takashi; Shiratsuchi, Takayuki; Terada, Naohiro; Hook, Derek J; Walters, Michael A; Georg, Gunda I; Hawkinson, Jon E

    2016-04-01

    Transport of ADP and ATP across mitochondria is one of the primary points of regulation to maintain cellular energy homeostasis. This process is mainly mediated by adenine nucleotide translocase (ANT) located on the mitochondrial inner membrane. There are four human ANT isoforms, each having a unique tissue-specific expression pattern and biological function, highlighting their potential as drug targets for diverse clinical indications, including male contraception and cancer. In this study, we present a novel yeast-based high-throughput screening (HTS) strategy to identify compounds inhibiting the function of ANT. Yeast strains generated by deletion of endogenous proteins with ANT activity followed by insertion of individual human ANT isoforms are sensitive to cell-permeable ANT inhibitors, which reduce proliferation. Screening hits identified in the yeast proliferation assay were characterized in ADP/ATP exchange assays employing recombinant ANT isoforms expressed in isolated yeast mitochondria and Lactococcus lactis as well as by oxygen consumption rate in mammalian cells. Using this approach, closantel and CD437 were identified as broad-spectrum ANT inhibitors, whereas leelamine was found to be a modulator of ANT function. This yeast "knock-out/knock-in" screening strategy is applicable to a broad range of essential molecular targets that are required for yeast survival. © 2016 Society for Laboratory Automation and Screening.

  2. Probing biolabels for high throughput biosensing via synchrotron radiation SEIRA technique

    Energy Technology Data Exchange (ETDEWEB)

    Hornemann, Andrea, E-mail: andrea.hornemann@ptb.de; Hoehl, Arne, E-mail: arne.hoehl@ptb.de; Ulm, Gerhard, E-mail: gerhard.ulm@ptb.de; Beckhoff, Burkhard, E-mail: burkhard.beckhoff@ptb.de [Physikalisch-Technische Bundesanstalt, Abbestr. 2-12, 10587 Berlin (Germany); Eichert, Diane, E-mail: diane.eichert@elettra.eu [Elettra-Sincrotrone Trieste S.C.p.A., Strada Statale 14, Area Science Park, 34149 Trieste (Italy); Flemig, Sabine, E-mail: sabine.flemig@bam.de [BAM Bundesanstalt für Materialforschung und –prüfung, Richard-Willstätter-Str.10, 12489 Berlin (Germany)

    2016-07-27

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzed by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.

  3. SNP high-throughput screening in grapevine using the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Velasco Riccardo

    2008-01-01

    Full Text Available Abstract Background Until recently, only a small number of low- and mid-throughput methods have been used for single nucleotide polymorphism (SNP discovery and genotyping in grapevine (Vitis vinifera L.. However, following completion of the sequence of the highly heterozygous genome of Pinot Noir, it has been possible to identify millions of electronic SNPs (eSNPs thus providing a valuable source for high-throughput genotyping methods. Results Herein we report the first application of the SNPlex™ genotyping system in grapevine aiming at the anchoring of an eukaryotic genome. This approach combines robust SNP detection with automated assay readout and data analysis. 813 candidate eSNPs were developed from non-repetitive contigs of the assembled genome of Pinot Noir and tested in 90 progeny of Syrah × Pinot Noir cross. 563 new SNP-based markers were obtained and mapped. The efficiency rate of 69% was enhanced to 80% when multiple displacement amplification (MDA methods were used for preparation of genomic DNA for the SNPlex assay. Conclusion Unlike other SNP genotyping methods used to investigate thousands of SNPs in a few genotypes, or a few SNPs in around a thousand genotypes, the SNPlex genotyping system represents a good compromise to investigate several hundred SNPs in a hundred or more samples simultaneously. Therefore, the use of the SNPlex assay, coupled with whole genome amplification (WGA, is a good solution for future applications in well-equipped laboratories.

  4. A high-throughput sample preparation method for cellular proteomics using 96-well filter plates.

    Science.gov (United States)

    Switzar, Linda; van Angeren, Jordy; Pinkse, Martijn; Kool, Jeroen; Niessen, Wilfried M A

    2013-10-01

    A high-throughput sample preparation protocol based on the use of 96-well molecular weight cutoff (MWCO) filter plates was developed for shotgun proteomics of cell lysates. All sample preparation steps, including cell lysis, buffer exchange, protein denaturation, reduction, alkylation and proteolytic digestion are performed in a 96-well plate format, making the platform extremely well suited for processing large numbers of samples and directly compatible with functional assays for cellular proteomics. In addition, the usage of a single plate for all sample preparation steps following cell lysis reduces potential samples losses and allows for automation. The MWCO filter also enables sample concentration, thereby increasing the overall sensitivity, and implementation of washing steps involving organic solvents, for example, to remove cell membranes constituents. The optimized protocol allowed for higher throughput with improved sensitivity in terms of the number of identified cellular proteins when compared to an established protocol employing gel-filtration columns. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. A high-throughput method for the quantitative analysis of auxins.

    Science.gov (United States)

    Barkawi, Lana S; Tam, Yuen-Yee; Tillman, Julie A; Normanly, Jennifer; Cohen, Jerry D

    2010-09-01

    Auxin measurements in plants are critical to understanding both auxin signaling and metabolic homeostasis. The most abundant natural auxin is indole-3-acetic acid (IAA). This protocol is for the precise, high-throughput determination of free IAA in plant tissue by isotope dilution analysis using gas chromatography-mass spectrometry (GC-MS). The steps described are as follows: harvesting of plant material; amino and polymethylmethacrylate solid-phase purification followed by derivatization with diazomethane (either manual or robotic); GC-MS analysis; and data analysis. [¹³C₆]IAA is the standard used. The amount of tissue required is relatively small (25 mg of fresh weight) and one can process more than 500 samples per week using an automated system. To extract eight samples, this procedure takes ∼3 h, whether performed manually or robotically. For processing more than eight samples, robotic extraction becomes substantially more time efficient, saving at least 0.5 h per additional batch of eight samples.

  6. Scaling up high throughput field phenotyping of corn and soy research plots using ground rovers

    Science.gov (United States)

    Peshlov, Boyan; Nakarmi, Akash; Baldwin, Steven; Essner, Scott; French, Jasenka

    2017-05-01

    Crop improvement programs require large and meticulous selection processes that effectively and accurately collect and analyze data to generate quality plant products as efficiently as possible, develop superior cropping and/or crop improvement methods. Typically, data collection for such testing is performed by field teams using hand-held instruments or manually-controlled devices. Although steps are taken to reduce error, the data collected in such manner can be unreliable due to human error and fatigue, which reduces the ability to make accurate selection decisions. Monsanto engineering teams have developed a high-clearance mobile platform (Rover) as a step towards high throughput and high accuracy phenotyping at an industrial scale. The rovers are equipped with GPS navigation, multiple cameras and sensors and on-board computers to acquire data and compute plant vigor metrics per plot. The supporting IT systems enable automatic path planning, plot identification, image and point cloud data QA/QC and near real-time analysis where results are streamed to enterprise databases for additional statistical analysis and product advancement decisions. Since the rover program was launched in North America in 2013, the number of research plots we can analyze in a growing season has expanded dramatically. This work describes some of the successes and challenges in scaling up of the rover platform for automated phenotyping to enable science at scale.

  7. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit.

    Science.gov (United States)

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-04-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. GROMACS is an open source and free software available from http://www.gromacs.org. Supplementary data are available at Bioinformatics online.

  8. DOVIS: an implementation for high-throughput virtual screening using AutoDock

    Directory of Open Access Journals (Sweden)

    Wallqvist Anders

    2008-02-01

    Full Text Available Abstract Background Molecular-docking-based virtual screening is an important tool in drug discovery that is used to significantly reduce the number of possible chemical compounds to be investigated. In addition to the selection of a sound docking strategy with appropriate scoring functions, another technical challenge is to in silico screen millions of compounds in a reasonable time. To meet this challenge, it is necessary to use high performance computing (HPC platforms and techniques. However, the development of an integrated HPC system that makes efficient use of its elements is not trivial. Results We have developed an application termed DOVIS that uses AutoDock (version 3 as the docking engine and runs in parallel on a Linux cluster. DOVIS can efficiently dock large numbers (millions of small molecules (ligands to a receptor, screening 500 to 1,000 compounds per processor per day. Furthermore, in DOVIS, the docking session is fully integrated and automated in that the inputs are specified via a graphical user interface, the calculations are fully integrated with a Linux cluster queuing system for parallel processing, and the results can be visualized and queried. Conclusion DOVIS removes most of the complexities and organizational problems associated with large-scale high-throughput virtual screening, and provides a convenient and efficient solution for AutoDock users to use this software in a Linux cluster platform.

  9. Filling gaps in bacterial amino acid biosynthesis pathways with high-throughput genetics.

    Directory of Open Access Journals (Sweden)

    Morgan N Price

    2018-01-01

    Full Text Available For many bacteria with sequenced genomes, we do not understand how they synthesize some amino acids. This makes it challenging to reconstruct their metabolism, and has led to speculation that bacteria might be cross-feeding amino acids. We studied heterotrophic bacteria from 10 different genera that grow without added amino acids even though an automated tool predicts that the bacteria have gaps in their amino acid synthesis pathways. Across these bacteria, there were 11 gaps in their amino acid biosynthesis pathways that we could not fill using current knowledge. Using genome-wide mutant fitness data, we identified novel enzymes that fill 9 of the 11 gaps and hence explain the biosynthesis of methionine, threonine, serine, or histidine by bacteria from six genera. We also found that the sulfate-reducing bacterium Desulfovibrio vulgaris synthesizes homocysteine (which is a precursor to methionine by using DUF39, NIL/ferredoxin, and COG2122 proteins, and that homoserine is not an intermediate in this pathway. Our results suggest that most free-living bacteria can likely make all 20 amino acids and illustrate how high-throughput genetics can uncover previously-unknown amino acid biosynthesis genes.

  10. Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.

    Science.gov (United States)

    Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil

    2015-07-17

    In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.

  11. High-throughput protein crystallization on the World Community Grid and the GPU

    International Nuclear Information System (INIS)

    Kotseruba, Yulia; Cumbaa, Christian A; Jurisica, Igor

    2012-01-01

    We have developed CPU and GPU versions of an automated image analysis and classification system for protein crystallization trial images from the Hauptman Woodward Institute's High-Throughput Screening lab. The analysis step computes 12,375 numerical features per image. Using these features, we have trained a classifier that distinguishes 11 different crystallization outcomes, recognizing 80% of all crystals, 94% of clear drops, 94% of precipitates. The computing requirements for this analysis system are large. The complete HWI archive of 120 million images is being processed by the donated CPU cycles on World Community Grid, with a GPU phase launching in early 2012. The main computational burden of the analysis is the measure of textural (GLCM) features within the image at multiple neighbourhoods, distances, and at multiple greyscale intensity resolutions. CPU runtime averages 4,092 seconds (single threaded) on an Intel Xeon, but only 65 seconds on an NVIDIA Tesla C2050. We report on the process of adapting the C++ code to OpenCL, optimized for multiple platforms.

  12. High-Throughput Quantification of Bacterial-Cell Interactions Using Virtual Colony Counts

    Directory of Open Access Journals (Sweden)

    Stefanie Hoffmann

    2018-02-01

    Full Text Available The quantification of bacteria in cell culture infection models is of paramount importance for the characterization of host-pathogen interactions and pathogenicity factors involved. The standard to enumerate bacteria in these assays is plating of a dilution series on solid agar and counting of the resulting colony forming units (CFU. In contrast, the virtual colony count (VCC method is a high-throughput compatible alternative with minimized manual input. Based on the recording of quantitative growth kinetics, VCC relates the time to reach a given absorbance threshold to the initial cell count using a series of calibration curves. Here, we adapted the VCC method using the model organism Salmonella enterica sv. Typhimurium (S. Typhimurium in combination with established cell culture-based infection models. For HeLa infections, a direct side-by-side comparison showed a good correlation of VCC with CFU counting after plating. For MDCK cells and RAW macrophages we found that VCC reproduced the expected phenotypes of different S. Typhimurium mutants. Furthermore, we demonstrated the use of VCC to test the inhibition of Salmonella invasion by the probiotic E. coli strain Nissle 1917. Taken together, VCC provides a flexible, label-free, automation-compatible methodology to quantify bacteria in in vitro infection assays.

  13. Corifungin, a New Drug Lead against Naegleria, Identified from a High-Throughput Screen

    Science.gov (United States)

    Debnath, Anjan; Tunac, Josefino B.; Galindo-Gómez, Silvia; Silva-Olivares, Angélica; Shibayama, Mineko

    2012-01-01

    Primary amebic meningoencephalitis (PAM) is a rapidly fatal infection caused by the free-living ameba Naegleria fowleri. The drug of choice in treating PAM is the antifungal antibiotic amphotericin B, but its use is associated with severe adverse effects. Moreover, few patients treated with amphotericin B have survived PAM. Therefore, fast-acting and efficient drugs are urgently needed for the treatment of PAM. To facilitate drug screening for this pathogen, an automated, high-throughput screening methodology was developed and validated for the closely related species Naegleria gruberi. Five kinase inhibitors and an NF-kappaB inhibitor were hits identified in primary screens of three compound libraries. Most importantly for a preclinical drug discovery pipeline, we identified corifungin, a water-soluble polyene macrolide with a higher activity against Naegleria than that of amphotericin B. Transmission electron microscopy of N. fowleri trophozoites incubated with different concentrations of corifungin showed disruption of cytoplasmic and plasma membranes and alterations in mitochondria, followed by complete lysis of amebae. In vivo efficacy of corifungin in a mouse model of PAM was confirmed by an absence of detectable amebae in the brain and 100% survival of mice for 17 days postinfection for a single daily intraperitoneal dose of 9 mg/kg of body weight given for 10 days. The same dose of amphotericin B did not reduce ameba growth, and mouse survival was compromised. Based on these results, the U.S. FDA has approved orphan drug status for corifungin for the treatment of PAM. PMID:22869574

  14. MyoScreen, a High-Throughput Phenotypic Screening Platform Enabling Muscle Drug Discovery.

    Science.gov (United States)

    Young, Joanne; Margaron, Yoran; Fernandes, Mathieu; Duchemin-Pelletier, Eve; Michaud, Joris; Flaender, Mélanie; Lorintiu, Oana; Degot, Sébastien; Poydenot, Pauline

    2018-03-01

    Despite the need for more effective drug treatments to address muscle atrophy and disease, physiologically accurate in vitro screening models and higher information content preclinical assays that aid in the discovery and development of novel therapies are lacking. To this end, MyoScreen was developed: a robust and versatile high-throughput high-content screening (HT/HCS) platform that integrates a physiologically and pharmacologically relevant micropatterned human primary skeletal muscle model with a panel of pertinent phenotypic and functional assays. MyoScreen myotubes form aligned, striated myofibers, and they show nerve-independent accumulation of acetylcholine receptors (AChRs), excitation-contraction coupling (ECC) properties characteristic of adult skeletal muscle and contraction in response to chemical stimulation. Reproducibility and sensitivity of the fully automated MyoScreen platform are highlighted in assays that quantitatively measure myogenesis, hypertrophy and atrophy, AChR clusterization, and intracellular calcium release dynamics, as well as integrating contractility data. A primary screen of 2560 compounds to identify stimulators of myofiber regeneration and repair, followed by further biological characterization of two hits, validates MyoScreen for the discovery and testing of novel therapeutics. MyoScreen is an improvement of current in vitro muscle models, enabling a more predictive screening strategy for preclinical selection of the most efficacious new chemical entities earlier in the discovery pipeline process.

  15. High-throughput single-molecule force spectroscopy for membrane proteins

    Energy Technology Data Exchange (ETDEWEB)

    Bosshart, Patrick D; Casagrande, Fabio; Frederix, Patrick L T M; Engel, Andreas; Fotiadis, Dimitrios [M E Mueller Institute for Structural Biology, Biozentrum of the University of Basel, CH-4056 Basel (Switzerland); Ratera, Merce; Palacin, Manuel [Institute for Research in Biomedicine, Barcelona Science Park, Department of Biochemistry and Molecular Biology, Faculty of Biology, University of Barcelona and Centro de Investigacion Biomedica en Red de Enfermedades Raras, E-08028 Barcelona (Spain); Bippes, Christian A; Mueller, Daniel J [BioTechnology Center, Technical University, Tatzberg 47, D-01307 Dresden (Germany)], E-mail: andreas.engel@unibas.ch, E-mail: dimitrios.fotiadis@mci.unibe.ch

    2008-09-24

    Atomic force microscopy-based single-molecule force spectroscopy (SMFS) is a powerful tool for studying the mechanical properties, intermolecular and intramolecular interactions, unfolding pathways, and energy landscapes of membrane proteins. One limiting factor for the large-scale applicability of SMFS on membrane proteins is its low efficiency in data acquisition. We have developed a semi-automated high-throughput SMFS (HT-SMFS) procedure for efficient data acquisition. In addition, we present a coarse filter to efficiently extract protein unfolding events from large data sets. The HT-SMFS procedure and the coarse filter were validated using the proton pump bacteriorhodopsin (BR) from Halobacterium salinarum and the L-arginine/agmatine antiporter AdiC from the bacterium Escherichia coli. To screen for molecular interactions between AdiC and its substrates, we recorded data sets in the absence and in the presence of L-arginine, D-arginine, and agmatine. Altogether {approx}400 000 force-distance curves were recorded. Application of coarse filtering to this wealth of data yielded six data sets with {approx}200 (AdiC) and {approx}400 (BR) force-distance spectra in each. Importantly, the raw data for most of these data sets were acquired in one to two days, opening new perspectives for HT-SMFS applications.

  16. Commentary: Roles for Pathologists in a High-throughput Image Analysis Team.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Bolon, Brad; Kanaly, Suzanne; Mahrt, Charles R; Rudmann, Dan; Charles, Elaine; Young, G David

    2016-08-01

    Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist's assessment will include automated analyses (i.e., "digital pathology") to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists. © 2016 by The Author(s) 2016.

  17. Dissecting spatiotemporal biomass accumulation in barley under different water regimes using high-throughput image analysis.

    Science.gov (United States)

    Neumann, Kerstin; Klukas, Christian; Friedel, Swetlana; Rischbeck, Pablo; Chen, Dijun; Entzian, Alexander; Stein, Nils; Graner, Andreas; Kilian, Benjamin

    2015-10-01

    Phenotyping large numbers of genotypes still represents the rate-limiting step in many plant genetic experiments and in breeding. To address this issue, novel automated phenotyping technologies have been developed. We investigated for a core set of barley cultivars if high-throughput image analysis can help to dissect vegetative biomass accumulation in response to two different watering regimes under semi-controlled greenhouse conditions. We found that experiments, treatments, genotypes and genotype by environment interaction (G × E) can be characterized at any time point by certain digital traits. Biomass accumulation under control and stress conditions was highly heritable. Growth model-derived maximum vegetative biomass (K max), inflection point (I) and regrowth rate (k) were identified as promising candidate traits for genome-wide association studies. Drought stress symptoms can be visualized, dissected and modelled. Especially the highly heritable regrowth rate, which had the biggest influence on biomass accumulation in stress treatment, seems promising for future studies to improve drought tolerance in different crop species. A proof of concept study revealed potential correlations between digital traits obtained from pot experiments under greenhouse conditions and agronomic traits from field experiments. Overall, non-invasive, imaging-based phenotyping platforms under greenhouse conditions offer excellent possibilities for trait discovery, trait development and industrial applications. © 2015 John Wiley & Sons Ltd.

  18. A Data Analysis Pipeline Accounting for Artifacts in Tox21 Quantitative High-Throughput Screening Assays.

    Science.gov (United States)

    Hsieh, Jui-Hua; Sedykh, Alexander; Huang, Ruili; Xia, Menghang; Tice, Raymond R

    2015-08-01

    A main goal of the U.S. Tox21 program is to profile a 10K-compound library for activity against a panel of stress-related and nuclear receptor signaling pathway assays using a quantitative high-throughput screening (qHTS) approach. However, assay artifacts, including nonreproducible signals and assay interference (e.g., autofluorescence), complicate compound activity interpretation. To address these issues, we have developed a data analysis pipeline that includes an updated signal noise-filtering/curation protocol and an assay interference flagging system. To better characterize various types of signals, we adopted a weighted version of the area under the curve (wAUC) to quantify the amount of activity across the tested concentration range in combination with the assay-dependent point-of-departure (POD) concentration. Based on the 32 Tox21 qHTS assays analyzed, we demonstrate that signal profiling using wAUC affords the best reproducibility (Pearson's r = 0.91) in comparison with the POD (0.82) only or the AC(50) (i.e., half-maximal activity concentration, 0.81). Among the activity artifacts characterized, cytotoxicity is the major confounding factor; on average, about 8% of Tox21 compounds are affected, whereas autofluorescence affects less than 0.5%. To facilitate data evaluation, we implemented two graphical user interface applications, allowing users to rapidly evaluate the in vitro activity of Tox21 compounds. © 2015 Society for Laboratory Automation and Screening.

  19. A High-Throughput Method for Measuring Drug Residence Time Using the Transcreener ADP Assay.

    Science.gov (United States)

    Kumar, Meera; Lowery, Robert G

    2017-08-01

    Analysis of drug-target residence times during drug development can result in improved efficacy, increased therapeutic window, and reduced side effects. Residence time can be estimated as the reciprocal of the dissociation rate ( k off ) of an inhibitor from its target. The traditional methods for measuring k off require synthesis of labeled ligands or low-throughput label-free methods. To provide an alternative that is better suited to an automated high-throughput screening (HTS) environment, we adapted a classic "jump dilution" catalytic assay method for determination of k off values for kinase inhibitor drugs. We used the Transcreener ADP 2 Kinase assay as a universal, homogenous method to monitor the recovery of kinase activity as the drugs dissociated from preformed inhibitor-kinase complexes. We measured residence times for several drugs that bind the epidermal growth factor receptor (EGFR), ABL1, and Aurora kinases and found that the rank ordering of inhibitor k off values correlated with literature values determined using ligand binding assays. Moreover, very similar results were obtained using the Transcreener assay with fluorescence polarization (FP), fluorescence intensity (FI), and time-resolved Förster resonance energy transfer (TR-FRET) detection modes. This HTS-compatible, generic assay method should facilitate the use of residence time as a parameter for compound prioritization and optimization early in kinase drug discovery programs.

  20. Quantifying co-cultured cell phenotypes in high-throughput using pixel-based classification.

    Science.gov (United States)

    Logan, David J; Shan, Jing; Bhatia, Sangeeta N; Carpenter, Anne E

    2016-03-01

    Biologists increasingly use co-culture systems in which two or more cell types are grown in cell culture together in order to better model cells' native microenvironments. Co-cultures are often required for cell survival or proliferation, or to maintain physiological functioning in vitro. Having two cell types co-exist in culture, however, poses several challenges, including difficulties distinguishing the two populations during analysis using automated image analysis algorithms. We previously analyzed co-cultured primary human hepatocytes and mouse fibroblasts in a high-throughput image-based chemical screen, using a combination of segmentation, measurement, and subsequent machine learning to score each cell as hepatocyte or fibroblast. While this approach was successful in counting hepatocytes for primary screening, segmentation of the fibroblast nuclei was less accurate. Here, we present an improved approach that more accurately identifies both cell types. Pixel-based machine learning (using the software ilastik) is used to seed segmentation of each cell type individually (using the software CellProfiler). This streamlined and accurate workflow can be carried out using freely available and open source software. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. High-throughput microfluidics and ultrafast optics for in vivo compound/genetic discoveries

    Science.gov (United States)

    Rohde, Christopher B.; Gilleland, Cody; Samara, Chrysanthi; Yanik, M. Fatih

    2010-02-01

    Therapeutic treatment of spinal cord injuries, brain trauma, stroke, and neurodegenerative diseases will greatly benefit from the discovery of compounds that enhance neuronal regeneration following injury. We previously demonstrated the use of femtosecond laser microsurgery to induce precise and reproducible neural injury in C. elegans, and have developed microfluidic on-chip technologies that allow automated and rapid manipulation, orientation, and non-invasive immobilization of animals for sub-cellular resolution two-photon imaging and femtosecond-laser nanosurgery. These technologies include microfluidic whole-animal sorters, as well as integrated chips containing multiple addressable incubation chambers for exposure of individual animals to compounds and sub-cellular time-lapse imaging of hundreds of animals on a single chip. Our technologies can be used for a variety of highly sophisticated in vivo high-throughput compound and genetic screens, and we performed the first in vivo screen in C. elegans for compounds enhancing neuronal regrowth following femtosecond microsurgery. The compounds identified interact with a wide variety of cellular targets, such as cytoskeletal components, vesicle trafficking, and protein kinases that enhance neuronal regeneration.

  2. High-throughput single-molecule force spectroscopy for membrane proteins

    International Nuclear Information System (INIS)

    Bosshart, Patrick D; Casagrande, Fabio; Frederix, Patrick L T M; Engel, Andreas; Fotiadis, Dimitrios; Ratera, Merce; Palacin, Manuel; Bippes, Christian A; Mueller, Daniel J

    2008-01-01

    Atomic force microscopy-based single-molecule force spectroscopy (SMFS) is a powerful tool for studying the mechanical properties, intermolecular and intramolecular interactions, unfolding pathways, and energy landscapes of membrane proteins. One limiting factor for the large-scale applicability of SMFS on membrane proteins is its low efficiency in data acquisition. We have developed a semi-automated high-throughput SMFS (HT-SMFS) procedure for efficient data acquisition. In addition, we present a coarse filter to efficiently extract protein unfolding events from large data sets. The HT-SMFS procedure and the coarse filter were validated using the proton pump bacteriorhodopsin (BR) from Halobacterium salinarum and the L-arginine/agmatine antiporter AdiC from the bacterium Escherichia coli. To screen for molecular interactions between AdiC and its substrates, we recorded data sets in the absence and in the presence of L-arginine, D-arginine, and agmatine. Altogether ∼400 000 force-distance curves were recorded. Application of coarse filtering to this wealth of data yielded six data sets with ∼200 (AdiC) and ∼400 (BR) force-distance spectra in each. Importantly, the raw data for most of these data sets were acquired in one to two days, opening new perspectives for HT-SMFS applications

  3. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    Energy Technology Data Exchange (ETDEWEB)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan [Department of Medicine, Demirci Bio-Acoustic-MEMS in Medicine (BAMM) Laboratory, Center for Biomedical Engineering, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Durmus, Naside Gozde, E-mail: udemirci@rics.bwh.harvard.edu [School of Engineering and Division of Biology and Medicine, Brown University, Providence, RI (United States)

    2011-09-15

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  4. High Throughput Method to Quantify Anterior-Posterior Polarity of T-Cells and Epithelial Cells

    Directory of Open Access Journals (Sweden)

    Susan J. Marriott

    2011-11-01

    Full Text Available The virologic synapse (VS, which is formed between a virus-infected and uninfected cell, plays a central role in the transmission of certain viruses, such as HIV and HTLV-1. During VS formation, HTLV-1-infected T-cells polarize cellular and viral proteins toward the uninfected T-cell. This polarization resembles anterior-posterior cell polarity induced by immunological synapse (IS formation, which is more extensively characterized than VS formation and occurs when a T-cell interacts with an antigen-presenting cell. One measure of cell polarity induced by both IS or VS formation is the repositioning of the microtubule organizing center (MTOC relative to the contact point with the interacting cell. Here we describe an automated, high throughput system to score repositioning of the MTOC and thereby cell polarity establishment. The method rapidly and accurately calculates the angle between the MTOC and the IS for thousands of cells. We also show that the system can be adapted to score anterior-posterior polarity establishment of epithelial cells. This general approach represents a significant advancement over manual cell polarity scoring, which is subject to experimenter bias and requires more time and effort to evaluate large numbers of cells.

  5. ESSENTIALS: software for rapid analysis of high throughput transposon insertion sequencing data.

    Directory of Open Access Journals (Sweden)

    Aldert Zomer

    Full Text Available High-throughput analysis of genome-wide random transposon mutant libraries is a powerful tool for (conditional essential gene discovery. Recently, several next-generation sequencing approaches, e.g. Tn-seq/INseq, HITS and TraDIS, have been developed that accurately map the site of transposon insertions by mutant-specific amplification and sequence readout of DNA flanking the transposon insertions site, assigning a measure of essentiality based on the number of reads per insertion site flanking sequence or per gene. However, analysis of these large and complex datasets is hampered by the lack of an easy to use and automated tool for transposon insertion sequencing data. To fill this gap, we developed ESSENTIALS, an open source, web-based software tool for researchers in the genomics field utilizing transposon insertion sequencing analysis. It accurately predicts (conditionally essential genes and offers the flexibility of using different sample normalization methods, genomic location bias correction, data preprocessing steps, appropriate statistical tests and various visualizations to examine the results, while requiring only a minimum of input and hands-on work from the researcher. We successfully applied ESSENTIALS to in-house and published Tn-seq, TraDIS and HITS datasets and we show that the various pre- and post-processing steps on the sequence reads and count data with ESSENTIALS considerably improve the sensitivity and specificity of predicted gene essentiality.

  6. A high throughput system for the preparation of single stranded templates grown in microculture.

    Science.gov (United States)

    Kolner, D E; Guilfoyle, R A; Smith, L M

    1994-01-01

    A high throughput system for the preparation of single stranded M13 sequencing templates is described. Supernatants from clones grown in 48-well plates are treated with a chaotropic agent to dissociate the phage coat protein. Using a semi-automated cell harvester, the free nucleic acid is bound to a glass fiber filter in the presence of chaotrope and then washed with ethanol by aspiration. Individual glass fiber discs are punched out on the cell harvester and dried briefly. The DNA samples are then eluted in water by centrifugation. The processing time from 96 microcultures to sequence quality templates is approximately 1 hr. Assuming the ability to sequence 400 bases per clone, a 0.5 megabase per day genome sequencing facility will require 6250 purified templates a week. Toward accomplishing this goal we have developed a procedure which is a modification of a method that uses a chaotropic agent and glass fiber filter (Kristensen et al., 1987). By exploiting the ability of a cell harvester to uniformly aspirate and wash 96 samples, a rapid system for high quality template preparation has been developed. Other semi-automated systems for template preparation have been developed using commercially available robotic workstations like the Biomek (Mardis and Roe, 1989). Although minimal human intervention is required, processing time is at least twice as long. Custom systems based on paramagnetic beads (Hawkins et al., 1992) produce DNA in insufficient quantity for direct sequencing and therefore require cycle sequencing. These systems require custom programing, have a fairly high initial cost and have not proven to be as fast as the method reported here.

  7. GiA Roots: software for the high throughput analysis of plant root system architecture

    Science.gov (United States)

    2012-01-01

    Background Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. Results We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. Conclusions We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis. PMID:22834569

  8. High-throughput metal susceptibility testing of microbial biofilms.

    Science.gov (United States)

    Harrison, Joe J; Turner, Raymond J; Ceri, Howard

    2005-10-03

    Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO3(2-)) than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic cell E. coli JM109 to metals was time

  9. Advanced high throughput MOX fuel fabrication technology and sustainable development

    International Nuclear Information System (INIS)

    Krellmann, Juergen

    2005-01-01

    The MELOX plant in the south of France together with the La Hague reprocessing plant, are part of the two industrial facilities in charge of closing the nuclear fuel cycle in France. Started up in 1995, MELOX has since accumulated a solid know-how in recycling plutonium recovered from spent uranium fuel into MOX: a fuel blend comprised of both uranium and plutonium oxides. Converting recovered Pu into a proliferation-resistant material that can readily be used to power a civil nuclear reactor, MOX fabrication offers a sustainable solution to safely take advantage of the plutonium's high energy content. Being the first large-capacity industrial facility dedicated to MOX fuel fabrication, MELOX distinguishes itself from the first generation MOX plants with high capacity (around 200 tHM versus around 40 tHM) and several unique operational features designed to improve productivity, reliability and flexibility while maintaining high safety standards. Providing an exemplary reference for high throughput MOX fabrication with 1,000 tHM produced since start-up, the unique process and technologies implemented at MELOX are currently inspiring other MOX plant construction projects (in Japan with the J-MOX plant, in the US and in Russia as part of the weapon-grade plutonium inventory reduction). Spurred by the growing international demand, MELOX has embarked upon an ambitious production development and diversification plan. Starting from an annual level of 100 tons of heavy metal (tHM), MELOX demonstrated production capacity is continuously increasing: MELOX is now aiming for a minimum of 140 tHM by the end of 2005, with the ultimate ambition of reaching the full capacity of the plant (around 200 tHM) in the near future. With regards to its activity, MELOX also remains deeply committed to sustainable development in a consolidated involvement within AREVA group. The French minister of Industry, on August 26th 2005, acknowledged the benefits of MOX fuel production at MELOX: 'In

  10. High Throughput, Real-time, Dual-readout Testing of Intracellular Antimicrobial Activity and Eukaryotic Cell Cytotoxicity.

    Science.gov (United States)

    Chiaraviglio, Lucius; Kang, Yoon-Suk; Kirby, James E

    2016-11-16

    Traditional measures of intracellular antimicrobial activity and eukaryotic cell cytotoxicity rely on endpoint assays. Such endpoint assays require several additional experimental steps prior to readout, such as cell lysis, colony forming unit determination, or reagent addition. When performing thousands of assays, for example, during high-throughput screening, the downstream effort required for these types of assays is considerable. Therefore, to facilitate high-throughput antimicrobial discovery, we developed a real-time assay to simultaneously identify inhibitors of intracellular bacterial growth and assess eukaryotic cell cytotoxicity. Specifically, real-time intracellular bacterial growth detection was enabled by marking bacterial screening strains with either a bacterial lux operon (1 st generation assay) or fluorescent protein reporters (2 nd generation, orthogonal assay). A non-toxic, cell membrane-impermeant, nucleic acid-binding dye was also added during initial infection of macrophages. These dyes are excluded from viable cells. However, non-viable host cells lose membrane integrity permitting entry and fluorescent labeling of nuclear DNA (deoxyribonucleic acid). Notably, DNA binding is associated with a large increase in fluorescent quantum yield that provides a solution-based readout of host cell death. We have used this combined assay to perform a high-throughput screen in microplate format, and to assess intracellular growth and cytotoxicity by microscopy. Notably, antimicrobials may demonstrate synergy in which the combined effect of two or more antimicrobials when applied together is greater than when applied separately. Testing for in vitro synergy against intracellular pathogens is normally a prodigious task as combinatorial permutations of antibiotics at different concentrations must be assessed. However, we found that our real-time assay combined with automated, digital dispensing technology permitted facile synergy testing. Using these

  11. High-throughput detection of prostate cancer in histological sections using probabilistic pairwise Markov models.

    Science.gov (United States)

    Monaco, James P; Tomaszewski, John E; Feldman, Michael D; Hagemann, Ian; Moradi, Mehdi; Mousavi, Parvin; Boag, Alexander; Davidson, Chris; Abolmaesumi, Purang; Madabhushi, Anant

    2010-08-01

    In this paper we present a high-throughput system for detecting regions of carcinoma of the prostate (CaP) in HSs from radical prostatectomies (RPs) using probabilistic pairwise Markov models (PPMMs), a novel type of Markov random field (MRF). At diagnostic resolution a digitized HS can contain 80Kx70K pixels - far too many for current automated Gleason grading algorithms to process. However, grading can be separated into two distinct steps: (1) detecting cancerous regions and (2) then grading these regions. The detection step does not require diagnostic resolution and can be performed much more quickly. Thus, we introduce a CaP detection system capable of analyzing an entire digitized whole-mount HS (2x1.75cm(2)) in under three minutes (on a desktop computer) while achieving a CaP detection sensitivity and specificity of 0.87 and 0.90, respectively. We obtain this high-throughput by tailoring the system to analyze the HSs at low resolution (8microm per pixel). This motivates the following algorithm: (Step 1) glands are segmented, (Step 2) the segmented glands are classified as malignant or benign, and (Step 3) the malignant glands are consolidated into continuous regions. The classification of individual glands leverages two features: gland size and the tendency for proximate glands to share the same class. The latter feature describes a spatial dependency which we model using a Markov prior. Typically, Markov priors are expressed as the product of potential functions. Unfortunately, potential functions are mathematical abstractions, and constructing priors through their selection becomes an ad hoc procedure, resulting in simplistic models such as the Potts. Addressing this problem, we introduce PPMMs which formulate priors in terms of probability density functions, allowing the creation of more sophisticated models. To demonstrate the efficacy of our CaP detection system and assess the advantages of using a PPMM prior instead of the Potts, we alternately

  12. FTIR spectroscopy as a unified method for simultaneous analysis of intra- and extracellular metabolites in high-throughput screening of microbial bioprocesses.

    Science.gov (United States)

    Kosa, Gergely; Shapaval, Volha; Kohler, Achim; Zimmermann, Boris

    2017-11-13

    Analyses of substrate and metabolites are often bottleneck activities in high-throughput screening of microbial bioprocesses. We have assessed Fourier transform infrared spectroscopy (FTIR), in combination with high throughput micro-bioreactors and multivariate statistical analyses, for analysis of metabolites in high-throughput screening of microbial bioprocesses. In our previous study, we have demonstrated that high-throughput (HTS) FTIR can be used for estimating content and composition of intracellular metabolites, namely triglyceride accumulation in oleaginous filamentous fungi. As a continuation of that research, in the present study HTS FTIR was evaluated as a unified method for simultaneous quantification of intra- and extracellular metabolites and substrate consumption. As a proof of concept, a high-throughput microcultivation of oleaginous filamentous fungi was conducted in order to monitor production of citric acid (extracellular metabolite) and triglyceride lipids (intracellular metabolites), as well as consumption of glucose in the cultivation medium. HTS FTIR analyses of supernatant samples was compared with an attenuated total reflection (ATR) FTIR, which is an established method for bioprocess monitoring. Glucose and citric acid content of growth media was quantified by high performance liquid chromatography (HPLC). Partial least square regression (PLSR) between HPLC glucose and citric acid data and the corresponding FTIR spectral data was used to set up calibration models. PLSR results for HTS measurements were very similar to the results obtained with ATR methodology, with high coefficients of determination (0.91-0.98) and low error values (4.9-8.6%) for both glucose and citric acid estimates. The study has demonstrated that intra- and extracellular metabolites, as well as nutrients in the cultivation medium, can be monitored by a unified approach by HTS FTIR. The proof-of-concept study has validated that HTS FTIR, in combination with Duetz

  13. Rhizoslides: paper-based growth system for non-destructive, high throughput phenotyping of root development by means of image analysis.

    Science.gov (United States)

    Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas

    2014-01-01

    and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.

  14. High Resolution Melting (HRM for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Directory of Open Access Journals (Sweden)

    Marcin Słomka

    2017-11-01

    Full Text Available High resolution melting (HRM is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs. This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  15. Development and Applications of a High Throughput Genotyping Tool for Polyploid Crops: Single Nucleotide Polymorphism (SNP) Array

    Science.gov (United States)

    You, Qian; Yang, Xiping; Peng, Ze; Xu, Liping; Wang, Jianping

    2018-01-01

    Polypoid species play significant roles in agriculture and food production. Many crop species are polyploid, such as potato, wheat, strawberry, and sugarcane. Genotyping has been a daunting task for genetic studies of polyploid crops, which lags far behind the diploid crop species. Single nucleotide polymorphism (SNP) array is considered to be one of, high-throughput, relatively cost-efficient and automated genotyping approaches. However, there are significant challenges for SNP identification in complex, polyploid genomes, which has seriously slowed SNP discovery and array development in polyploid species. Ploidy is a significant factor impacting SNP qualities and validation rates of SNP markers in SNP arrays, which has been proven to be a very important tool for genetic studies and molecular breeding. In this review, we (1) discussed the pros and cons of SNP array in general for high throughput genotyping, (2) presented the challenges of and solutions to SNP calling in polyploid species, (3) summarized the SNP selection criteria and considerations of SNP array design for polyploid species, (4) illustrated SNP array applications in several different polyploid crop species, then (5) discussed challenges, available software, and their accuracy comparisons for genotype calling based on SNP array data in polyploids, and finally (6) provided a series of SNP array design and genotype calling recommendations. This review presents a complete overview of SNP array development and applications in polypoid crops, which will benefit the research in molecular breeding and genetics of crops with complex genomes. PMID:29467780

  16. A workflow to increase verification rate of chromosomal structural rearrangements using high-throughput next-generation sequencing.

    Science.gov (United States)

    Quek, Kelly; Nones, Katia; Patch, Ann-Marie; Fink, J Lynn; Newell, Felicity; Cloonan, Nicole; Miller, David; Fadlullah, Muhammad Z H; Kassahn, Karin; Christ, Angelika N; Bruxner, Timothy J C; Manning, Suzanne; Harliwong, Ivon; Idrisoglu, Senel; Nourse, Craig; Nourbakhsh, Ehsan; Wani, Shivangi; Steptoe, Anita; Anderson, Matthew; Holmes, Oliver; Leonard, Conrad; Taylor, Darrin; Wood, Scott; Xu, Qinying; Wilson, Peter; Biankin, Andrew V; Pearson, John V; Waddell, Nic; Grimmond, Sean M

    2014-07-01

    Somatic rearrangements, which are commonly found in human cancer genomes, contribute to the progression and maintenance of cancers. Conventionally, the verification of somatic rearrangements comprises many manual steps and Sanger sequencing. This is labor intensive when verifying a large number of rearrangements in a large cohort. To increase the verification throughput, we devised a high-throughput workflow that utilizes benchtop next-generation sequencing and in-house bioinformatics tools to link the laboratory processes. In the proposed workflow, primers are automatically designed. PCR and an optional gel electrophoresis step to confirm the somatic nature of the rearrangements are performed. PCR products of somatic events are pooled for Ion Torrent PGM and/or Illumina MiSeq sequencing, the resulting sequence reads are assembled into consensus contigs by a consensus assembler, and an automated BLAT is used to resolve the breakpoints to base level. We compared sequences and breakpoints of verified somatic rearrangements between the conventional and high-throughput workflow. The results showed that next-generation sequencing methods are comparable to conventional Sanger sequencing. The identified breakpoints obtained from next-generation sequencing methods were highly accurate and reproducible. Furthermore, the proposed workflow allows hundreds of events to be processed in a shorter time frame compared with the conventional workflow.

  17. High-Throughput Screening Platform for the Discovery of New Immunomodulator Molecules from Natural Product Extract Libraries.

    Science.gov (United States)

    Pérez Del Palacio, José; Díaz, Caridad; de la Cruz, Mercedes; Annang, Frederick; Martín, Jesús; Pérez-Victoria, Ignacio; González-Menéndez, Víctor; de Pedro, Nuria; Tormo, José R; Algieri, Francesca; Rodriguez-Nogales, Alba; Rodríguez-Cabezas, M Elena; Reyes, Fernando; Genilloud, Olga; Vicente, Francisca; Gálvez, Julio

    2016-07-01

    It is widely accepted that central nervous system inflammation and systemic inflammation play a significant role in the progression of chronic neurodegenerative diseases such as Alzheimer's disease and Parkinson's disease, neurotropic viral infections, stroke, paraneoplastic disorders, traumatic brain injury, and multiple sclerosis. Therefore, it seems reasonable to propose that the use of anti-inflammatory drugs might diminish the cumulative effects of inflammation. Indeed, some epidemiological studies suggest that sustained use of anti-inflammatory drugs may prevent or slow down the progression of neurodegenerative diseases. However, the anti-inflammatory drugs and biologics used clinically have the disadvantage of causing side effects and a high cost of treatment. Alternatively, natural products offer great potential for the identification and development of bioactive lead compounds into drugs for treating inflammatory diseases with an improved safety profile. In this work, we present a validated high-throughput screening approach in 96-well plate format for the discovery of new molecules with anti-inflammatory/immunomodulatory activity. The in vitro models are based on the quantitation of nitrite levels in RAW264.7 murine macrophages and interleukin-8 in Caco-2 cells. We have used this platform in a pilot project to screen a subset of 5976 noncytotoxic crude microbial extracts from the MEDINA microbial natural product collection. To our knowledge, this is the first report on an high-throughput screening of microbial natural product extracts for the discovery of immunomodulators. © 2016 Society for Laboratory Automation and Screening.

  18. Ultra-High-Throughput Screening of Natural Product Extracts to Identify Proapoptotic Inhibitors of Bcl-2 Family Proteins.

    Science.gov (United States)

    Hassig, Christian A; Zeng, Fu-Yue; Kung, Paul; Kiankarimi, Mehrak; Kim, Sylvia; Diaz, Paul W; Zhai, Dayong; Welsh, Kate; Morshedian, Shana; Su, Ying; O'Keefe, Barry; Newman, David J; Rusman, Yudi; Kaur, Harneet; Salomon, Christine E; Brown, Susan G; Baire, Beeraiah; Michel, Andrew R; Hoye, Thomas R; Francis, Subhashree; Georg, Gunda I; Walters, Michael A; Divlianska, Daniela B; Roth, Gregory P; Wright, Amy E; Reed, John C

    2014-09-01

    Antiapoptotic Bcl-2 family proteins are validated cancer targets composed of six related proteins. From a drug discovery perspective, these are challenging targets that exert their cellular functions through protein-protein interactions (PPIs). Although several isoform-selective inhibitors have been developed using structure-based design or high-throughput screening (HTS) of synthetic chemical libraries, no large-scale screen of natural product collections has been reported. A competitive displacement fluorescence polarization (FP) screen of nearly 150,000 natural product extracts was conducted against all six antiapoptotic Bcl-2 family proteins using fluorochrome-conjugated peptide ligands that mimic functionally relevant PPIs. The screens were conducted in 1536-well format and displayed satisfactory overall HTS statistics, with Z'-factor values ranging from 0.72 to 0.83 and a hit confirmation rate between 16% and 64%. Confirmed active extracts were orthogonally tested in a luminescent assay for caspase-3/7 activation in tumor cells. Active extracts were resupplied, and effort toward the isolation of pure active components was initiated through iterative bioassay-guided fractionation. Several previously described altertoxins were isolated from a microbial source, and the pure compounds demonstrate activity in both Bcl-2 FP and caspase cellular assays. The studies demonstrate the feasibility of ultra-high-throughput screening using natural product sources and highlight some of the challenges associated with this approach. © 2014 Society for Laboratory Automation and Screening.

  19. High-Throughput Silencing Using the CRISPR-Cas9 System: A Review of the Benefits and Challenges.

    Science.gov (United States)

    Wade, Mark

    2015-09-01

    The clustered regularly interspaced short palindromic repeats (CRISPR)/Cas system has been seized upon with a fervor enjoyed previously by small interfering RNA (siRNA) and short hairpin RNA (shRNA) technologies and has enormous potential for high-throughput functional genomics studies. The decision to use this approach must be balanced with respect to adoption of existing platforms versus awaiting the development of more "mature" next-generation systems. Here, experience from siRNA and shRNA screening plays an important role, as issues such as targeting efficiency, pooling strategies, and off-target effects with those technologies are already framing debates in the CRISPR field. CRISPR/Cas can be exploited not only to knockout genes but also to up- or down-regulate gene transcription-in some cases in a multiplex fashion. This provides a powerful tool for studying the interaction among multiple signaling cascades in the same genetic background. Furthermore, the documented success of CRISPR/Cas-mediated gene correction (or the corollary, introduction of disease-specific mutations) provides proof of concept for the rapid generation of isogenic cell lines for high-throughput screening. In this review, the advantages and limitations of CRISPR/Cas are discussed and current and future applications are highlighted. It is envisaged that complementarities between CRISPR, siRNA, and shRNA will ensure that all three technologies remain critical to the success of future functional genomics projects. © 2015 Society for Laboratory Automation and Screening.

  20. RGB picture vegetation indexes for High-Throughput Phenotyping Platforms (HTPPs)

    Science.gov (United States)

    Kefauver, Shawn C.; El-Haddad, George; Vergara-Diaz, Omar; Araus, José Luis

    2015-10-01

    Extreme and abnormal weather events, as well as the more gradual meteorological changes associated with climate change, often coincide with not only increased abiotic risks (such as increases in temperature and decreases in precipitation), but also increased biotic risks due to environmental conditions that favor the rapid spread of crop pests and diseases. Durum wheat is by extension the most cultivated cereal in the south and east margins of the Mediterranean Basin. It is of strategic importance for Mediterranean agriculture to develop new varieties of durum wheat with greater production potential, better adaptation to increasingly adverse environmental conditions (drought) and better grain quality. Similarly, maize is the top staple crop for low-income populations in Sub-Saharan Africa and is currently suffering from the appearance of new diseases, which, together with increased abiotic stresses from climate change, are challenging the very sustainability of African societies. Current constraints in field phenotyping remain a major bottleneck for future breeding advances, but RGB-based High-Throughput Phenotyping Platforms (HTPPs) have shown promise for rapidly developing both disease-resistant and weather-resilient crops. RGB cameras have proven costeffective in studies assessing the effect of abiotic stresses, but have yet to be fully exploited to phenotype disease resistance. Recent analyses of durum wheat in Spain have shown RGB vegetation indexes to outperform multispectral indexes such as NDVI consistently in disease and yield prediction. Towards HTTP development for breeding maize disease resistance, some of the same RGB picture vegetation indexes outperformed NDVI (Normalized Difference Vegetation Index), with R2 values up to 0.65, compared to 0.56 for NDVI. . Specifically, hue, a*, u*, and Green Area (GA), as produced by FIJI and BreedPix open source software, performed similar to or better than NDVI in predicting yield and disease severity conditions

  1. Advancing gut microbiome research using cultivation

    DEFF Research Database (Denmark)

    Sommer, Morten OA

    2015-01-01

    Culture-independent approaches have driven the field of microbiome research and illuminated intricate relationships between the gut microbiota and human health. However, definitively associating phenotypes to specific strains or elucidating physiological interactions is challenging for metagenomic...... approaches. Recently a number of new approaches to gut microbiota cultivation have emerged through the integration of high-throughput phylogenetic mapping and new simplified cultivation methods. These methodologies are described along with their potential use within microbiome research. Deployment of novel...

  2. A modular segmented-flow platform for 3D cell cultivation.

    Science.gov (United States)

    Lemke, Karen; Förster, Tobias; Römer, Robert; Quade, Mandy; Wiedemeier, Stefan; Grodrian, Andreas; Gastrock, Gunter

    2015-07-10

    In vitro 3D cell cultivation is promised to equate tissue in vivo more realistically than 2D cell cultivation corresponding to cell-cell and cell-matrix interactions. Therefore, a scalable 3D cultivation platform was developed. This platform, called pipe-based bioreactors (pbb), is based on the segmented-flow technology: aqueous droplets are embedded in a water-immiscible carrier fluid. The droplet volumes range from 60 nL to 20 μL and are used as bioreactors lined up in a tubing like pearls on a string. The modular automated platform basically consists of several modules like a fluid management for a high throughput droplet generation for self-assembly or scaffold-based 3D cell cultivation, a storage module for incubation and storage, and an analysis module for monitoring cell aggregation and proliferation basing on microscopy or photometry. In this report, the self-assembly of murine embryonic stem cells (mESCs) to uniformly sized embryoid bodies (EBs), the cell proliferation, the cell viability as well as the influence on the cell differentiation to cardiomyocytes are described. The integration of a dosage module for medium exchange or agent addition will enable pbb as long-term 3D cell cultivation system for studying stem cell differentiation, e.g. cardiac myogenesis or for diagnostic and therapeutic testing in personalized medicine. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. A High Throughput Workflow Environment for Cosmological Simulations

    Science.gov (United States)

    Brandon, Erickson; Evrard, A. E.; Singh, R.; Marru, S.; Pierce, M.; Becker, M. R.; Kravtsov, A.; Busha, M. T.; Wechsler, R. H.; Ricker, P. M.; DES Simulations Working Group

    2013-01-01

    The Simulation Working Group (SimWG) of the Dark Energy Survey (DES) is collaborating with an XSEDE science gateway team to develop a distributed workflow management layer for the production of wide-area synthetic galaxy catalogs from large N-body simulations. We use the suite of tools in Airavata, an Apache Incubator project, to generate and archive multiple 10^10-particle N-body simulations of nested volumes on XSEDE supercomputers. Lightcone outputs are moved via Globus Online to SLAC, where they are transformed into multi-band, catalog-level descriptions of gravitationally lensed galaxies covering 10,000 sq deg to high redshift. We outline the method and discuss efficiency and provenance improvements brought about in N-body production. Plans to automate data movement and post-processing within the workflow are sketched, as are risks associated with working in an environment of constantly evolving services.

  4. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  5. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3.04 "Propulsion Systems," Busek Co. Inc. will develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  6. High-throughput sequencing enhanced phage display identifies peptides that bind mycobacteria

    CSIR Research Space (South Africa)

    Ngubane, NAC

    2013-11-01

    Full Text Available these clones using both random clone picking and high throughput sequencing. We demonstrate that random clone picking does not necessarily identify highly enriched clones. We further showed that the clone displaying the CPLHARLPC peptide which was identified...

  7. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3-04 "Propulsion Systems," Busek proposes to develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  8. [HTRF-based high-throughput PGE2 release prohibition model and application in discovering traditional Chinese medicine active ingredients].

    Science.gov (United States)

    Bai, Zhi-Ru; Fei, Hong-Qiang; Li, Na; Cao, Liang; Zhang, Chen-Feng; Wang, Tuan-Jie; Ding, Gang; Wang, Zhen-Zhong; Xiao, Wei

    2016-02-01

    Prostaglandin (PG) E2 is an active substance in pathological and physiological mechanisms, such as inflammation and pain. The in vitro high-throughput assay for screening the inhibitors of reducing PEG2 production is a useful method for finding out antiphlogistic and analgesic candidates. The assay was based on LPS-induced PGE2 production model using a homogeneous time-resolved fluorescence(HTRF) PGE2 testing kit combined with liquid handling automation and detection instruments. The critical steps, including the cell density optimization and IC50 values determination of a positive compound, were taken to verify the stability and sensibility of the assay. Low intra-plate, inter-plate and day-to-day variability were observed in this 384-well, high-throughput format assay. Totally 5 121 samples were selected from the company's traditional Chinese medicine(TCM) material base library and used to screen PGE2 inhibitors. In this model, the cell plating density was 2 000 cells for each well; the average IC₅₀ value for positive compounds was (7.3±0.1) μmol; the Z' factor for test plates was more than 0.5 and averaged at 0.7. Among the 5 121 samples, 228 components exhibited a PGE2 production prohibition rate of more than 50%, and 23 components exhibited more than 80%. This model reached the expected standards in data stability and accuracy, indicating the reliability and authenticity of the screening results. The automated screening system was introduced to make the model fast and efficient, with a average daily screening amount exceeding 14 000 data points and provide a new model for discovering new anti-inflammatory and analgesic drug and quickly screening effective constituents of TCM in the early stage. Copyright© by the Chinese Pharmaceutical Association.

  9. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    Directory of Open Access Journals (Sweden)

    Salvo-Chirnside Eliane

    2011-12-01

    Full Text Available Abstract The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue. The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates can easily be fully processed (samples homogenised, RNA purified and quantified in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.

  10. Application of high-throughput sequencing in understanding human oral microbiome related with health and disease

    OpenAIRE

    Chen, Hui; Jiang, Wen

    2014-01-01

    The oral microbiome is one of most diversity habitat in the human body and they are closely related with oral health and disease. As the technique developing,, high throughput sequencing has become a popular approach applied for oral microbial analysis. Oral bacterial profiles have been studied to explore the relationship between microbial diversity and oral diseases such as caries and periodontal disease. This review describes the application of high-throughput sequencing for characterizati...

  11. High-throughput screening of antagonists for the orphan G-protein coupled receptor GPR139.

    Science.gov (United States)

    Wang, Jia; Zhu, Lin-yun; Liu, Qing; Hentzer, Morten; Smith, Garrick Paul; Wang, Ming-wei

    2015-07-01

    To discover antagonists of the orphan G-protein coupled receptor GPR139 through high-throughput screening of a collection of diverse small molecules. Calcium mobilization assays were used to identify initial hits and for subsequent confirmation studies. Five small molecule antagonists, representing 4 different scaffolds, were identified following high-throughput screening of 16 000 synthetic compounds. The findings provide important tools for further study of this orphan G-protein coupled receptor.

  12. Genetic profiles of cervical tumors by high-throughput sequencing for personalized medical care

    International Nuclear Information System (INIS)

    Muller, Etienne; Brault, Baptiste; Holmes, Allyson; Legros, Angelina; Jeannot, Emmanuelle; Campitelli, Maura; Rousselin, Antoine; Goardon, Nicolas; Frébourg, Thierry; Krieger, Sophie; Crouet, Hubert; Nicolas, Alain; Sastre, Xavier; Vaur, Dominique; Castéra, Laurent

    2015-01-01

    Cancer treatment is facing major evolution since the advent of targeted therapies. Building genetic profiles could predict sensitivity or resistance to these therapies and highlight disease-specific abnormalities, supporting personalized patient care. In the context of biomedical research and clinical diagnosis, our laboratory has developed an oncogenic panel comprised of 226 genes and a dedicated bioinformatic pipeline to explore somatic mutations in cervical carcinomas, using high-throughput sequencing. Twenty-nine tumors were sequenced for exons within 226 genes. The automated pipeline used includes a database and a filtration system dedicated to identifying mutations of interest and excluding false positive and germline mutations. One-hundred and seventy-six total mutational events were found among the 29 tumors. Our cervical tumor mutational landscape shows that most mutations are found in PIK3CA (E545K, E542K) and KRAS (G12D, G13D) and others in FBXW7 (R465C, R505G, R479Q). Mutations have also been found in ALK (V1149L, A1266T) and EGFR (T259M). These results showed that 48% of patients display at least one deleterious mutation in genes that have been already targeted by the Food and Drug Administration approved therapies. Considering deleterious mutations, 59% of patients could be eligible for clinical trials. Sequencing hundreds of genes in a clinical context has become feasible, in terms of time and cost. In the near future, such an analysis could be a part of a battery of examinations along the diagnosis and treatment of cancer, helping to detect sensitivity or resistance to targeted therapies and allow advancements towards personalized oncology

  13. A high throughput serum bactericidal assay for antibodies to Haemophilus influenzae type b.

    Science.gov (United States)

    Kim, Han Wool; Kim, Kyung-Hyo; Kim, JiHye; Nahm, Moon H

    2016-09-05

    The protective capacities of antibodies induced with Haemophilus influenzae type b (Hib) vaccines can be directly assessed in vitro with a Hib-specific serum bactericidal assay (SBA). However, the conventional SBA requires several tedious steps including manual counting of bacterial colonies, and therefore, it is seldom used. To overcome these limitations, we have improved the conventional SBA by using frozen target bacteria and by developing an automated colony counting method based on agar plates with the chromogenic dye 2, 3, 5-triphenyl tetrazolium chloride (TTC). These changes enabled us to analyze about 100 serum samples per day per person by SBA. When the intra- and inter-assay precisions were studied, this assay showed a coefficient of variation (CV) ranging from 1 to 38 %. To monitor the long term assay stability for assays involving different bacteria lots, complement lots, and operators, we analyzed bactericidal indices of quality control samples obtained over a 6 year period and found the CV to be about 35-50 %. Lastly, our SBA results were compared with the ELISA results obtained using 90 serum samples from children. We showed that the bactericidal index correlated with IgG anti-Hib antibody levels (r = 0.84), with a bactericidal index of 10 corresponding approximately to 0.15 μg/mL IgG, the widely accepted protective level of antibody. We describe a simple high throughput SBA for anti-Hib antibodies that would be useful for evaluating various Hib vaccines. While additional work will be needed to standardize the assay, this SBA should greatly facilitate studies of Hib vaccines.

  14. Analysis of high-throughput sequencing and annotation strategies for phage genomes.

    Directory of Open Access Journals (Sweden)

    Matthew R Henn

    Full Text Available BACKGROUND: Bacterial viruses (phages play a critical role in shaping microbial populations as they influence both host mortality and horizontal gene transfer. As such, they have a significant impact on local and global ecosystem function and human health. Despite their importance, little is known about the genomic diversity harbored in phages, as methods to capture complete phage genomes have been hampered by the lack of knowledge about the target genomes, and difficulties in generating sufficient quantities of genomic DNA for sequencing. Of the approximately 550 phage genomes currently available in the public domain, fewer than 5% are marine phage. METHODOLOGY/PRINCIPAL FINDINGS: To advance the study of phage biology through comparative genomic approaches we used marine cyanophage as a model system. We compared DNA preparation methodologies (DNA extraction directly from either phage lysates or CsCl purified phage particles, and sequencing strategies that utilize either Sanger sequencing of a linker amplification shotgun library (LASL or of a whole genome shotgun library (WGSL, or 454 pyrosequencing methods. We demonstrate that genomic DNA sample preparation directly from a phage lysate, combined with 454 pyrosequencing, is best suited for phage genome sequencing at scale, as this method is capable of capturing complete continuous genomes with high accuracy. In addition, we describe an automated annotation informatics pipeline that delivers high-quality annotation and yields few false positives and negatives in ORF calling. CONCLUSIONS/SIGNIFICANCE: These DNA preparation, sequencing and annotation strategies enable a high-throughput approach to the burgeoning field of phage genomics.

  15. Estimation of immune cell densities in immune cell conglomerates: an approach for high-throughput quantification.

    Directory of Open Access Journals (Sweden)

    Niels Halama

    2009-11-01

    Full Text Available Determining the correct number of positive immune cells in immunohistological sections of colorectal cancer and other tumor entities is emerging as an important clinical predictor and therapy selector for an individual patient. This task is usually obstructed by cell conglomerates of various sizes. We here show that at least in colorectal cancer the inclusion of immune cell conglomerates is indispensable for estimating reliable patient cell counts. Integrating virtual microscopy and image processing principally allows the high-throughput evaluation of complete tissue slides.For such large-scale systems we demonstrate a robust quantitative image processing algorithm for the reproducible quantification of cell conglomerates on CD3 positive T cells in colorectal cancer. While isolated cells (28 to 80 microm(2 are counted directly, the number of cells contained in a conglomerate is estimated by dividing the area of the conglomerate in thin tissues sections (< or =6 microm by the median area covered by an isolated T cell which we determined as 58 microm(2. We applied our algorithm to large numbers of CD3 positive T cell conglomerates and compared the results to cell counts obtained manually by two independent observers. While especially for high cell counts, the manual counting showed a deviation of up to 400 cells/mm(2 (41% variation, algorithm-determined T cell numbers generally lay in between the manually observed cell numbers but with perfect reproducibility.In summary, we recommend our approach as an objective and robust strategy for quantifying immune cell densities in immunohistological sections which can be directly implemented into automated full slide image processing systems.

  16. High-throughput electrophysiological assays for voltage gated ion channels using SyncroPatch 768PE.

    Directory of Open Access Journals (Sweden)

    Tianbo Li

    Full Text Available Ion channels regulate a variety of physiological processes and represent an important class of drug target. Among the many methods of studying ion channel function, patch clamp electrophysiology is considered the gold standard by providing the ultimate precision and flexibility. However, its utility in ion channel drug discovery is impeded by low throughput. Additionally, characterization of endogenous ion channels in primary cells remains technical challenging. In recent years, many automated patch clamp (APC platforms have been developed to overcome these challenges, albeit with varying throughput, data quality and success rate. In this study, we utilized SyncroPatch 768PE, one of the latest generation APC platforms which conducts parallel recording from two-384 modules with giga-seal data quality, to push these 2 boundaries. By optimizing various cell patching parameters and a two-step voltage protocol, we developed a high throughput APC assay for the voltage-gated sodium channel Nav1.7. By testing a group of Nav1.7 reference compounds' IC50, this assay was proved to be highly consistent with manual patch clamp (R > 0.9. In a pilot screening of 10,000 compounds, the success rate, defined by > 500 MΩ seal resistance and >500 pA peak current, was 79%. The assay was robust with daily throughput ~ 6,000 data points and Z' factor 0.72. Using the same platform, we also successfully recorded endogenous voltage-gated potassium channel Kv1.3 in primary T cells. Together, our data suggest that SyncroPatch 768PE provides a powerful platform for ion channel research and drug discovery.

  17. An Ecometric Study of Recent Microfossils using High-throughput Imaging

    Science.gov (United States)

    Elder, L. E.; Hull, P. M.; Hsiang, A. Y.; Kahanamoku, S.

    2016-02-01

    The era of Big Data has ushered in the potential to collect population level information in a manageable time frame. Taxon-free morphological trait analysis, referred to as ecometrics, can be used to examine and compare ecological dynamics between communities with entirely different species compositions. Until recently population level studies of morphology were difficult because of the time intensive task of collecting measurements. To overcome this, we implemented advances in imaging technology and created software to automate measurements. This high-throughput set of methods collects assemblage-scale data, with methods tuned to foraminiferal samples (e.g., light objects on a dark background). Methods include serial focused dark-field microscopy, custom software (Automorph) to batch process images, extract 2D and 3D shape parameters and frames, and implement landmark-free geometric morphometric analyses. Informatics pipelines were created to store, catalog and share images through the Yale Peabody Museum(YPM; peabody.yale.edu). We openly share software and images to enhance future data discovery. In less than a year we have generated over 25TB of high resolution semi 3D images for this initial study. Here, we take the first step towards developing ecometric approaches for open ocean microfossil communities with a calibration study of community shape in recent sediments. We will present an overview of the `shape' of modern planktonic foraminiferal communities from 25 Atlantic core top samples (23 sites in the North and Equatorial Atlantic; 2 sites in the South Atlantic). In total, more than 100,000 microfossils and fragments were imaged from these sites' sediment cores, an unprecedented morphometric sample set. Correlates of community shape, including diversity, temperature, and latitude, will be discussed. These methods have also been applied to images of limpets and fish teeth to date, and have the potential to be used on modern taxa to extract meaningful

  18. High throughput testing and characterization of IR readouts and hybrids

    Science.gov (United States)

    Mandl, William J.; Bui, Khang X.; Patel, Manilal J.

    1992-07-01

    A program to upgrade the test capability for IR focal plane arrays and readouts is in progress at Aerojet. The objective of the development is to reduce the number and complexity of the steps in the test process, reduce in socket test time and provide a simplified set up procedure for production testing. There are two areas of study in the program. One is concerned with examining multiple fabrication sources for readout circuits. The application of a commercially available automated test system for production testing and engineering characterization of readouts and focal plane arrays is discussed. The Sentry Series 80 mixed signal tester is being fixtured for low noise measurements and interfacing to dewars for cryogenic testing. The multiuser foreground/background operating system software has the advantage of allowing noise and other statistical calculations to be performed in the background without impeding test measurements. It also has the advantage in production of requiring no manual instrumentation set up or interconnect. The improvements in test throughput and analysis capability will be shown in adapting this class of tester as opposed to assembling test instruments in a custom made computer controlled test approach.

  19. A Three-groups Model for High Throughput Survival Screens

    Science.gov (United States)

    Shaby, Benjamin A.; Skibinski, Gaia; Ando, Michael; LaDow, Eva S.; Finkbeiner, Steven

    2016-01-01

    Summary Amyotrophic lateral sclerosis (ALS) is a neurodegenerative condition characterized by the progressive deterioration of motor neurons in the cortex and spinal cord. Using an automated robotic microscope platform that enables the longitudinal tracking of thousands of single neurons, we examine the effects a large library of compounds on modulating the survival of primary neurons expressing a mutation known to cause ALS. The goal of our analysis is to identify the few potentially beneficial compounds among the many assayed, the vast majority of which do not extend neuronal survival. This resembles the large-scale simultaneous inference scenario familiar from microarray analysis, but transferred to the survival analysis setting due to the novel experimental setup. We apply a three component mixture model to censored survival times of thousands of individual neurons subjected to hundreds of different compounds. The shrinkage induced by our model significantly improves performance in simulations relative to performing treatment-wise survival analysis and subsequent multiple testing adjustment. Our analysis identified compounds that provide insight into potential novel therapeutic strategies for ALS. PMID:26821783

  20. NMRbot: Python scripts enable high-throughput data collection on current Bruker BioSpin NMR spectrometers.

    Science.gov (United States)

    Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L

    2013-06-01

    To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.

  1. High-throughput automated microfluidic sample preparation for accurate microbial genomics.

    Science.gov (United States)

    Kim, Soohong; De Jonghe, Joachim; Kulesa, Anthony B; Feldman, David; Vatanen, Tommi; Bhattacharyya, Roby P; Berdy, Brittany; Gomez, James; Nolan, Jill; Epstein, Slava; Blainey, Paul C

    2017-01-27

    Low-cost shotgun DNA sequencing is transforming the microbial sciences. Sequencing instruments are so effective that sample preparation is now the key limiting factor. Here, we introduce a microfluidic sample preparation platform that integrates the key steps in cells to sequence library sample preparation for up to 96 samples and reduces DNA input requirements 100-fold while maintaining or improving data quality. The general-purpose microarchitecture we demonstrate supports workflows with arbitrary numbers of reaction and clean-up or capture steps. By reducing the sample quantity requirements, we enabled low-input (∼10,000 cells) whole-genome shotgun (WGS) sequencing of Mycobacterium tuberculosis and soil micro-colonies with superior results. We also leveraged the enhanced throughput to sequence ∼400 clinical Pseudomonas aeruginosa libraries and demonstrate excellent single-nucleotide polymorphism detection performance that explained phenotypically observed antibiotic resistance. Fully-integrated lab-on-chip sample preparation overcomes technical barriers to enable broader deployment of genomics across many basic research and translational applications.

  2. Automated high-throughput infusion ESI-MS with direct coupling to a microtiter plate

    Czech Academy of Sciences Publication Activity Database

    Felten, C.; Foret, František; Minarik, M.; Goetzinger, W.; Karger, B. L.

    2001-01-01

    Roč. 73, č. 7 (2001), s. 1449-1454 ISSN 0003-2700 Institutional research plan: CEZ:AV0Z4031919 Keywords : electrospray * mass-spectrometry Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.532, year: 2001

  3. High throughput microfabricated CE/ESI-MS: automated sampling from a microwell plate

    Czech Academy of Sciences Publication Activity Database

    Zhang, B.; Foret, František; Karger, B. L.

    2001-01-01

    Roč. 73, č. 11 (2001), s. 2675-2681 ISSN 0003-2700 Institutional research plan: CEZ:AV0Z4031919 Keywords : electrospray mass-spectrometry * dead volume * electrophoresis Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.532, year: 2001

  4. Automation of gene assignments to metabolic pathways using high-throughput expression data

    Directory of Open Access Journals (Sweden)

    Yona Golan

    2005-08-01

    Full Text Available Abstract Background Accurate assignment of genes to pathways is essential in order to understand the functional role of genes and to map the existing pathways in a given genome. Existing algorithms predict pathways by extrapolating experimental data in one organism to other organisms for which this data is not available. However, current systems classify all genes that belong to a specific EC family to all the pathways that contain the corresponding enzymatic reaction, and thus introduce ambiguity. Results Here we describe an algorithm for assignment of genes to cellular pathways that addresses this problem by selectively assigning specific genes to pathways. Our algorithm uses the set of experimentally elucidated metabolic pathways from MetaCyc, together with statistical models of enzyme families and expression data to assign genes to enzyme families and pathways by optimizing correlated co-expression, while minimizing conflicts due to shared assignments among pathways. Our algorithm also identifies alternative ("backup" genes and addresses the multi-domain nature of proteins. We apply our model to assign genes to pathways in the Yeast genome and compare the results for genes that were assigned experimentally. Our assignments are consistent with the experimentally verified assignments and reflect characteristic properties of cellular pathways. Conclusion We present an algorithm for automatic assignment of genes to metabolic pathways. The algorithm utilizes expression data and reduces the ambiguity that characterizes assignments that are based only on EC numbers.

  5. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy

    DEFF Research Database (Denmark)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H.

    2017-01-01

    cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8...... day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues...

  6. High-throughput phenotyping of plant resistance to aphids by automated video tracking

    NARCIS (Netherlands)

    Kloth, K.J.; Broeke, ten C.J.M.; Thoen, H.P.M.; Hanhart-van den Brink, M.; Wiegers, G.L.; Krips, O.E.; Noldus, L.P.J.J.; Dicke, M.; Jongsma, M.A.

    2015-01-01

    Background: Piercing-sucking insects are major vectors of plant viruses causing significant yield losses in crops.Functional genomics of plant resistance to these insects would greatly benefit from the availability of highthroughput, quantitative phenotyping methods. Results: We have developed an

  7. Advancing a distributed multi-scale computing framework for large-scale high-throughput discovery in materials science.

    Science.gov (United States)

    Knap, J; Spear, C E; Borodin, O; Leiter, K W

    2015-10-30

    We describe the development of a large-scale high-throughput application for discovery in materials science. Our point of departure is a computational framework for distributed multi-scale computation. We augment the original framework with a specialized module whose role is to route evaluation requests needed by the high-throughput application to a collection of available computational resources. We evaluate the feasibility and performance of the resulting high-throughput computational framework by carrying out a high-throughput study of battery solvents. Our results indicate that distributed multi-scale computing, by virtue of its adaptive nature, is particularly well-suited for building high-throughput applications.

  8. Sample Processing Impacts the Viability and Cultivability of the Sponge Microbiome

    OpenAIRE

    Esteves, Ana I. S.; Amer, Nimra; Nguyen, Mary; Thomas, Torsten

    2016-01-01

    Sponges host complex microbial communities of recognized ecological and biotechnological importance. Extensive cultivation efforts have been made to isolate sponge bacteria, but most still elude cultivation. To identify the bottlenecks of sponge bacterial cultivation, we combined high-throughput 16S rRNA gene sequencing with a variety of cultivation media and incubation conditions. We aimed to determine the extent to which sample processing and cultivation conditions can impact bacterial viab...

  9. High-Throughput Accurate Single-Cell Screening of Euglena gracilis with Fluorescence-Assisted Optofluidic Time-Stretch Microscopy.

    Directory of Open Access Journals (Sweden)

    Baoshan Guo

    Full Text Available The development of reliable, sustainable, and economical sources of alternative fuels is an important, but challenging goal for the world. As an alternative to liquid fossil fuels, algal biofuel is expected to play a key role in alleviating global warming since algae absorb atmospheric CO2 via photosynthesis. Among various algae for fuel production, Euglena gracilis is an attractive microalgal species as it is known to produce wax ester (good for biodiesel and aviation fuel within lipid droplets. To date, while there exist many techniques for inducing microalgal cells to produce and accumulate lipid with high efficiency, few analytical methods are available for characterizing a population of such lipid-accumulated microalgae including E. gracilis with high throughout, high accuracy, and single-cell resolution simultaneously. Here we demonstrate high-throughput, high-accuracy, single-cell screening of E. gracilis with fluorescence-assisted optofluidic time-stretch microscopy-a method that combines the strengths of microfluidic cell focusing, optical time-stretch microscopy, and fluorescence detection used in conventional flow cytometry. Specifically, our fluorescence-assisted optofluidic time-stretch microscope consists of an optical time-stretch microscope and a fluorescence analyzer on top of a hydrodynamically focusing microfluidic device and can detect fluorescence from every E. gracilis cell in a population and simultaneously obtain its image with a high throughput of 10,000 cells/s. With the multi-dimensional information acquired by the system, we classify nitrogen-sufficient (ordinary and nitrogen-deficient (lipid-accumulated E. gracilis cells with a low false positive rate of 1.0%. This method holds promise for evaluating cultivation techniques and selective breeding for microalgae-based biofuel production.

  10. MicroRNA from Moringa oleifera: Identification by High Throughput Sequencing and Their Potential Contribution to Plant Medicinal Value.

    Science.gov (United States)

    Pirrò, Stefano; Zanella, Letizia; Kenzo, Maurice; Montesano, Carla; Minutolo, Antonella; Potestà, Marina; Sobze, Martin Sanou; Canini, Antonella; Cirilli, Marco; Muleo, Rosario; Colizzi, Vittorio; Galgani, Andrea

    2016-01-01

    Moringa oleifera is a widespread plant with substantial nutritional and medicinal value. We postulated that microRNAs (miRNAs), which are endogenous, noncoding small RNAs regulating gene expression at the post-transcriptional level, might contribute to the medicinal properties of plants of this species after ingestion into human body, regulating human gene expression. However, the knowledge is scarce about miRNA in Moringa. Furthermore, in order to test the hypothesis on the pharmacological potential properties of miRNA, we conducted a high-throughput sequencing analysis using the Illumina platform. A total of 31,290,964 raw reads were produced from a library of small RNA isolated from M. oleifera seeds. We identified 94 conserved and two novel miRNAs that were validated by qRT-PCR assays. Results from qRT-PCR trials conducted on the expression of 20 Moringa miRNA showed that are conserved across multiple plant species as determined by their detection in tissue of other common crop plants. In silico analyses predicted target genes for the conserved miRNA that in turn allowed to relate the miRNAs to the regulation of physiological processes. Some of the predicted plant miRNAs have functional homology to their mammalian counterparts and regulated human genes when they were transfected into cell lines. To our knowledge, this is the first report of discovering M. oleifera miRNAs based on high-throughput sequencing and bioinformatics analysis and we provided new insight into a potential cross-species control of human gene expression. The widespread cultivation and consumption of M. oleifera, for nutritional and medicinal purposes, brings humans into close contact with products and extracts of this plant species. The potential for miRNA transfer should be evaluated as one possible mechanism of action to account for beneficial properties of this valuable species.

  11. High-Throughput Accurate Single-Cell Screening of Euglena gracilis with Fluorescence-Assisted Optofluidic Time-Stretch Microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Ito, Takuro; Jiang, Yiyue; Ozeki, Yasuyuki; Goda, Keisuke

    2016-01-01

    The development of reliable, sustainable, and economical sources of alternative fuels is an important, but challenging goal for the world. As an alternative to liquid fossil fuels, algal biofuel is expected to play a key role in alleviating global warming since algae absorb atmospheric CO2 via photosynthesis. Among various algae for fuel production, Euglena gracilis is an attractive microalgal species as it is known to produce wax ester (good for biodiesel and aviation fuel) within lipid droplets. To date, while there exist many techniques for inducing microalgal cells to produce and accumulate lipid with high efficiency, few analytical methods are available for characterizing a population of such lipid-accumulated microalgae including E. gracilis with high throughout, high accuracy, and single-cell resolution simultaneously. Here we demonstrate high-throughput, high-accuracy, single-cell screening of E. gracilis with fluorescence-assisted optofluidic time-stretch microscopy-a method that combines the strengths of microfluidic cell focusing, optical time-stretch microscopy, and fluorescence detection used in conventional flow cytometry. Specifically, our fluorescence-assisted optofluidic time-stretch microscope consists of an optical time-stretch microscope and a fluorescence analyzer on top of a hydrodynamically focusing microfluidic device and can detect fluorescence from every E. gracilis cell in a population and simultaneously obtain its image with a high throughput of 10,000 cells/s. With the multi-dimensional information acquired by the system, we classify nitrogen-sufficient (ordinary) and nitrogen-deficient (lipid-accumulated) E. gracilis cells with a low false positive rate of 1.0%. This method holds promise for evaluating cultivation techniques and selective breeding for microalgae-based biofuel production.

  12. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  13. High-throughput quality control of DMSO acoustic dispensing using photometric dye methods.

    Science.gov (United States)

    Quintero, Catherine; Tran, Kristen; Szewczak, Alexander A

    2013-08-01

    One high-throughput technology gaining widespread adoption in industry and academia is acoustic liquid dispensing, in which focused sound waves eject nanoliter-sized droplets from a solution into a recipient microplate. This technology allows for direct dispensing of small-molecule compounds or reagents dissolved in DMSO, while keeping a low final concentration of organic solvent in an assay. However, acoustic dispensing presents unique quality control (QC) challenges when measuring the accuracy and precision of small dispense volumes ranging from 2.5 to 100 nL. As part of an effort to develop a rapid and cost-effective QC method for acoustic dispensing of 100% DMSO, we implemented the first high-throughput photometric dual-dye-based QC protocol in the nanoliter volume range. This technical note validates the new photometric 100% DMSO QC method and highlights its cost-effectiveness when compared with conventional low-throughput fluorimetric QC methods. In addition, a potential software solution is described for the analysis, storage, and display of accumulated high-throughput QC data, called LabGauge. As the need for high-throughput QC grows, conventional low-throughput methods can no longer meet demand. Validated high-throughput techniques, such as the dual-dye photometric method, will need to be implemented.

  14. High-throughput cryopreservation of spermatozoa of blue catfish (Ictalurus furcatus): Establishment of an approach for commercial-scale processing.

    Science.gov (United States)

    Hu, E; Yang, Huiping; Tiersch, Terrence R

    2011-02-01

    Hybrid catfish created by crossing of female channel catfish (Ictalurus punctatus) and male blue catfish (Ictalurus furcatus) are being used increasingly in foodfish aquaculture because of their fast growth and efficient food conversion. However, the availability of blue catfish males is limited, and their peak spawning is at a different time than that of the channel catfish. As such, cryopreservation of sperm of blue catfish could improve production of hybrid catfish, and has been studied in the laboratory and tested for feasibility in a commercial dairy bull cryopreservation facility. However, an approach for commercially relevant production of cryopreserved blue catfish sperm is still needed. The goal of this study was to develop practical approaches for commercial-scale sperm cryopreservation of blue catfish by use of an automated high-throughput system (MAPI, CryoBioSystem Co.). The objectives were to: (1) refine cooling rate and cryoprotectant concentration, and evaluate their interactions; (2) evaluate the effect of sperm concentration on cryopreservation; (3) refine cryoprotectant concentration based on the highest effective sperm concentration; (4) compare the effect of thawing samples at 20 or 40°C; (5) evaluate the fertility of thawed sperm at a research scale by fertilizing with channel catfish eggs; (6) test the post-thaw motility and fertility of sperm from individual males in a commercial setting, and (7) test for correlation of cryopreservation results with biological indices used for male evaluation. The optimal cooling rate was 5°C/min (Micro Digitcool, IMV) for high-throughput cryopreservation using CBS high-biosecurity 0.5-ml straws with 10% methanol, and a concentration of 1×10(9)sperm/ml. There was no difference in post-thaw motility when samples were thawed at 20°C for 40s or 40°C for 20s. After fertilization, the percentage of neurulation (Stage V embryos) was 80±21%, and percentage of embryonic mobility (Stage VI embryo) was 51±22

  15. Single-cell-based image analysis of high-throughput cell array screens for quantification of viral infection.

    Science.gov (United States)

    Matula, Petr; Kumar, Anil; Wörz, Ilka; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl

    2009-04-01

    The identification of eukaryotic genes involved in virus entry and replication is important for understanding viral infection. Our goal is to develop a siRNA-based screening system using cell arrays and high-throughput (HT) fluorescence microscopy. A central issue is efficient, robust, and automated single-cell-based analysis of massive image datasets. We have developed an image analysis approach that comprises (i) a novel, gradient-based thresholding scheme for cell nuclei segmentation which does not require subsequent postprocessing steps for separation of clustered nuclei, (ii) quantification of the virus signal in the neighborhood of cell nuclei, (iii) localization of regions with transfected cells by combining model-based circle fitting and grid fitting, (iv) cell classification as infected or noninfected, and (v) image quality control (e.g., identification of out-of-focus images). We compared the results of our nucleus segmentation approach with a previously developed scheme of adaptive thresholding with subsequent separation of nuclear clusters. Our approach, which does not require a postprocessing step for the separation of nuclear clusters, correctly segmented 97.1% of the nuclei, whereas the previous scheme achieved 95.8%. Using our algorithm for the detection of out-of-focus images, we obtained a high discrimination power of 99.4%. Our overall approach has been applied to more than 55,000 images of cells infected by either hepatitis C or dengue virus. Reduced infection rates were correctly detected in positive siRNA controls, as well as for siRNAs targeting, for example, cellular genes involved in viral infection. Our image analysis approach allows for the automatic and accurate determination of changes in viral infection based on high-throughput single-cell-based siRNA cell array imaging experiments. (c) 2008 International Society for Advancement of Cytometry.

  16. A novel high throughput assay for anthelmintic drug screening and resistance diagnosis by real-time monitoring of parasite motility.

    Directory of Open Access Journals (Sweden)

    Michael J Smout

    Full Text Available BACKGROUND: Helminth parasites cause untold morbidity and mortality to billions of people and livestock. Anthelmintic drugs are available but resistance is a problem in livestock parasites, and is a looming threat for human helminths. Testing the efficacy of available anthelmintic drugs and development of new drugs is hindered by the lack of objective high-throughput screening methods. Currently, drug effect is assessed by observing motility or development of parasites using laborious, subjective, low-throughput methods. METHODOLOGY/PRINCIPAL FINDINGS: Here we describe a novel application for a real-time cell monitoring device (xCELLigence that can simply and objectively assess anthelmintic effects by measuring parasite motility in real time in a fully automated high-throughput fashion. We quantitatively assessed motility and determined real time IC(50 values of different anthelmintic drugs against several developmental stages of major helminth pathogens of humans and livestock, including larval Haemonchus contortus and Strongyloides ratti, and adult hookworms and blood flukes. The assay enabled quantification of the onset of egg hatching in real time, and the impact of drugs on hatch rate, as well as discriminating between the effects of drugs on motility of drug-susceptible and -resistant isolates of H. contortus. CONCLUSIONS/SIGNIFICANCE: Our findings indicate that this technique will be suitable for discovery and development of new anthelmintic drugs as well as for detection of phenotypic resistance to existing drugs for the majority of helminths and other pathogens where motility is a measure of pathogen viability. The method is also amenable to use for other purposes where motility is assessed, such as gene silencing or antibody-mediated killing.

  17. Filtering high-throughput protein-protein interaction data using a combination of genomic features

    Directory of Open Access Journals (Sweden)

    Patil Ashwini

    2005-04-01

    Full Text Available Abstract Background Protein-protein interaction data used in the creation or prediction of molecular networks is usually obtained from large scale or high-throughput experiments. This experimental data is liable to contain a large number of spurious interactions. Hence, there is a need to validate the interactions and filter out the incorrect data before using them in prediction studies. Results In this study, we use a combination of 3 genomic features – structurally known interacting Pfam domains, Gene Ontology annotations and sequence homology – as a means to assign reliability to the protein-protein interactions in Saccharomyces cerevisiae determined by high-throughput experiments. Using Bayesian network approaches, we show that protein-protein interactions from high-throughput data supported by one or more genomic features have a higher likelihood ratio and hence are more likely to be real interactions. Our method has a high sensitivity (90% and good specificity (63%. We show that 56% of the interactions from high-throughput experiments in Saccharomyces cerevisiae have high reliability. We use the method to estimate the number of true interactions in the high-throughput protein-protein interaction data sets in Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens to be 27%, 18% and 68% respectively. Our results are available for searching and downloading at http://helix.protein.osaka-u.ac.jp/htp/. Conclusion A combination of genomic features that include sequence, structure and annotation information is a good predictor of true interactions in large and noisy high-throughput data sets. The method has a very high sensitivity and good specificity and can be used to assign a likelihood ratio, corresponding to the reliability, to each interaction.

  18. High-Throughput Combinatorial Development of High-Entropy Alloys For Light-Weight Structural Applications

    Energy Technology Data Exchange (ETDEWEB)

    Van Duren, Jeroen K; Koch, Carl; Luo, Alan; Sample, Vivek; Sachdev, Anil

    2017-12-29

    on Al-Cr-Fe-Ni, shows compressive strain >10% and specific compressive yield strength of 229 MPa x cc/g, yet does not show ductility in tensile tests due to cleavage. When replacing Cr in Al-Cr-Fe-based 4- and 5-element LDHEA with Mn, hardness drops 2x. Combined with compression test results, including those on the ternaries Al-Cr-Fe and Al-Mn-Fe suggest that Al-Mn-Fe-based LDHEA are still worth pursuing. These initial results only represent one compressive stress-strain curve per composition without any property optimization. As such, reproducibility needs to be followed by optimization to show their full potential. When including Li, Mg, and Zn, single-phase Li-Mg-Al-Ti-Zn LDHEA has been found with a specific ultimate compressive strength of 289MPa x cc/g. Al-Ti-Mn-Zn showed a specific ultimate compressive strength of 73MPa x cc/g. These initial results after hot isostatic pressing (HIP) of the ball-milled powders represent the lower end of what is possible, since no secondary processing (e.g. extrusion) has been performed to optimize strength and ductility. Compositions for multi-phase (e.g. dual-phase) LDHEA were identified largely by automated searches through CALPHAD databases, while screening for large face-centered-cubic (FCC) volume fractions, followed by experimental verification. This resulted in several new alloys. Li-Mg-Al-Mn-Fe and Mg-Mn-Fe-Co ball-milled powders upon HIP show specific ultimate compressive strengths of 198MPa x cc/g and 45MPa x cc/g, respectively. Several malleable quarternary Al-Zn-based alloys have been found upon arc/induction melting, yet with limited specific compressive yield strength (<75 MPa x cc/g). These initial results are all without any optimization for strength and/or ductility. High-throughput experimentation allowed us to triple the existing experimental HEA database as published in the past 10 years in less than 2 years which happened at a rate 10x higher than previous methods. Furthermore, we showed that high-throughput

  19. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    Science.gov (United States)

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  20. Machine learning in computational biology to accelerate high-throughput protein expression

    DEFF Research Database (Denmark)

    Sastry, Anand; Monk, Jonathan M.; Tegel, Hanna

    2017-01-01

    and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide...... the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. Availability and implementation: We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template...

  1. A platform for high-throughput screening of DNA-encoded catalyst libraries in organic solvents.

    Science.gov (United States)

    Hook, K Delaney; Chambers, John T; Hili, Ryan

    2017-10-01

    We have developed a novel high-throughput screening platform for the discovery of small-molecules catalysts for bond-forming reactions. The method employs an in vitro selection for bond-formation using amphiphilic DNA-encoded small molecules charged with reaction substrate, which enables selections to be conducted in a variety of organic or aqueous solvents. Using the amine-catalysed aldol reaction as a catalytic model and high-throughput DNA sequencing as a selection read-out, we demonstrate the 1200-fold enrichment of a known aldol catalyst from a library of 16.7-million uncompetitive library members.

  2. Construction of small RNA cDNA libraries for high-throughput sequencing.

    Science.gov (United States)

    Lu, Cheng; Shedge, Vikas

    2011-01-01

    Small RNAs (smRNAs) play an essential role in virtually every aspect of growth and development, by regulating gene expression at the post-transcriptional and/or transcriptional level. New high-throughput sequencing technology allows for a comprehensive coverage of smRNAs in any given biological sample, and has been widely used for profiling smRNA populations in various developmental stages, tissue and cell types, or normal and disease states. In this article, we describe the method used in our laboratory to construct smRNA cDNA libraries for high-throughput sequencing.

  3. Quantitative high-throughput screen identifies inhibitors of the Schistosoma mansoni redox cascade.

    Directory of Open Access Journals (Sweden)

    Anton Simeonov

    2008-01-01

    Full Text Available Schistosomiasis is a tropical disease associated with high morbidity and mortality, currently affecting over 200 million people worldwide. Praziquantel is the only drug used to treat the disease, and with its increased use the probability of developing drug resistance has grown significantly. The Schistosoma parasites can survive for up to decades in the human host due in part to a unique set of antioxidant enzymes that continuously degrade the reactive oxygen species produced by the host's innate immune response. Two principal components of this defense system have been recently identified in S. mansoni as thioredoxin/glutathione reductase (TGR and peroxiredoxin (Prx and as such these enzymes present attractive new targets for anti-schistosomiasis drug development. Inhibition of TGR/Prx activity was screened in a dual-enzyme format with reducing equivalents being transferred from NADPH to glutathione via a TGR-catalyzed reaction and then to hydrogen peroxide via a Prx-catalyzed step. A fully automated quantitative high-throughput (qHTS experiment was performed against a collection of 71,028 compounds tested as 7- to 15-point concentration series at 5 microL reaction volume in 1536-well plate format. In order to generate a robust data set and to minimize the effect of compound autofluorescence, apparent reaction rates derived from a kinetic read were utilized instead of end-point measurements. Actives identified from the screen, along with previously untested analogues, were subjected to confirmatory experiments using the screening assay and subsequently against the individual targets in secondary assays. Several novel active series were identified which inhibited TGR at a range of potencies, with IC(50s ranging from micromolar to the assay response limit ( approximately 25 nM. This is, to our knowledge, the first report of a large-scale HTS to identify lead compounds for a helminthic disease, and provides a paradigm that can be used to jump

  4. A high-throughput, in-vitro assay for Bacillus thuringiensis insecticidal proteins.

    Science.gov (United States)

    Izumi Willcoxon, Michi; Dennis, Jaclyn R; Lau, Sabina I; Xie, Weiping; You, You; Leng, Song; Fong, Ryan C; Yamamoto, Takashi

    2016-01-10

    A high-throughput, in-vitro assay for Bacillus thuringiensis (Bt) insecticidal proteins designated as Cry was developed and evaluated for screening a large number of Cry protein variants produced by DNA shuffling. This automation-amenable assay exploits an insect cell line expressing a single receptor of Bt Cry proteins. The Cry toxin used to develop this assay is a variant of the Cry1Ab protein called IP1-88, which was produced previously by DNA shuffling. Cell mortality caused by the activated Bt Cry toxin was determined by chemical cell viability assay in 96/384-well microtiter plates utilizing CellTiter 96(®) obtained from Promega. A widely-accepted mode-of-action theory of certain Bt Cry proteins suggests that the activated toxin binds to one or more receptors and forms a pore through the insect gut epithelial cell apical membrane. A number of insect proteins such as cadherin-like protein (Cad), aminopeptidase-N (APN), alkaline phosphatase (ALP) and ABC transporter (ABCC) have been identified as the receptors of Bt Cry toxins. In this study, Bt Cry toxin receptors Ostrinia nubilalis (European corn borer) cadherin-like protein (On-Cad) and aminopeptidase-N 1 and 3 (On-APN1, On-APN3) and Spodoptera frugiperda (fall armyworm) cadherin-like protein (Sf-Cad) were cloned in an insect cell line, Sf21, and a mammalian cell line, Expi293F. It was observed by ligand blotting and immunofluorescence microscopy that trypsin-activated IP1-88 bound to On-Cad and On-APN1, but not Sf-Cad or On-APN3. In contrast, IP1-88 bound only to APN1 in BBMV (Brush Border Membrane Vesicles) prepared from the third and fourth-instar O. nubilalis larval midgut. The sensitivity of the recombinant cells to the toxin was then tested. IP1-88 showed no toxicity to non-recombinant Sf21 and Expi293F. Toxicity was observed only when the On-Cad gene was cloned and expressed. Sf-Cad and On-APN1 were not able to make those cells sensitive to the toxin. Since the expression of On-Cad alone was

  5. Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries.

    Science.gov (United States)

    Haghighattalab, Atena; González Pérez, Lorena; Mondal, Suchismita; Singh, Daljit; Schinstock, Dale; Rutkoski, Jessica; Ortiz-Monasterio, Ivan; Singh, Ravi Prakash; Goodin, Douglas; Poland, Jesse

    2016-01-01

    Low cost unmanned aerial systems (UAS) have great potential for rapid proximal measurements of plants in agriculture. In the context of plant breeding and genetics, current approaches for phenotyping a large number of breeding lines under field conditions require substantial investments in time, cost, and labor. For field-based high-throughput phenotyping (HTP), UAS platforms can provide high-resolution measurements for small plot research, while enabling the rapid assessment of tens-of-thousands of field plots. The objective of this study was to complete a baseline assessment of the utility of UAS in assessment field trials as commonly implemented in wheat breeding programs. We developed a semi-automated image-processing pipeline to extract plot level data from UAS imagery. The image dataset was processed using a photogrammetric pipeline based on image orientation and radiometric calibration to produce orthomosaic images. We also examined the relationships between vegetation indices (VIs) extracted from high spatial resolution multispectral imagery collected with two different UAS systems (eBee Ag carrying MultiSpec 4C camera, and IRIS+ quadcopter carrying modified NIR Canon S100) and ground truth spectral data from hand-held spectroradiometer. We found good correlation between the VIs obtained from UAS platforms and ground-truth measurements and observed high broad-sense heritability for VIs. We determined radiometric calibration methods developed for satellite imagery significantly improved the precision of VIs from the UAS. We observed VIs extracted from calibrated images of Canon S100 had a significantly higher correlation to the spectroradiometer (r = 0.76) than VIs from the MultiSpec 4C camera (r = 0.64). Their correlation to spectroradiometer readings was as high as or higher than repeated measurements with the spectroradiometer per se. The approaches described here for UAS imaging and extraction of proximal sensing data enable collection of HTP

  6. Towards high-throughput molecular detection of Plasmodium: new approaches and molecular markers

    Directory of Open Access Journals (Sweden)

    Rogier Christophe

    2009-04-01

    molecular methods. Dot18S and CYTB, the new methods reported herein are highly sensitive, allow parasite DNA extraction as well as genus- and species-specific diagnosis of several hundreds of samples, and are amenable to high-throughput scaling up for larger sample sizes. Such methods provide novel information on malaria prevalence and epidemiology and are suited for active malaria detection. The usefulness of such sensitive malaria diagnosis tools, especially in low endemic areas where eradication plans are now on-going, is discussed in this paper.

  7. Ensembler: Enabling High-Throughput Molecular Simulations at the Superfamily Scale.

    Directory of Open Access Journals (Sweden)

    Daniel L Parton

    2016-06-01

    Full Text Available The rapidly expanding body of available genomic and protein structural data provides a rich resource for understanding protein dynamics with biomolecular simulation. While computational infrastructure has grown rapidly, simulations on an omics scale are not yet widespread, primarily because software infrastructure to enable simulations at this scale has not kept pace. It should now be possible to study protein dynamics across entire (superfamilies, exploiting both available structural biology data and conformational similarities across homologous proteins. Here, we present a new tool for enabling high-throughput simulation in the genomics era. Ensembler takes any set of sequences-from a single sequence to an entire superfamily-and shepherds them through various stages of modeling and refinement to produce simulation-ready structures. This includes comparative modeling to all relevant PDB structures (which may span multiple conformational states of interest, reconstruction of missing loops, addition of missing atoms, culling of nearly identical structures, assignment of appropriate protonation states, solvation in explicit solvent, and refinement and filtering with molecular simulation to ensure stable simulation. The output of this pipeline is an ensemble of structures ready for subsequent molecular simulations using computer clusters, supercomputers, or distributed computing projects like Folding@home. Ensembler thus automates much of the time-consuming process of preparing protein models suitable for simulation, while allowing scalability up to entire superfamilies. A particular advantage of this approach can be found in the construction of kinetic models of conformational dynamics-such as Markov state models (MSMs-which benefit from a diverse array of initial configurations that span the accessible conformational states to aid sampling. We demonstrate the power of this approach by constructing models for all catalytic domains in the human

  8. WE-E-BRE-07: High-Throughput Mapping of Proton Biologic Effect

    Energy Technology Data Exchange (ETDEWEB)

    Bronk, L; Guan, F; Kerr, M; Dinh, J; Titt, U; Mirkovic, D; Lin, S; Mohan, R; Grosshans, D [UT MD Anderson Cancer Center, Houston, TX (United States)

    2014-06-15

    Purpose: To systematically relate the relative biological effectives (RBE) of proton therapy to beam linear energy transfer (LET) and dose. Methods: Using a custom irradiation apparatus previously characterized by our group, H460 NSCLCs were irradiated using a clinical 80MeV spot scanning proton beam. Utilizing this system allowed for high-throughput clonogenic assays performed in 96-well tissue culture plates as opposed to the traditional 6-well technique. Each column in the 96-well plate received a set LET-dose combination. By altering the total number of dose repaintings, numerous dose-LET configurations were examined to effectively generate surviving fraction (SF) data over the entire Bragg peak. The clonogenic assay was performed post-irradiation using an INCell Analyzer for colony quantification. SF data were fit to the linear-quadratic model for analysis. Results: Irradiation with increasing LETs resulted in decreased cell survival largely independent of dose. A significant correlation between LET and SF was identified by two-way ANOVA and the extra sum-of-squares F test. This trend was obscured at the lower LET values in the plateau region of the Bragg peak; however, it was clear for LET values at and beyond the Bragg peak. Data fits revealed the SF at a dose of 2Gy (SF2) to be 0.48 for the lowest tested LET (1.55keV/um), 0.47 at the end of the plateau region (4.74keV/um) and 0.33 for protons at the Bragg peak (10.35keV/um). Beyond the Bragg peak we measured SF2s of 0.16 for 15.01keV/um, 0.02 for 16.79keV/um, and 0.004 for 18.06keV/um. Conclusion: We have shown that our methodology enables high-content automated screening for proton irradiations over a range of LETs. The observed decrease in cellular SF in high LET regions confirms an increased RBE of the radiation and suggests further evaluation of proton RBE values is necessary to optimize clinical outcomes. Rosalie B. Hite Graduate Fellowship in Cancer Research, NIH Program Project Grant P01CA021239.

  9. A high-throughput phenotypic screen identifies clofazimine as a potential treatment for cryptosporidiosis.

    Directory of Open Access Journals (Sweden)

    Melissa S Love

    2017-02-01

    Full Text Available Cryptosporidiosis has emerged as a leading cause of non-viral diarrhea in children under five years of age in the developing world, yet the current standard of care to treat Cryptosporidium infections, nitazoxanide, demonstrates limited and immune-dependent efficacy. Given the lack of treatments with universal efficacy, drug discovery efforts against cryptosporidiosis are necessary to find therapeutics more efficacious than the standard of care. To date, cryptosporidiosis drug discovery efforts have been limited to a few targeted mechanisms in the parasite and whole cell phenotypic screens against small, focused collections of compounds. Using a previous screen as a basis, we initiated the largest known drug discovery effort to identify novel anticryptosporidial agents. A high-content imaging assay for inhibitors of Cryptosporidium parvum proliferation within a human intestinal epithelial cell line was miniaturized and automated to enable high-throughput phenotypic screening against a large, diverse library of small molecules. A screen of 78,942 compounds identified 12 anticryptosporidial hits with sub-micromolar activity, including clofazimine, an FDA-approved drug for the treatment of leprosy, which demonstrated potent and selective in vitro activity (EC50 = 15 nM against C. parvum. Clofazimine also displayed activity against C. hominis-the other most clinically-relevant species of Cryptosporidium. Importantly, clofazimine is known to accumulate within epithelial cells of the small intestine, the primary site of Cryptosporidium infection. In a mouse model of acute cryptosporidiosis, a once daily dosage regimen for three consecutive days or a single high dose resulted in reduction of oocyst shedding below the limit detectable by flow cytometry. Recently, a target product profile (TPP for an anticryptosporidial compound was proposed by Huston et al. and highlights the need for a short dosing regimen (< 7 days and formulations for children < 2

  10. Fully Automated On-Chip Imaging Flow Cytometry System with Disposable Contamination-Free Plastic Re-Cultivation Chip

    Directory of Open Access Journals (Sweden)

    Tomoyuki Kaneko

    2011-06-01

    Full Text Available We have developed a novel imaging cytometry system using a poly(methyl methacrylate (PMMA based microfluidic chip. The system was contamination-free, because sample suspensions contacted only with a flammable PMMA chip and no other component of the system. The transparency and low-fluorescence of PMMA was suitable for microscopic imaging of cells flowing through microchannels on the chip. Sample particles flowing through microchannels on the chip were discriminated by an image-recognition unit with a high-speed camera in real time at the rate of 200 event/s, e.g., microparticles 2.5 μm and 3.0 μm in diameter were differentiated with an error rate of less than 2%. Desired cells were separated automatically from other cells by electrophoretic or dielectrophoretic force one by one with a separation efficiency of 90%. Cells in suspension with fluorescent dye were separated using the same kind of microfluidic chip. Sample of 5 μL with 1 × 106 particle/mL was processed within 40 min. Separated cells could be cultured on the microfluidic chip without contamination. The whole operation of sample handling was automated using 3D micropipetting system. These results showed that the novel imaging flow cytometry system is practically applicable for biological research and clinical diagnostics.

  11. Multiplexing spheroid volume, resazurin and acid phosphatase viability assays for high-throughput screening of tumour spheroids and stem cell neurospheres.

    Directory of Open Access Journals (Sweden)

    Delyan P Ivanov

    Full Text Available Three-dimensional cell culture has many advantages over monolayer cultures, and spheroids have been hailed as the best current representation of small avascular tumours in vitro. However their adoption in regular screening programs has been hindered by uneven culture growth, poor reproducibility and lack of high-throughput analysis methods for 3D. The objective of this study was to develop a method for a quick and reliable anticancer drug screen in 3D for tumour and human foetal brain tissue in order to investigate drug effectiveness and selective cytotoxic effects. Commercially available ultra-low attachment 96-well round-bottom plates were employed to culture spheroids in a rapid, reproducible manner amenable to automation. A set of three mechanistically different methods for spheroid health assessment (Spheroid volume, metabolic activity and acid phosphatase enzyme activity were validated against cell numbers in healthy and drug-treated spheroids. An automated open-source ImageJ macro was developed to enable high-throughput volume measurements. Although spheroid volume determination was superior to the other assays, multiplexing it with resazurin reduction and phosphatase activity produced a richer picture of spheroid condition. The ability to distinguish between effects on malignant and the proliferating component of normal brain was tested using etoposide on UW228-3 medulloblastoma cell line and human neural stem cells. At levels below 10 µM etoposide exhibited higher toxicity towards proliferating stem cells, whereas at concentrations above 10 µM the tumour spheroids were affected to a greater extent. The high-throughput assay procedures use ready-made plates, open-source software and are compatible with standard plate readers, therefore offering high predictive power with substantial savings in time and money.

  12. Application of high-throughput technologies to a structural proteomics-type analysis of Bacillus anthracis

    NARCIS (Netherlands)

    Au, K.; Folkers, G.E.; Kaptein, R.

    2006-01-01

    A collaborative project between two Structural Proteomics In Europe (SPINE) partner laboratories, York and Oxford, aimed at high-throughput (HTP) structure determination of proteins from Bacillus anthracis, the aetiological agent of anthrax and a biomedically important target, is described. Based

  13. Evaluation of Simple and Inexpensive High-Throughput Methods for Phytic Acid Determination

    DEFF Research Database (Denmark)

    Raboy, Victor; Johnson, Amy; Bilyeu, Kristin

    2017-01-01

    High-throughput/low-cost/low-tech methods for phytic acid determination that are sufficiently accurate and reproducible would be of value in plant genetics, crop breeding and in the food and feed industries. Variants of two candidate methods, those described by Vaintraub and Lapteva (Anal Biochem...

  14. High-Throughput Production of Proteins in E. coli for Structural Studies.

    Science.gov (United States)

    Black, Charikleia; Barker, John J; Hitchman, Richard B; Kwong, Hok Sau; Festenstein, Sam; Acton, Thomas B

    2017-01-01

    We have developed a standardized and efficient workflow for high-throughput (HT) protein expression in E. coli and parallel purification which can be tailored to the downstream application of the target proteins. It includes a one-step purification for the purposes of functional assays and a two-step protocol for crystallographic studies, with the option of on-column tag removal.

  15. High-throughput genotoxicity assay identifies antioxidants as inducers of DNA damage response and cell death

    Science.gov (United States)

    Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...

  16. High throughput "omics" approaches to assess the effects of phytochemicals in human health studies

    Czech Academy of Sciences Publication Activity Database

    Ovesná, J.; Slabý, O.; Toussaint, O.; Kodíček, M.; Maršík, Petr; Pouchová, V.; Vaněk, Tomáš

    2008-01-01

    Roč. 99, E-S1 (2008), ES127-ES134 ISSN 0007-1145 R&D Projects: GA MŠk(CZ) 1P05OC054 Institutional research plan: CEZ:AV0Z50380511 Keywords : Nutrigenomics * Phytochemicals * High throughput platforms Subject RIV: GM - Food Processing Impact factor: 2.764, year: 2008

  17. Roche genome sequencer FLX based high-throughput sequencing of ancient DNA

    DEFF Research Database (Denmark)

    Alquezar-Planas, David E; Fordyce, Sarah Louise

    2012-01-01

    Since the development of so-called "next generation" high-throughput sequencing in 2005, this technology has been applied to a variety of fields. Such applications include disease studies, evolutionary investigations, and ancient DNA. Each application requires a specialized protocol to ensure tha...

  18. Retrofit Strategies for Incorporating Xenobiotic Metabolism into High Throughput Screening Assays (EMGS)

    Science.gov (United States)

    The US EPA’s ToxCast program is designed to assess chemical perturbations of molecular and cellular endpoints using a variety of high-throughput screening (HTS) assays. However, existing HTS assays have limited or no xenobiotic metabolism which could lead to a mischaracterization...

  19. A high-throughput, precipitating colorimetric sandwich ELISA microarray for shiga toxins

    Science.gov (United States)

    Shiga toxins 1 and 2 (Stx1 and Stx2) from Shiga toxin-producing E. coli (STEC) bacteria were simultaneously detected with a newly developed, high-throughput antibody microarray platform. The proteinaceous toxins were immobilized and sandwiched between biorecognition elements (monoclonal antibodies)...

  20. The protein crystallography beamline BW6 at DORIS - automatic operation and high-throughput data collection

    CERN Document Server

    Blume, H; Bourenkov, G P; Kosciesza, D; Bartunik, H D

    2001-01-01

    The wiggler beamline BW6 at DORIS has been optimized for de-novo solution of protein structures on the basis of MAD phasing. Facilities for automatic data collection, rapid data transfer and storage, and online processing have been developed which provide adequate conditions for high-throughput applications, e.g., in structural genomics.

  1. A high-throughput sample preparation method for cellular proteomics using 96-well filter plates.

    NARCIS (Netherlands)

    Switzar, L.; van Angeren, J.A; Pinkse, M; Kool, J.; Niessen, W.M.A.

    2013-01-01

    A high-throughput sample preparation protocol based on the use of 96-well molecular weight cutoff (MWCO) filter plates was developed for shotgun proteomics of cell lysates. All sample preparation steps, including cell lysis, buffer exchange, protein denaturation, reduction, alkylation and

  2. A genome-enabled, high-throughput, and multiplexed fingerprinting platform for strawberry (Fragaria L.)

    Science.gov (United States)

    Strawberry (Fragaria L.) genotypes bear remarkable phenotypic similarity, even across ploidy levels. Additionally, breeding programs seek to introgress alleles from wild germplasm, so objective molecular description of genetic variation has great value. In this report, a high-throughput, robust prot...

  3. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis

  4. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  5. A high-throughput screening assay for distinguishing nitrile hydratases from nitrilases

    Directory of Open Access Journals (Sweden)

    Leticia Mara Lima Angelini

    2015-03-01

    Full Text Available A modified colorimetric high-throughput screen based on pH changes combined with an amidase inhibitor capable of distinguishing between nitrilases and nitrile hydratases. This enzymatic screening is based on a binary response and is suitable for the first step of hierarchical screening projects.

  6. High-throughput parallel SPM for metrology, defect and mask inspection

    NARCIS (Netherlands)

    Sadeghian Marnani, H.; Herfst, R.W.; Dool, T.C. van den; Crowcombe, W.E.; Winters, J.; Kramers, G.F.I.J.

    2014-01-01

    Scanning probe microscopy (SPM) is a promising candidate for accurate assessment of metrology and defects on wafers and masks, however it has traditionally been too slow for high-throughput applications, although recent developments have significantly pushed the speed of SPM [1,2]. In this paper we

  7. High Throughput Screening Methodologies Classified for Major Drug Target Classes According to Target Signaling Pathways

    NARCIS (Netherlands)

    Kool, J.; Lingeman, H.; Niessen, W.M.A.; Irth, H.

    2010-01-01

    Over the years, many different high throughput screening technologies and subsequently follow-up methodologies have been developed. All of these can be categorized, for example according to measurement of analyte classes, assay mechanisms, readout principles, or screening of drug target classes.

  8. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free of t...

  9. High-throughput mapping of cell-wall polymers within and between plants using novel microarrays

    DEFF Research Database (Denmark)

    Moller, Isabel Eva; Sørensen, Iben; Bernal Giraldo, Adriana Jimena

    2007-01-01

    We describe here a methodology that enables the occurrence of cell-wall glycans to be systematically mapped throughout plants in a semi-quantitative high-throughput fashion. The technique (comprehensive microarray polymer profiling, or CoMPP) integrates the sequential extraction of glycans from...

  10. Human variability in high-throughput risk prioritization of environmental chemicals (Texas AM U. webinar)

    Science.gov (United States)

    We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...

  11. INDeGenIUS, a new method for high-throughput identification of ...

    Indian Academy of Sciences (India)

    Investigation of the predicted GIs in pathogens may lead to identification of potential drug/vaccine candidates. [Shrivastava S, Reddy Ch V S K and Mande S S 2010 INDeGenIUS, a new method for high-throughput identification of specialized functional islands in completely sequenced organisms; J. Biosci. 35 351–364] DOI ...

  12. High-throughput screening of tick-borne pathogens in Europe

    DEFF Research Database (Denmark)

    Michelet, Lorraine; Delannoy, Sabine; Devillers, Elodie

    2014-01-01

    was conducted on 7050 Ixodes ricinus nymphs collected from France, Denmark, and the Netherlands using a powerful new high-throughput approach. This advanced methodology permitted the simultaneous detection of 25 bacterial, and 12 parasitic species (including; Borrelia, Anaplasma, Ehrlichia, Rickettsia...

  13. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    Science.gov (United States)

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...

  14. Development of a thyroperoxidase inhibition assay for high-throughput screening

    Science.gov (United States)

    High-throughput screening (HTPS) assays to detect inhibitors of thyroperoxidase (TPO), the enzymatic catalyst for thyroid hormone (TH) synthesis, are not currently available. Herein we describe the development of a HTPS TPO inhibition assay. Rat thyroid microsomes and a fluores...

  15. High-throughput sequencing of forensic genetic samples using punches of FTA cards with buccal swabs

    DEFF Research Database (Denmark)

    Kampmann, Marie-Louise; Buchard, Anders; Børsting, Claus

    2016-01-01

    Here, we demonstrate that punches from buccal swab samples preserved on FTA cards can be used for high-throughput DNA sequencing, also known as massively parallel sequencing (MPS). We typed 44 reference samples with the HID-Ion AmpliSeq Identity Panel using washed 1.2 mm punches from FTA cards wi...

  16. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  17. ESSENTIALS: Software for Rapid Analysis of High Throughput Transposon Insertion Sequencing Data.

    NARCIS (Netherlands)

    Zomer, A.L.; Burghout, P.J.; Bootsma, H.J.; Hermans, P.W.M.; Hijum, S.A.F.T. van

    2012-01-01

    High-throughput analysis of genome-wide random transposon mutant libraries is a powerful tool for (conditional) essential gene discovery. Recently, several next-generation sequencing approaches, e.g. Tn-seq/INseq, HITS and TraDIS, have been developed that accurately map the site of transposon

  18. Increasing ecological inference from high throughput sequencing of fungi in the environment through a tagging approach

    Science.gov (United States)

    D. Lee Taylor; Michael G. Booth; Jack W. McFarland; Ian C. Herriott; Niall J. Lennon; Chad Nusbaum; Thomas G. Marr

    2008-01-01

    High throughput sequencing methods are widely used in analyses of microbial diversity but are generally applied to small numbers of samples, which precludes charaterization of patterns of microbial diversity across space and time. We have designed a primer-tagging approach that allows pooling and subsequent sorting of numerous samples, which is directed to...

  19. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    Science.gov (United States)

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  20. High throughput generated micro-aggregates of chondrocytes stimulate cartilage formation in vitro and in vivo

    NARCIS (Netherlands)

    Moreira Teixeira, Liliana; Leijten, Jeroen Christianus Hermanus; Sobral, J.; Jin, R.; van Apeldoorn, Aart A.; Feijen, Jan; van Blitterswijk, Clemens; Dijkstra, Pieter J.; Karperien, Hermanus Bernardus Johannes

    2012-01-01

    Cell-based cartilage repair strategies such as matrix-induced autologous chondrocyte implantation (MACI) could be improved by enhancing cell performance. We hypothesised that micro-aggregates of chondrocytes generated in high-throughput prior to implantation in a defect could stimulate cartilaginous

  1. Incorporating Human Dosimetry and Exposure into High-Throughput In Vitro Toxicity Screening

    Science.gov (United States)

    Many chemicals in commerce today have undergone limited or no safety testing. To reduce the number of untested chemicals and prioritize limited testing resources, several governmental programs are using high-throughput in vitro screens for assessing chemical effects across multip...

  2. Hydrogel Based 3-Dimensional (3D System for Toxicity and High-Throughput (HTP Analysis for Cultured Murine Ovarian Follicles.

    Directory of Open Access Journals (Sweden)

    Hong Zhou

    Full Text Available Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN, preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR. The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased

  3. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles.

    Science.gov (United States)

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  4. Insect-derived cecropins display activity against Acinetobacter baumannii in a whole-animal high-throughput Caenorhabditis elegans model.

    Science.gov (United States)

    Jayamani, Elamparithi; Rajamuthiah, Rajmohan; Larkins-Ford, Jonah; Fuchs, Beth Burgwyn; Conery, Annie L; Vilcinskas, Andreas; Ausubel, Frederick M; Mylonakis, Eleftherios

    2015-03-01

    The rise of multidrug-resistant Acinetobacter baumannii and a concomitant decrease in antibiotic treatment options warrants a search for new classes of antibacterial agents. We have found that A. baumannii is pathogenic and lethal to the model host organism Caenorhabditis elegans and have exploited this phenomenon to develop an automated, high-throughput, high-content screening assay in liquid culture that can be used to identify novel antibiotics effective against A. baumannii. The screening assay involves coincubating C. elegans with A. baumannii in 384-well plates containing potential antibacterial compounds. At the end of the incubation period, worms are stained with a dye that stains only dead animals, and images are acquired using automated microscopy and then analyzed using an automated image analysis program. This robust assay yields a Z' factor consistently greater than 0.7. In a pilot experiment to test the efficacy of the assay, we screened a small custom library of synthetic antimicrobial peptides (AMPs) that were synthesized using publicly available sequence data and/or transcriptomic data from immune-challenged insects. We identified cecropin A and 14 other cecropin or cecropin-like peptides that were able to enhance C. elegans survival in the presence of A. baumannii. Interestingly, one particular hit, BR003-cecropin A, a cationic peptide synthesized by the mosquito Aedes aegypti, showed antibiotic activity against a panel of Gram-negative bacteria and exhibited a low MIC (5 μg/ml) against A. baumannii. BR003-cecropin A causes membrane permeability in A. baumannii, which could be the underlying mechanism of its lethality. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  5. MAPPI-DAT: data management and analysis for protein-protein interaction data from the high-throughput MAPPIT cell microarray platform.

    Science.gov (United States)

    Gupta, Surya; De Puysseleyr, Veronic; Van der Heyden, José; Maddelein, Davy; Lemmens, Irma; Lievens, Sam; Degroeve, Sven; Tavernier, Jan; Martens, Lennart

    2017-05-01

    Protein-protein interaction (PPI) studies have dramatically expanded our knowledge about cellular behaviour and development in different conditions. A multitude of high-throughput PPI techniques have been developed to achieve proteome-scale coverage for PPI studies, including the microarray based Mammalian Protein-Protein Interaction Trap (MAPPIT) system. Because such high-throughput techniques typically report thousands of interactions, managing and analysing the large amounts of acquired data is a challenge. We have therefore built the MAPPIT cell microArray Protein Protein Interaction-Data management & Analysis Tool (MAPPI-DAT) as an automated data management and analysis tool for MAPPIT cell microarray experiments. MAPPI-DAT stores the experimental data and metadata in a systematic and structured way, automates data analysis and interpretation, and enables the meta-analysis of MAPPIT cell microarray data across all stored experiments. MAPPI-DAT is developed in Python, using R for data analysis and MySQL as data management system. MAPPI-DAT is cross-platform and can be ran on Microsoft Windows, Linux and OS X/macOS. The source code and a Microsoft Windows executable are freely available under the permissive Apache2 open source license at https://github.com/compomics/MAPPI-DAT. jan.tavernier@vib-ugent.be or lennart.martens@vib-ugent.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  6. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  7. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  8. Raman-Activated Droplet Sorting (RADS) for Label-Free High-Throughput Screening of Microalgal Single-Cells.

    Science.gov (United States)

    Wang, Xixian; Ren, Lihui; Su, Yetian; Ji, Yuetong; Liu, Yaoping; Li, Chunyu; Li, Xunrong; Zhang, Yi; Wang, Wei; Hu, Qiang; Han, Danxiang; Xu, Jian; Ma, Bo

    2017-11-21

    Raman-activated cell sorting (RACS) has attracted increasing interest, yet throughput remains one major factor limiting its broader application. Here we present an integrated Raman-activated droplet sorting (RADS) microfluidic system for functional screening of live cells in a label-free and high-throughput manner, by employing AXT-synthetic industrial microalga Haematococcus pluvialis (H. pluvialis) as a model. Raman microspectroscopy analysis of individual cells is carried out prior to their microdroplet encapsulation, which is then directly coupled to DEP-based droplet sorting. To validate the system, H. pluvialis cells containing different levels of AXT were mixed and underwent RADS. Those AXT-hyperproducing cells were sorted with an accuracy of 98.3%, an enrichment ratio of eight folds, and a throughput of ∼260 cells/min. Of the RADS-sorted cells, 92.7% remained alive and able to proliferate, which is equivalent to the unsorted cells. Thus, the RADS achieves a much higher throughput than existing RACS systems, preserves the vitality of cells, and facilitates seamless coupling with downstream manipulations such as single-cell sequencing and cultivation.

  9. Mining environmental high-throughput sequence data sets to identify divergent amplicon clusters for phylogenetic reconstruction and morphotype visualization.

    Science.gov (United States)

    Gimmler, Anna; Stoeck, Thorsten

    2015-08-01

    Environmental high-throughput sequencing (envHTS) is a very powerful tool, which in protistan ecology is predominantly used for the exploration of diversity and its geographic and local patterns. We here used a pyrosequenced V4-SSU rDNA data set from a solar saltern pond as test case to exploit such massive protistan amplicon data sets beyond this descriptive purpose. Therefore, we combined a Swarm-based blastn network including 11 579 ciliate V4 amplicons to identify divergent amplicon clusters with targeted polymerase chain reaction (PCR) primer design for full-length small subunit of the ribosomal DNA retrieval and probe design for fluorescence in situ hybridization (FISH). This powerful strategy allows to benefit from envHTS data sets to (i) reveal the phylogenetic position of the taxon behind divergent amplicons; (ii) improve phylogenetic resolution and evolutionary history of specific taxon groups; (iii) solidly assess an amplicons (species') degree of similarity to its closest described relative; (iv) visualize the morphotype behind a divergent amplicons cluster; (v) rapidly FISH screen many environmental samples for geographic/habitat distribution and abundances of the respective organism and (vi) to monitor the success of enrichment strategies in live samples for cultivation and isolation of the respective organisms. © 2015 Society for Applied Microbiology and John Wiley & Sons Ltd.

  10. High-throughput growth prediction for Lactuca sativa L. seedlings using chlorophyll fluorescence in a plant factory with artificial lighting

    Directory of Open Access Journals (Sweden)

    Shogo eMoriyuki

    2016-03-01

    Full Text Available Poorly grown plants that result from differences in individuals lead to large profit losses for plant factories that use large electric power sources for cultivation. Thus, identifying and culling the low-grade seedlings at an early stage, using so-called seedling diagnosis technology, plays an important role in avoiding large losses in plant factories. In this study, we developed a high-throughput diagnosis system using the measurement of chlorophyll fluorescence (CF in a commercial large-scale plant factory, which produces about 5,000 lettuce plants every day. At an early stage (6 days after sowing, a CF image of 6,000 seedlings was captured every 4 hours on the final greening day by a high-sensitivity CCD camera and an automatic transferring machine, and biological indices were extracted. Using machine learning, plant growth can be predicted with a high degree of accuracy based on biological indices including leaf size, amount of CF, and circadian rhythms in CF. Growth prediction was improved by addition of temporal information on CF. The present data also provide new insights into the relationships between growth and temporal information regulated by the inherent biological clock.

  11. Yeast diversity during the fermentation of Andean chicha: A comparison of high-throughput sequencing and culture-dependent approaches.

    Science.gov (United States)

    Mendoza, Lucía M; Neef, Alexander; Vignolo, Graciela; Belloch, Carmela

    2017-10-01

    Diversity and dynamics of yeasts associated with the fermentation of Argentinian maize-based beverage chicha was investigated. Samples taken at different stages from two chicha productions were analyzed by culture-dependent and culture-independent methods. Five hundred and ninety six yeasts were isolated by classical microbiological methods and 16 species identified by RFLPs and sequencing of D1/D2 26S rRNA gene. Genetic typing of isolates from the dominant species, Saccharomyces cerevisiae, by PCR of delta elements revealed up to 42 different patterns. High-throughput sequencing (HTS) of D1/D2 26S rRNA gene amplicons from chicha samples detected more than one hundred yeast species and almost fifty filamentous fungi taxa. Analysis of the data revealed that yeasts dominated the fermentation, although, a significant percentage of filamentous fungi appeared in the first step of the process. Statistical analysis of results showed that very few taxa were represented by more than 1% of the reads per sample at any step of the process. S. cerevisiae represented more than 90% of the reads in the fermentative samples. Other yeast species dominated the pre-fermentative steps and abounded in fermented samples when S. cerevisiae was in percentages below 90%. Most yeasts species detected by pyrosequencing were not recovered by cultivation. In contrast, the cultivation-based methodology detected very few yeast taxa, and most of them corresponded with very few reads in the pyrosequencing analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. PubChem BioAssay: A Decade's Development toward Open High-Throughput Screening Data Sharing.

    Science.gov (United States)

    Wang, Yanli; Cheng, Tiejun; Bryant, Stephen H

    2017-07-01

    High-throughput screening (HTS) is now routinely conducted for drug discovery by both pharmaceutical companies and screening centers at academic institutions and universities. Rapid advance in assay development, robot automation, and computer technology has led to the generation of terabytes of data in screening laboratories. Despite the technology development toward HTS productivity, fewer efforts were devoted to HTS data integration and sharing. As a result, the huge amount of HTS data was rarely made available to the public. To fill this gap, the PubChem BioAssay database ( https://www.ncbi.nlm.nih.gov/pcassay/ ) was set up in 2004 to provide open access to the screening results tested on chemicals and RNAi reagents. With more than 10 years' development and contributions from the community, PubChem has now become the largest public repository for chemical structures and biological data, which provides an information platform to worldwide researchers supporting drug development, medicinal chemistry study, and chemical biology research. This work presents a review of the HTS data content in the PubChem BioAssay database and the progress of data deposition to stimulate knowledge discovery and data sharing. It also provides a description of the database's data standard and basic utilities facilitating information access and use for new users.

  13. A high-throughput method for the quantitative analysis of indole-3-acetic acid and other auxins from plant tissue.

    Science.gov (United States)

    Barkawi, Lana S; Tam, Yuen-Yee; Tillman, Julie A; Pederson, Ben; Calio, Jessica; Al-Amier, Hussein; Emerick, Michael; Normanly, Jennifer; Cohen, Jerry D

    2008-01-15

    To investigate novel pathways involved in auxin biosynthesis, transport, metabolism, and response, we have developed a high-throughput screen for indole-3-acetic acid (IAA) levels. Historically, the quantitative analysis of IAA has been a cumbersome and time-consuming process that does not lend itself to the screening of large numbers of samples. The method described here can be performed with or without an automated liquid handler and involves purification solely by solid-phase extraction in a 96-well format, allowing the analysis of up to 96 samples per day. In preparation for quantitative analysis by selected ion monitoring-gas chromatography-mass spectrometry, the carboxylic acid moiety of IAA is derivatized by methylation. The derivatization of the IAA described here was also done in a 96-well format in which up to 96 samples can be methylated at once, minimizing the handling of the toxic reagent, diazomethane. To this end, we have designed a custom diazomethane generator that can safely withstand high flow and accommodate larger volumes. The method for IAA analysis is robust and accurate over a range of plant tissue weights and can be used to screen for and quantify other indolic auxins and compounds including indole-3-butyric acid, 4-chloro-indole-3-acetic acid, and indole-3-propionic acid.

  14. Human Leukocyte Antigen Typing Using a Knowledge Base Coupled with a High Throughput Oligonucleotide Probe Array Analysis

    Directory of Open Access Journals (Sweden)

    Vladimir eBrusic

    2014-11-01

    Full Text Available Human leukocyte antigens (HLA are important biomarkers since multiple diseases, drug toxicity, and vaccine responses reveal strong HLA associations. Current clinical HLA typing is an elimination process requiring serial testing. We present an alternative an in situ synthesized DNA-based microarray method that contains hundreds of thousands of probes representing a complete overlapping set covering 1,610 clinically relevant HLA class I alleles accompanied by computational tools for assigning HLA type to 4-digit resolution. Our proof-of-concept experiment included 21 blood samples, 18 cell lines, and multiple controls. The method is accurate, robust, and amenable to automation. Typing errors were restricted to homozygous samples or those with very closely related alleles from the same locus, but readily resolved by targeted DNA sequencing validation of flagged samples. High-throughput HLA typing technologies that are effective, yet inexpensive, can be used to analyze the world’s populations, benefiting both global public health and personalized health care.

  15. SSR_pipeline: a bioinformatic infrastructure for identifying microsatellites from paired-end Illumina high-throughput DNA sequencing data

    Science.gov (United States)

    Miller, Mark P.; Knaus, Brian J.; Mullins, Thomas D.; Haig, Susan M.

    2013-01-01

    SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (e.g., microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains 3 analysis modules along with a fourth control module that can automate analyses of large volumes of data. The modules are used to 1) identify the subset of paired-end sequences that pass Illumina quality standards, 2) align paired-end reads into a single composite DNA sequence, and 3) identify sequences that possess microsatellites (both simple and compound) conforming to user-specified parameters. The microsatellite search algorithm is extremely efficient, and we have used it to identify repeats with motifs from 2 to 25bp in length. Each of the 3 analysis modules can also be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc.). We demonstrate use of the program with data from the brine fly Ephydra packardi (Diptera: Ephydridae) and provide empirical timing benchmarks to illustrate program performance on a common desktop computer environment. We further show that the Illumina platform is capable of identifying large numbers of microsatellites, even when using unenriched sample libraries and a very small percentage of the sequencing capacity from a single DNA sequencing run. All modules from SSR_pipeline are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, and Windows).

  16. SSR_pipeline: a bioinformatic infrastructure for identifying microsatellites from paired-end Illumina high-throughput DNA sequencing data.

    Science.gov (United States)

    Miller, Mark P; Knaus, Brian J; Mullins, Thomas D; Haig, Susan M

    2013-01-01

    SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (e.g., microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains 3 analysis modules along with a fourth control module that can automate analyses of large volumes of data. The modules are used to 1) identify the subset of paired-end sequences that pass Illumina quality standards, 2) align paired-end reads into a single composite DNA sequence, and 3) identify sequences that possess microsatellites (both simple and compound) conforming to user-specified parameters. The microsatellite search algorithm is extremely efficient, and we have used it to identify repeats with motifs from 2 to 25 bp in length. Each of the 3 analysis modules can also be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc.). We demonstrate use of the program with data from the brine fly Ephydra packardi (Diptera: Ephydridae) and provide empirical timing benchmarks to illustrate program performance on a common desktop computer environment. We further show that the Illumina platform is capable of identifying large numbers of microsatellites, even when using unenriched sample libraries and a very small percentage of the sequencing capacity from a single DNA sequencing run. All modules from SSR_pipeline are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, and Windows).

  17. IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.

    Science.gov (United States)

    Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis

    2018-04-01

    Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.

  18. Spatial Mapping of Protein Abundances in the Mouse Brain by Voxelation Integrated with High-Throughput Liquid Chromatography ? Mass Spectrometry

    International Nuclear Information System (INIS)

    Petyuk, Vladislav A.; Qian, Weijun; Chin, Mark H.; Wang, Haixing H.; Livesay, Eric A.; Monroe, Matthew E.; Adkins, Joshua N.; Jaitly, Navdeep; Anderson, David J.; Camp, David G.; Smith, Desmond J.; Smith, Richard D.

    2007-01-01

    Temporally and spatially resolved mapping of protein abundance patterns within the mammalian brain is of significant interest for understanding brain function and molecular etiologies of neurodegenerative diseases; however, such imaging efforts have been greatly challenged by complexity of the proteome, throughput and sensitivity of applied analytical methodologies, and accurate quantitation of protein abundances across the brain. Here, we describe a methodology for comprehensive spatial proteome mapping that addresses these challenges by employing voxelation integrated with automated microscale sample processing, high-throughput LC system coupled with high resolution Fourier transform ion cyclotron mass spectrometer and a ''universal'' stable isotope labeled reference sample approach for robust quantitation. We applied this methodology as a proof-of-concept trial for the analysis of protein distribution within a single coronal slice of a C57BL/6J mouse brain. For relative quantitation of the protein abundances across the slice, an 18O-isotopically labeled reference sample, derived from a whole control coronal slice from another mouse, was spiked into each voxel sample and stable isotopic intensity ratios were used to obtain measures of relative protein abundances. In total, we generated maps of protein abundance patterns for 1,028 proteins. The significant agreement of the protein distributions with previously reported data supports the validity of this methodology, which opens new opportunities for studying the spatial brain proteome and its dynamics during the course of disease progression and other important biological and associated health aspects in a discovery-driven fashion

  19. Towards high-throughput phenotyping of complex patterned behaviors in rodents: focus on mouse self-grooming and its sequencing.

    Science.gov (United States)

    Kyzar, Evan; Gaikwad, Siddharth; Roth, Andrew; Green, Jeremy; Pham, Mimi; Stewart, Adam; Liang, Yiqing; Kobla, Vikrant; Kalueff, Allan V

    2011-12-01

    Increasingly recognized in biological psychiatry, rodent self-grooming is a complex patterned behavior with evolutionarily conserved cephalo-caudal progression. While grooming is traditionally assessed by the latency, frequency and duration, its sequencing represents another important domain sensitive to various experimental manipulations. Such behavioral complexity requires novel objective approaches to quantify rodent grooming, in addition to time-consuming and highly variable manual observation. The present study combined modern behavior-recognition video-tracking technologies (CleverSys, Inc.) with manual observation to characterize in-depth spontaneous (novelty-induced) and artificial (water-induced) self-grooming in adult male C57BL/6J mice. We specifically focused on individual episodes of grooming (paw licking, head washing, body/leg washing, and tail/genital grooming), their duration and transitions between episodes. Overall, the frequency, duration and transitions detected using the automated approach significantly correlated with manual observations (R=0.51-0.7, pgrooming, also indicating that behavior-recognition tools can be applied to characterize both the amount and sequential organization (patterning) of rodent grooming. Together with further refinement and methodological advancement, this approach will foster high-throughput neurophenotyping of grooming, with multiple applications in drug screening and testing of genetically modified animals. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. A High-Throughput Microfluidic Platform for Mammalian Cell Transfection and Culturing

    Science.gov (United States)

    Woodruff, Kristina; Maerkl, Sebastian J.

    2016-01-01

    Mammalian synthetic biology could be augmented through the development of high-throughput microfluidic systems that integrate cellular transfection, culturing, and imaging. We created a microfluidic chip that cultures cells and implements 280 independent transfections at up to 99% efficiency. The chip can perform co-transfections, in which the number of cells expressing each protein and the average protein expression level can be precisely tuned as a function of input DNA concentration and synthetic gene circuits can be optimized on chip. We co-transfected four plasmids to test a histidine kinase signaling pathway and mapped the dose dependence of this network on the level of one of its constituents. The chip is readily integrated with high-content imaging, enabling the evaluation of cellular behavior and protein expression dynamics over time. These features make the transfection chip applicable to high-throughput mammalian protein and synthetic biology studies. PMID:27030663

  1. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  2. Determining the optimal size of small molecule mixtures for high throughput NMR screening

    International Nuclear Information System (INIS)

    Mercier, Kelly A.; Powers, Robert

    2005-01-01

    High-throughput screening (HTS) using NMR spectroscopy has become a common component of the drug discovery effort and is widely used throughout the pharmaceutical industry. NMR provides additional information about the nature of small molecule-protein interactions compared to traditional HTS methods. In order to achieve comparable efficiency, small molecules are often screened as mixtures in NMR-based assays. Nevertheless, an analysis of the efficiency of mixtures and a corresponding determination of the optimum mixture size (OMS) that minimizes the amount of material and instrumentation time required for an NMR screen has been lacking. A model for calculating OMS based on the application of the hypergeometric distribution function to determine the probability of a 'hit' for various mixture sizes and hit rates is presented. An alternative method for the deconvolution of large screening mixtures is also discussed. These methods have been applied in a high-throughput NMR screening assay using a small, directed library

  3. A high-throughput exploration of magnetic materials by using structure predicting methods

    Science.gov (United States)

    Arapan, S.; Nieves, P.; Cuesta-López, S.

    2018-02-01

    We study the capability of a structure predicting method based on genetic/evolutionary algorithm for a high-throughput exploration of magnetic materials. We use the USPEX and VASP codes to predict stable and generate low-energy meta-stable structures for a set of representative magnetic structures comprising intermetallic alloys, oxides, interstitial compounds, and systems containing rare-earths elements, and for both types of ferromagnetic and antiferromagnetic ordering. We have modified the interface between USPEX and VASP codes to improve the performance of structural optimization as well as to perform calculations in a high-throughput manner. We show that exploring the structure phase space with a structure predicting technique reveals large sets of low-energy metastable structures, which not only improve currently exiting databases, but also may provide understanding and solutions to stabilize and synthesize magnetic materials suitable for permanent magnet applications.

  4. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy

    Directory of Open Access Journals (Sweden)

    Yiu Wai Lai, Michael Krause, Alan Savan, Sigurd Thienhaus, Nektarios Koukourakis, Martin R Hofmann and Alfred Ludwig

    2011-01-01

    Full Text Available A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  5. High-throughput Sequencing Based Immune Repertoire Study during Infectious Disease

    Directory of Open Access Journals (Sweden)

    Dongni Hou

    2016-08-01

    Full Text Available The selectivity of the adaptive immune response is based on the enormous diversity of T and B cell antigen-specific receptors. The immune repertoire, the collection of T and B cells with functional diversity in the circulatory system at any given time, is dynamic and reflects the essence of immune selectivity. In this article, we review the recent advances in immune repertoire study of infectious diseases that achieved by traditional techniques and high-throughput sequencing techniques. High-throughput sequencing techniques enable the determination of complementary regions of lymphocyte receptors with unprecedented efficiency and scale. This progress in methodology enhances the understanding of immunologic changes during pathogen challenge, and also provides a basis for further development of novel diagnostic markers, immunotherapies and vaccines.

  6. High-throughput investigation of catalysts for JP-8 fuel cracking to liquefied petroleum gas.

    Science.gov (United States)

    Bedenbaugh, John E; Kim, Sungtak; Sasmaz, Erdem; Lauterbach, Jochen

    2013-09-09

    Portable power technologies for military applications necessitate the production of fuels similar to LPG from existing feedstocks. Catalytic cracking of military jet fuel to form a mixture of C₂-C₄ hydrocarbons was investigated using high-throughput experimentation. Cracking experiments were performed in a gas-phase, 16-sample high-throughput reactor. Zeolite ZSM-5 catalysts with low Si/Al ratios (≤25) demonstrated the highest production of C₂-C₄ hydrocarbons at moderate reaction temperatures (623-823 K). ZSM-5 catalysts were optimized for JP-8 cracking activity to LPG through varying reaction temperature and framework Si/Al ratio. The reducing atmosphere required during catalytic cracking resulted in coking of the catalyst and a commensurate decrease in conversion rate. Rare earth metal promoters for ZSM-5 catalysts were screened to reduce coking deactivation rates, while noble metal promoters reduced onset temperatures for coke burnoff regeneration.

  7. High-throughput miniaturized microfluidic microscopy with radially parallelized channel geometry.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Bhat, Bindu Prabhath; Nirupa Julius, Lourdes Albina; Gorthi, Sai Siva

    2016-03-01

    In this article, we present a novel approach to throughput enhancement in miniaturized microfluidic microscopy systems. Using the presented approach, we demonstrate an inexpensive yet high-throughput analytical instrument. Using the high-throughput analytical instrument, we have been able to achieve about 125,880 cells per minute (more than one hundred and twenty five thousand cells per minute), even while employing cost-effective low frame rate cameras (120 fps). The throughput achieved here is a notable progression in the field of diagnostics as it enables rapid quantitative testing and analysis. We demonstrate the applicability of the instrument to point-of-care diagnostics, by performing blood cell counting. We report a comparative analysis between the counts (in cells per μl) obtained from our instrument, with that of a commercially available hematology analyzer.

  8. Multiple and high-throughput droplet reactions via combination of microsampling technique and microfluidic chip

    KAUST Repository

    Wu, Jinbo

    2012-11-20

    Microdroplets offer unique compartments for accommodating a large number of chemical and biological reactions in tiny volume with precise control. A major concern in droplet-based microfluidics is the difficulty to address droplets individually and achieve high throughput at the same time. Here, we have combined an improved cartridge sampling technique with a microfluidic chip to perform droplet screenings and aggressive reaction with minimal (nanoliter-scale) reagent consumption. The droplet composition, distance, volume (nanoliter to subnanoliter scale), number, and sequence could be precisely and digitally programmed through the improved sampling technique, while sample evaporation and cross-contamination are effectively eliminated. Our combined device provides a simple model to utilize multiple droplets for various reactions with low reagent consumption and high throughput. © 2012 American Chemical Society.

  9. Current developments in high-throughput analysis for microalgae cellular contents.

    Science.gov (United States)

    Lee, Tsung-Hua; Chang, Jo-Shu; Wang, Hsiang-Yu

    2013-11-01

    Microalgae have emerged as one of the most promising feedstocks for biofuels and bio-based chemical production. However, due to the lack of effective tools enabling rapid and high-throughput analysis of the content of microalgae biomass, the efficiency of screening and identification of microalgae with desired functional components from the natural environment is usually quite low. Moreover, the real-time monitoring of the production of target components from microalgae is also difficult. Recently, research efforts focusing on overcoming this limitation have started. In this review, the recent development of high-throughput methods for analyzing microalgae cellular contents is summarized. The future prospects and impacts of these detection methods in microalgae-related processing and industries are also addressed. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Studying binding specificities of peptide recognition modules by high-throughput phage display selections.

    Science.gov (United States)

    Huang, Haiming; Sidhu, Sachdev S

    2011-01-01

    Peptide recognition modules (PRMs) play critical roles in cellular processes, including differentiation, proliferation and cytoskeleton organization. PRMs normally bind to short linear motifs in protein ligands, and by so doing recruit proteins into signaling complexes. Based on the binding specificity profile of a PRM, one can predict putative natural interaction partners by searching genome databases. Candidate interaction partners can in turn provide clues to assemble potential in vivo protein complexes that the PRM may be involved with. Combinatorial peptide libraries have proven to be effective tools for profiling the binding specificities of PRMs. Herein, we describe high-throughput methods for the expression and purification of PRM proteins and the use of peptide-phage libraries for PRM specificity profiling. These high-throughput methods greatly expedite the study of PRM families on a genome-wide scale.

  11. High-throughput screening of antibiotic-resistant bacteria in picodroplets.

    Science.gov (United States)

    Liu, X; Painter, R E; Enesa, K; Holmes, D; Whyte, G; Garlisi, C G; Monsma, F J; Rehak, M; Craig, F F; Smith, C A

    2016-04-26

    The prevalence of clinically-relevant bacterial strains resistant to current antibiotic therapies is increasing and has been recognized as a major health threat. For example, multidrug-resistant tuberculosis and methicillin-resistant Staphylococcus aureus are of global concern. Novel methodologies are needed to identify new targets or novel compounds unaffected by pre-existing resistance mechanisms. Recently, water-in-oil picodroplets have been used as an alternative to conventional high-throughput methods, especially for phenotypic screening. Here we demonstrate a novel microfluidic-based picodroplet platform which enables high-throughput assessment and isolation of antibiotic-resistant bacteria in a label-free manner. As a proof-of-concept, the system was used to isolate fusidic acid-resistant mutants and estimate the frequency of resistance among a population of Escherichia coli (strain HS151). This approach can be used for rapid screening of rare antibiotic-resistant mutants to help identify novel compound/target pairs.

  12. High-throughput screening assay of hepatitis C virus helicase inhibitors using fluorescence-quenching phenomenon

    International Nuclear Information System (INIS)

    Tani, Hidenori; Akimitsu, Nobuyoshi; Fujita, Osamu; Matsuda, Yasuyoshi; Miyata, Ryo; Tsuneda, Satoshi; Igarashi, Masayuki; Sekiguchi, Yuji; Noda, Naohiro

    2009-01-01

    We have developed a novel high-throughput screening assay of hepatitis C virus (HCV) nonstructural protein 3 (NS3) helicase inhibitors using the fluorescence-quenching phenomenon via photoinduced electron transfer between fluorescent dyes and guanine bases. We prepared double-stranded DNA (dsDNA) with a 5'-fluorescent-dye (BODIPY FL)-labeled strand hybridized with a complementary strand, the 3'-end of which has guanine bases. When dsDNA is unwound by helicase, the dye emits fluorescence owing to its release from the guanine bases. Our results demonstrate that this assay is suitable for quantitative assay of HCV NS3 helicase activity and useful for high-throughput screening for inhibitors. Furthermore, we applied this assay to the screening for NS3 helicase inhibitors from cell extracts of microorganisms, and found several cell extracts containing potential inhibitors.

  13. High-throughput method to predict extrusion pressure of ceramic pastes.

    Science.gov (United States)

    Cao, Kevin; Liu, Yang; Tucker, Christopher; Baumann, Michael; Grit, Grote; Lakso, Steven

    2014-04-14

    A new method was developed to measure the rheology of extrudable ceramic pastes using a Hamilton MicroLab Star liquid handler. The Hamilton instrument, normally used for high throughput liquid processing, was expanded to function as a low pressure capillary rheometer. Diluted ceramic pastes were forced through the modified pipettes, which produced pressure drop data that was converted to standard rheology data. A known ceramic paste containing cellulose ether was made and diluted to various concentrations in water. The most dilute paste samples were tested in the Hamilton instrument and the more typical, highly concentrated, ceramic paste were tested with a hydraulic ram extruder fitted with a capillary die and pressure measurement system. The rheology data from this study indicates that the dilute high throughput method using the Hamilton instrument correlates to, and can predict, the rheology of concentrated ceramic pastes normally used in ceramic extrusion production processes.

  14. Development and Application of a High Throughput Protein Unfolding Kinetic Assay

    Science.gov (United States)

    Wang, Qiang; Waterhouse, Nicklas; Feyijinmi, Olusegun; Dominguez, Matthew J.; Martinez, Lisa M.; Sharp, Zoey; Service, Rachel; Bothe, Jameson R.; Stollar, Elliott J.

    2016-01-01

    The kinetics of folding and unfolding underlie protein stability and quantification of these rates provides important insights into the folding process. Here, we present a simple high throughput protein unfolding kinetic assay using a plate reader that is applicable to the studies of the majority of 2-state folding proteins. We validate the assay by measuring kinetic unfolding data for the SH3 (Src Homology 3) domain from Actin Binding Protein 1 (AbpSH3) and its stabilized mutants. The results of our approach are in excellent agreement with published values. We further combine our kinetic assay with a plate reader equilibrium assay, to obtain indirect estimates of folding rates and use these approaches to characterize an AbpSH3-peptide hybrid. Our high throughput protein unfolding kinetic assays allow accurate screening of libraries of mutants by providing both kinetic and equilibrium measurements and provide a means for in-depth ϕ-value analyses. PMID:26745729

  15. Combinatorial Method/High Throughput Strategies for Hydrogel Optimization in Tissue Engineering Applications

    Directory of Open Access Journals (Sweden)

    Laura A. Smith Callahan

    2016-06-01

    Full Text Available Combinatorial method/high throughput strategies, which have long been used in the pharmaceutical industry, have recently been applied to hydrogel optimization for tissue engineering applications. Although many combinatorial methods have been developed, few are suitable for use in tissue engineering hydrogel optimization. Currently, only three approaches (design of experiment, arrays and continuous gradients have been utilized. This review highlights recent work with each approach. The benefits and disadvantages of design of experiment, array and continuous gradient approaches depending on study objectives and the general advantages of using combinatorial methods for hydrogel optimization over traditional optimization strategies will be discussed. Fabrication considerations for combinatorial method/high throughput samples will additionally be addressed to provide an assessment of the current state of the field, and potential future contributions to expedited material optimization and design.

  16. A robust, high-throughput method for computing maize ear, cob, and kernel attributes automatically from images.

    Science.gov (United States)

    Miller, Nathan D; Haase, Nicholas J; Lee, Jonghyun; Kaeppler, Shawn M; de Leon, Natalia; Spalding, Edgar P

    2017-01-01

    Grain yield of the maize plant depends on the sizes, shapes, and numbers of ears and the kernels they bear. An automated pipeline that can measure these components of yield from easily-obtained digital images is needed to advance our understanding of this globally important crop. Here we present three custom algorithms designed to compute such yield components automatically from digital images acquired by a low-cost platform. One algorithm determines the average space each kernel occupies along the cob axis using a sliding-window Fourier transform analysis of image intensity features. A second counts individual kernels removed from ears, including those in clusters. A third measures each kernel's major and minor axis after a Bayesian analysis of contour points identifies the kernel tip. Dimensionless ear and kernel shape traits that may interrelate yield components are measured by principal components analysis of contour point sets. Increased objectivity and speed compared to typical manual methods are achieved without loss of accuracy as evidenced by high correlations with ground truth measurements and simulated data. Millimeter-scale differences among ear, cob, and kernel traits that ranged more than 2.5-fold across a diverse group of inbred maize lines were resolved. This system for measuring maize ear, cob, and kernel attributes is being used by multiple research groups as an automated Web service running on community high-throughput computing and distributed data storage infrastructure. Users may create their own workflow using the source code that is staged for download on a public repository. © 2016 The Authors. The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.

  17. A High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    OpenAIRE

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.

    2015-01-01

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes a...

  18. Development of Microfluidic Systems Enabling High-Throughput Single-Cell Protein Characterization

    OpenAIRE

    Fan, Beiyuan; Li, Xiufeng; Chen, Deyong; Peng, Hongshang; Wang, Junbo; Chen, Jian

    2016-01-01

    This article reviews recent developments in microfluidic systems enabling high-throughput characterization of single-cell proteins. Four key perspectives of microfluidic platforms are included in this review: (1) microfluidic fluorescent flow cytometry; (2) droplet based microfluidic flow cytometry; (3) large-array micro wells (microengraving); and (4) large-array micro chambers (barcode microchips). We examine the advantages and limitations of each technique and discuss future research oppor...

  19. A Reconfigurable Low Power High Throughput Architecture for Deep Network Training

    OpenAIRE

    Hasan, Raqibul; Taha, Tarek

    2016-01-01

    General purpose computing systems are used for a large variety of applications. Extensive supports for flexibility in these systems limit their energy efficiencies. Neural networks, including deep networks, are widely used for signal processing and pattern recognition applications. In this paper we propose a multicore architecture for deep neural network based processing. Memristor crossbars are utilized to provide low power high throughput execution of neural networks. The system has both tr...

  20. High-throughput recombinant protein expression in Escherichia coli: current status and future perspectives

    OpenAIRE

    Jia, Baolei; Jeon, Che Ok

    2016-01-01

    The ease of genetic manipulation, low cost, rapid growth and number of previous studies have made Escherichia coli one of the most widely used microorganism species for producing recombinant proteins. In this post-genomic era, challenges remain to rapidly express and purify large numbers of proteins for academic and commercial purposes in a high-throughput manner. In this review, we describe several state-of-the-art approaches that are suitable for the cloning, expression and purification, co...

  1. High-throughput functional genomic methods to analyze the effects of dietary lipids.

    Science.gov (United States)

    Puskás, László G; Ménesi, Dalma; Fehér, Liliána Z; Kitajka, Klára

    2006-12-01

    The applications of 'omics' (genomics, transcriptomics, proteomics and metabolomics) technologies in nutritional studies have opened new possibilities to understand the effects and the action of different diets both in healthy and diseased states and help to define personalized diets and to develop new drugs that revert or prevent the negative dietary effects. Several single nucleotide polymorphisms have already been investigated for potential gene-diet interactions in the response to different lipid diets. It is also well-known that besides the known cellular effects of lipid nutrition, dietary lipids influence gene expression in a tissue, concentration and age-dependent manner. Protein expression and post-translational changes due to different diets have been reported as well. To understand the molecular basis of the effects and roles of dietary lipids high-throughput functional genomic methods such as DNA- or protein microarrays, high-throughput NMR and mass spectrometry are needed to assess the changes in a global way at the genome, at the transcriptome, at the proteome and at the metabolome level. The present review will focus on different high-throughput technologies from the aspects of assessing the effects of dietary fatty acids including cholesterol and polyunsaturated fatty acids. Several genes were identified that exhibited altered expression in response to fish-oil treatment of human lung cancer cells, including protein kinase C, natriuretic peptide receptor-A, PKNbeta, interleukin-1 receptor associated kinase-1 (IRAK-1) and diacylglycerol kinase genes by using high-throughput quantitative real-time PCR. Other results will also be mentioned obtained from cholesterol and polyunsaturated fatty acid fed animals by using DNA- and protein microarrays.

  2. Fluorescence-based high-throughput screening of dicer cleavage activity

    Czech Academy of Sciences Publication Activity Database

    Podolská, Kateřina; Sedlák, David; Bartůněk, Petr; Svoboda, Petr

    2014-01-01

    Roč. 19, č. 3 (2014), s. 417-426 ISSN 1087-0571 R&D Projects: GA ČR GA13-29531S; GA MŠk(CZ) LC06077; GA MŠk LM2011022 Grant - others:EMBO(DE) 1483 Institutional support: RVO:68378050 Keywords : Dicer * siRNA * high-throughput screening Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.423, year: 2014

  3. Patterning cell using Si-stencil for high-throughput assay

    KAUST Repository

    Wu, Jinbo

    2011-01-01

    In this communication, we report a newly developed cell pattering methodology by a silicon-based stencil, which exhibited advantages such as easy handling, reusability, hydrophilic surface and mature fabrication technologies. Cell arrays obtained by this method were used to investigate cell growth under a temperature gradient, which demonstrated the possibility of studying cell behavior in a high-throughput assay. This journal is © The Royal Society of Chemistry 2011.

  4. High-throughput, image-based screening of pooled genetic-variant libraries.

    Science.gov (United States)

    Emanuel, George; Moffitt, Jeffrey R; Zhuang, Xiaowei

    2017-12-01

    We report a high-throughput screening method that allows diverse genotypes and corresponding phenotypes to be imaged in individual cells. We achieve genotyping by introducing barcoded genetic variants into cells as pooled libraries and reading the barcodes out using massively multiplexed fluorescence in situ hybridization. To demonstrate the power of image-based pooled screening, we identified brighter and more photostable variants of the fluorescent protein YFAST among 60,000 variants.

  5. AMBIENT: Active Modules for Bipartite Networks - using high-throughput transcriptomic data to dissect metabolic response

    OpenAIRE

    Bryant, William A; Sternberg, Michael JE; Pinney, John W

    2013-01-01

    Background With the continued proliferation of high-throughput biological experiments, there is a pressing need for tools to integrate the data produced in ways that produce biologically meaningful conclusions. Many microarray studies have analysed transcriptomic data from a pathway perspective, for instance by testing for KEGG pathway enrichment in sets of upregulated genes. However, the increasing availability of species-specific metabolic models provides the opportunity to analyse these da...

  6. High-throughput and low-latency network communication with NetIO

    OpenAIRE

    Schumacher, Jorn; Plessl, Christian; Vandelli, Wainer

    2017-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ appli...

  7. Geochip: A high throughput genomic tool for linking community structure to functions

    Energy Technology Data Exchange (ETDEWEB)

    Van Nostrand, Joy D.; Liang, Yuting; He, Zhili; Li, Guanghe; Zhou, Jizhong

    2009-01-30

    GeoChip is a comprehensive functional gene array that targets key functional genes involved in the geochemical cycling of N, C, and P, sulfate reduction, metal resistance and reduction, and contaminant degradation. Studies have shown the GeoChip to be a sensitive, specific, and high-throughput tool for microbial community analysis that has the power to link geochemical processes with microbial community structure. However, several challenges remain regarding the development and applications of microarrays for microbial community analysis.

  8. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Science.gov (United States)

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  9. High Throughput Single-cell and Multiple-cell Micro-encapsulation

    OpenAIRE

    Lagus, Todd P.; Edd, Jon F.

    2012-01-01

    Microfluidic encapsulation methods have been previously utilized to capture cells in picoliter-scale aqueous, monodisperse drops, providing confinement from a bulk fluid environment with applications in high throughput screening, cytometry, and mass spectrometry. We describe a method to not only encapsulate single cells, but to repeatedly capture a set number of cells (here we demonstrate one- and two-cell encapsulation) to study both isolation and the interactions between cells in groups of ...

  10. Investigation of the antimicrobial activity of soy peptides by developing a high throughput drug screening assay

    OpenAIRE

    Dhayakaran, Rekha; Neethirajan, Suresh; Weng, Xuan

    2016-01-01

    Background Antimicrobial resistance is a great concern in the medical community, as well as food industry. Soy peptides were tested against bacterial biofilms for their antimicrobial activity. A high throughput drug screening assay was developed using microfluidic technology, RAMAN spectroscopy, and optical microscopy for rapid screening of antimicrobials and rapid identification of pathogens. Methods Synthesized PGTAVFK and IKAFKEATKVDKVVVLWTA soy peptides were tested against Pseudomonas aer...

  11. High-throughput expression of animal venom toxins in Escherichia coli to generate a large library of oxidized disulphide-reticulated peptides for drug discovery.

    Science.gov (United States)

    Turchetto, Jeremy; Sequeira, Ana Filipa; Ramond, Laurie; Peysson, Fanny; Brás, Joana L A; Saez, Natalie J; Duhoo, Yoan; Blémont, Marilyne; Guerreiro, Catarina I P D; Quinton, Loic; De Pauw, Edwin; Gilles, Nicolas; Darbon, Hervé; Fontes, Carlos M G A; Vincentelli, Renaud

    2017-01-17

    Animal venoms are complex molecular cocktails containing a wide range of biologically active disulphide-reticulated peptides that target, with high selectivity and efficacy, a variety of membrane receptors. Disulphide-reticulated peptides have evolved to display improved specificity, low immunogenicity and to show much higher resistance to degradation than linear peptides. These properties make venom peptides attractive candidates for drug development. However, recombinant expression of reticulated peptides containing disulphide bonds is challenging, especially when associated with the production of large libraries of bioactive molecules for drug screening. To date, as an alternative to artificial synthetic chemical libraries, no comprehensive recombinant libraries of natural venom peptides are accessible for high-throughput screening to identify novel therapeutics. In the accompanying paper an efficient system for the expression and purification of oxidized disulphide-reticulated venom peptides in Escherichia coli is described. Here we report the development of a high-throughput automated platform, that could be adapted to the production of other families, to generate the largest ever library of recombinant venom peptides. The peptides were produced in the periplasm of E. coli using redox-active DsbC as a fusion tag, thus allowing the efficient formation of correctly folded disulphide bridges. TEV protease was used to remove fusion tags and recover the animal venom peptides in the native state. Globally, within nine months, out of a total of 4992 synthetic genes encoding a representative diversity of venom peptides, a library containing 2736 recombinant disulphide-reticulated peptides was generated. The data revealed that the animal venom peptides produced in the bacterial host were natively folded and, thus, are putatively biologically active. Overall this study reveals that high-throughput expression of animal venom peptides in E. coli can generate large

  12. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  13. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  14. High-throughput gene expression profiling of memory differentiation in primary human T cells

    Directory of Open Access Journals (Sweden)

    Russell Kate

    2008-08-01

    Full Text Available Abstract Background The differentiation of naive T and B cells into memory lymphocytes is essential for immunity to pathogens. Therapeutic manipulation of this cellular differentiation program could improve vaccine efficacy and the in vitro expansion of memory cells. However, chemical screens to identify compounds that induce memory differentiation have been limited by 1 the lack of reporter-gene or functional assays that can distinguish naive and memory-phenotype T cells at high throughput and 2 a suitable cell-line representative of naive T cells. Results Here, we describe a method for gene-expression based screening that allows primary naive and memory-phenotype lymphocytes to be discriminated based on complex genes signatures corresponding to these differentiation states. We used ligation-mediated amplification and a fluorescent, bead-based detection system to quantify simultaneously 55 transcripts representing naive and memory-phenotype signatures in purified populations of human T cells. The use of a multi-gene panel allowed better resolution than any constituent single gene. The method was precise, correlated well with Affymetrix microarray data, and could be easily scaled up for high-throughput. Conclusion This method provides a generic solution for high-throughput differentiation screens in primary human T cells where no single-gene or functional assay is available. This screening platform will allow the identification of small molecules, genes or soluble factors that direct memory differentiation in naive human lymphocytes.

  15. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  16. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Energy Technology Data Exchange (ETDEWEB)

    Schuster, Andre; Bruno, Kenneth S.; Collett, James R.; Baker, Scott E.; Seiboth, Bernhard; Kubicek, Christian P.; Schmoll, Monika

    2012-01-02

    The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina), represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. RESULTS: Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ) by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414.CONCLUSIONS:Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  17. A high-throughput surface plasmon resonance biosensor based on differential interferometric imaging

    International Nuclear Information System (INIS)

    Wang, Daqian; Ding, Lili; Zhang, Wei; Zhang, Enyao; Yu, Xinglong; Luo, Zhaofeng; Ou, Huichao

    2012-01-01

    A new high-throughput surface plasmon resonance (SPR) biosensor based on differential interferometric imaging is reported. The two SPR interferograms of the sensing surface are imaged on two CCD cameras. The phase difference between the two interferograms is 180°. The refractive index related factor (RIRF) of the sensing surface is calculated from the two simultaneously acquired interferograms. The simulation results indicate that the RIRF exhibits a linear relationship with the refractive index of the sensing surface and is unaffected by the noise, drift and intensity distribution of the light source. The affinity and kinetic information can be extracted in real time from continuously acquired RIRF distributions. The results of refractometry experiments show that the dynamic detection range of SPR differential interferometric imaging system can be over 0.015 refractive index unit (RIU). High refractive index resolution is down to 0.45 RU (1 RU = 1 × 10 −6 RIU). Imaging and protein microarray experiments demonstrate the ability of high-throughput detection. The aptamer experiments demonstrate that the SPR sensor based on differential interferometric imaging has a great capability to be implemented for high-throughput aptamer kinetic evaluation. These results suggest that this biosensor has the potential to be utilized in proteomics and drug discovery after further improvement. (paper)

  18. High-throughput purification of recombinant proteins using self-cleaving intein tags.

    Science.gov (United States)

    Coolbaugh, M J; Shakalli Tang, M J; Wood, D W

    2017-01-01

    High throughput methods for recombinant protein production using E. coli typically involve the use of affinity tags for simple purification of the protein of interest. One drawback of these techniques is the occasional need for tag removal before study, which can be hard to predict. In this work, we demonstrate two high throughput purification methods for untagged protein targets based on simple and cost-effective self-cleaving intein tags. Two model proteins, E. coli beta-galactosidase (βGal) and superfolder green fluorescent protein (sfGFP), were purified using self-cleaving versions of the conventional chitin-binding domain (CBD) affinity tag and the nonchromatographic elastin-like-polypeptide (ELP) precipitation tag in a 96-well filter plate format. Initial tests with shake flask cultures confirmed that the intein purification scheme could be scaled down, with >90% pure product generated in a single step using both methods. The scheme was then validated in a high throughput expression platform using 24-well plate cultures followed by purification in 96-well plates. For both tags and with both target proteins, the purified product was consistently obtained in a single-step, with low well-to-well and plate-to-plate variability. This simple method thus allows the reproducible production of highly pure untagged recombinant proteins in a convenient microtiter plate format. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Directory of Open Access Journals (Sweden)

    Schuster André

    2012-01-01

    Full Text Available Abstract Background The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina, represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. Results Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414. Conclusions Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  20. High-Throughput Cloning and Expression Library Creation for Functional Proteomics

    Science.gov (United States)

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-01-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047