WorldWideScience

Sample records for high throughput optimization

  1. Chemometric Optimization Studies in Catalysis Employing High-Throughput Experimentation

    NARCIS (Netherlands)

    Pereira, S.R.M.

    2008-01-01

    The main topic of this thesis is the investigation of the synergies between High-Throughput Experimentation (HTE) and Chemometric Optimization methodologies in Catalysis research and of the use of such methodologies to maximize the advantages of using HTE methods. Several case studies were analysed

  2. High throughput RNAi assay optimization using adherent cell cytometry

    Directory of Open Access Journals (Sweden)

    Pradhan Leena

    2011-04-01

    Full Text Available Abstract Background siRNA technology is a promising tool for gene therapy of vascular disease. Due to the multitude of reagents and cell types, RNAi experiment optimization can be time-consuming. In this study adherent cell cytometry was used to rapidly optimize siRNA transfection in human aortic vascular smooth muscle cells (AoSMC. Methods AoSMC were seeded at a density of 3000-8000 cells/well of a 96well plate. 24 hours later AoSMC were transfected with either non-targeting unlabeled siRNA (50 nM, or non-targeting labeled siRNA, siGLO Red (5 or 50 nM using no transfection reagent, HiPerfect or Lipofectamine RNAiMax. For counting cells, Hoechst nuclei stain or Cell Tracker green were used. For data analysis an adherent cell cytometer, Celigo® was used. Data was normalized to the transfection reagent alone group and expressed as red pixel count/cell. Results After 24 hours, none of the transfection conditions led to cell loss. Red fluorescence counts were normalized to the AoSMC count. RNAiMax was more potent compared to HiPerfect or no transfection reagent at 5 nM siGLO Red (4.12 +/-1.04 vs. 0.70 +/-0.26 vs. 0.15 +/-0.13 red pixel/cell and 50 nM siGLO Red (6.49 +/-1.81 vs. 2.52 +/-0.67 vs. 0.34 +/-0.19. Fluorescence expression results supported gene knockdown achieved by using MARCKS targeting siRNA in AoSMCs. Conclusion This study underscores that RNAi delivery depends heavily on the choice of delivery method. Adherent cell cytometry can be used as a high throughput-screening tool for the optimization of RNAi assays. This technology can accelerate in vitro cell assays and thus save costs.

  3. High-Throughput Synthesis, Screening, and Scale-Up of Optimized Conducting Indium Tin Oxides

    OpenAIRE

    Marchand, P; Makwana, N. M.; Tighe, C. J.; Gruar, R. I.; Parkin, I. P.; Carmalt, C. J.; Darr, J. A.

    2016-01-01

    A high-throughput optimization and subsequent scale-up methodology has been used for the synthesis of conductive tin-doped indium oxide (known as ITO) nanoparticles. ITO nanoparticles with up to 12 at % Sn were synthesized using a laboratory scale (15 g/hour by dry mass) continuous hydrothermal synthesis process, and the as-synthesized powders were characterized by powder X-ray diffraction, transmission electron microscopy, energy-dispersive X-ray analysis, and X-ray photoelectron spectroscop...

  4. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized

  5. Rapid optimization of metal nanoparticle surface modification with high-throughput gel electrophoresis.

    Science.gov (United States)

    Beskorovaynyy, Alexander V; Kopitsyn, Dmitry S; Novikov, Andrei A; Ziangirova, Maya; Skorikova, Galina S; Kotelev, Mikhail S; Gushchin, Pavel A; Ivanov, Evgeniy V; Getmansky, Michael D; Itzkan, Irving; Muradov, Alexander V; Vinokurov, Vladimir A; Perelman, Lev T

    2014-02-25

    The ability to effectively control and optimize surface modification of metal nanoparticles is paramount to the ability to employ metal nanoparticles as diagnostic and therapeutic agents in biology and medicine. Here we present a high-throughput two-dimensional-grid gel electrophoresis cell (2D-GEC)-based method, capable of optimizing the surface modification of as many as 96 samples of metal nanoparticles in approximately 1 h. The 2D-GEC method determines not only the average zeta-potential of the modified particles but also the homogeneity of the surface modification by measuring the distance between the front of the sample track and the area where the maximum optical density is achieved. The method was tested for optimizing pH and concentration of the modifiers (pM) for functionalizing gold nanorod thiol-containing acidic agents.

  6. On the optimal trimming of high-throughput mRNA sequence data

    Directory of Open Access Journals (Sweden)

    Matthew D MacManes

    2014-01-01

    Full Text Available The widespread and rapid adoption of high-throughput sequencing technologies has afforded researchers the opportunity to gain a deep understanding of genome level processes that underlie evolutionary change, and perhaps more importantly, the links between genotype and phenotype. In particular, researchers interested in functional biology and adaptation have used these technologies to sequence mRNA transcriptomes of specific tissues, which in turn are often compared to other tissues, or other individuals with different phenotypes. While these techniques are extremely powerful, careful attention to data quality is required. In particular, because high-throughput sequencing is more error-prone than traditional Sanger sequencing, quality trimming of sequence reads should be an important step in all data processing pipelines. While several software packages for quality trimming exist, no general guidelines for the specifics of trimming have been developed. Here, using empirically derived sequence data, I provide general recommendations regarding the optimal strength of trimming, specifically in mRNA-Seq studies. Although very aggressive quality trimming is common, this study suggests that a more gentle trimming, specifically of those nucleotides whose Phred score < 2 or < 5, is optimal for most studies across a wide variety of metrics.

  7. Optimizing synchrotron microCT for high-throughput phenotyping of zebrafish

    Science.gov (United States)

    La Rivière, Patrick J.; Clark, Darin; Rojek, Alexandra; Vargas, Phillip; Xiao, Xianghui; DeCarlo, Francesco; Kindlmann, Gordon; Cheng, Keith

    2010-09-01

    We are creating a state-of-the-art 2D and 3D imaging atlas of zebrafish development. The atlas employs both 2D histology slides and 3D benchtop and synchrotron micro CT results. Through this atlas, we expect to document normal and abnormal organogenesis, to reveal new levels of structural detail, and to advance image informatics as a form of systems biology. The zebrafish has become a widely used model organism in biological and biomedical research for studies of vertebrate development and gene function. In this work, we will report on efforts to optimize synchrotron microCT imaging parameters for zebrafish at crucial developmental stages. The aim of these studies is to establish protocols for high-throughput phenotyping of normal, mutant and diseased zebrafish. We have developed staining and embedding protocols using different heavy metal stains (osmium tetroxide and uranyl acetate) and different embedding media (Embed 812 and glycol methacrylate). We have explored the use of edge subtraction and multi-energy techniques for contrast enhancement and we have examined the use of different sample-detector distances with unstained samples to explore and optimize phase-contrast enhancement effects. We will report principally on our efforts to optimize energy choice for single- and multi-energy studies as well as our efforts to optimize the degree of phase contrast enhancement.

  8. 3D printed high-throughput hydrothermal reactionware for discovery, optimization, and scale-up.

    Science.gov (United States)

    Kitson, Philip J; Marshall, Ross J; Long, Deliang; Forgan, Ross S; Cronin, Leroy

    2014-11-17

    3D printing techniques allow the laboratory-scale design and production of reactionware tailored to specific experimental requirements. To increase the range and versatility of reactionware devices, sealed, monolithic reactors suitable for use in hydrothermal synthesis have been digitally designed and realized. The fabrication process allows the introduction of reaction mixtures directly into the reactors during the production, and also enables the manufacture of devices of varying scales and geometries unavailable in traditional equipment. The utility of these devices is shown by the use of 3D printed, high-throughput array reactors to discover two new coordination polymers, optimize the synthesis of one of these, and scale-up its synthesis using larger reactors produced on the same 3D printer. Reactors were also used to produce phase-pure samples of coordination polymers MIL-96 and HKUST-1, in yields comparable to synthesis in traditional apparatus. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. End-to-End Optimization of High-Throughput DNA Sequencing.

    Science.gov (United States)

    O'Reilly, Eliza; Baccelli, Francois; De Veciana, Gustavo; Vikalo, Haris

    2016-10-01

    At the core of Illumina's high-throughput DNA sequencing platforms lies a biophysical surface process that results in a random geometry of clusters of homogeneous short DNA fragments typically hundreds of base pairs long-bridge amplification. The statistical properties of this random process and the lengths of the fragments are critical as they affect the information that can be subsequently extracted, that is, density of successfully inferred DNA fragment reads. The ensembles of overlapping DNA fragment reads are then used to computationally reconstruct the much longer target genome sequence. The success of the reconstruction in turn depends on having a sufficiently large ensemble of DNA fragments that are sufficiently long. In this article using stochastic geometry, we model and optimize the end-to-end flow cell synthesis and target genome sequencing process, linking and partially controlling the statistics of the physical processes to the success of the final computational step. Based on a rough calibration of our model, we provide, for the first time, a mathematical framework capturing the salient features of the sequencing platform that serves as a basis for optimizing cost, performance, and/or sensitivity analysis to various parameters.

  10. Development of an optimized medium, strain and high-throughput culturing methods for Methylobacterium extorquens.

    Directory of Open Access Journals (Sweden)

    Nigel F Delaney

    Full Text Available Methylobacterium extorquens strains are the best-studied methylotrophic model system, and their metabolism of single carbon compounds has been studied for over 50 years. Here we develop a new system for high-throughput batch culture of M. extorquens in microtiter plates by jointly optimizing the properties of the organism, the growth media and the culturing system. After removing cellulose synthase genes in M. extorquens strains AM1 and PA1 to prevent biofilm formation, we found that currently available lab automation equipment, integrated and managed by open source software, makes possible reliable estimates of the exponential growth rate. Using this system, we developed an optimized growth medium for M. extorquens using response surface methodologies. We found that media that used EDTA as a metal chelator inhibited growth and led to inconsistent culture conditions. In contrast, the new medium we developed with a PIPES buffer and metals chelated by citrate allowed for fast and more consistent growth rates. This new Methylobacterium PIPES ('MP' medium was also robust to large deviations in its component ingredients which avoided batch effects from experiments that used media prepared at different times. MP medium allows for faster and more consistent growth than other media used for M. extorquens.

  11. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  12. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    Directory of Open Access Journals (Sweden)

    Astrid eJunker

    2015-01-01

    Full Text Available Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and is thus indispensable for successful plant research. Non-invasive and high throughput (HT phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants’ physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad

  13. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems.

    Science.gov (United States)

    Junker, Astrid; Muraya, Moses M; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E; Meyer, Rhonda C; Riewe, David; Altmann, Thomas

    2014-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications.

  14. High-Throughput Synthesis, Screening, and Scale-Up of Optimized Conducting Indium Tin Oxides.

    Science.gov (United States)

    Marchand, Peter; Makwana, Neel M; Tighe, Christopher J; Gruar, Robert I; Parkin, Ivan P; Carmalt, Claire J; Darr, Jawwad A

    2016-02-08

    A high-throughput optimization and subsequent scale-up methodology has been used for the synthesis of conductive tin-doped indium oxide (known as ITO) nanoparticles. ITO nanoparticles with up to 12 at % Sn were synthesized using a laboratory scale (15 g/hour by dry mass) continuous hydrothermal synthesis process, and the as-synthesized powders were characterized by powder X-ray diffraction, transmission electron microscopy, energy-dispersive X-ray analysis, and X-ray photoelectron spectroscopy. Under standard synthetic conditions, either the cubic In2O3 phase, or a mixture of InO(OH) and In2O3 phases were observed in the as-synthesized materials. These materials were pressed into compacts and heat-treated in an inert atmosphere, and their electrical resistivities were then measured using the Van der Pauw method. Sn doping yielded resistivities of ∼ 10(-2) Ω cm for most samples with the lowest resistivity of 6.0 × 10(-3) Ω cm (exceptionally conductive for such pressed nanopowders) at a Sn concentration of 10 at %. Thereafter, the optimized lab-scale composition was scaled-up using a pilot-scale continuous hydrothermal synthesis process (at a rate of 100 g/hour by dry mass), and a comparable resistivity of 9.4 × 10(-3) Ω cm was obtained. The use of the synthesized TCO nanomaterials for thin film fabrication was finally demonstrated by deposition of a transparent, conductive film using a simple spin-coating process.

  15. Compositional optimization of polyimide-based SEPPI membranes using a genetic algorithm and high-throughput techniques.

    OpenAIRE

    Vandezande, P.; Gevers, LEM; Weyens, Nele; Vankelecom, IFJ

    2009-01-01

    Asymmetric, nanosized zeolite-filled solvent resistant nanofiltration (SRNF) membranes, prepared from emulsified polyimide (PI) solutions via the earlier reported solidification of emulsified polymer solutions via phase inversion (SEPPI) method, were optimized for their performance in the separation of rose bengal (RB) from 2-propanol (IPA). All membranes were prepared and tested in a parallellized, miniaturized, and automated manner using laboratory-developed high-throughput experimentation ...

  16. Optimization of Methanotrophic Growth and Production of Poly(3-Hydroxybutyrate) in a High-Throughput Microbioreactor System

    Science.gov (United States)

    Criddle, Craig S.

    2015-01-01

    Production of poly(3-hydroxybutyrate) (P3HB) from methane has economic and environmental advantages over production by agricultural feedstock. Identification of high-productivity strains and optimal growth conditions is critical to efficient conversion of methane to polymer. Current culture conditions, including serum bottles, shake flasks, and agar plates, are labor-intensive and therefore insufficient for systematic screening and isolation. Gas chromatography, the standard method for analysis of P3HB content in bacterial biomass, is also incompatible with high-throughput screening. Growth in aerated microtiter plates coupled with a 96-well Nile red flow-cytometric assay creates an integrated microbioreactor system for high-throughput growth and analysis of P3HB-producing methanotrophic cultures, eliminating the need for individual manipulation of experimental replicates. This system was tested in practice to conduct medium optimization for P3HB production in pure cultures of Methylocystis parvus OBBP. Optimization gave insight into unexpected interactions: for example, low calcium concentrations significantly enhanced P3HB production under nitrogen-limited conditions. Optimization of calcium and copper concentrations in the growth medium increased final P3HB content from 18.1% to 49.4% and P3HB concentration from 0.69 g/liter to 3.43 g/liter while reducing doubling time from 10.6 h to 8.6 h. The ability to culture and analyze thousands of replicates with high mass transfer in completely mixed culture promises to streamline medium optimization and allow the detection and isolation of highly productive strains. Applications for this system are numerous, encompassing analysis of biofuels and other lipid inclusions, as well as analysis of heterotrophic and photosynthetic systems. PMID:25956771

  17. Optimized negative staining: a high-throughput protocol for examining small and asymmetric protein structure by electron microscopy.

    Science.gov (United States)

    Rames, Matthew; Yu, Yadong; Ren, Gang

    2014-08-15

    Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electron microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol. Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high-resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography. Moreover, OpNS can be a high-throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.

  18. A high-throughput reactor system for optimization of Mo–V–Nb mixed oxide catalyst composition in ethane ODH

    KAUST Repository

    Zhu, Haibo

    2015-01-01

    75 Mo-V-Nb mixed oxide catalysts with a broad range of compositions were prepared by a simple evaporation method, and were screened for the ethane oxidative dehydrogenation (ODH) reaction. The compositions of these 75 catalysts were systematically changed by varying the Nb loading, and the Mo/V molar ratio. Characterization by XRD, XPS, H2-TPR and SEM revealed that an intimate structure is formed among the 3 components. The strong interaction among different components leads to the formation of a new phase or an "intimate structure". The dependency of conversion and selectivity on the catalyst composition was clearly demonstrated from the results of high-throughput testing. The optimized Mo-V-Nb molar composition was confirmed to be composed of a Nb content of 4-8%, a Mo content of 70-83%, and a V content of 12-25%. The enhanced catalytic performance of the mixed oxides is obviously due to the synergistic effects of the different components. The optimized compositions for ethane ODH revealed in our high-throughput tests and the structural information provided by our characterization studies can serve as the starting point for future efforts to improve the catalytic performance of Mo-V-Nb oxides. This journal is © The Royal Society of Chemistry.

  19. High throughput methodology for synthesis, screening, and optimization of solid state lithium ion electrolytes.

    Science.gov (United States)

    Beal, Mark S; Hayden, Brian E; Le Gall, Thierry; Lee, Christopher E; Lu, Xiaojuan; Mirsaneh, Mehdi; Mormiche, Claire; Pasero, Denis; Smith, Duncan C A; Weld, Andrew; Yada, Chihiro; Yokoishi, Shoji

    2011-07-11

    A study of the lithium ion conductor Li(3x)La(2/3-x)TiO(3) solid solution and the surrounding composition space was carried out using a high throughput physical vapor deposition system. An optimum total ionic conductivity value of 5.45 × 10(-4) S cm(-1) was obtained for the composition Li(0.17)La(0.29)Ti(0.54) (Li(3x)La(2/3-x)TiO(3)x = 0.11). This optimum value was calculated using an artificial neural network model based on the empirical data. Due to the large scale of the data set produced and the complexity of synthesis, informatics tools were required to analyze the data. Partition analysis was carried out to determine the synthetic parameters of importance and their threshold values. Multivariate curve resolution and principal component analysis were applied to the diffraction data set. This analysis enabled the construction of phase distribution diagrams, illustrating both the phases obtained and the compositional zones in which they occur. The synthetic technique presented has significant advantages over other thin film and bulk methodologies, in terms of both the compositional range covered and the nature of the materials produced.

  20. Comprehensive evaluation and optimization of amplicon library preparation methods for high-throughput antibody sequencing.

    Directory of Open Access Journals (Sweden)

    Ulrike Menzel

    Full Text Available High-throughput sequencing (HTS of antibody repertoire libraries has become a powerful tool in the field of systems immunology. However, numerous sources of bias in HTS workflows may affect the obtained antibody repertoire data. A crucial step in antibody library preparation is the addition of short platform-specific nucleotide adapter sequences. As of yet, the impact of the method of adapter addition on experimental library preparation and the resulting antibody repertoire HTS datasets has not been thoroughly investigated. Therefore, we compared three standard library preparation methods by performing Illumina HTS on antibody variable heavy genes from murine antibody-secreting cells. Clonal overlap and rank statistics demonstrated that the investigated methods produced equivalent HTS datasets. PCR-based methods were experimentally superior to ligation with respect to speed, efficiency, and practicality. Finally, using a two-step PCR based method we established a protocol for antibody repertoire library generation, beginning from inputs as low as 1 ng of total RNA. In summary, this study represents a major advance towards a standardized experimental framework for antibody HTS, thus opening up the potential for systems-based, cross-experiment meta-analyses of antibody repertoires.

  1. Compositional optimization of polyimide-based SEPPI membranes using a genetic algorithm and high-throughput techniques.

    Science.gov (United States)

    Vandezande, Pieter; Gevers, Lieven E M; Weyens, Nele; Vankelecom, Ivo F J

    2009-03-09

    Asymmetric, nanosized zeolite-filled solvent resistant nanofiltration (SRNF) membranes, prepared from emulsified polyimide (PI) solutions via the earlier reported solidification of emulsified polymer solutions via phase inversion (SEPPI) method, were optimized for their performance in the separation of rose bengal (RB) from 2-propanol (IPA). All membranes were prepared and tested in a parallellized, miniaturized, and automated manner using laboratory-developed high-throughput experimentation techniques. Nine different synthesis parameters related to the composition of the casting solutions were thus optimized. In a first, "conventional" approach, a preliminary systematic screening was carried out, in which only four constituents were used, that is, Matrimid PI, NMP as solvent, THF as volatile cosolvent, and an NMP-based zeolite precursor sol as emulsifying agent. A combinatorial strategy, based on a genetic algorithm and a self-adaptive evolutionary strategy, was then applied to optimize the SRNF performance of PI-based SEPPI membranes. This directed approach allowed the screening of an extended, 9-dimensional parameter space, comprising two extra solvents, the two corresponding nanosized zeolite suspensions, as well as another cosolvent. Coupling with high-throughput techniques allowed the preparation of three generations of casting solutions, 176 compositions in total, resulting in 125 testable membranes. With IPA permeances up to 3.3 L.m(-2) h(-1) bar(-1) and RB rejections around 98%, the combinatorially optimized membranes scored significantly better with respect to fluxes and selectivities than the best membranes obtained in the systematic screening. The best SEPPI membranes also showed much higher IPA permeances than two commercial SRNF membranes at similar or slightly lower RB rejections.

  2. Optimized high-throughput microRNA expression profiling provides novel biomarker assessment of clinical prostate and breast cancer biopsies

    Directory of Open Access Journals (Sweden)

    Fedele Vita

    2006-06-01

    Full Text Available Abstract Background Recent studies indicate that microRNAs (miRNAs are mechanistically involved in the development of various human malignancies, suggesting that they represent a promising new class of cancer biomarkers. However, previously reported methods for measuring miRNA expression consume large amounts of tissue, prohibiting high-throughput miRNA profiling from typically small clinical samples such as excision or core needle biopsies of breast or prostate cancer. Here we describe a novel combination of linear amplification and labeling of miRNA for highly sensitive expression microarray profiling requiring only picogram quantities of purified microRNA. Results Comparison of microarray and qRT-PCR measured miRNA levels from two different prostate cancer cell lines showed concordance between the two platforms (Pearson correlation R2 = 0.81; and extension of the amplification, labeling and microarray platform was successfully demonstrated using clinical core and excision biopsy samples from breast and prostate cancer patients. Unsupervised clustering analysis of the prostate biopsy microarrays separated advanced and metastatic prostate cancers from pooled normal prostatic samples and from a non-malignant precursor lesion. Unsupervised clustering of the breast cancer microarrays significantly distinguished ErbB2-positive/ER-negative, ErbB2-positive/ER-positive, and ErbB2-negative/ER-positive breast cancer phenotypes (Fisher exact test, p = 0.03; as well, supervised analysis of these microarray profiles identified distinct miRNA subsets distinguishing ErbB2-positive from ErbB2-negative and ER-positive from ER-negative breast cancers, independent of other clinically important parameters (patient age; tumor size, node status and proliferation index. Conclusion In sum, these findings demonstrate that optimized high-throughput microRNA expression profiling offers novel biomarker identification from typically small clinical samples such as breast

  3. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  4. Throughput Optimization of Continuous Biopharmaceutical Manufacturing Facilities.

    Science.gov (United States)

    Garcia, Fernando A; Vandiver, Michael W

    2017-01-01

    In order to operate profitably under different product demand scenarios, biopharmaceutical companies must design their facilities with mass output flexibility in mind. Traditional biologics manufacturing technologies pose operational challenges in this regard due to their high costs and slow equipment turnaround times, restricting the types of products and mass quantities that can be processed. Modern plant design, however, has facilitated the development of lean and efficient bioprocessing facilities through footprint reduction and adoption of disposable and continuous manufacturing technologies. These development efforts have proven to be crucial in seeking to drastically reduce the high costs typically associated with the manufacturing of recombinant proteins. In this work, mathematical modeling is used to optimize annual production schedules for a single-product commercial facility operating with a continuous upstream and discrete batch downstream platform. Utilizing cell culture duration and volumetric productivity as process variables in the model, and annual plant throughput as the optimization objective, 3-D surface plots are created to understand the effect of process and facility design on expected mass output. The model shows that once a plant has been fully debottlenecked it is capable of processing well over a metric ton of product per year. Moreover, the analysis helped to uncover a major limiting constraint on plant performance, the stability of the neutralized viral inactivated pool, which may indicate that this should be a focus of attention during future process development efforts. LAY ABSTRACT: Biopharmaceutical process modeling can be used to design and optimize manufacturing facilities and help companies achieve a predetermined set of goals. One way to perform optimization is by making the most efficient use of process equipment in order to minimize the expenditure of capital, labor and plant resources. To that end, this paper introduces a

  5. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  6. Large-scale tracking and classification for automatic analysis of cell migration and proliferation, and experimental optimization of high-throughput screens of neuroblastoma cells.

    Science.gov (United States)

    Harder, Nathalie; Batra, Richa; Diessl, Nicolle; Gogolin, Sina; Eils, Roland; Westermann, Frank; König, Rainer; Rohr, Karl

    2015-06-01

    Computational approaches for automatic analysis of image-based high-throughput and high-content screens are gaining increased importance to cope with the large amounts of data generated by automated microscopy systems. Typically, automatic image analysis is used to extract phenotypic information once all images of a screen have been acquired. However, also in earlier stages of large-scale experiments image analysis is important, in particular, to support and accelerate the tedious and time-consuming optimization of the experimental conditions and technical settings. We here present a novel approach for automatic, large-scale analysis and experimental optimization with application to a screen on neuroblastoma cell lines. Our approach consists of cell segmentation, tracking, feature extraction, classification, and model-based error correction. The approach can be used for experimental optimization by extracting quantitative information which allows experimentalists to optimally choose and to verify the experimental parameters. This involves systematically studying the global cell movement and proliferation behavior. Moreover, we performed a comprehensive phenotypic analysis of a large-scale neuroblastoma screen including the detection of rare division events such as multi-polar divisions. Major challenges of the analyzed high-throughput data are the relatively low spatio-temporal resolution in conjunction with densely growing cells as well as the high variability of the data. To account for the data variability we optimized feature extraction and classification, and introduced a gray value normalization technique as well as a novel approach for automatic model-based correction of classification errors. In total, we analyzed 4,400 real image sequences, covering observation periods of around 120 h each. We performed an extensive quantitative evaluation, which showed that our approach yields high accuracies of 92.2% for segmentation, 98.2% for tracking, and 86.5% for

  7. High-throughput screening assay used in pharmacognosy: Selection, optimization and validation of methods of enzymatic inhibition by UV-visible spectrophotometry

    Directory of Open Access Journals (Sweden)

    Graciela Granados-Guzmán

    2014-02-01

    Full Text Available In research laboratories of both organic synthesis and extraction of natural products, every day a lot of products that can potentially introduce some biological activity are obtained. Therefore it is necessary to have in vitro assays, which provide reliable information for further evaluation in in vivo systems. From this point of view, in recent years has intensified the use of high-throughput screening assays. Such trials should be optimized and validated for accurate and precise results, i.e. reliable. The present review addresses the steps needed to develop and validate bioanalytical methods, emphasizing UV-Visible spectrophotometry as detection system. Particularly focuses on the selection of the method, the optimization to determine the best experimental conditions, validation, implementation of optimized and validated method to real samples, and finally maintenance and possible transfer it to a new laboratory.

  8. Optimization of a micro-scale, high throughput process development tool and the demonstration of comparable process performance and product quality with biopharmaceutical manufacturing processes.

    Science.gov (United States)

    Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J

    2017-07-14

    In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Optimal Thawing of Cryopreserved Peripheral Blood Mononuclear Cells for Use in High-Throughput Human Immune Monitoring Studies

    Directory of Open Access Journals (Sweden)

    Ramu A. Subbramanian

    2012-07-01

    Full Text Available Cryopreserved peripheral blood mononuclear cells (PBMC constitute an important component of immune monitoring studies as they allow for efficient batch- testing of samples as well as for the validation and extension of original studies in the future. In this study, we systematically test the permutations of PBMC thawing practices commonly employed in the field and identify conditions that are high and low risk for the viability of PBMC and their functionality in downstream ELISPOT assays. The study identifies the addition of ice-chilled washing media to thawed cells at the same temperature as being a high risk practice, as it yields significantly lower viability and functionality of recovered PBMC when compared to warming the cryovials to 37 °C and adding a warm washing medium. We found thawed PBMC in cryovials could be kept up to 30 minutes at 37 °C in the presence of DMSO before commencement of washing, which surprisingly identifies exposure to DMSO as a low risk step during the thawing process. This latter finding is of considerable practical relevance since it permits batch-thawing of PBMC in high-throughput immune monitoring environments.

  10. Establishing a high throughput method for medium optimization – a case study using Streptomyces lividans as host for production of heterologous protein

    DEFF Research Database (Denmark)

    Rattleff, Stig; Thykaer, Jette; Lantz, Anna Eliasson

    2012-01-01

    composition can have great effect on the cellular performance, in particular on heterologous protein production. It is a parameter that can be adjusted regardless of GMO concerns or knowledge of genomic sequence. Optimizing medium composition can be labor intensive opening up for introducing automation......Actinomycetes are widely known for production of antibiotics, though as hosts for heterologous protein expression they show great potential which should be further developed. Streptomyces lividans is especially interesting due to very low endogenous protease activity and the capability to secrete....... In this study a potential high throughput method was tested for optimizing medium composition, with respect to nitrogen, to improve heterologous protein production in S. lividans, using mRFP as a model protein. A large number of nitrogen sources were tested in an initial, highly automated, screen. Subsequently...

  11. Development and optimization of ultra-high performance supercritical fluid chromatography mass spectrometry method for high-throughput determination of tocopherols and tocotrienols in human serum.

    Science.gov (United States)

    Pilařová, Veronika; Gottvald, Tomáš; Svoboda, Pavel; Novák, Ondřej; Benešová, Karolína; Běláková, Sylvie; Nováková, Lucie

    2016-08-31

    The goal of this study was to develop an effective supercritical fluid chromatography method using single quadrupole MS for analysis of all isomeric forms of vitamin E. Finally, two fast and effective methods, the high resolution one and the high speed one, for the determination of 8 vitamin E isomers in human serum were developed. Rapid high-throughput liquid-liquid extraction was selected as a sample preparation step. Sample pretreatment of 100 μL human serum was consisted of protein precipitation with 200 μL ethanol and liquid-liquid extraction by 400 μL hexane/dichloromethane (80/20, v/v). The separation was performed on BEH 2-EP (3.0 × 100 mm, 1.7 μm) stationary phase, using isocratic elution with carbon dioxide and 10 mM ammonium formate in methanol in the ratio 98:2 for high resolution method with run time 4.5 min and in the ratio 95:5 for high speed method, where the run time was 2.5 min. The method development included optimization of key parameters: the choice of the suitable stationary phase and the composition of mobile phase, where an influence of various modifiers, their ratio and additives were tested, and optimization of fine tunning parameters including BPR pressure, flow-rate and column temperature. Quantification of all isomeric forms was performed using SIM (single ion monitoring) experiments in ESI positive ion mode. Both high speed and high resolution chromatographic methods were validated in terms of precision, accuracy, range, linearity, LOD, LOQ and matrix effects using the same LLE procedure. The high resolution method provided more sensitive results (LOD: 0.017-0.083 μg mL(-1)) and better linearity (r(2) > 0.9930) than the high speed one (LOD: 0.083-0.25 μg mL(-1), r(2) > 0.9877) at the cost of double time of analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Applications of High Throughput Sequencing for Immunology and Clinical Diagnostics

    OpenAIRE

    Kim, Hyunsung John

    2014-01-01

    High throughput sequencing methods have fundamentally shifted the manner in which biological experiments are performed. In this dissertation, conventional and novel high throughput sequencing and bioinformatics methods are applied to immunology and diagnostics. In order to study rare subsets of cells, an RNA sequencing method was first optimized for use with minimal levels of RNA and cellular input. The optimized RNA sequencing method was then applied to study the transcriptional differences ...

  13. Optimizing ultrafast wide field-of-view illumination for high-throughput multi-photon imaging and screening of mutant fluorescent proteins

    Science.gov (United States)

    Stoltzfus, Caleb; Mikhailov, Alexandr; Rebane, Aleksander

    2017-02-01

    Fluorescence induced by 1wo-photon absorption (2PA) and three-photon absorption (3PA) is becoming an increasingly important tool for deep-tissue microscopy, especially in conjunction with genetically-encoded functional probes such as fluorescent proteins (FPs). Unfortunately, the efficacy of the multi-photon excitation of FPs is notoriously low, and because relations between a biological fluorophore's nonlinear-optical properties and its molecular structure are inherently complex, there are no practical avenues available that would allow boosting the performance of current FPs. Here we describe a novel method, where we apply directed evolution to optimize the 2PA properties of EGFP. Key to the success of this approach consists in high-throughput screening of mutants that would allow selection of variants with promising 2PA and 3PA properties in a broad near-IR excitation range of wavelength. For this purpose, we construct and test a wide field-of-view (FOV), femtosecond imaging system that we then use to quantify the multi-photon excited fluorescence in the 550- 1600 nm range of tens of thousands of E. coli colonies expressing randomly mutated FPs in a standard 10 cm diameter Petri dish configuration. We present a quantitative analysis of different factors that are currently limiting the maximum throughput of the femtosecond multi-photon screening techniques and also report on quantitative measurement of absolute 2PA and 3PA cross sections spectra.

  14. Effective Optimization of Antibody Affinity by Phage Display Integrated with High-Throughput DNA Synthesis and Sequencing Technologies.

    Directory of Open Access Journals (Sweden)

    Dongmei Hu

    Full Text Available Phage display technology has been widely used for antibody affinity maturation for decades. The limited library sequence diversity together with excessive redundancy and labour-consuming procedure for candidate identification are two major obstacles to widespread adoption of this technology. We hereby describe a novel library generation and screening approach to address the problems. The approach started with the targeted diversification of multiple complementarity determining regions (CDRs of a humanized anti-ErbB2 antibody, HuA21, with a small perturbation mutagenesis strategy. A combination of three degenerate codons, NWG, NWC, and NSG, were chosen for amino acid saturation mutagenesis without introducing cysteine and stop residues. In total, 7,749 degenerate oligonucleotides were synthesized on two microchips and released to construct five single-chain antibody fragment (scFv gene libraries with 4 x 10(6 DNA sequences. Deep sequencing of the unselected and selected phage libraries using the Illumina platform allowed for an in-depth evaluation of the enrichment landscapes in CDR sequences and amino acid substitutions. Potent candidates were identified according to their high frequencies using NGS analysis, by-passing the need for the primary screening of target-binding clones. Furthermore, a subsequent library by recombination of the 10 most abundant variants from four CDRs was constructed and screened, and a mutant with 158-fold increased affinity (Kd = 25.5 pM was obtained. These results suggest the potential application of the developed methodology for optimizing the binding properties of other antibodies and biomolecules.

  15. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    Full Text Available Abstract Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM, which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM. ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted

  16. Optimization of Monte Carlo particle transport parameters and validation of a novel high throughput experimental setup to measure the biological effects of particle beams.

    Science.gov (United States)

    Patel, Darshana; Bronk, Lawrence; Guan, Fada; Peeler, Christopher R; Brons, Stephan; Dokic, Ivana; Abdollahi, Amir; Rittmüller, Claudia; Jäkel, Oliver; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2017-11-01

    Accurate modeling of the relative biological effectiveness (RBE) of particle beams requires increased systematic in vitro studies with human cell lines with care towards minimizing uncertainties in biologic assays as well as physical parameters. In this study, we describe a novel high-throughput experimental setup and an optimized parameterization of the Monte Carlo (MC) simulation technique that is universally applicable for accurate determination of RBE of clinical ion beams. Clonogenic cell-survival measurements on a human lung cancer cell line (H460) are presented using proton irradiation. Experiments were performed at the Heidelberg Ion Therapy Center (HIT) with support from the Deutsches Krebsforschungszentrum (DKFZ) in Heidelberg, Germany using a mono-energetic horizontal proton beam. A custom-made variable range selector was designed for the horizontal beam line using the Geant4 MC toolkit. This unique setup enabled a high-throughput clonogenic assay investigation of multiple, well defined dose and linear energy transfer (LETs) per irradiation for human lung cancer cells (H460) cultured in a 96-well plate. Sensitivity studies based on application of different physics lists in conjunction with different electromagnetic constructors and production threshold values to the MC simulations were undertaken for accurate assessment of the calculated dose and the dose-averaged LET (LETd ). These studies were extended to helium and carbon ion beams. Sensitivity analysis of the MC parameterization revealed substantial dependence of the dose and LETd values on both the choice of physics list and the production threshold values. While the dose and LETd calculations using FTFP_BERT_LIV, FTFP_BERT_EMZ, FTFP_BERT_PEN and QGSP_BIC_EMY physics lists agree well with each other for all three ions, they show large differences when compared to the FTFP_BERT physics list with the default electromagnetic constructor. For carbon ions, the dose corresponding to the largest LETd

  17. Scanning droplet cell for high throughput electrochemical and photoelectrochemical measurements

    Science.gov (United States)

    Gregoire, John M.; Xiang, Chengxiang; Liu, Xiaonao; Marcin, Martin; Jin, Jian

    2013-02-01

    High throughput electrochemical techniques are widely applied in material discovery and optimization. For many applications, the most desirable electrochemical characterization requires a three-electrode cell under potentiostat control. In high throughput screening, a material library is explored by either employing an array of such cells, or rastering a single cell over the library. To attain this latter capability with unprecedented throughput, we have developed a highly integrated, compact scanning droplet cell that is optimized for rapid electrochemical and photoeletrochemical measurements. Using this cell, we screened a quaternary oxide library as (photo)electrocatalysts for the oxygen evolution (water splitting) reaction. High quality electrochemical measurements were carried out and key electrocatalytic properties were identified for each of 5456 samples with a throughput of 4 s per sample.

  18. Throughput optimization for dual collaborative spectrum sensing with dynamic scheduling

    Science.gov (United States)

    Cui, Cuimei; Yang, Dezhi

    2017-07-01

    Cognitive radio technology is envisaged to alleviate both spectrum inefficiency and spectrum scarcity problems by exploiting the existing licensed spectrum opportunistically. However, cognitive radio ad hoc networks (CRAHNs) impose unique challenges due to the high dynamic scheduling in the available spectrum, diverse quality of service (QOS) requirements, as well as hidden terminals and shadow fading issues in a harsh radio environment. To solve these problems, this paper proposes a dynamic and variable time-division multiple-access scheduling mechanism (DV-TDMA) incorporated with dual collaborative spectrum sensing scheme for CRAHNs. This study involves the cross-layered cooperation between the Physical (PHY) layer and Medium Access Control (MAC) layer under the consideration of average sensing time, sensing accuracy and the average throughput of cognitive radio users (CRs). Moreover, multiple-objective optimization algorithm is proposed to maximize the average throughput of CRs while still meeting QOS requirements on sensing time and detection error. Finally, performance evaluation is conducted through simulations, and the simulation results reveal that this optimization algorithm can significantly improve throughput and sensing accuracy and reduce average sensing time.

  19. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  20. Optimized automated data analysis for the cytokinesis‐block micronucleus assay using imaging flow cytometry for high throughput radiation biodosimetry

    Science.gov (United States)

    Rodrigues, M. A.; Probst, C. E.; Beaton‐Green, L. A.

    2016-01-01

    Abstract The cytokinesis‐block micronucleus (CBMN) assay is a well‐established technique that can be employed in triage radiation biodosimetry to estimate whole body doses of radiation to potentially exposed individuals through quantitation of the frequency of micronuclei (MN) in binucleated lymphocyte cells (BNCs). The assay has been partially automated using traditional microscope‐based methods and most recently has been modified for application on the ImageStreamX (ISX) imaging flow cytometer. This modification has allowed for a similar number of BNCs to be automatically scored as compared to traditional microscopy in a much shorter time period. However, the MN frequency measured was much lower than both manual and automated slide‐based methods of performing the assay. This work describes the optimized analysis template which implements newly developed functions in the IDEAS® data analysis software for the ISX that enhances specificity for BNCs and increases the frequency of scored MN. A new dose response calibration curve is presented in which the average rate of MN per BNC is of similar magnitude to those presented in the literature using automated CBMN slide scoring methods. In addition, dose estimates were generated for nine irradiated, blinded samples and were found to be within ±0.5 Gy of the delivered dose. Results demonstrate that the improved identification accuracy for MN and BNCs in the ISX‐based version of the CBMN assay will translate to increased accuracy when estimating unknown radiation doses received by exposed individuals following large‐scale radiological or nuclear emergencies. © 2016 The Authors. Cytometry Part A published by Wiley Periodicals, Inc. on behalf of ISAC PMID:27272602

  1. Improving Data Transfer Throughput with Direct Search Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Balaprakash, Prasanna; Morozov, Vitali; Kettimuthu, Rajkumar; Kumaran, Kalyan; Foster, Ian

    2016-01-01

    Improving data transfer throughput over high-speed long-distance networks has become increasingly difficult. Numerous factors such as nondeterministic congestion, dynamics of the transfer protocol, and multiuser and multitask source and destination endpoints, as well as interactions among these factors, contribute to this difficulty. A promising approach to improving throughput consists in using parallel streams at the application layer.We formulate and solve the problem of choosing the number of such streams from a mathematical optimization perspective. We propose the use of direct search methods, a class of easy-to-implement and light-weight mathematical optimization algorithms, to improve the performance of data transfers by dynamically adapting the number of parallel streams in a manner that does not require domain expertise, instrumentation, analytical models, or historic data. We apply our method to transfers performed with the GridFTP protocol, and illustrate the effectiveness of the proposed algorithm when used within Globus, a state-of-the-art data transfer tool, on productionWAN links and servers. We show that when compared to user default settings our direct search methods can achieve up to 10x performance improvement under certain conditions. We also show that our method can overcome performance degradation due to external compute and network load on source end points, a common scenario at high performance computing facilities.

  2. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r...

  3. Resolution- and throughput-enhanced spectroscopy using a high-throughput computational slit

    Science.gov (United States)

    Kazemzadeh, Farnoud; Wong, Alexander

    2016-09-01

    There exists a fundamental tradeoff between spectral resolution and the efficiency or throughput for all optical spectrometers. The primary factors affecting the spectral resolution and throughput of an optical spectrometer are the size of the entrance aperture and the optical power of the focusing element. Thus far collective optimization of the above mentioned has proven difficult. Here, we introduce the concept of high-throughput computational slits (HTCS), a numerical technique for improving both the effective spectral resolution and efficiency of a spectrometer. The proposed HTCS approach was experimentally validated using an optical spectrometer configured with a 200 um entrance aperture, test, and a 50 um entrance aperture, control, demonstrating improvements in spectral resolution of the spectrum by ~ 50% over the control spectral resolution and improvements in efficiency of > 2 times over the efficiency of the largest entrance aperture used in the study while producing highly accurate spectra.

  4. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  5. High Throughput Architecture for High Performance NoC

    OpenAIRE

    Ghany, Mohamed A. Abd El; El-Moursy, Magdy A.; Ismail, Mohammed

    2010-01-01

    In this chapter, the high throughput NoC architecture is proposed to increase the throughput of the switch in NoC. The proposed architecture can also improve the latency of the network. The proposed high throughput interconnect architecture is applied on different NoC architectures. The architecture increases the throughput of the network by more than 38% while preserving the average latency. The area of high throughput NoC switch is decreased by 18% as compared to the area of BFT switch. The...

  6. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  7. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    Science.gov (United States)

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  8. An Overlay Architecture for Throughput Optimal Multipath Routing

    Science.gov (United States)

    2017-01-14

    throughput optimal routing policy that has been studied for decades. Its strength lies in discovering multipath routes and utilizing them optimally... computing differential backlogs across the overlay edges, e.g. node 2 computes W 62,5 = Q 6 2 − Q 6 5 and W 6 2,6 = Q 6 2 − Q 6 6. Simulation results in...require the threshold computation and associated knowledge of the underlay topology. Overlay Backpressure (OBP): Redefine the differential backlog as, W

  9. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  10. High-throughput crystallization screening.

    Science.gov (United States)

    Skarina, Tatiana; Xu, Xiaohui; Evdokimova, Elena; Savchenko, Alexei

    2014-01-01

    Protein structure determination by X-ray crystallography is dependent on obtaining a single protein crystal suitable for diffraction data collection. Due to this requirement, protein crystallization represents a key step in protein structure determination. The conditions for protein crystallization have to be determined empirically for each protein, making this step also a bottleneck in the structure determination process. Typical protein crystallization practice involves parallel setup and monitoring of a considerable number of individual protein crystallization experiments (also called crystallization trials). In these trials the aliquots of purified protein are mixed with a range of solutions composed of a precipitating agent, buffer, and sometimes an additive that have been previously successful in prompting protein crystallization. The individual chemical conditions in which a particular protein shows signs of crystallization are used as a starting point for further crystallization experiments. The goal is optimizing the formation of individual protein crystals of sufficient size and quality to make them suitable for diffraction data collection. Thus the composition of the primary crystallization screen is critical for successful crystallization.Systematic analysis of crystallization experiments carried out on several hundred proteins as part of large-scale structural genomics efforts allowed the optimization of the protein crystallization protocol and identification of a minimal set of 96 crystallization solutions (the "TRAP" screen) that, in our experience, led to crystallization of the maximum number of proteins.

  11. Trade-Off Analysis in High-Throughput Materials Exploration.

    Science.gov (United States)

    Volety, Kalpana K; Huyberechts, Guido P J

    2017-03-13

    This Research Article presents a strategy to identify the optimum compositions in metal alloys with certain desired properties in a high-throughput screening environment, using a multiobjective optimization approach. In addition to the identification of the optimum compositions in a primary screening, the strategy also allows pointing to regions in the compositional space where further exploration in a secondary screening could be carried out. The strategy for the primary screening is a combination of two multiobjective optimization approaches namely Pareto optimality and desirability functions. The experimental data used in the present study have been collected from over 200 different compositions belonging to four different alloy systems. The metal alloys (comprising Fe, Ti, Al, Nb, Hf, Zr) are synthesized and screened using high-throughput technologies. The advantages of such a kind of approach compared to the limitations of the traditional and comparatively simpler approaches like ranking and calculating figures of merit are discussed.

  12. High-throughput evaluation of synthetic metabolic pathways.

    Science.gov (United States)

    Klesmith, Justin R; Whitehead, Timothy A

    2016-03-01

    A central challenge in the field of metabolic engineering is the efficient identification of a metabolic pathway genotype that maximizes specific productivity over a robust range of process conditions. Here we review current methods for optimizing specific productivity of metabolic pathways in living cells. New tools for library generation, computational analysis of pathway sequence-flux space, and high-throughput screening and selection techniques are discussed.

  13. Economic consequences of high throughput maskless lithography

    Science.gov (United States)

    Hartley, John G.; Govindaraju, Lakshmi

    2005-11-01

    Many people in the semiconductor industry bemoan the high costs of masks and view mask cost as one of the significant barriers to bringing new chip designs to market. All that is needed is a viable maskless technology and the problem will go away. Numerous sites around the world are working on maskless lithography but inevitably, the question asked is "Wouldn't a one wafer per hour maskless tool make a really good mask writer?" Of course, the answer is yes, the hesitation you hear in the answer isn't based on technology concerns, it's financial. The industry needs maskless lithography because mask costs are too high. Mask costs are too high because mask pattern generators (PG's) are slow and expensive. If mask PG's become much faster, mask costs go down, the maskless market goes away and the PG supplier is faced with an even smaller tool demand from the mask shops. Technical success becomes financial suicide - or does it? In this paper we will present the results of a model that examines some of the consequences of introducing high throughput maskless pattern generation. Specific features in the model include tool throughput for masks and wafers, market segmentation by node for masks and wafers and mask cost as an entry barrier to new chip designs. How does the availability of low cost masks and maskless tools affect the industries tool makeup and what is the ultimate potential market for high throughput maskless pattern generators?

  14. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  15. A high throughput spectral image microscopy system

    Science.gov (United States)

    Gesley, M.; Puri, R.

    2018-01-01

    A high throughput spectral image microscopy system is configured for rapid detection of rare cells in large populations. To overcome flow cytometry rates and use of fluorophore tags, a system architecture integrates sample mechanical handling, signal processors, and optics in a non-confocal version of light absorption and scattering spectroscopic microscopy. Spectral images with native contrast do not require the use of exogeneous stain to render cells with submicron resolution. Structure may be characterized without restriction to cell clusters of differentiation.

  16. High-throughput theoretical design of lithium battery materials

    Science.gov (United States)

    Shi-Gang, Ling; Jian, Gao; Rui-Juan, Xiao; Li-Quan, Chen

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. Project supported by the National Natural Science Foundation of China (Grant Nos. 11234013 and 51172274) and the National High Technology Research and Development Program of China (Grant No. 2015AA034201).

  17. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  18. Optimization of a high-throughput CTAB-based protocol for the extraction of qPCR-grade DNA from rumen fluid, plant and bacterial pure cultures.

    Science.gov (United States)

    Minas, Konstantinos; McEwan, Neil R; Newbold, Charles Jamie; Scott, Karen P

    2011-12-01

    The quality and yield of extracted DNA are critical for the majority of downstream applications in molecular biology. Moreover, molecular techniques such as quantitative real-time PCR (qPCR) are becoming increasingly widespread; thus, validation and cross-laboratory comparison of data require standardization of upstream experimental procedures. DNA extraction methods depend on the type and size of starting material(s) used. As such, the extraction of template DNA is arguably the most significant variable when cross-comparing data from different laboratories. Here, we describe a reliable, inexpensive and rapid method of DNA purification that is equally applicable to small or large scale or high-throughput purification of DNA. The protocol relies on a CTAB-based buffer for cell lysis and further purification of DNA with phenol : chloroform : isoamyl alcohol. The protocol has been used successfully for DNA purification from rumen fluid and plant cells. Moreover, after slight alterations, the same protocol was used for large-scale extraction of DNA from pure cultures of Gram-positive and Gram-negative bacteria. The yield of the DNA obtained with this method exceeded that from the same samples using commercial kits, and the quality was confirmed by successful qPCR applications. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  19. A high-throughput protein refolding screen in 96-well format combined with design of experiments to optimize the refolding conditions.

    Science.gov (United States)

    Dechavanne, Vincent; Barrillat, Nicolas; Borlat, Frederic; Hermant, Aurélie; Magnenat, Laurent; Paquet, Mikael; Antonsson, Bruno; Chevalet, Laurent

    2011-02-01

    Production of correctly folded and biologically active proteins in Escherichiacoli can be a challenging process. Frequently, proteins are recovered as insoluble inclusion bodies and need to be denatured and refolded into the correct structure. To address this, a refolding screening process based on a 96-well assay format supported by design of experiments (DOE) was developed for identification of optimal refolding conditions. After a first generic screen of 96 different refolding conditions the parameters that produced the best yield were further explored in a focused DOE-based screen. The refolding efficiency and the quality of the refolded protein were analyzed by RP-HPLC and SDS-PAGE. The results were analyzed by the DOE software to identify the optimal concentrations of the critical additives. The optimal refolding conditions suggested by DOE were verified in medium-scale refolding tests, which confirmed the reliability of the predictions. Finally, the refolded protein was purified and its biological activity was tested in vitro. The screen was applied for the refolding of Interleukin 17F (IL-17F), stromal-cell-derived factor-1 (SDF-1α/CXCL12), B cell-attracting chemokine 1 (BCA-1/CXCL13), granulocyte macrophage colony stimulating factor (GM-CSF) and the complement factor C5a. This procedure identified refolding conditions for all the tested proteins. For the proteins where refolding conditions were already available, the optimized conditions identified in the screening process increased the yields between 50% and 100%. Thus, the method described herein is a useful tool to determine the feasibility of refolding and to identify high-yield scalable refolding conditions optimized for each individual protein. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. Determining the optimal number of individual samples to pool for quantification of average herd levels of antimicrobial resistance genes in Danish pig herds using high-throughput qPCR

    DEFF Research Database (Denmark)

    Clasen, Julie; Mellerup, Anders; Olsen, John Elmerdahl

    2016-01-01

    The primary objective of this study was to determine the minimum number of individual fecal samples to pool together in order to obtain a representative sample for herd level quantification of antimicrobial resistance (AMR) genes in a Danish pig herd, using a novel high-throughput qPCR assay...... was found between individual samples. As the number of samples in a pool increased, a decrease in sample variation was observed. It was concluded that the optimal pooling size is five samples, as an almost steady state in the variation was observed when pooling this number of samples. Good agreement between....... The secondary objective was to assess the agreement between different methods of sample pooling. Quantification of AMR was achieved using a high-throughput qPCR method to quantify the levels of seven AMR genes (ermB, ermF, sulI, sulII, tet(M), tet(O) and tet(W)). A large variation in the levels of AMR genes...

  1. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  2. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  3. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  4. High throughput assays for analyzing transcription factors.

    Science.gov (United States)

    Li, Xianqiang; Jiang, Xin; Yaoi, Takuro

    2006-06-01

    Transcription factors are a group of proteins that modulate the expression of genes involved in many biological processes, such as cell growth and differentiation. Alterations in transcription factor function are associated with many human diseases, and therefore these proteins are attractive potential drug targets. A key issue in the development of such therapeutics is the generation of effective tools that can be used for high throughput discovery of the critical transcription factors involved in human diseases, and the measurement of their activities in a variety of disease or compound-treated samples. Here, a number of innovative arrays and 96-well format assays for profiling and measuring the activities of transcription factors will be discussed.

  5. High-throughput hyperdimensional vertebrate phenotyping.

    Science.gov (United States)

    Pardo-Martin, Carlos; Allalou, Amin; Medina, Jaime; Eimon, Peter M; Wählby, Carolina; Fatih Yanik, Mehmet

    2013-01-01

    Most gene mutations and biologically active molecules cause complex responses in animals that cannot be predicted by cell culture models. Yet animal studies remain too slow and their analyses are often limited to only a few readouts. Here we demonstrate high-throughput optical projection tomography with micrometre resolution and hyperdimensional screening of entire vertebrates in tens of seconds using a simple fluidic system. Hundreds of independent morphological features and complex phenotypes are automatically captured in three dimensions with unprecedented speed and detail in semitransparent zebrafish larvae. By clustering quantitative phenotypic signatures, we can detect and classify even subtle alterations in many biological processes simultaneously. We term our approach hyperdimensional in vivo phenotyping. To illustrate the power of hyperdimensional in vivo phenotyping, we have analysed the effects of several classes of teratogens on cartilage formation using 200 independent morphological measurements, and identified similarities and differences that correlate well with their known mechanisms of actions in mammals.

  6. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  7. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......, focusing on oft encountered problems in data processing, such as quality assurance, mapping, normalization, visualization, and interpretation. Presented in the second part are scientific endeavors representing solutions to problems of two sub-genres of next generation sequencing. For the first flavor, RNA-sequencing...

  8. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......). For the second flavor, DNA-seq, a study presenting genome wide profiling of transcription factor CEBP/A in liver cells undergoing regeneration after partial hepatectomy (article IV) is included....

  9. High Throughput Spectroscopic Catalyst Screening via Surface Plasmon Spectroscopy

    Science.gov (United States)

    2015-07-15

    Final 3. DATES COVERED (From - To) 26-June-2014 to 25-March-2015 4. TITLE AND SUBTITLE High Throughput Catalyst Screening via Surface...TITLE AND SUBTITLE High Throughput Catalyst Screening via Surface Plasmon Spectroscopy 5a. CONTRACT NUMBER FA2386-14-1-4064 5b. GRANT NUMBER 5c...AOARD Grant 144064 FA2386-14-1-4064 “High Throughput Spectroscopic Catalyst Screening by Surface Plasmon Spectroscopy” Date July 15, 2015

  10. Achieving High Throughput for Data Transfer over ATM Networks

    Science.gov (United States)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  11. High Throughput Direct Detection Doppler Lidar Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Lite Cycles, Inc. (LCI) proposes to develop a direct-detection Doppler lidar (D3L) technology called ELITE that improves the system optical throughput by more than...

  12. Optimizing SIEM Throughput on the Cloud Using Parallelization.

    Directory of Open Access Journals (Sweden)

    Masoom Alam

    Full Text Available Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP, mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS. It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.

  13. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  14. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification...

  15. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  16. High-throughput crystallography for structural genomics.

    Science.gov (United States)

    Joachimiak, Andrzej

    2009-10-01

    Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now more than 55000 protein structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal, and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact.

  17. High-throughput Crystallography for Structural Genomics

    Science.gov (United States)

    Joachimiak, Andrzej

    2009-01-01

    Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now over 53,000 proteins structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact. PMID:19765976

  18. High-throughput screening: update on practices and success.

    Science.gov (United States)

    Fox, Sandra; Farr-Jones, Shauna; Sopchak, Lynne; Boggs, Amy; Nicely, Helen Wang; Khoury, Richard; Biros, Michael

    2006-10-01

    High-throughput screening (HTS) has become an important part of drug discovery at most pharmaceutical and many biotechnology companies worldwide, and use of HTS technologies is expanding into new areas. Target validation, assay development, secondary screening, ADME/Tox, and lead optimization are among the areas in which there is an increasing use of HTS technologies. It is becoming fully integrated within drug discovery, both upstream and downstream, which includes increasing use of cell-based assays and high-content screening (HCS) technologies to achieve more physiologically relevant results and to find higher quality leads. In addition, HTS laboratories are continually evaluating new technologies as they struggle to increase their success rate for finding drug candidates. The material in this article is based on a 900-page HTS industry report involving 54 HTS directors representing 58 HTS laboratories and 34 suppliers.

  19. Modeling of 5 ' nuclease real-time responses for optimization of a high-throughput enrichment PCR procedure for Salmonella enterica

    DEFF Research Database (Denmark)

    Knutsson, R.; Löfström, Charlotta; Grage, H.

    2002-01-01

    The performance of a 5' nuclease real-time PCR assay was studied to optimize an automated method of detection of preenriched Salmonella enterica cells in buffered peptone water (BPW). The concentrations and interactions of the PCR reagents were evaluated on the basis of two detection responses......, the threshold cycle (C-T) and the fluorescence intensity by a normalized reporter value (DeltaR(n)). The C-r response was identified as the most suitable for detection modeling to describe the PCR performances of different samples. DNA extracted from S. enterica serovar Enteritidis was studied in double....../microwell for the AmpliTaq Gold mixture. To verify the improved amplification capacity of the rTth mixture, BPW was inoculated with 1 CFU of S. enterica serovar Enteritidis per ml and the mixture was incubated at 30degreesC. Samples for PCR were withdrawn every 4 h during a 36-h enrichment. Use of the rTth mixture...

  20. Structuring intuition with theory: The high-throughput way

    Science.gov (United States)

    Fornari, Marco

    2015-03-01

    First principles methodologies have grown in accuracy and applicability to the point where large databases can be built, shared, and analyzed with the goal of predicting novel compositions, optimizing functional properties, and discovering unexpected relationships between the data. In order to be useful to a large community of users, data should be standardized, validated, and distributed. In addition, tools to easily manage large datasets should be made available to effectively lead to materials development. Within the AFLOW consortium we have developed a simple frame to expand, validate, and mine data repositories: the MTFrame. Our minimalistic approach complement AFLOW and other existing high-throughput infrastructures and aims to integrate data generation with data analysis. We present few examples from our work on materials for energy conversion. Our intent s to pinpoint the usefulness of high-throughput methodologies to guide the discovery process by quantitatively structuring the scientific intuition. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.

  1. High-Throughput Automation in Chemical Process Development.

    Science.gov (United States)

    Selekman, Joshua A; Qiu, Jun; Tran, Kristy; Stevens, Jason; Rosso, Victor; Simmons, Eric; Xiao, Yi; Janey, Jacob

    2017-06-07

    High-throughput (HT) techniques built upon laboratory automation technology and coupled to statistical experimental design and parallel experimentation have enabled the acceleration of chemical process development across multiple industries. HT technologies are often applied to interrogate wide, often multidimensional experimental spaces to inform the design and optimization of any number of unit operations that chemical engineers use in process development. In this review, we outline the evolution of HT technology and provide a comprehensive overview of how HT automation is used throughout different industries, with a particular focus on chemical and pharmaceutical process development. In addition, we highlight the common strategies of how HT automation is incorporated into routine development activities to maximize its impact in various academic and industrial settings.

  2. Optimizing Decoy State Enabled Quantum Key Distribution Systems to Maximize Quantum Throughput and Detect Photon Number Splitting Attacks with High Confidence

    OpenAIRE

    Mailloux, Logan O.; Grimaila, Michael R.; Hodson, Douglas D.; Engle, Ryan D.; McLaughlin, Colin V.; Gerald B. Baumgartner

    2016-01-01

    Quantum Key Distribution (QKD) is an innovative quantum communications protocol which exploits the laws of quantum mechanics to generate unconditionally secure cryptographic keying material between two geographically separated parties. The unique nature of QKD shows promise for high-security applications such as those found in banking, government, and military environments. However, QKD systems contain implementation non-idealities which can negatively impact their performance and security.In...

  3. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek is developing a high throughput nominal 100-W Hall Effect Thruster. This device is well sized for spacecraft ranging in size from several tens of kilograms to...

  4. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    Science.gov (United States)

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  5. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek Co. Inc. proposes to develop a high throughput, nominal 100 W Hall Effect Thruster (HET). This HET will be sized for small spacecraft (< 180 kg), including...

  6. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  7. Materiomics - High-Throughput Screening of Biomaterial Properties

    NARCIS (Netherlands)

    de Boer, Jan; van Blitterswijk, Clemens

    2013-01-01

    This complete, yet concise, guide introduces you to the rapidly developing field of high throughput screening of biomaterials: materiomics. Bringing together the key concepts and methodologies used to determine biomaterial properties, you will understand the adaptation and application of materomics

  8. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  9. High-throughput microcavitation bubble induced cellular mechanotransduction

    Science.gov (United States)

    Compton, Jonathan Lee

    Focused pulsed laser irradiation allows for the deposition of energy with high spatial and temporal resolution. These attributes provide an optimal tool for non-contact manipulation in cellular biology such as laser microsurgery, cell membrane permeabilization, as well as targeted cell death. In this thesis we investigate the direct physical effects produced by laser- generated microcavitation bubbles in adherent cell cultures. We examine how variation in pulse durations (180 ps - 6ns) and pulse energy (0.5 - 40 mu;J) affect microcavitation bubble (mu;CB) generated cell lysis, necrosis, and molecular delivery. To compare the effects of pulse duration we employ classical fluid dynamics modeling to quantify the perturbation caused on cell populations from mu;CB generated microTsunamis (a transient microscale burst of hydrodynamic shear stress). Through time-resolved imaging we capture the mu;CB dynamics at various energies and pulse durations. Moreover, the mathematical modeling provides information regarding the cellular exposure to time varying shear stress and impulse as a function of radial location from the mu;CB center. We demonstrate that the resultant cellular effect can be predicted based on the total impulse across a two order of magnitude span of pulse duration and pulse energy. We also examine the region of cells beyond the zone of molecular delivery to investigate possible cellular reactions to mu;Tsunami exposure. Our studies have shown that cellular mechanotransduction occurs within cell populations spanning an area of up to 1 mm2 surrounding the mu;CB. Visualization of mechanotransduction is achieved through the visualization of intracellular calcium signaling via fluorescence microscopy that occurs due to the ability of the muTsunami generated shear stresses to stimulate G-protein coupled receptors at the apical cell surface. Moreover, we have shown that the observed signaling can be attenuated in a dose-dependent manner using 2-APB which is a known

  10. High-throughput hydrophilic interaction chromatography coupled to tandem mass spectrometry for the optimized quantification of the anti-Gram-negatives antibiotic colistin A/B and its pro-drug colistimethate.

    Science.gov (United States)

    Mercier, Thomas; Tissot, Fréderic; Gardiol, Céline; Corti, Natascia; Wehrli, Stéphane; Guidi, Monia; Csajka, Chantal; Buclin, Thierry; Couet, William; Marchetti, Oscar; Decosterd, Laurent A

    2014-11-21

    Colistin is a last resort's antibacterial treatment in critically ill patients with multi-drug resistant Gram-negative infections. As appropriate colistin exposure is the key for maximizing efficacy while minimizing toxicity, individualized dosing optimization guided by therapeutic drug monitoring is a top clinical priority. Objective of the present work was to develop a rapid and robust HPLC-MS/MS assay for quantification of colistin plasma concentrations. This novel methodology validated according to international standards simultaneously quantifies the microbiologically active compounds colistin A and B, plus the pro-drug colistin methanesulfonate (colistimethate, CMS). 96-well micro-Elution SPE on Oasis Hydrophilic-Lipophilic-Balanced (HLB) followed by direct analysis by Hydrophilic Interaction Liquid Chromatography (HILIC) with Ethylene Bridged Hybrid--BEH--Amide phase column coupled to tandem mass spectrometry allows a high-throughput with no significant matrix effect. The technique is highly sensitive (limit of quantification 0.014 and 0.006 μg/mL for colistin A and B), precise (intra-/inter-assay CV 0.6-8.4%) and accurate (intra-/inter-assay deviation from nominal concentrations -4.4 to +6.3%) over the clinically relevant analytical range 0.05-20 μg/mL. Colistin A and B in plasma and whole blood samples are reliably quantified over 48 h at room temperature and at +4°C (<6% deviation from nominal values) and after three freeze-thaw cycles. Colistimethate acidic hydrolysis (1M H2SO4) to colistin A and B in plasma was completed in vitro after 15 min of sonication while the pro-drug hydrolyzed spontaneously in plasma ex vivo after 4 h at room temperature: this information is of utmost importance for interpretation of analytical results. Quantification is precise and accurate when using serum, citrated or EDTA plasma as biological matrix, while use of heparin plasma is not appropriate. This new analytical technique providing optimized quantification in real

  11. The protein crystallography beamline BW6 at DORIS - automatic operation and high-throughput data collection

    CERN Document Server

    Blume, H; Bourenkov, G P; Kosciesza, D; Bartunik, H D

    2001-01-01

    The wiggler beamline BW6 at DORIS has been optimized for de-novo solution of protein structures on the basis of MAD phasing. Facilities for automatic data collection, rapid data transfer and storage, and online processing have been developed which provide adequate conditions for high-throughput applications, e.g., in structural genomics.

  12. The JCSG high-throughput structural biology pipeline.

    Science.gov (United States)

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  13. Perspective: Composition–structure–property mapping in high-throughput experiments: Turning data into knowledge

    Directory of Open Access Journals (Sweden)

    Jason R. Hattrick-Simpers

    2016-05-01

    Full Text Available With their ability to rapidly elucidate composition-structure-property relationships, high-throughput experimental studies have revolutionized how materials are discovered, optimized, and commercialized. It is now possible to synthesize and characterize high-throughput libraries that systematically address thousands of individual cuts of fabrication parameter space. An unresolved issue remains transforming structural characterization data into phase mappings. This difficulty is related to the complex information present in diffraction and spectroscopic data and its variation with composition and processing. We review the field of automated phase diagram attribution and discuss the impact that emerging computational approaches will have in the generation of phase diagrams and beyond.

  14. Macro-to-micro structural proteomics: native source proteins for high-throughput crystallization.

    Science.gov (United States)

    Totir, Monica; Echols, Nathaniel; Nanao, Max; Gee, Christine L; Moskaleva, Alisa; Gradia, Scott; Iavarone, Anthony T; Berger, James M; May, Andrew P; Zubieta, Chloe; Alber, Tom

    2012-01-01

    Structural biology and structural genomics projects routinely rely on recombinantly expressed proteins, but many proteins and complexes are difficult to obtain by this approach. We investigated native source proteins for high-throughput protein crystallography applications. The Escherichia coli proteome was fractionated, purified, crystallized, and structurally characterized. Macro-scale fermentation and fractionation were used to subdivide the soluble proteome into 408 unique fractions of which 295 fractions yielded crystals in microfluidic crystallization chips. Of the 295 crystals, 152 were selected for optimization, diffraction screening, and data collection. Twenty-three structures were determined, four of which were novel. This study demonstrates the utility of native source proteins for high-throughput crystallography.

  15. High-throughput optical coherence tomography at 800 nm.

    Science.gov (United States)

    Goda, Keisuke; Fard, Ali; Malik, Omer; Fu, Gilbert; Quach, Alan; Jalali, Bahram

    2012-08-27

    We report high-throughput optical coherence tomography (OCT) that offers 1,000 times higher axial scan rate than conventional OCT in the 800 nm spectral range. This is made possible by employing photonic time-stretch for chirping a pulse train and transforming it into a passive swept source. We demonstrate a record high axial scan rate of 90.9 MHz. To show the utility of our method, we also demonstrate real-time observation of laser ablation dynamics. Our high-throughput OCT is expected to be useful for industrial applications where the speed of conventional OCT falls short.

  16. A novel high throughput method to investigate polymer dissolution.

    Science.gov (United States)

    Zhang, Ying; Mallapragada, Surya K; Narasimhan, Balaji

    2010-02-16

    The dissolution behavior of polystyrene (PS) in biodiesel was studied by developing a novel high throughput approach based on Fourier-transform infrared (FTIR) microscopy. A multiwell device for high throughput dissolution testing was fabricated using a photolithographic rapid prototyping method. The dissolution of PS films in each well was tracked by following the characteristic IR band of PS and the effect of PS molecular weight and temperature on the dissolution rate was simultaneously investigated. The results were validated with conventional gravimetric methods. The high throughput method can be extended to evaluate the dissolution profiles of a large number of samples, or to simultaneously investigate the effect of variables such as polydispersity, crystallinity, and mixed solvents. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.|info:eu-repo/dai/nl/074334603; Folkers, G.E.|info:eu-repo/dai/nl/162277202

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  18. Screening and synthesis: high throughput technologies applied to parasitology.

    Science.gov (United States)

    Morgan, R E; Westwood, N J

    2004-01-01

    High throughput technologies continue to develop in response to the challenges set by the genome projects. This article discusses how the techniques of both high throughput screening (HTS) and synthesis can influence research in parasitology. Examples of the use of targeted and phenotype-based HTS using unbiased compound collections are provided. The important issue of identifying the protein target(s) of bioactive compounds is discussed from the synthetic chemist's perspective. This article concludes by reviewing recent examples of successful target identification studies in parasitology.

  19. High throughput calorimetry for evaluating enzymatic reactions generating phosphate.

    Science.gov (United States)

    Hoflack, Lieve; De Groeve, Manu; Desmet, Tom; Van Gerwen, Peter; Soetaert, Wim

    2010-05-01

    A calorimetric assay is described for the high-throughput screening of enzymes that produce inorganic phosphate. In the current example, cellobiose phosphorylase (EC 2.4.1.20) is tested for its ability to synthesise rare disaccharides. The generated phosphate is measured in a high-throughput calorimeter by coupling the reaction to pyruvate oxidase and catalase. This procedure allows for the simultaneous analysis of 48 reactions in microtiter plate format and has been validated by comparison with a colorimetric phosphate assay. The proposed assay has a coefficient of variation of 3.14% and is useful for screening enzyme libraries for enhanced activity and substrate libraries for enzyme promiscuity.

  20. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  1. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  2. Generating Mouse Models Using Zygote Electroporation of Nucleases (ZEN) Technology with High Efficiency and Throughput.

    Science.gov (United States)

    Wang, Wenbo; Zhang, Yingfan; Wang, Haoyi

    2017-01-01

    Mouse models with genetic modifications are widely used in biology and biomedical research. Although the application of CRISPR-Cas9 system greatly accelerated the process of generating genetically modified mice, the delivery method depending on manual injection of the components into the embryos remains a bottleneck, as it is laborious, low throughput, and technically demanding. To overcome this limitation, we invented and optimized the ZEN (Zygote electroporation of nucleases) technology to deliver CRISPR-Cas9 reagents via electroporation. Using ZEN, we were able to generate genetically modified mouse models with high efficiency and throughput. Here, we describe the protocol in great detail.

  3. High Throughput Multispectral Image Processing with Applications in Food Science.

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  4. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  5. Optimal Throughput and Energy Efficiency for Wireless Sensor Networks: Multiple Access and Multipacket Reception

    Directory of Open Access Journals (Sweden)

    Li Wenjun

    2005-01-01

    Full Text Available We investigate two important aspects in sensor network design—the throughput and the energy efficiency. We consider the uplink reachback problem where the receiver is equipped with multiple antennas and linear multiuser detectors. We first assume Rayleigh flat-fading, and analyze two MAC schemes: round-robin and slotted-ALOHA. We optimize the average number of transmissions per slot and the transmission power for two purposes: maximizing the throughput, or minimizing the effective energy (defined as the average energy consumption per successfully received packet subject to a throughput constraint. For each MAC scheme with a given linear detector, we derive the maximum asymptotic throughput as the signal-to-noise ratio goes to infinity. It is shown that the minimum effective energy grows rapidly as the throughput constraint approaches the maximum asymptotic throughput. By comparing the optimal performance of different MAC schemes equipped with different detectors, we draw important tradeoffs involved in the sensor network design. Finally, we show that multiuser scheduling greatly enhances system performance in a shadow fading environment.

  6. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  7. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  8. High throughput defect detection with multiple parallel electron beams

    NARCIS (Netherlands)

    Himbergen, H.M.P. van; Nijkerk, M.D.; Jager, P.W.H. de; Hosman, T.C.; Kruit, P.

    2007-01-01

    A new concept for high throughput defect detection with multiple parallel electron beams is described. As many as 30 000 beams can be placed on a footprint of a in.2, each beam having its own microcolumn and detection system without cross-talk. Based on the International Technology Roadmap for

  9. A High-Throughput SU-8Microfluidic Magnetic Bead Separator

    DEFF Research Database (Denmark)

    Bu, Minqiang; Christensen, T. B.; Smistrup, Kristian

    2007-01-01

    We present a novel microfluidic magnetic bead separator based on SU-8 fabrication technique for high through-put applications. The experimental results show that magnetic beads can be captured at an efficiency of 91 % and 54 % at flow rates of 1 mL/min and 4 mL/min, respectively. Integration of s...

  10. High-Throughput Toxicity Testing: New Strategies for ...

    Science.gov (United States)

    In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it

  11. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  12. High-throughput screening, predictive modeling and computational embryology - Abstract

    Science.gov (United States)

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  13. High-throughput screening, predictive modeling and computational embryology

    Science.gov (United States)

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  14. High-throughput sequencing in mitochondrial DNA research.

    Science.gov (United States)

    Ye, Fei; Samuels, David C; Clark, Travis; Guo, Yan

    2014-07-01

    Next-generation sequencing, also known as high-throughput sequencing, has greatly enhanced researchers' ability to conduct biomedical research on all levels. Mitochondrial research has also benefitted greatly from high-throughput sequencing; sequencing technology now allows for screening of all 16,569 base pairs of the mitochondrial genome simultaneously for SNPs and low level heteroplasmy and, in some cases, the estimation of mitochondrial DNA copy number. It is important to realize the full potential of high-throughput sequencing for the advancement of mitochondrial research. To this end, we review how high-throughput sequencing has impacted mitochondrial research in the categories of SNPs, low level heteroplasmy, copy number, and structural variants. We also discuss the different types of mitochondrial DNA sequencing and their pros and cons. Based on previous studies conducted by various groups, we provide strategies for processing mitochondrial DNA sequencing data, including assembly, variant calling, and quality control. Copyright © 2014 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  15. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    Science.gov (United States)

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  16. High-Throughput Microfluidics for the Screening of Yeast Libraries.

    Science.gov (United States)

    Huang, Mingtao; Joensson, Haakan N; Nielsen, Jens

    2018-01-01

    Cell factory development is critically important for efficient biological production of chemicals, biofuels, and pharmaceuticals. Many rounds of the Design-Build-Test-Learn cycles may be required before an engineered strain meeting specific metrics required for industrial application. The bioindustry prefer products in secreted form (secreted products or extracellular metabolites) as it can lower the cost of downstream processing, reduce metabolic burden to cell hosts, and allow necessary modification on the final products , such as biopharmaceuticals. Yet, products in secreted form result in the disconnection of phenotype from genotype, which may have limited throughput in the Test step for identification of desired variants from large libraries of mutant strains. In droplet microfluidic screening, single cells are encapsulated in individual droplet and enable high-throughput processing and sorting of single cells or clones. Encapsulation in droplets allows this technology to overcome the throughput limitations present in traditional methods for screening by extracellular phenotypes. In this chapter, we describe a protocol/guideline for high-throughput droplet microfluidics screening of yeast libraries for higher protein secretion . This protocol can be adapted to screening by a range of other extracellular products from yeast or other hosts.

  17. Applications of Biophysics in High-Throughput Screening Hit Validation.

    Science.gov (United States)

    Genick, Christine Clougherty; Barlier, Danielle; Monna, Dominique; Brunner, Reto; Bé, Céline; Scheufler, Clemens; Ottl, Johannes

    2014-06-01

    For approximately a decade, biophysical methods have been used to validate positive hits selected from high-throughput screening (HTS) campaigns with the goal to verify binding interactions using label-free assays. By applying label-free readouts, screen artifacts created by compound interference and fluorescence are discovered, enabling further characterization of the hits for their target specificity and selectivity. The use of several biophysical methods to extract this type of high-content information is required to prevent the promotion of false positives to the next level of hit validation and to select the best candidates for further chemical optimization. The typical technologies applied in this arena include dynamic light scattering, turbidometry, resonance waveguide, surface plasmon resonance, differential scanning fluorimetry, mass spectrometry, and others. Each technology can provide different types of information to enable the characterization of the binding interaction. Thus, these technologies can be incorporated in a hit-validation strategy not only according to the profile of chemical matter that is desired by the medicinal chemists, but also in a manner that is in agreement with the target protein's amenability to the screening format. Here, we present the results of screening strategies using biophysics with the objective to evaluate the approaches, discuss the advantages and challenges, and summarize the benefits in reference to lead discovery. In summary, the biophysics screens presented here demonstrated various hit rates from a list of ~2000 preselected, IC50-validated hits from HTS (an IC50 is the inhibitor concentration at which 50% inhibition of activity is observed). There are several lessons learned from these biophysical screens, which will be discussed in this article. © 2014 Society for Laboratory Automation and Screening.

  18. A high throughput droplet based electroporation system

    Science.gov (United States)

    Yoo, Byeongsun; Ahn, Myungmo; Im, Dojin; Kang, Inseok

    2014-11-01

    Delivery of exogenous genetic materials across the cell membrane is a powerful and popular research tool for bioengineering. Among conventional non-viral DNA delivery methods, electroporation (EP) is one of the most widely used technologies and is a standard lab procedure in molecular biology. We developed a novel digital microfluidic electroporation system which has higher efficiency of transgene expression and better cell viability than that of conventional EP techniques. We present the successful performance of digital EP system for transformation of various cell lines by investigating effects of the EP conditions such as electric pulse voltage, number, and duration on the cell viability and transfection efficiency in comparison with a conventional bulk EP system. Through the numerical analysis, we have also calculated the electric field distribution around the cells precisely to verify the effect of the electric field on the high efficiency of the digital EP system. Furthermore, the parallelization of the EP processes has been developed to increase the transformation productivity. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (Grant Number: 2013R1A1A2011956).

  19. A high-throughput media design approach for high performance mammalian fed-batch cultures.

    Science.gov (United States)

    Rouiller, Yolande; Périlleux, Arnaud; Collet, Natacha; Jordan, Martin; Stettler, Matthieu; Broly, Hervé

    2013-01-01

    An innovative high-throughput medium development method based on media blending was successfully used to improve the performance of a Chinese hamster ovary fed-batch medium in shaking 96-deepwell plates. Starting from a proprietary chemically-defined medium, 16 formulations testing 43 of 47 components at 3 different levels were designed. Media blending was performed following a custom-made mixture design of experiments considering binary blends, resulting in 376 different blends that were tested during both cell expansion and fed-batch production phases in one single experiment. Three approaches were chosen to provide the best output of the large amount of data obtained. A simple ranking of conditions was first used as a quick approach to select new formulations with promising features. Then, prediction of the best mixes was done to maximize both growth and titer using the Design Expert software. Finally, a multivariate analysis enabled identification of individual potential critical components for further optimization. Applying this high-throughput method on a fed-batch, rather than on a simple batch, process opens new perspectives for medium and feed development that enables identification of an optimized process in a short time frame.

  20. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels...... channel targets accessible for drug screening. Specifically, genuine HTS parallel processing techniques based on arrays of planar silicon chips are being developed, but also lower throughput sequential techniques may be of value in compound screening, lead optimization, and safety screening....... The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery....

  1. Validation of high throughput sequencing and microbial forensics applications

    OpenAIRE

    Budowle, Bruce; Connell, Nancy D.; Bielecka-Oder, Anna; Rita R Colwell; Corbett, Cindi R.; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A.; Murch, Randall S; Sajantila, Antti; Schemes, Sarah E; Ternus, Krista L; Turner, Stephen D

    2014-01-01

    Abstract High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results a...

  2. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  3. Hypoxia-sensitive reporter system for high-throughput screening.

    Science.gov (United States)

    Tsujita, Tadayuki; Kawaguchi, Shin-ichi; Dan, Takashi; Baird, Liam; Miyata, Toshio; Yamamoto, Masayuki

    2015-02-01

    The induction of anti-hypoxic stress enzymes and proteins has the potential to be a potent therapeutic strategy to prevent the progression of ischemic heart, kidney or brain diseases. To realize this idea, small chemical compounds, which mimic hypoxic conditions by activating the PHD-HIF-α system, have been developed. However, to date, none of these compounds were identified by monitoring the transcriptional activation of hypoxia-inducible factors (HIFs). Thus, to facilitate the discovery of potent inducers of HIF-α, we have developed an effective high-throughput screening (HTS) system to directly monitor the output of HIF-α transcription. We generated a HIF-α-dependent reporter system that responds to hypoxic stimuli in a concentration- and time-dependent manner. This system was developed through multiple optimization steps, resulting in the generation of a construct that consists of the secretion-type luciferase gene (Metridia luciferase, MLuc) under the transcriptional regulation of an enhancer containing 7 copies of 40-bp hypoxia responsive element (HRE) upstream of a mini-TATA promoter. This construct was stably integrated into the human neuroblastoma cell line, SK-N-BE(2)c, to generate a reporter system, named SKN:HRE-MLuc. To improve this system and to increase its suitability for the HTS platform, we incorporated the next generation luciferase, Nano luciferase (NLuc), whose longer half-life provides us with flexibility for the use of this reporter. We thus generated a stably transformed clone with NLuc, named SKN:HRE-NLuc, and found that it showed significantly improved reporter activity compared to SKN:HRE-MLuc. In this study, we have successfully developed the SKN:HRE-NLuc screening system as an efficient platform for future HTS.

  4. High-Throughput Phase-Field Design of High-Energy-Density Polymer Nanocomposites.

    Science.gov (United States)

    Shen, Zhong-Hui; Wang, Jian-Jun; Lin, Yuanhua; Nan, Ce-Wen; Chen, Long-Qing; Shen, Yang

    2017-11-22

    Understanding the dielectric breakdown behavior of polymer nanocomposites is crucial to the design of high-energy-density dielectric materials with reliable performances. It is however challenging to predict the breakdown behavior due to the complicated factors involved in this highly nonequilibrium process. In this work, a comprehensive phase-field model is developed to investigate the breakdown behavior of polymer nanocomposites under electrostatic stimuli. It is found that the breakdown strength and path significantly depend on the microstructure of the nanocomposite. The predicted breakdown strengths for polymer nanocomposites with specific microstructures agree with existing experimental measurements. Using this phase-field model, a high throughput calculation is performed to seek the optimal microstructure. Based on the high-throughput calculation, a sandwich microstructure for PVDF-BaTiO3 nanocomposite is designed, where the upper and lower layers are filled with parallel nanosheets and the middle layer is filled with vertical nanofibers. It has an enhanced energy density of 2.44 times that of the pure PVDF polymer. The present work provides a computational approach for understanding the electrostatic breakdown, and it is expected to stimulate future experimental efforts on synthesizing polymer nanocomposites with novel microstructures to achieve high performances. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  6. High-throughput sequence alignment using Graphics Processing Units.

    Science.gov (United States)

    Schatz, Michael C; Trapnell, Cole; Delcher, Arthur L; Varshney, Amitabh

    2007-12-10

    The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs) in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA) from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  7. Human transcriptome array for high-throughput clinical studies

    Science.gov (United States)

    Xu, Weihong; Seok, Junhee; Mindrinos, Michael N.; Schweitzer, Anthony C.; Jiang, Hui; Wilhelmy, Julie; Clark, Tyson A.; Kapur, Karen; Xing, Yi; Faham, Malek; Storey, John D.; Moldawer, Lyle L.; Maier, Ronald V.; Tompkins, Ronald G.; Wong, Wing Hung; Davis, Ronald W.; Xiao, Wenzhong; Toner, Mehmet; Warren, H. Shaw; Schoenfeld, David A.; Rahme, Laurence; McDonald-Smith, Grace P.; Hayden, Douglas; Mason, Philip; Fagan, Shawn; Yu, Yong-Ming; Cobb, J. Perren; Remick, Daniel G.; Mannick, John A.; Lederer, James A.; Gamelli, Richard L.; Silver, Geoffrey M.; West, Michael A.; Shapiro, Michael B.; Smith, Richard; Camp, David G.; Qian, Weijun; Tibshirani, Rob; Lowry, Stephen; Calvano, Steven; Chaudry, Irshad; Cohen, Mitchell; Moore, Ernest E.; Johnson, Jeffrey; Baker, Henry V.; Efron, Philip A.; Balis, Ulysses G. J.; Billiar, Timothy R.; Ochoa, Juan B.; Sperry, Jason L.; Miller-Graziano, Carol L.; De, Asit K.; Bankey, Paul E.; Herndon, David N.; Finnerty, Celeste C.; Jeschke, Marc G.; Minei, Joseph P.; Arnoldo, Brett D.; Hunt, John L.; Horton, Jureta; Cobb, J. Perren; Brownstein, Bernard; Freeman, Bradley; Nathens, Avery B.; Cuschieri, Joseph; Gibran, Nicole; Klein, Matthew; O'Keefe, Grant

    2011-01-01

    A 6.9 million-feature oligonucleotide array of the human transcriptome [Glue Grant human transcriptome (GG-H array)] has been developed for high-throughput and cost-effective analyses in clinical studies. This array allows comprehensive examination of gene expression and genome-wide identification of alternative splicing as well as detection of coding SNPs and noncoding transcripts. The performance of the array was examined and compared with mRNA sequencing (RNA-Seq) results over multiple independent replicates of liver and muscle samples. Compared with RNA-Seq of 46 million uniquely mappable reads per replicate, the GG-H array is highly reproducible in estimating gene and exon abundance. Although both platforms detect similar expression changes at the gene level, the GG-H array is more sensitive at the exon level. Deeper sequencing is required to adequately cover low-abundance transcripts. The array has been implemented in a multicenter clinical program and has generated high-quality, reproducible data. Considering the clinical trial requirements of cost, sample availability, and throughput, the GG-H array has a wide range of applications. An emerging approach for large-scale clinical genomic studies is to first use RNA-Seq to the sufficient depth for the discovery of transcriptome elements relevant to the disease process followed by high-throughput and reliable screening of these elements on thousands of patient samples using custom-designed arrays. PMID:21317363

  8. Computational analysis of high-throughput flow cytometry data.

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2012-08-01

    Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible.

  9. Computational analysis of high-throughput flow cytometry data

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  10. High throughput screening of starch structures using carbohydrate microarrays.

    Science.gov (United States)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Motawia, Mohammed Saddik; Shaik, Shahnoor Sultana; Mikkelsen, Maria Dalgaard; Krunic, Susanne Langgaard; Fangel, Jonatan Ulrik; Willats, William George Tycho; Blennow, Andreas

    2016-07-29

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers the potential for rapidly analysing resistant and slowly digested dietary starches.

  11. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  12. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...... maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content...... and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers...

  13. High-throughput screening for modulators of cellular contractile force

    CERN Document Server

    Park, Chan Young; Tambe, Dhananjay; Chen, Bohao; Lavoie, Tera; Dowell, Maria; Simeonov, Anton; Maloney, David J; Marinkovic, Aleksandar; Tschumperlin, Daniel J; Burger, Stephanie; Frykenberg, Matthew; Butler, James P; Stamer, W Daniel; Johnson, Mark; Solway, Julian; Fredberg, Jeffrey J; Krishnan, Ramaswamy

    2014-01-01

    When cellular contractile forces are central to pathophysiology, these forces comprise a logical target of therapy. Nevertheless, existing high-throughput screens are limited to upstream signaling intermediates with poorly defined relationship to such a physiological endpoint. Using cellular force as the target, here we screened libraries to identify novel drug candidates in the case of human airway smooth muscle cells in the context of asthma, and also in the case of Schlemm's canal endothelial cells in the context of glaucoma. This approach identified several drug candidates for both asthma and glaucoma. We attained rates of 1000 compounds per screening day, thus establishing a force-based cellular platform for high-throughput drug discovery.

  14. High-throughput optical screening of cellular mechanotransduction

    OpenAIRE

    Compton, JL; Luo, JC; Ma, H.; Botvinick, E; Venugopalan, V

    2014-01-01

    We introduce an optical platform for rapid, high-throughput screening of exogenous molecules that affect cellular mechanotransduction. Our method initiates mechanotransduction in adherent cells using single laser-microbeam generated microcavitation bubbles without requiring flow chambers or microfluidics. These microcavitation bubbles expose adherent cells to a microtsunami, a transient microscale burst of hydrodynamic shear stress, which stimulates cells over areas approaching 1 mm2. We demo...

  15. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  16. The high-throughput highway to computational materials design.

    Science.gov (United States)

    Curtarolo, Stefano; Hart, Gus L W; Nardelli, Marco Buongiorno; Mingo, Natalio; Sanvito, Stefano; Levy, Ohad

    2013-03-01

    High-throughput computational materials design is an emerging area of materials science. By combining advanced thermodynamic and electronic-structure methods with intelligent data mining and database construction, and exploiting the power of current supercomputer architectures, scientists generate, manage and analyse enormous data repositories for the discovery of novel materials. In this Review we provide a current snapshot of this rapidly evolving field, and highlight the challenges and opportunities that lie ahead.

  17. Web-based visual analysis for high-throughput genomics.

    Science.gov (United States)

    Goecks, Jeremy; Eberhard, Carl; Too, Tomithy; Nekrutenko, Anton; Taylor, James

    2013-06-13

    Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput

  18. Graph-based signal integration for high-throughput phenotyping.

    Science.gov (United States)

    Herskovic, Jorge R; Subramanian, Devika; Cohen, Trevor; Bozzo-Silva, Pamela A; Bearden, Charles F; Bernstam, Elmer V

    2012-01-01

    Electronic Health Records aggregated in Clinical Data Warehouses (CDWs) promise to revolutionize Comparative Effectiveness Research and suggest new avenues of research. However, the effectiveness of CDWs is diminished by the lack of properly labeled data. We present a novel approach that integrates knowledge from the CDW, the biomedical literature, and the Unified Medical Language System (UMLS) to perform high-throughput phenotyping. In this paper, we automatically construct a graphical knowledge model and then use it to phenotype breast cancer patients. We compare the performance of this approach to using MetaMap when labeling records. MetaMap's overall accuracy at identifying breast cancer patients was 51.1% (n=428); recall=85.4%, precision=26.2%, and F1=40.1%. Our unsupervised graph-based high-throughput phenotyping had accuracy of 84.1%; recall=46.3%, precision=61.2%, and F1=52.8%. We conclude that our approach is a promising alternative for unsupervised high-throughput phenotyping.

  19. Condor-COPASI: high-throughput computing for biochemical networks

    Directory of Open Access Journals (Sweden)

    Kent Edward

    2012-07-01

    Full Text Available Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage.

  20. High-throughput computational and experimental techniques in structural genomics.

    Science.gov (United States)

    Chance, Mark R; Fiser, Andras; Sali, Andrej; Pieper, Ursula; Eswar, Narayanan; Xu, Guiping; Fajardo, J Eduardo; Radhakannan, Thirumuruhan; Marinkovic, Nebojsa

    2004-10-01

    Structural genomics has as its goal the provision of structural information for all possible ORF sequences through a combination of experimental and computational approaches. The access to genome sequences and cloning resources from an ever-widening array of organisms is driving high-throughput structural studies by the New York Structural Genomics Research Consortium. In this report, we outline the progress of the Consortium in establishing its pipeline for structural genomics, and some of the experimental and bioinformatics efforts leading to structural annotation of proteins. The Consortium has established a pipeline for structural biology studies, automated modeling of ORF sequences using solved (template) structures, and a novel high-throughput approach (metallomics) to examining the metal binding to purified protein targets. The Consortium has so far produced 493 purified proteins from >1077 expression vectors. A total of 95 have resulted in crystal structures, and 81 are deposited in the Protein Data Bank (PDB). Comparative modeling of these structures has generated >40,000 structural models. We also initiated a high-throughput metal analysis of the purified proteins; this has determined that 10%-15% of the targets contain a stoichiometric structural or catalytic transition metal atom. The progress of the structural genomics centers in the U.S. and around the world suggests that the goal of providing useful structural information on most all ORF domains will be realized. This projected resource will provide structural biology information important to understanding the function of most proteins of the cell.

  1. 76 FR 28990 - Ultra High Throughput Sequencing for Clinical Diagnostic Applications-Approaches To Assess...

    Science.gov (United States)

    2011-05-19

    ... Clinical Diagnostic Applications--Approaches To Assess Analytical Validity.'' The purpose of the public... approaches to assess analytical validity of ultra high throughput sequencing for clinical diagnostic... HUMAN SERVICES Food and Drug Administration Ultra High Throughput Sequencing for Clinical Diagnostic...

  2. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  3. A High-Throughput Microfluidic Platform for Mammalian Cell Transfection and Culturing

    Science.gov (United States)

    Woodruff, Kristina; Maerkl, Sebastian J.

    2016-01-01

    Mammalian synthetic biology could be augmented through the development of high-throughput microfluidic systems that integrate cellular transfection, culturing, and imaging. We created a microfluidic chip that cultures cells and implements 280 independent transfections at up to 99% efficiency. The chip can perform co-transfections, in which the number of cells expressing each protein and the average protein expression level can be precisely tuned as a function of input DNA concentration and synthetic gene circuits can be optimized on chip. We co-transfected four plasmids to test a histidine kinase signaling pathway and mapped the dose dependence of this network on the level of one of its constituents. The chip is readily integrated with high-content imaging, enabling the evaluation of cellular behavior and protein expression dynamics over time. These features make the transfection chip applicable to high-throughput mammalian protein and synthetic biology studies. PMID:27030663

  4. Large scale library generation for high throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Erik Borgström

    Full Text Available BACKGROUND: Large efforts have recently been made to automate the sample preparation protocols for massively parallel sequencing in order to match the increasing instrument throughput. Still, the size selection through agarose gel electrophoresis separation is a labor-intensive bottleneck of these protocols. METHODOLOGY/PRINCIPAL FINDINGS: In this study a method for automatic library preparation and size selection on a liquid handling robot is presented. The method utilizes selective precipitation of certain sizes of DNA molecules on to paramagnetic beads for cleanup and selection after standard enzymatic reactions. CONCLUSIONS/SIGNIFICANCE: The method is used to generate libraries for de novo and re-sequencing on the Illumina HiSeq 2000 instrument with a throughput of 12 samples per instrument in approximately 4 hours. The resulting output data show quality scores and pass filter rates comparable to manually prepared samples. The sample size distribution can be adjusted for each application, and are suitable for all high throughput DNA processing protocols seeking to control size intervals.

  5. High throughput workflow for coacervate formation and characterization in shampoo systems.

    Science.gov (United States)

    Kalantar, T H; Tucker, C J; Zalusky, A S; Boomgaard, T A; Wilson, B E; Ladika, M; Jordan, S L; Li, W K; Zhang, X; Goh, C G

    2007-01-01

    Cationic cellulosic polymers find wide utility as benefit agents in shampoo. Deposition of these polymers onto hair has been shown to mend split-ends, improve appearance and wet combing, as well as provide controlled delivery of insoluble actives. The deposition is thought to be enhanced by the formation of a polymer/surfactant complex that phase-separates from the bulk solution upon dilution. A standard characterization method has been developed to characterize the coacervate formation upon dilution, but the test is time and material prohibitive. We have developed a semi-automated high throughput workflow to characterize the coacervate-forming behavior of different shampoo formulations. A procedure that allows testing of real use shampoo dilutions without first formulating a complete shampoo was identified. This procedure was adapted to a Tecan liquid handler by optimizing the parameters for liquid dispensing as well as for mixing. The high throughput workflow enabled preparation and testing of hundreds of formulations with different types and levels of cationic cellulosic polymers and surfactants, and for each formulation a haze diagram was constructed. Optimal formulations and their dilutions that give substantial coacervate formation (determined by haze measurements) were identified. Results from this high throughput workflow were shown to reproduce standard haze and bench-top turbidity measurements, and this workflow has the advantages of using less material and allowing more variables to be tested with significant time savings.

  6. Applications of high-throughput clonogenic survival assays in high-LET particle microbeams

    Directory of Open Access Journals (Sweden)

    Antonios eGeorgantzoglou

    2016-01-01

    Full Text Available Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-LET particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells’ clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells’ response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell’s capacity to divide at least 4-5 times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  7. Applications of High-Throughput Clonogenic Survival Assays in High-LET Particle Microbeams.

    Science.gov (United States)

    Georgantzoglou, Antonios; Merchant, Michael J; Jeynes, Jonathan C G; Mayhead, Natalie; Punia, Natasha; Butler, Rachel E; Jena, Rajesh

    2015-01-01

    Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-linear energy transfer (LET) particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells' clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells' response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell's capacity to divide at least four to five times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  8. Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation

    Science.gov (United States)

    Potyrailo, Radislav A.; Mirsky, Vladimir M.

    New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.

  9. High-throughput investigation of catalysts for JP-8 fuel cracking to liquefied petroleum gas.

    Science.gov (United States)

    Bedenbaugh, John E; Kim, Sungtak; Sasmaz, Erdem; Lauterbach, Jochen

    2013-09-09

    Portable power technologies for military applications necessitate the production of fuels similar to LPG from existing feedstocks. Catalytic cracking of military jet fuel to form a mixture of C₂-C₄ hydrocarbons was investigated using high-throughput experimentation. Cracking experiments were performed in a gas-phase, 16-sample high-throughput reactor. Zeolite ZSM-5 catalysts with low Si/Al ratios (≤25) demonstrated the highest production of C₂-C₄ hydrocarbons at moderate reaction temperatures (623-823 K). ZSM-5 catalysts were optimized for JP-8 cracking activity to LPG through varying reaction temperature and framework Si/Al ratio. The reducing atmosphere required during catalytic cracking resulted in coking of the catalyst and a commensurate decrease in conversion rate. Rare earth metal promoters for ZSM-5 catalysts were screened to reduce coking deactivation rates, while noble metal promoters reduced onset temperatures for coke burnoff regeneration.

  10. Macro-to-micro structural proteomics: native source proteins for high-throughput crystallization.

    Directory of Open Access Journals (Sweden)

    Monica Totir

    Full Text Available Structural biology and structural genomics projects routinely rely on recombinantly expressed proteins, but many proteins and complexes are difficult to obtain by this approach. We investigated native source proteins for high-throughput protein crystallography applications. The Escherichia coli proteome was fractionated, purified, crystallized, and structurally characterized. Macro-scale fermentation and fractionation were used to subdivide the soluble proteome into 408 unique fractions of which 295 fractions yielded crystals in microfluidic crystallization chips. Of the 295 crystals, 152 were selected for optimization, diffraction screening, and data collection. Twenty-three structures were determined, four of which were novel. This study demonstrates the utility of native source proteins for high-throughput crystallography.

  11. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.

    Science.gov (United States)

    Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli

    2018-01-23

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning

  12. High-throughput cultivation and screening platform for unicellular phototrophs.

    Science.gov (United States)

    Tillich, Ulrich M; Wolter, Nick; Schulze, Katja; Kramer, Dan; Brödel, Oliver; Frohme, Marcus

    2014-09-16

    High-throughput cultivation and screening methods allow a parallel, miniaturized and cost efficient processing of many samples. These methods however, have not been generally established for phototrophic organisms such as microalgae or cyanobacteria. In this work we describe and test high-throughput methods with the model organism Synechocystis sp. PCC6803. The required technical automation for these processes was achieved with a Tecan Freedom Evo 200 pipetting robot. The cultivation was performed in 2.2 ml deepwell microtiter plates within a cultivation chamber outfitted with programmable shaking conditions, variable illumination, variable temperature, and an adjustable CO2 atmosphere. Each microtiter-well within the chamber functions as a separate cultivation vessel with reproducible conditions. The automated measurement of various parameters such as growth, full absorption spectrum, chlorophyll concentration, MALDI-TOF-MS, as well as a novel vitality measurement protocol, have already been established and can be monitored during cultivation. Measurement of growth parameters can be used as inputs for the system to allow for periodic automatic dilutions and therefore a semi-continuous cultivation of hundreds of cultures in parallel. The system also allows the automatic generation of mid and long term backups of cultures to repeat experiments or to retrieve strains of interest. The presented platform allows for high-throughput cultivation and screening of Synechocystis sp. PCC6803. The platform should be usable for many phototrophic microorganisms as is, and be adaptable for even more. A variety of analyses are already established and the platform is easily expandable both in quality, i.e. with further parameters to screen for additional targets and in quantity, i.e. size or number of processed samples.

  13. High throughput inclusion body sizing: Nano particle tracking analysis.

    Science.gov (United States)

    Reichelt, Wieland N; Kaineder, Andreas; Brillmann, Markus; Neutsch, Lukas; Taschauer, Alexander; Lohninger, Hans; Herwig, Christoph

    2017-06-01

    The expression of pharmaceutical relevant proteins in Escherichia coli frequently triggers inclusion body (IB) formation caused by protein aggregation. In the scientific literature, substantial effort has been devoted to the quantification of IB size. However, particle-based methods used up to this point to analyze the physical properties of representative numbers of IBs lack sensitivity and/or orthogonal verification. Using high pressure freezing and automated freeze substitution for transmission electron microscopy (TEM) the cytosolic inclusion body structure was preserved within the cells. TEM imaging in combination with manual grey scale image segmentation allowed the quantification of relative areas covered by the inclusion body within the cytosol. As a high throughput method nano particle tracking analysis (NTA) enables one to derive the diameter of inclusion bodies in cell homogenate based on a measurement of the Brownian motion. The NTA analysis of fixated (glutaraldehyde) and non-fixated IBs suggests that high pressure homogenization annihilates the native physiological shape of IBs. Nevertheless, the ratio of particle counts of non-fixated and fixated samples could potentially serve as factor for particle stickiness. In this contribution, we establish image segmentation of TEM pictures as an orthogonal method to size biologic particles in the cytosol of cells. More importantly, NTA has been established as a particle-based, fast and high throughput method (1000-3000 particles), thus constituting a much more accurate and representative analysis than currently available methods. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Efficient Management of High-Throughput Screening Libraries with SAVANAH

    DEFF Research Database (Denmark)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen

    2017-01-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such scr......High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis...... for such screens are molecular libraries, that is, microtiter plates with solubilized reagents such as siRNAs, shRNAs, miRNA inhibitors or mimics, and sgRNAs, or small compounds, that is, drugs. These reagents are typically condensed to provide enough material for covering several screens. Library plates thus need...... to be serially diluted before they can be used as assay plates. This process, however, leads to an explosion in the number of plates and samples to be tracked. Here, we present SAVANAH, the first tool to effectively manage molecular screening libraries across dilution series. It conveniently links (connects...

  15. Plant chip for high-throughput phenotyping of Arabidopsis.

    Science.gov (United States)

    Jiang, Huawei; Xu, Zhen; Aluru, Maneesha R; Dong, Liang

    2014-04-07

    We report on the development of a vertical and transparent microfluidic chip for high-throughput phenotyping of Arabidopsis thaliana plants. Multiple Arabidopsis seeds can be germinated and grown hydroponically over more than two weeks in the chip, thus enabling large-scale and quantitative monitoring of plant phenotypes. The novel vertical arrangement of this microfluidic device not only allows for normal gravitropic growth of the plants but also, more importantly, makes it convenient to continuously monitor phenotypic changes in plants at the whole organismal level, including seed germination and root and shoot growth (hypocotyls, cotyledons, and leaves), as well as at the cellular level. We also developed a hydrodynamic trapping method to automatically place single seeds into seed holding sites of the device and to avoid potential damage to seeds that might occur during manual loading. We demonstrated general utility of this microfluidic device by showing clear visible phenotypes of the immutans mutant of Arabidopsis, and we also showed changes occurring during plant-pathogen interactions at different developmental stages. Arabidopsis plants grown in the device maintained normal morphological and physiological behaviour, and distinct phenotypic variations consistent with a priori data were observed via high-resolution images taken in real time. Moreover, the timeline for different developmental stages for plants grown in this device was highly comparable to growth using a conventional agar plate method. This prototype plant chip technology is expected to lead to the establishment of a powerful experimental and cost-effective framework for high-throughput and precise plant phenotyping.

  16. High-throughput genomics enhances tomato breeding efficiency.

    Science.gov (United States)

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-03-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits.

  17. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Energy Technology Data Exchange (ETDEWEB)

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  18. High-throughput sequencing: a roadmap toward community ecology.

    Science.gov (United States)

    Poisot, Timothée; Péquin, Bérangère; Gravel, Dominique

    2013-04-01

    High-throughput sequencing is becoming increasingly important in microbial ecology, yet it is surprisingly under-used to generate or test biogeographic hypotheses. In this contribution, we highlight how adding these methods to the ecologist toolbox will allow the detection of new patterns, and will help our understanding of the structure and dynamics of diversity. Starting with a review of ecological questions that can be addressed, we move on to the technical and analytical issues that will benefit from an increased collaboration between different disciplines.

  19. UAV-based high-throughput phenotyping in legume crops

    Science.gov (United States)

    Sankaran, Sindhuja; Khot, Lav R.; Quirós, Juan; Vandemark, George J.; McGee, Rebecca J.

    2016-05-01

    In plant breeding, one of the biggest obstacles in genetic improvement is the lack of proven rapid methods for measuring plant responses in field conditions. Therefore, the major objective of this research was to evaluate the feasibility of utilizing high-throughput remote sensing technology for rapid measurement of phenotyping traits in legume crops. The plant responses of several chickpea and peas varieties to the environment were assessed with an unmanned aerial vehicle (UAV) integrated with multispectral imaging sensors. Our preliminary assessment showed that the vegetation indices are strongly correlated (pphenotyping traits.

  20. REDItools: high-throughput RNA editing detection made easy.

    Science.gov (United States)

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  1. High throughput platforms for structural genomics of integral membrane proteins.

    Science.gov (United States)

    Mancia, Filippo; Love, James

    2011-08-01

    Structural genomics approaches on integral membrane proteins have been postulated for over a decade, yet specific efforts are lagging years behind their soluble counterparts. Indeed, high throughput methodologies for production and characterization of prokaryotic integral membrane proteins are only now emerging, while large-scale efforts for eukaryotic ones are still in their infancy. Presented here is a review of recent literature on actively ongoing structural genomics of membrane protein initiatives, with a focus on those aimed at implementing interesting techniques aimed at increasing our rate of success for this class of macromolecules. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families....

  3. Bifrost: Stream processing framework for high-throughput applications

    Science.gov (United States)

    Barsdell, Ben; Price, Daniel; Cranmer, Miles; Garsden, Hugh; Dowell, Jayce

    2017-11-01

    Bifrost is a stream processing framework that eases the development of high-throughput processing CPU/GPU pipelines. It is designed for digital signal processing (DSP) applications within radio astronomy. Bifrost uses a flexible ring buffer implementation that allows different signal processing blocks to be connected to form a pipeline. Each block may be assigned to a CPU core, and the ring buffers are used to transport data to and from blocks. Processing blocks may be run on either the CPU or GPU, and the ring buffer will take care of memory copies between the CPU and GPU spaces.

  4. High-throughput DNA sequencing: a genomic data manufacturing process.

    Science.gov (United States)

    Huang, G M

    1999-01-01

    The progress trends in automated DNA sequencing operation are reviewed. Technological development in sequencing instruments, enzymatic chemistry and robotic stations has resulted in ever-increasing capacity of sequence data production. This progress leads to a higher demand on laboratory information management and data quality assessment. High-throughput laboratories face the challenge of organizational management, as well as technology management. Engineering principles of process control should be adopted in this biological data manufacturing procedure. While various systems attempt to provide solutions to automate different parts of, or even the entire process, new technical advances will continue to change the paradigm and provide new challenges.

  5. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  6. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  7. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  8. A pocket device for high-throughput optofluidic holographic microscopy

    Science.gov (United States)

    Mandracchia, B.; Bianco, V.; Wang, Z.; Paturzo, M.; Bramanti, A.; Pioggia, G.; Ferraro, P.

    2017-06-01

    Here we introduce a compact holographic microscope embedded onboard a Lab-on-a-Chip (LoC) platform. A wavefront division interferometer is realized by writing a polymer grating onto the channel to extract a reference wave from the object wave impinging the LoC. A portion of the beam reaches the samples flowing along the channel path, carrying their information content to the recording device, while one of the diffraction orders from the grating acts as an off-axis reference wave. Polymeric micro-lenses are delivered forward the chip by Pyro-ElectroHydroDynamic (Pyro-EHD) inkjet printing techniques. Thus, all the required optical components are embedded onboard a pocket device, and fast, non-iterative, reconstruction algorithms can be used. We use our device in combination with a novel high-throughput technique, named Space-Time Digital Holography (STDH). STDH exploits the samples motion inside microfluidic channels to obtain a synthetic hologram, mapped in a hybrid space-time domain, and with intrinsic useful features. Indeed, a single Linear Sensor Array (LSA) is sufficient to build up a synthetic representation of the entire experiment (i.e. the STDH) with unlimited Field of View (FoV) along the scanning direction, independently from the magnification factor. The throughput of the imaging system is dramatically increased as STDH provides unlimited FoV, refocusable imaging of samples inside the liquid volume with no need for hologram stitching. To test our embedded STDH microscopy module, we counted, imaged and tracked in 3D with high-throughput red blood cells moving inside the channel volume under non ideal flow conditions.

  9. An Updated Protocol for High Throughput Plant Tissue Sectioning

    Directory of Open Access Journals (Sweden)

    Jonathan A. Atkinson

    2017-10-01

    Full Text Available Quantification of the tissue and cellular structure of plant material is essential for the study of a variety of plant sciences applications. Currently, many methods for sectioning plant material are either low throughput or involve free-hand sectioning which requires a significant amount of practice. Here, we present an updated method to provide rapid and high-quality cross sections, primarily of root tissue but which can also be readily applied to other tissues such as leaves or stems. To increase the throughput of traditional agarose embedding and sectioning, custom designed 3D printed molds were utilized to embed 5–15 roots in a block for sectioning in a single cut. A single fluorescent stain in combination with laser scanning confocal microscopy was used to obtain high quality images of thick sections. The provided CAD files allow production of the embedding molds described here from a number of online 3D printing services. Although originally developed for roots, this method provides rapid, high quality cross sections of many plant tissue types, making it suitable for use in forward genetic screens for differences in specific cell structures or developmental changes. To demonstrate the utility of the technique, the two parent lines of the wheat (Triticum aestivum Chinese Spring × Paragon doubled haploid mapping population were phenotyped for root anatomical differences. Significant differences in adventitious cross section area, stele area, xylem, phloem, metaxylem, and cortical cell file count were found.

  10. High-throughput search for improved transparent conducting oxides

    Science.gov (United States)

    Miglio, Anna

    High-throughput methodologies are a very useful computational tool to explore the space of binary and ternary oxides. We use these methods to search for new and improved transparent conducting oxides (TCOs). TCOs exhibit both visible transparency and good carrier mobility and underpin many energy and electronic applications (e.g. photovoltaics, transparent transistors). We find several potential new n-type and p-type TCOs with a low effective mass. Combining different ab initio approaches, we characterize candidate oxides by their effective mass (mobility), band gap (transparency) and dopability. We present several compounds, not considered previously as TCOs, and discuss the chemical rationale for their promising properties. This analysis is useful to formulate design strategies for future high mobility oxides and has led to follow-up studies including preliminary experimental characterization of a p-type TCO candidate with unexpected chemistry. G. Hautier, A. Miglio, D. Waroquiers, G.-M. Rignanese, and X. Gonze, ``How Does Chemistry Influence Electron Effective Mass in Oxides? A High-Throughput Computational Analysis'', Chem. Mater. 26, 5447 (2014). G. Hautier, A. Miglio, G. Ceder, G.-M. Rignanese, and X. Gonze, ``Identification and design principles of low hole effective mass p-type transparent conducting oxides'', Nature Commun. 4, 2292 (2013).

  11. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    Energy Technology Data Exchange (ETDEWEB)

    Li, Fenglei [Iowa State Univ., Ames, IA (United States)

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  12. High throughput instruments, methods, and informatics for systems biology.

    Energy Technology Data Exchange (ETDEWEB)

    Sinclair, Michael B.; Cowie, Jim R. (New Mexico State University, Las Cruces, NM); Van Benthem, Mark Hilary; Wylie, Brian Neil; Davidson, George S.; Haaland, David Michael; Timlin, Jerilyn Ann; Aragon, Anthony D. (University of New Mexico, Albuquerque, NM); Keenan, Michael Robert; Boyack, Kevin W.; Thomas, Edward Victor; Werner-Washburne, Margaret C. (University of New Mexico, Albuquerque, NM); Mosquera-Caro, Monica P. (University of New Mexico, Albuquerque, NM); Martinez, M. Juanita (University of New Mexico, Albuquerque, NM); Martin, Shawn Bryan; Willman, Cheryl L. (University of New Mexico, Albuquerque, NM)

    2003-12-01

    High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery.

  13. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  14. High-throughput technology for novel SO2 oxidation catalysts

    Directory of Open Access Journals (Sweden)

    Jonas Loskyll, Klaus Stoewe and Wilhelm F Maier

    2011-01-01

    Full Text Available We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations.

  15. Fusion genes and their discovery using high throughput sequencing.

    Science.gov (United States)

    Annala, M J; Parker, B C; Zhang, W; Nykter, M

    2013-11-01

    Fusion genes are hybrid genes that combine parts of two or more original genes. They can form as a result of chromosomal rearrangements or abnormal transcription, and have been shown to act as drivers of malignant transformation and progression in many human cancers. The biological significance of fusion genes together with their specificity to cancer cells has made them into excellent targets for molecular therapy. Fusion genes are also used as diagnostic and prognostic markers to confirm cancer diagnosis and monitor response to molecular therapies. High-throughput sequencing has enabled the systematic discovery of fusion genes in a wide variety of cancer types. In this review, we describe the history of fusion genes in cancer and the ways in which fusion genes form and affect cellular function. We also describe computational methodologies for detecting fusion genes from high-throughput sequencing experiments, and the most common sources of error that lead to false discovery of fusion genes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  17. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  18. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  19. Nanoliter high-throughput PCR for DNA and RNA profiling.

    Science.gov (United States)

    Brenan, Colin J H; Roberts, Douglas; Hurley, James

    2009-01-01

    The increasing emphasis in life science research on utilization of genetic and genomic information underlies the need for high-throughput technologies capable of analyzing the expression of multiple genes or the presence of informative single nucleotide polymorphisms (SNPs) in large-scale, population-based applications. Human disease research, disease diagnosis, personalized therapeutics, environmental monitoring, blood testing, and identification of genetic traits impacting agricultural practices, both in terms of food quality and production efficiency, are a few areas where such systems are in demand. This has stimulated the need for PCR technologies that preserves the intrinsic analytical benefits of PCR yet enables higher throughputs without increasing the time to answer, labor and reagent expenses and workflow complexity. An example of such a system based on a high-density array of nanoliter PCR assays is described here. Functionally equivalent to a microtiter plate, the nanoplate system makes possible up to 3,072 simultaneous end-point or real-time PCR measurements in a device, the size of a standard microscope slide. Methods for SNP genotyping with end-point TaqMan PCR assays and quantitative measurement of gene expression with SYBR Green I real-time PCR are outlined and illustrative data showing system performance is provided.

  20. Cross-Layer Throughput Optimization in Cognitive Radio Networks with SINR Constraints

    Directory of Open Access Journals (Sweden)

    Miao Ma

    2010-01-01

    Full Text Available Recently, there have been some research works in the design of cross-layer protocols for cognitive radio (CR networks, where the Protocol Model is used to model the radio interference. In this paper we consider a multihop multi-channel CR network. We use a more realistic Signal-to-Interference-plus-Noise Ratio (SINR model for radio interference and study the following cross-layer throughput optimization problem: (1 Given a set of secondary users with random but fixed location, and a set of traffic flows, what is the max-min achievable throughput? (2 To achieve the optimum, how to choose the set of active links, how to assign the channels to each active link, and how to route the flows? To the end, we present a formal mathematical formulation with the objective of maximizing the minimum end-to-end flow throughput. Since the formulation is in the forms of mixed integer nonlinear programming (MINLP, which is generally a hard problem, we develop a heuristic method by solving a relaxation of the original problem, followed by rounding and simple local optimization. Simulation results show that the heuristic approach performs very well, that is, the solutions obtained by the heuristic are very close to the global optimum obtained via LINGO.

  1. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations.

    Science.gov (United States)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  2. Quantitative description on structure–property relationships of Li-ion battery materials for high-throughput computations

    Science.gov (United States)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737

  3. Optimal power allocation based on sum-throughput maximization for energy harvesting cognitive radio networks

    Science.gov (United States)

    Xie, Zhenwei; Zhu, Qi

    2017-01-01

    In this study, an optimal power allocation algorithm by maximizing the sum-throughput in energy harvesting cognitive radio networks is proposed. Under the causality constraints of the harvested energy by solar radiation, electromagnetic waves and so on in the two secondary users (SUs), and the interference constraint in the primary user (PU), the sum-throughput maximization problem is formulated. The algorithm decomposes the interference threshold constraint to the power upper bounds of the two SUs. Then, the power allocation problems of the two SUs can be solved by a directional water-filling algorithm (DWA) with the power upper bounds, respectively. The paper gives the algorithm steps and simulation results, and the simulation results verify that the proposed algorithm has obvious advantages over the other two algorithms.

  4. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online.

  5. Evaluation of a pooled strategy for high-throughput sequencing of cosmid clones from metagenomic libraries.

    Directory of Open Access Journals (Sweden)

    Kathy N Lam

    Full Text Available High-throughput sequencing methods have been instrumental in the growing field of metagenomics, with technological improvements enabling greater throughput at decreased costs. Nonetheless, the economy of high-throughput sequencing cannot be fully leveraged in the subdiscipline of functional metagenomics. In this area of research, environmental DNA is typically cloned to generate large-insert libraries from which individual clones are isolated, based on specific activities of interest. Sequence data are required for complete characterization of such clones, but the sequencing of a large set of clones requires individual barcode-based sample preparation; this can become costly, as the cost of clone barcoding scales linearly with the number of clones processed, and thus sequencing a large number of metagenomic clones often remains cost-prohibitive. We investigated a hybrid Sanger/Illumina pooled sequencing strategy that omits barcoding altogether, and we evaluated this strategy by comparing the pooled sequencing results to reference sequence data obtained from traditional barcode-based sequencing of the same set of clones. Using identity and coverage metrics in our evaluation, we show that pooled sequencing can generate high-quality sequence data, without producing problematic chimeras. Though caveats of a pooled strategy exist and further optimization of the method is required to improve recovery of complete clone sequences and to avoid circumstances that generate unrecoverable clone sequences, our results demonstrate that pooled sequencing represents an effective and low-cost alternative for sequencing large sets of metagenomic clones.

  6. A robust robotic high-throughput antibody purification platform.

    Science.gov (United States)

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. High-throughput ballistic injection nanorheology to measure cell mechanics

    Science.gov (United States)

    Wu, Pei-Hsun; Hale, Christopher M; Chen, Wei-Chiang; Lee, Jerry S H; Tseng, Yiider; Wirtz, Denis

    2015-01-01

    High-throughput ballistic injection nanorheology is a method for the quantitative study of cell mechanics. Cell mechanics are measured by ballistic injection of submicron particles into the cytoplasm of living cells and tracking the spontaneous displacement of the particles at high spatial resolution. The trajectories of the cytoplasm-embedded particles are transformed into mean-squared displacements, which are subsequently transformed into frequency-dependent viscoelastic moduli and time-dependent creep compliance of the cytoplasm. This method allows for the study of a wide range of cellular conditions, including cells inside a 3D matrix, cell subjected to shear flows and biochemical stimuli, and cells in a live animal. Ballistic injection lasts < 1 min and is followed by overnight incubation. Multiple particle tracking for one cell lasts < 1 min. Forty cells can be examined in < 1 h. PMID:22222790

  8. Machine Learning for High-Throughput Stress Phenotyping in Plants.

    Science.gov (United States)

    Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh Kumar; Sarkar, Soumik

    2016-02-01

    Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large corpus of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping. Four stages of the decision cycle in plant stress phenotyping and plant breeding activities where different ML approaches can be deployed are (i) identification, (ii) classification, (iii) quantification, and (iv) prediction (ICQP). We provide here a comprehensive overview and user-friendly taxonomy of ML tools to enable the plant community to correctly and easily apply the appropriate ML tools and best-practice guidelines for various biotic and abiotic stress traits. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. The Principals and Practice of Distributed High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The potential of Distributed Processing Systems to deliver computing capabilities with qualities ranging from high availability and reliability to easy expansion in functionality and capacity were recognized and formalized in the 1970’s. For more three decade these principals Distributed Computing guided the development of the HTCondor resource and job management system. The widely adopted suite of software tools offered by HTCondor are based on novel distributed computing technologies and are driven by the evolving needs of High Throughput scientific applications. We will review the principals that underpin our work, the distributed computing frameworks and technologies we developed and the lessons we learned from delivering effective and dependable software tools in an ever changing landscape computing technologies and needs that range today from a desktop computer to tens of thousands of cores offered by commercial clouds. About the speaker Miron Livny received a B.Sc. degree in Physics and Mat...

  10. Single-platelet nanomechanics measured by high-throughput cytometry

    Science.gov (United States)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2017-02-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  11. High-throughput antibody development and retrospective epitope mapping

    DEFF Research Database (Denmark)

    Rydahl, Maja Gro

    Plant cell walls are composed of an interlinked network of polysaccharides, glycoproteins and phenolic polymers. When addressing the diverse polysaccharides in green plants, including land plants and the ancestral green algae, there are significant overlaps in the cell wall structures. Yet......, there are noteworthy differences in the less evolved species of algae as compared to land plants. The dynamic process orchestrating the deposition of these biopolymers both in algae and higher plants, is complex and highly heterogeneous, yet immensely important for the development and differentiation of the cell...... of green algae, during the development into land plants. Hence, there is a pressing need for rethinking the glycomic toolbox, by developing new and high-throughput (HTP) technology, in order to acquire information of the location and relative abundance of diverse cell wall polymers. In this dissertation...

  12. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  13. Ethoscopes: An open platform for high-throughput ethomics.

    Science.gov (United States)

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J; French, Alice S; Jamasb, Arian R; Gilestro, Giorgio F

    2017-10-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  14. Ethoscopes: An open platform for high-throughput ethomics

    Science.gov (United States)

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J.; French, Alice S.; Jamasb, Arian R.

    2017-01-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope. PMID:29049280

  15. High-throughput drawing and testing of metallic glass nanostructures.

    Science.gov (United States)

    Hasan, Molla; Kumar, Golden

    2017-03-02

    Thermoplastic embossing of metallic glasses promises direct imprinting of metal nanostructures using templates. However, embossing high-aspect-ratio nanostructures faces unworkable flow resistance due to friction and non-wetting conditions at the template interface. Herein, we show that these inherent challenges of embossing can be reversed by thermoplastic drawing using templates. The flow resistance not only remains independent of wetting but also decreases with increasing feature aspect-ratio. Arrays of assembled nanotips, nanowires, and nanotubes with aspect-ratios exceeding 1000 can be produced through controlled elongation and fracture of metallic glass structures. In contrast to embossing, the drawing approach generates two sets of nanostructures upon final fracture; one set remains anchored to the metallic glass substrate while the second set is assembled on the template. This method can be readily adapted for high-throughput fabrication and testing of nanoscale tensile specimens, enabling rapid screening of size-effects in mechanical behavior.

  16. Statistically invalid classification of high throughput gene expression data

    Science.gov (United States)

    Barbash, Shahar; Soreq, Hermona

    2013-01-01

    Classification analysis based on high throughput data is a common feature in neuroscience and other fields of science, with a rapidly increasing impact on both basic biology and disease-related studies. The outcome of such classifications often serves to delineate novel biochemical mechanisms in health and disease states, identify new targets for therapeutic interference, and develop innovative diagnostic approaches. Given the importance of this type of studies, we screened 111 recently-published high-impact manuscripts involving classification analysis of gene expression, and found that 58 of them (53%) based their conclusions on a statistically invalid method which can lead to bias in a statistical sense (lower true classification accuracy then the reported classification accuracy). In this report we characterize the potential methodological error and its scope, investigate how it is influenced by different experimental parameters, and describe statistically valid methods for avoiding such classification mistakes. PMID:23346359

  17. High throughput sequencing of microRNAs in chicken somites.

    Science.gov (United States)

    Rathjen, Tina; Pais, Helio; Sweetman, Dylan; Moulton, Vincent; Munsterberg, Andrea; Dalmay, Tamas

    2009-05-06

    High throughput Solexa sequencing technology was applied to identify microRNAs in somites of developing chicken embryos. We obtained 651,273 reads, from which 340,415 were mapped to the chicken genome representing 1701 distinct sequences. Eighty-five of these were known microRNAs and 42 novel miRNA candidates were identified. Accumulation of 18 of 42 sequences was confirmed by Northern blot analysis. Ten of the 18 sequences are new variants of known miRNAs and eight short RNAs are novel miRNAs. Six of these eight have not been reported by other deep sequencing projects. One of the six new miRNAs is highly enriched in somite tissue suggesting that deep sequencing of other specific tissues has the potential to identify novel tissue specific miRNAs.

  18. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  19. A high throughput DNA extraction method with high yield and quality.

    Science.gov (United States)

    Xin, Zhanguo; Chen, Junping

    2012-07-28

    Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome), and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L.) Moench] leaves and dry seeds with high yield, high quality, and affordable cost. We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  20. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  1. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  2. High-throughput phenotyping of seminal root traits in wheat.

    Science.gov (United States)

    Richard, Cecile Ai; Hickey, Lee T; Fletcher, Susan; Jennings, Raeleen; Chenu, Karine; Christopher, Jack T

    2015-01-01

    Water availability is a major limiting factor for wheat (Triticum aestivum L.) production in rain-fed agricultural systems worldwide. Root system architecture has important functional implications for the timing and extent of soil water extraction, yet selection for root architectural traits in breeding programs has been limited by a lack of suitable phenotyping methods. The aim of this research was to develop low-cost high-throughput phenotyping methods to facilitate selection for desirable root architectural traits. Here, we report two methods, one using clear pots and the other using growth pouches, to assess the angle and the number of seminal roots in wheat seedlings- two proxy traits associated with the root architecture of mature wheat plants. Both methods revealed genetic variation for seminal root angle and number in the panel of 24 wheat cultivars. The clear pot method provided higher heritability and higher genetic correlations across experiments compared to the growth pouch method. In addition, the clear pot method was more efficient - requiring less time, space, and labour compared to the growth pouch method. Therefore the clear pot method was considered the most suitable for large-scale and high-throughput screening of seedling root characteristics in crop improvement programs. The clear-pot method could be easily integrated in breeding programs targeting drought tolerance to rapidly enrich breeding populations with desirable alleles. For instance, selection for narrow root angle and high number of seminal roots could lead to deeper root systems with higher branching at depth. Such root characteristics are highly desirable in wheat to cope with anticipated future climate conditions, particularly where crops rely heavily on stored soil moisture at depth, including some Australian, Indian, South American, and African cropping regions.

  3. High-Throughput Genomics Enhances Tomato Breeding Efficiency

    Science.gov (United States)

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-01-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits. PMID:19721805

  4. High-Throughput Screening of the Asymmetric Decarboxylative Alkylation Reaction of Enolate-Stabilized Enol Carbonates

    KAUST Repository

    Stoltz, Brian

    2010-06-14

    The use of high-throughput screening allowed for the optimization of reaction conditions for the palladium-catalyzed asymmetric decarboxylative alkylation reaction of enolate-stabilized enol carbonates. Changing to a non-polar reaction solvent and to an electron-deficient PHOX derivative as ligand from our standard reaction conditions improved the enantioselectivity for the alkylation of a ketal-protected,1,3-diketone-derived enol carbonate from 28% ee to 84% ee. Similar improvements in enantioselectivity were seen for a β-keto-ester derived- and an α-phenyl cyclohexanone-derived enol carbonate.

  5. Curating and Preparing High-Throughput Screening Data for Quantitative Structure-Activity Relationship Modeling.

    Science.gov (United States)

    Kim, Marlene T; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2016-01-01

    Publicly available bioassay data often contains errors. Curating massive bioassay data, especially high-throughput screening (HTS) data, for Quantitative Structure-Activity Relationship (QSAR) modeling requires the assistance of automated data curation tools. Using automated data curation tools are beneficial to users, especially ones without prior computer skills, because many platforms have been developed and optimized based on standardized requirements. As a result, the users do not need to extensively configure the curation tool prior to the application procedure. In this chapter, a freely available automatic tool to curate and prepare HTS data for QSAR modeling purposes will be described.

  6. An extractive removal step optimized for a high-throughput α-cellulose extraction method for δ 13 C and δ 18 O stable isotope ratio analysis in conifer tree rings

    Science.gov (United States)

    Wen Lin; Asko Noormets; John S. King; Ge Sun; Steve McNulty; Jean-Christophe Domec; Lucas Cernusak

    2017-01-01

    Stable isotope ratios (δ13C and δ18O) of tree-ring α-cellulose are important tools in paleoclimatology, ecology, plant physiology and genetics. The Multiple Sample Isolation System for Solids (MSISS) was a major advance in the tree-ring α-cellulose extraction methods, offering greater throughput and reduced labor input compared to traditional alternatives. However, the...

  7. Chromatographic Monoliths for High-Throughput Immunoaffinity Isolation of Transferrin from Human Plasma

    Directory of Open Access Journals (Sweden)

    Irena Trbojević-Akmačić

    2016-06-01

    Full Text Available Changes in protein glycosylation are related to different diseases and have a potential as diagnostic and prognostic disease biomarkers. Transferrin (Tf glycosylation changes are common marker for congenital disorders of glycosylation. However, biological interindividual variability of Tf N-glycosylation and genes involved in glycosylation regulation are not known. Therefore, high-throughput Tf isolation method and large scale glycosylation studies are needed in order to address these questions. Due to their unique chromatographic properties, the use of chromatographic monoliths enables very fast analysis cycle, thus significantly increasing sample preparation throughput. Here, we are describing characterization of novel immunoaffinity-based monolithic columns in a 96-well plate format for specific high-throughput purification of human Tf from blood plasma. We optimized the isolation and glycan preparation procedure for subsequent ultra performance liquid chromatography (UPLC analysis of Tf N-glycosylation and managed to increase the sensitivity for approximately three times compared to initial experimental conditions, with very good reproducibility. This work is licensed under a Creative Commons Attribution 4.0 International License.

  8. Surrogate-assisted feature extraction for high-throughput phenotyping.

    Science.gov (United States)

    Yu, Sheng; Chakrabortty, Abhishek; Liao, Katherine P; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2017-04-01

    Phenotyping algorithms are capable of accurately identifying patients with specific phenotypes from within electronic medical records systems. However, developing phenotyping algorithms in a scalable way remains a challenge due to the extensive human resources required. This paper introduces a high-throughput unsupervised feature selection method, which improves the robustness and scalability of electronic medical record phenotyping without compromising its accuracy. The proposed Surrogate-Assisted Feature Extraction (SAFE) method selects candidate features from a pool of comprehensive medical concepts found in publicly available knowledge sources. The target phenotype's International Classification of Diseases, Ninth Revision and natural language processing counts, acting as noisy surrogates to the gold-standard labels, are used to create silver-standard labels. Candidate features highly predictive of the silver-standard labels are selected as the final features. Algorithms were trained to identify patients with coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis using various numbers of labels to compare the performance of features selected by SAFE, a previously published automated feature extraction for phenotyping procedure, and domain experts. The out-of-sample area under the receiver operating characteristic curve and F -score from SAFE algorithms were remarkably higher than those from the other two, especially at small label sizes. SAFE advances high-throughput phenotyping methods by automatically selecting a succinct set of informative features for algorithm training, which in turn reduces overfitting and the needed number of gold-standard labels. SAFE also potentially identifies important features missed by automated feature extraction for phenotyping or experts.

  9. A Primer on High-Throughput Computing for Genomic Selection

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  10. An improved high throughput sequencing method for studying oomycete communities

    DEFF Research Database (Denmark)

    Sapkota, Rumakanta; Nicolaisen, Mogens

    2015-01-01

    Culture-independent studies using next generation sequencing have revolutionizedmicrobial ecology, however, oomycete ecology in soils is severely lagging behind. The aimof this study was to improve and validate standard techniques for using high throughput sequencing as a tool for studying oomycete...... agricultural fields in Denmark, and 11 samples from carrot tissue with symptoms of Pythium infection. Sequence data from the Pythium and Phytophthora mock communities showed that our strategy successfully detected all included species. Taxonomic assignments of OTUs from 26 soil sample showed that 95...... the usefulness of the method not only in soil DNA but also in a plant DNA background. In conclusion, we demonstrate a successful approach for pyrosequencing of oomycete communities using ITS1 as the barcode sequence with well-known primers for oomycete DNA amplification....

  11. High-Throughput Screening Using Fourier-Transform Infrared Imaging

    Directory of Open Access Journals (Sweden)

    Erdem Sasmaz

    2015-06-01

    Full Text Available Efficient parallel screening of combinatorial libraries is one of the most challenging aspects of the high-throughput (HT heterogeneous catalysis workflow. Today, a number of methods have been used in HT catalyst studies, including various optical, mass-spectrometry, and gas-chromatography techniques. Of these, rapid-scanning Fourier-transform infrared (FTIR imaging is one of the fastest and most versatile screening techniques. Here, the new design of the 16-channel HT reactor is presented and test results for its accuracy and reproducibility are shown. The performance of the system was evaluated through the oxidation of CO over commercial Pd/Al2O3 and cobalt oxide nanoparticles synthesized with different reducer-reductant molar ratios, surfactant types, metal and surfactant concentrations, synthesis temperatures, and ramp rates.

  12. High throughput parametric studies of the structure of complex nanomaterials

    Science.gov (United States)

    Tian, Peng

    The structure of nanoscale materials is difficult to study because crystallography, the gold-standard for structure studies, no longer works at the nanoscale. New tools are needed to study nanostructure. Furthermore, it is important to study the evolution of nanostructure of complex nanostructured materials as a function of various parameters such as temperature or other environmental variables. These are called parametric studies because an environmental parameter is being varied. This means that the new tools for studying nanostructure also need to be extended to work quickly and on large numbers of datasets. This thesis describes the development of new tools for high throughput studies of complex and nanostructured materials, and their application to study the structural evolution of bulk, and nanoparticles of, MnAs as a function of temperature. The tool for high throughput analysis of the bulk material was developed as part of this PhD thesis work and is called SrRietveld. A large part of making a new tool is to validate it and we did this for SrRietveld by carrying out a high-throughput study of uncertainties coming from the program using different ways of estimating the uncertainty. This tool was applied to study structural changes in MnAs as a function of temperature. We were also interested in studying different MnAs nanoparticles fabricated through different methods because of their applications in information storage. PDFgui, an existing tool for analyzing nanoparticles using Pair distribution function (PDF) refinement, was used in these cases. Comparing the results from the analysis by SrRietveld and PDFgui, we got more comprehensive structure information about MnAs. The layout of the thesis is as follows. First, the background knowledge about material structures is given. The conventional crystallographic analysis is introduced in both theoretical and practical ways. For high throughput study, the next-generation Rietveld analysis program: Sr

  13. Interactive Visual Analysis of High Throughput Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; Goodall, John R [ORNL; Maness, Christopher S [ORNL; Senter, James K [ORNL; Potok, Thomas E [ORNL

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  14. High-Throughput Mass Spectrometry Applied to Structural Genomics

    Directory of Open Access Journals (Sweden)

    Rod Chalk

    2014-10-01

    Full Text Available Mass spectrometry (MS remains under-utilized for the analysis of expressed proteins because it is inaccessible to the non-specialist, and sample-turnaround from service labs is slow. Here, we describe 3.5 min Liquid-Chromatography (LC-MS and 16 min LC-MSMS methods which are tailored to validation and characterization of recombinant proteins in a high throughput structural biology pipeline. We illustrate the type and scope of MS data typically obtained from a 96-well expression and purification test for both soluble and integral membrane proteins (IMPs, and describe their utility in the selection of constructs for scale-up structural work, leading to cost and efficiency savings. We propose that value of MS data lies in how quickly it becomes available and that this can fundamentally change the way in which it is used.

  15. Applications of High-Throughput Nucleotide Sequencing (PhD)

    DEFF Research Database (Denmark)

    Waage, Johannes

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......). For the second flavor, DNA-seq, a study presenting genome wide profiling of transcription factor CEBP/A in liver cells undergoing regeneration after partial hepatectomy (article IV) is included....

  16. Automated high-throughput behavioral analyses in zebrafish larvae.

    Science.gov (United States)

    Richendrfer, Holly; Créton, Robbert

    2013-07-04

    We have created a novel high-throughput imaging system for the analysis of behavior in 7-day-old zebrafish larvae in multi-lane plates. This system measures spontaneous behaviors and the response to an aversive stimulus, which is shown to the larvae via a PowerPoint presentation. The recorded images are analyzed with an ImageJ macro, which automatically splits the color channels, subtracts the background, and applies a threshold to identify individual larvae placement in the lanes. We can then import the coordinates into an Excel sheet to quantify swim speed, preference for edge or side of the lane, resting behavior, thigmotaxis, distance between larvae, and avoidance behavior. Subtle changes in behavior are easily detected using our system, making it useful for behavioral analyses after exposure to environmental toxicants or pharmaceuticals.

  17. High-throughput ab-initio dilute solute diffusion database

    Science.gov (United States)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-01-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world. PMID:27434308

  18. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    RNAs with known killing effects as a model system to demonstrate that RPPA-based protein quantification can serve as substitute readout of cell viability, hereby reliably reflecting toxicity. In terms of automation, cell exposure, protein harvest, serial dilution and sample reformatting were performed using...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si...... a robotic screening platform. Furthermore, we automated sample tracking and data analysis by developing a bundled bioinformatics tool named “MIRACLE”. Automation and RPPA-based viability/toxicity readouts enable rapid testing of large sample numbers, while granting the possibility for flexible consecutive...

  19. Optimal throughput for cognitive radio with energy harvesting in fading wireless channel.

    Science.gov (United States)

    Vu-Van, Hiep; Koo, Insoo

    2014-01-01

    Energy resource management is a crucial problem of a device with a finite capacity battery. In this paper, cognitive radio is considered to be a device with an energy harvester that can harvest energy from a non-RF energy resource while performing other actions of cognitive radio. Harvested energy will be stored in a finite capacity battery. At the start of the time slot of cognitive radio, the radio needs to determine if it should remain silent or carry out spectrum sensing based on the idle probability of the primary user and the remaining energy in order to maximize the throughput of the cognitive radio system. In addition, optimal sensing energy and adaptive transmission power control are also investigated in this paper to effectively utilize the limited energy of cognitive radio. Finding an optimal approach is formulated as a partially observable Markov decision process. The simulation results show that the proposed optimal decision scheme outperforms the myopic scheme in which current throughput is only considered when making a decision.

  20. A High-Throughput Antibody-Based Microarray Typing Platform

    Science.gov (United States)

    Andrew, Gehring; Charles, Barnett; Chu, Ted; DebRoy, Chitrita; D'Souza, Doris; Eaker, Shannon; Fratamico, Pina; Gillespie, Barbara; Hegde, Narasimha; Jones, Kevin; Lin, Jun; Oliver, Stephen; Paoli, George; Perera, Ashan; Uknalis, Joseph

    2013-01-01

    Many rapid methods have been developed for screening foods for the presence of pathogenic microorganisms. Rapid methods that have the additional ability to identify microorganisms via multiplexed immunological recognition have the potential for classification or typing of microbial contaminants thus facilitating epidemiological investigations that aim to identify outbreaks and trace back the contamination to its source. This manuscript introduces a novel, high throughput typing platform that employs microarrayed multiwell plate substrates and laser-induced fluorescence of the nucleic acid intercalating dye/stain SYBR Gold for detection of antibody-captured bacteria. The aim of this study was to use this platform for comparison of different sets of antibodies raised against the same pathogens as well as demonstrate its potential effectiveness for serotyping. To that end, two sets of antibodies raised against each of the “Big Six” non-O157 Shiga toxin-producing E. coli (STEC) as well as E. coli O157:H7 were array-printed into microtiter plates, and serial dilutions of the bacteria were added and subsequently detected. Though antibody specificity was not sufficient for the development of an STEC serotyping method, the STEC antibody sets performed reasonably well exhibiting that specificity increased at lower capture antibody concentrations or, conversely, at lower bacterial target concentrations. The favorable results indicated that with sufficiently selective and ideally concentrated sets of biorecognition elements (e.g., antibodies or aptamers), this high-throughput platform can be used to rapidly type microbial isolates derived from food samples within ca. 80 min of total assay time. It can also potentially be used to detect the pathogens from food enrichments and at least serve as a platform for testing antibodies. PMID:23645110

  1. Compound Cytotoxicity Profiling Using Quantitative High-Throughput Screening

    Science.gov (United States)

    Xia, Menghang; Huang, Ruili; Witt, Kristine L.; Southall, Noel; Fostel, Jennifer; Cho, Ming-Hsuang; Jadhav, Ajit; Smith, Cynthia S.; Inglese, James; Portier, Christopher J.; Tice, Raymond R.; Austin, Christopher P.

    2008-01-01

    Background The propensity of compounds to produce adverse health effects in humans is generally evaluated using animal-based test methods. Such methods can be relatively expensive, low-throughput, and associated with pain suffered by the treated animals. In addition, differences in species biology may confound extrapolation to human health effects. Objective The National Toxicology Program and the National Institutes of Health Chemical Genomics Center are collaborating to identify a battery of cell-based screens to prioritize compounds for further toxicologic evaluation. Methods A collection of 1,408 compounds previously tested in one or more traditional toxicologic assays were profiled for cytotoxicity using quantitative high-throughput screening (qHTS) in 13 human and rodent cell types derived from six common targets of xenobiotic toxicity (liver, blood, kidney, nerve, lung, skin). Selected cytotoxicants were further tested to define response kinetics. Results qHTS of these compounds produced robust and reproducible results, which allowed cross-compound, cross-cell type, and cross-species comparisons. Some compounds were cytotoxic to all cell types at similar concentrations, whereas others exhibited species- or cell type–specific cytotoxicity. Closely related cell types and analogous cell types in human and rodent frequently showed different patterns of cytotoxicity. Some compounds inducing similar levels of cytotoxicity showed distinct time dependence in kinetic studies, consistent with known mechanisms of toxicity. Conclusions The generation of high-quality cytotoxicity data on this large library of known compounds using qHTS demonstrates the potential of this methodology to profile a much broader array of assays and compounds, which, in aggregate, may be valuable for prioritizing compounds for further toxicologic evaluation, identifying compounds with particular mechanisms of action, and potentially predicting in vivo biological response. PMID:18335092

  2. High throughput phenotyping for aphid resistance in large plant collections

    Directory of Open Access Journals (Sweden)

    Chen Xi

    2012-08-01

    Full Text Available Abstract Background Phloem-feeding insects are among the most devastating pests worldwide. They not only cause damage by feeding from the phloem, thereby depleting the plant from photo-assimilates, but also by vectoring viruses. Until now, the main way to prevent such problems is the frequent use of insecticides. Applying resistant varieties would be a more environmental friendly and sustainable solution. For this, resistant sources need to be identified first. Up to now there were no methods suitable for high throughput phenotyping of plant germplasm to identify sources of resistance towards phloem-feeding insects. Results In this paper we present a high throughput screening system to identify plants with an increased resistance against aphids. Its versatility is demonstrated using an Arabidopsis thaliana activation tag mutant line collection. This system consists of the green peach aphid Myzus persicae (Sulzer and the circulative virus Turnip yellows virus (TuYV. In an initial screening, with one plant representing one mutant line, 13 virus-free mutant lines were identified by ELISA. Using seeds produced from these lines, the putative candidates were re-evaluated and characterized, resulting in nine lines with increased resistance towards the aphid. Conclusions This M. persicae-TuYV screening system is an efficient, reliable and quick procedure to identify among thousands of mutated lines those resistant to aphids. In our study, nine mutant lines with increased resistance against the aphid were selected among 5160 mutant lines in just 5 months by one person. The system can be extended to other phloem-feeding insects and circulative viruses to identify insect resistant sources from several collections, including for example genebanks and artificially prepared mutant collections.

  3. High-throughput DNA extraction of forensic adhesive tapes.

    Science.gov (United States)

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2018-01-01

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org. Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  5. A Self-Driven and Adaptive Adjusting Teaching Learning Method for Optimizing Optical Multicast Network Throughput

    Science.gov (United States)

    Liu, Huanlin; Xu, Yifan; Chen, Yong; Zhang, Mingjia

    2016-09-01

    With the development of one point to multiple point applications, network resources become scarcer and wavelength channels become more crowded in optical networks. To improve the bandwidth utilization, the multicast routing algorithm based on network coding can greatly increase the resource utilization, but it is most difficult to maximize the network throughput owing to ignoring the differences between the multicast receiving nodes. For making full use of the destination nodes' receives ability to maximize optical multicast's network throughput, a new optical multicast routing algorithm based on teaching-learning-based optimization (MR-iTLBO) is proposed in the paper. In order to increase the diversity of learning, a self-driven learning method is adopted in MR-iTLBO algorithm, and the mutation operator of genetic algorithm is introduced to prevent the algorithm into a local optimum. For increasing learner's learning efficiency, an adaptive learning factor is designed to adjust the learning process. Moreover, the reconfiguration scheme based on probability vector is devised to expand its global search capability in MR-iTLBO algorithm. The simulation results show that performance in terms of network throughput and convergence rate has been improved significantly with respect to the TLBO and the variant TLBO.

  6. High-throughput microfluidic device for single cell analysis using multiple integrated soft lithographic pumps.

    Science.gov (United States)

    Patabadige, Damith E W; Mickleburgh, Tom; Ferris, Lorin; Brummer, Gage; Culbertson, Anne H; Culbertson, Christopher T

    2016-05-01

    The ability to accurately control fluid transport in microfluidic devices is key for developing high-throughput methods for single cell analysis. Making small, reproducible changes to flow rates, however, to optimize lysis and injection using pumps external to the microfluidic device are challenging and time-consuming. To improve the throughput and increase the number of cells analyzed, we have integrated previously reported micropumps into a microfluidic device that can increase the cell analysis rate to ∼1000 cells/h and operate for over an hour continuously. In order to increase the flow rates sufficiently to handle cells at a higher throughput, three sets of pumps were multiplexed. These pumps are simple, low-cost, durable, easy to fabricate, and biocompatible. They provide precise control of the flow rate up to 9.2 nL/s. These devices were used to automatically transport, lyse, and electrophoretically separate T-Lymphocyte cells loaded with Oregon green and 6-carboxyfluorescein. Peak overlap statistics predicted the number of fully resolved single-cell electropherograms seen. In addition, there was no change in the average fluorescent dye peak areas indicating that the cells remained intact and the dyes did not leak out of the cells over the 1 h analysis time. The cell lysate peak area distribution followed that expected of an asynchronous steady-state population of immortalized cells. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Optimization of Comminution Circuit Throughput and Product Size Distribution by Simulation and Control

    Energy Technology Data Exchange (ETDEWEB)

    S.K. Kawatra; T.C. Eisele; T. Weldum; D. Larsen; R. Mariani; J. Pletka

    2005-07-01

    The goal of this project was to improve energy efficiency of industrial crushing and grinding operations (comminution). Mathematical models of the comminution process were used to study methods for optimizing the product size distribution, so that the amount of excessively fine material produced could be minimized. The goal was to save energy by reducing the amount of material that was ground below the target size, while simultaneously reducing the quantity of materials wasted as ''slimes'' that were too fine to be useful. Extensive plant sampling and mathematical modeling of the grinding circuits was carried out to determine how to correct this problem. The approaches taken included (1) Modeling of the circuit to determine process bottlenecks that restrict flowrates in one area while forcing other parts of the circuit to overgrind the material; (2) Modeling of hydrocyclones to determine the mechanisms responsible for retaining fine, high-density particles in the circuit until they are overground, and improving existing models to accurately account for this behavior; and (3) Evaluation of the potential of advanced technologies to improve comminution efficiency and produce sharper product size distributions with less overgrinding. The mathematical models were used to simulate novel circuits for minimizing overgrinding and increasing throughput, and it is estimated that a single plant grinding 15 million tons of ore per year saves up to 82.5 million kWhr/year, or 8.6 x 10{sup 11} BTU/year. Implementation of this technology in the midwestern iron ore industry, which grinds an estimated 150 million tons of ore annually to produce over 50 million tons of iron ore concentrate, would save an estimated 1 x 10{sup 13} BTU/year.

  8. Emerging metrology for high-throughput nanomaterial genotoxicology.

    Science.gov (United States)

    Nelson, Bryant C; Wright, Christa W; Ibuki, Yuko; Moreno-Villanueva, Maria; Karlsson, Hanna L; Hendriks, Giel; Sims, Christopher M; Singh, Neenu; Doak, Shareen H

    2017-01-01

    The rapid development of the engineered nanomaterial (ENM) manufacturing industry has accelerated the incorporation of ENMs into a wide variety of consumer products across the globe. Unintentionally or not, some of these ENMs may be introduced into the environment or come into contact with humans or other organisms resulting in unexpected biological effects. It is thus prudent to have rapid and robust analytical metrology in place that can be used to critically assess and/or predict the cytotoxicity, as well as the potential genotoxicity of these ENMs. Many of the traditional genotoxicity test methods [e.g. unscheduled DNA synthesis assay, bacterial reverse mutation (Ames) test, etc.,] for determining the DNA damaging potential of chemical and biological compounds are not suitable for the evaluation of ENMs, due to a variety of methodological issues ranging from potential assay interferences to problems centered on low sample throughput. Recently, a number of sensitive, high-throughput genotoxicity assays/platforms (CometChip assay, flow cytometry/micronucleus assay, flow cytometry/γ-H2AX assay, automated 'Fluorimetric Detection of Alkaline DNA Unwinding' (FADU) assay, ToxTracker reporter assay) have been developed, based on substantial modifications and enhancements of traditional genotoxicity assays. These new assays have been used for the rapid measurement of DNA damage (strand breaks), chromosomal damage (micronuclei) and for detecting upregulated DNA damage signalling pathways resulting from ENM exposures. In this critical review, we describe and discuss the fundamental measurement principles and measurement endpoints of these new assays, as well as the modes of operation, analytical metrics and potential interferences, as applicable to ENM exposures. An unbiased discussion of the major technical advantages and limitations of each assay for evaluating and predicting the genotoxic potential of ENMs is also provided. Published by Oxford University Press on

  9. High-throughput massively parallel sequencing for fetal aneuploidy detection from maternal plasma.

    Directory of Open Access Journals (Sweden)

    Taylor J Jensen

    Full Text Available Circulating cell-free (ccf fetal DNA comprises 3-20% of all the cell-free DNA present in maternal plasma. Numerous research and clinical studies have described the analysis of ccf DNA using next generation sequencing for the detection of fetal aneuploidies with high sensitivity and specificity. We sought to extend the utility of this approach by assessing semi-automated library preparation, higher sample multiplexing during sequencing, and improved bioinformatic tools to enable a higher throughput, more efficient assay while maintaining or improving clinical performance.Whole blood (10mL was collected from pregnant female donors and plasma separated using centrifugation. Ccf DNA was extracted using column-based methods. Libraries were prepared using an optimized semi-automated library preparation method and sequenced on an Illumina HiSeq2000 sequencer in a 12-plex format. Z-scores were calculated for affected chromosomes using a robust method after normalization and genomic segment filtering. Classification was based upon a standard normal transformed cutoff value of z = 3 for chromosome 21 and z = 3.95 for chromosomes 18 and 13.Two parallel assay development studies using a total of more than 1900 ccf DNA samples were performed to evaluate the technical feasibility of automating library preparation and increasing the sample multiplexing level. These processes were subsequently combined and a study of 1587 samples was completed to verify the stability of the process-optimized assay. Finally, an unblinded clinical evaluation of 1269 euploid and aneuploid samples utilizing this high-throughput assay coupled to improved bioinformatic procedures was performed. We were able to correctly detect all aneuploid cases with extremely low false positive rates of 0.09%, <0.01%, and 0.08% for trisomies 21, 18, and 13, respectively.These data suggest that the developed laboratory methods in concert with improved bioinformatic approaches enable higher sample

  10. High-throughput retrotransposon-based fluorescent markers: improved information content and allele discrimination

    Directory of Open Access Journals (Sweden)

    Baker David

    2009-07-01

    Full Text Available Abstract Background Dense genetic maps, together with the efficiency and accuracy of their construction, are integral to genetic studies and marker assisted selection for plant breeding. High-throughput multiplex markers that are robust and reproducible can contribute to both efficiency and accuracy. Multiplex markers are often dominant and so have low information content, this coupled with the pressure to find alternatives to radio-labelling, has led us to adapt the SSAP (sequence specific amplified polymorphism marker method from a 33P labelling procedure to fluorescently tagged markers analysed from an automated ABI 3730 xl platform. This method is illustrated for multiplexed SSAP markers based on retrotransposon insertions of pea and is applicable for the rapid and efficient generation of markers from genomes where repetitive element sequence information is available for primer design. We cross-reference SSAP markers previously generated using the 33P manual PAGE system to fluorescent peaks, and use these high-throughput fluorescent SSAP markers for further genetic studies in Pisum. Results The optimal conditions for the fluorescent-labelling method used a triplex set of primers in the PCR. These included a fluorescently labelled specific primer together with its unlabelled counterpart, plus an adapter-based primer with two bases of selection on the 3' end. The introduction of the unlabelled specific primer helped to optimise the fluorescent signal across the range of fragment sizes expected, and eliminated the need for extensive dilutions of PCR amplicons. The software (GeneMarker Version 1.6 used for the high-throughput data analysis provided an assessment of amplicon size in nucleotides, peak areas and fluorescence intensity in a table format, so providing additional information content for each marker. The method has been tested in a small-scale study with 12 pea accessions resulting in 467 polymorphic fluorescent SSAP markers of which

  11. High-Throughput Printing Process for Flexible Electronics

    Science.gov (United States)

    Hyun, Woo Jin

    Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.

  12. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  13. Field high-throughput phenotyping: the new crop breeding frontier.

    Science.gov (United States)

    Araus, José Luis; Cairns, Jill E

    2014-01-01

    Constraints in field phenotyping capability limit our ability to dissect the genetics of quantitative traits, particularly those related to yield and stress tolerance (e.g., yield potential as well as increased drought, heat tolerance, and nutrient efficiency, etc.). The development of effective field-based high-throughput phenotyping platforms (HTPPs) remains a bottleneck for future breeding advances. However, progress in sensors, aeronautics, and high-performance computing are paving the way. Here, we review recent advances in field HTPPs, which should combine at an affordable cost, high capacity for data recording, scoring and processing, and non-invasive remote sensing methods, together with automated environmental data collection. Laboratory analyses of key plant parts may complement direct phenotyping under field conditions. Improvements in user-friendly data management together with a more powerful interpretation of results should increase the use of field HTPPs, therefore increasing the efficiency of crop genetic improvement to meet the needs of future generations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. The complete automation of cell culture: improvements for high-throughput and high-content screening.

    Science.gov (United States)

    Jain, Shushant; Sondervan, David; Rizzu, Patrizia; Bochdanovits, Zoltan; Caminada, Daniel; Heutink, Peter

    2011-09-01

    Genomic approaches provide enormous amounts of raw data with regard to genetic variation, the diversity of RNA species, and protein complement. High-throughput (HT) and high-content (HC) cellular screens are ideally suited to contextualize the information gathered from other "omic" approaches into networks and can be used for the identification of therapeutic targets. Current methods used for HT-HC screens are laborious, time-consuming, and prone to human error. The authors thus developed an automated high-throughput system with an integrated fluorescent imager for HC screens called the AI.CELLHOST. The implementation of user-defined culturing and assay plate setup parameters allows parallel operation of multiple screens in diverse mammalian cell types. The authors demonstrate that such a system is able to successfully maintain different cell lines in culture for extended periods of time as well as significantly increasing throughput, accuracy, and reproducibility of HT and HC screens.

  15. A high-throughput splinkerette-PCR method for the isolation and sequencing of retroviral insertion sites

    DEFF Research Database (Denmark)

    Uren, Anthony G; Mikkers, Harald; Kool, Jaap

    2009-01-01

    sites has been a major limitation to performing screens on this scale. Here we present a method for the high-throughput isolation of insertion sites using a highly efficient splinkerette-PCR method coupled with capillary or 454 sequencing. This protocol includes a description of the procedure for DNA...... optimized for the murine leukemia virus (MuLV), and can easily be performed in a 96-well plate format for the efficient multiplex isolation of insertion sites....

  16. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  17. High-Throughput Combinatorial Development of High-Entropy Alloys For Light-Weight Structural Applications

    Energy Technology Data Exchange (ETDEWEB)

    Van Duren, Jeroen K; Koch, Carl; Luo, Alan; Sample, Vivek; Sachdev, Anil

    2017-12-29

    on Al-Cr-Fe-Ni, shows compressive strain >10% and specific compressive yield strength of 229 MPa x cc/g, yet does not show ductility in tensile tests due to cleavage. When replacing Cr in Al-Cr-Fe-based 4- and 5-element LDHEA with Mn, hardness drops 2x. Combined with compression test results, including those on the ternaries Al-Cr-Fe and Al-Mn-Fe suggest that Al-Mn-Fe-based LDHEA are still worth pursuing. These initial results only represent one compressive stress-strain curve per composition without any property optimization. As such, reproducibility needs to be followed by optimization to show their full potential. When including Li, Mg, and Zn, single-phase Li-Mg-Al-Ti-Zn LDHEA has been found with a specific ultimate compressive strength of 289MPa x cc/g. Al-Ti-Mn-Zn showed a specific ultimate compressive strength of 73MPa x cc/g. These initial results after hot isostatic pressing (HIP) of the ball-milled powders represent the lower end of what is possible, since no secondary processing (e.g. extrusion) has been performed to optimize strength and ductility. Compositions for multi-phase (e.g. dual-phase) LDHEA were identified largely by automated searches through CALPHAD databases, while screening for large face-centered-cubic (FCC) volume fractions, followed by experimental verification. This resulted in several new alloys. Li-Mg-Al-Mn-Fe and Mg-Mn-Fe-Co ball-milled powders upon HIP show specific ultimate compressive strengths of 198MPa x cc/g and 45MPa x cc/g, respectively. Several malleable quarternary Al-Zn-based alloys have been found upon arc/induction melting, yet with limited specific compressive yield strength (<75 MPa x cc/g). These initial results are all without any optimization for strength and/or ductility. High-throughput experimentation allowed us to triple the existing experimental HEA database as published in the past 10 years in less than 2 years which happened at a rate 10x higher than previous methods. Furthermore, we showed that high-throughput

  18. Quantitative High-Throughput Screening Using a Coincidence Reporter Biocircuit.

    Science.gov (United States)

    Schuck, Brittany W; MacArthur, Ryan; Inglese, James

    2017-04-10

    Reporter-biased artifacts-i.e., compounds that interact directly with the reporter enzyme used in a high-throughput screening (HTS) assay and not the biological process or pharmacology being interrogated-are now widely recognized to reduce the efficiency and quality of HTS used for chemical probe and therapeutic development. Furthermore, narrow or single-concentration HTS perpetuates false negatives during primary screening campaigns. Titration-based HTS, or quantitative HTS (qHTS), and coincidence reporter technology can be employed to reduce false negatives and false positives, respectively, thereby increasing the quality and efficiency of primary screening efforts, where the number of compounds investigated can range from tens of thousands to millions. The three protocols described here allow for generation of a coincidence reporter (CR) biocircuit to interrogate a biological or pharmacological question of interest, generation of a stable cell line expressing the CR biocircuit, and qHTS using the CR biocircuit to efficiently identify high-quality biologically active small molecules. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  19. High-throughput literature mining to support read-across ...

    Science.gov (United States)

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing biological similarity. Many research efforts are underway such as structuring mechanistic data in adverse outcome pathways and investigating the utility of high throughput (HT)/high content (HC) screening data. A largely untapped resource for read-across to date is the biomedical literature. This information has the potential to support read-across by facilitating the identification of valid source analogues with similar biological and toxicological profiles as well as providing the mechanistic understanding for any prediction made. A key challenge in using biomedical literature is to convert and translate its unstructured form into a computable format that can be linked to chemical structure. We developed a novel text-mining strategy to represent literature information for read across. Keywords were used to organize literature into toxicity signatures at the chemical level. These signatures were integrated with HT in vitro data and curated chemical structures. A rule-based algorithm assessed the strength of the literature relationship, providing a mechanism to rank and visualize the signature as literature ToxPIs (LitToxPIs). LitToxPIs were developed for over 6,000 chemicals for a varie

  20. High Throughput Heuristics for Prioritizing Human Exposure to ...

    Science.gov (United States)

    The risk posed to human health by any of the thousands of untested anthropogenic chemicals in our environment is a function of both the potential hazard presented by the chemical, and the possibility of being exposed. Without the capacity to make quantitative, albeit uncertain, forecasts of exposure, the putative risk of adverse health effect from a chemical cannot be evaluated. We used Bayesian methodology to infer ranges of exposure intakes that are consistent with biomarkers of chemical exposures identified in urine samples from the U.S. population by the National Health and Nutrition Examination Survey (NHANES). We perform linear regression on inferred exposure for demographic subsets of NHANES demarked by age, gender, and weight using high throughput chemical descriptors gleaned from databases and chemical structure-based calculators. We find that five of these descriptors are capable of explaining roughly 50% of the variability across chemicals for all the demographic groups examined, including children aged 6-11. For the thousands of chemicals with no other source of information, this approach allows rapid and efficient prediction of average exposure intake of environmental chemicals. The methods described by this manuscript provide a highly improved methodology for HTS of human exposure to environmental chemicals. The manuscript includes a ranking of 7785 environmental chemicals with respect to potential human exposure, including most of the Tox21 in vit

  1. High Throughput, Continuous, Mass Production of Photovoltaic Modules

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Barth

    2008-02-06

    AVA Solar has developed a very low cost solar photovoltaic (PV) manufacturing process and has demonstrated the significant economic and commercial potential of this technology. This I & I Category 3 project provided significant assistance toward accomplishing these milestones. The original goals of this project were to design, construct and test a production prototype system, fabricate PV modules and test the module performance. The original module manufacturing costs in the proposal were estimated at $2/Watt. The objectives of this project have been exceeded. An advanced processing line was designed, fabricated and installed. Using this automated, high throughput system, high efficiency devices and fully encapsulated modules were manufactured. AVA Solar has obtained 2 rounds of private equity funding, expand to 50 people and initiated the development of a large scale factory for 100+ megawatts of annual production. Modules will be manufactured at an industry leading cost which will enable AVA Solar's modules to produce power that is cost-competitive with traditional energy resources. With low manufacturing costs and the ability to scale manufacturing, AVA Solar has been contacted by some of the largest customers in the PV industry to negotiate long-term supply contracts. The current market for PV has continued to grow at 40%+ per year for nearly a decade and is projected to reach $40-$60 Billion by 2012. Currently, a crystalline silicon raw material supply shortage is limiting growth and raising costs. Our process does not use silicon, eliminating these limitations.

  2. Efficient Management of High-Throughput Screening Libraries with SAVANAH.

    Science.gov (United States)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen; Christiansen, Helle; Tan, Qihua; Mollenhauer, Jan; Baumbach, Jan

    2017-02-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such screens are molecular libraries, that is, microtiter plates with solubilized reagents such as siRNAs, shRNAs, miRNA inhibitors or mimics, and sgRNAs, or small compounds, that is, drugs. These reagents are typically condensed to provide enough material for covering several screens. Library plates thus need to be serially diluted before they can be used as assay plates. This process, however, leads to an explosion in the number of plates and samples to be tracked. Here, we present SAVANAH, the first tool to effectively manage molecular screening libraries across dilution series. It conveniently links (connects) sample information from the library to experimental results from the assay plates. All results can be exported to the R statistical environment or piped into HiTSeekR ( http://hitseekr.compbio.sdu.dk ) for comprehensive follow-up analyses. In summary, SAVANAH supports the HTS community in managing and analyzing HTS experiments with an emphasis on serially diluted molecular libraries.

  3. High-Throughput Identification of Antimicrobial Peptides from Amphibious Mudskippers.

    Science.gov (United States)

    Yi, Yunhai; You, Xinxin; Bian, Chao; Chen, Shixi; Lv, Zhao; Qiu, Limei; Shi, Qiong

    2017-11-22

    Widespread existence of antimicrobial peptides (AMPs) has been reported in various animals with comprehensive biological activities, which is consistent with the important roles of AMPs as the first line of host defense system. However, no big-data-based analysis on AMPs from any fish species is available. In this study, we identified 507 AMP transcripts on the basis of our previously reported genomes and transcriptomes of two representative amphibious mudskippers, Boleophthalmus pectinirostris (BP) and Periophthalmus magnuspinnatus (PM). The former is predominantly aquatic with less time out of water, while the latter is primarily terrestrial with extended periods of time on land. Within these identified AMPs, 449 sequences are novel; 15 were reported in BP previously; 48 are identically overlapped between BP and PM; 94 were validated by mass spectrometry. Moreover, most AMPs presented differential tissue transcription patterns in the two mudskippers. Interestingly, we discovered two AMPs, hemoglobin β1 and amylin, with high inhibitions on Micrococcus luteus. In conclusion, our high-throughput screening strategy based on genomic and transcriptomic data opens an efficient pathway to discover new antimicrobial peptides for ongoing development of marine drugs.

  4. High-Throughput Identification of Antimicrobial Peptides from Amphibious Mudskippers

    Directory of Open Access Journals (Sweden)

    Yunhai Yi

    2017-11-01

    Full Text Available Widespread existence of antimicrobial peptides (AMPs has been reported in various animals with comprehensive biological activities, which is consistent with the important roles of AMPs as the first line of host defense system. However, no big-data-based analysis on AMPs from any fish species is available. In this study, we identified 507 AMP transcripts on the basis of our previously reported genomes and transcriptomes of two representative amphibious mudskippers, Boleophthalmus pectinirostris (BP and Periophthalmus magnuspinnatus (PM. The former is predominantly aquatic with less time out of water, while the latter is primarily terrestrial with extended periods of time on land. Within these identified AMPs, 449 sequences are novel; 15 were reported in BP previously; 48 are identically overlapped between BP and PM; 94 were validated by mass spectrometry. Moreover, most AMPs presented differential tissue transcription patterns in the two mudskippers. Interestingly, we discovered two AMPs, hemoglobin β1 and amylin, with high inhibitions on Micrococcus luteus. In conclusion, our high-throughput screening strategy based on genomic and transcriptomic data opens an efficient pathway to discover new antimicrobial peptides for ongoing development of marine drugs.

  5. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    Directory of Open Access Journals (Sweden)

    Julio Alonso-Padilla

    2014-12-01

    Full Text Available The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  6. Validation of high throughput sequencing and microbial forensics applications.

    Science.gov (United States)

    Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.

  7. SNP-PHAGE – High throughput SNP discovery pipeline

    Directory of Open Access Journals (Sweden)

    Cregan Perry B

    2006-10-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs as defined here are single base sequence changes or short insertion/deletions between or within individuals of a given species. As a result of their abundance and the availability of high throughput analysis technologies SNP markers have begun to replace other traditional markers such as restriction fragment length polymorphisms (RFLPs, amplified fragment length polymorphisms (AFLPs and simple sequence repeats (SSRs or microsatellite markers for fine mapping and association studies in several species. For SNP discovery from chromatogram data, several bioinformatics programs have to be combined to generate an analysis pipeline. Results have to be stored in a relational database to facilitate interrogation through queries or to generate data for further analyses such as determination of linkage disequilibrium and identification of common haplotypes. Although these tasks are routinely performed by several groups, an integrated open source SNP discovery pipeline that can be easily adapted by new groups interested in SNP marker development is currently unavailable. Results We developed SNP-PHAGE (SNP discovery Pipeline with additional features for identification of common haplotypes within a sequence tagged site (Haplotype Analysis and GenBank (-dbSNP submissions. This tool was applied for analyzing sequence traces from diverse soybean genotypes to discover over 10,000 SNPs. This package was developed on UNIX/Linux platform, written in Perl and uses a MySQL database. Scripts to generate a user-friendly web interface are also provided with common queries for preliminary data analysis. A machine learning tool developed by this group for increasing the efficiency of SNP discovery is integrated as a part of this package as an optional feature. The SNP-PHAGE package is being made available open source at http://bfgl.anri.barc.usda.gov/ML/snp-phage/. Conclusion SNP-PHAGE provides a bioinformatics

  8. The Complete Automation of Cell Culture: Improvements for High-Throughput and High-Content Screening

    NARCIS (Netherlands)

    Jain, S.; Sondervan, D.; Rizzu, P.; Bochdanovits, Z.; Caminada, D.; Heutink, P.

    2011-01-01

    Genomic approaches provide enormous amounts of raw data with regard to genetic variation, the diversity of RNA species, and protein complement. high-throughput (HT) and high-content (HC) cellular screens are ideally suited to contextualize the information gathered from other "omic" approaches into

  9. High-throughput tissue extraction protocol for NMR- and MS-based metabolomics.

    Science.gov (United States)

    Wu, Huifeng; Southam, Andrew D; Hines, Adam; Viant, Mark R

    2008-01-15

    In metabolomics, tissues typically are extracted by grinding in liquid nitrogen followed by the stepwise addition of solvents. This is time-consuming and difficult to automate, and the multiple steps can introduce variability. Here we optimize tissue extraction methods compatible with high-throughput, reproducible nuclear magnetic resonance (NMR) spectroscopy- and mass spectrometry (MS)-based metabolomics. Previously, we concluded that methanol/chloroform/water extraction is preferable for metabolomics, and we further optimized this here using fish liver and an automated Precellys 24 bead-based homogenizer, allowing rapid extraction of multiple samples without carryover. We compared three solvent addition strategies: stepwise, two-step, and all solvents simultaneously. Then we evaluated strategies for improved partitioning of metabolites between solvent phases, including the addition of extra water and different partition times. Polar extracts were analyzed by NMR and principal components analysis, and the two-step approach was preferable based on lipid partitioning, reproducibility, yield, and throughput. Longer partitioning or extra water increased yield and decreased lipids in the polar phase but caused metabolic decay in these extracts. Overall, we conclude that the two-step method with extra water provides good quality data but that the two-step method with 10 min partitioning provides a more accurate snapshot of the metabolome. Finally, when validating the two-step strategy using NMR and MS metabolomics, we showed that technical variability was considerably smaller than biological variability.

  10. Performance of high-throughput DNA quantification methods

    Directory of Open Access Journals (Sweden)

    Chanock Stephen J

    2003-10-01

    Full Text Available Abstract Background The accuracy and precision of estimates of DNA concentration are critical factors for efficient use of DNA samples in high-throughput genotype and sequence analyses. We evaluated the performance of spectrophotometric (OD DNA quantification, and compared it to two fluorometric quantification methods, the PicoGreen® assay (PG, and a novel real-time quantitative genomic PCR assay (QG specific to a region at the human BRCA1 locus. Twenty-Two lymphoblastoid cell line DNA samples with an initial concentration of ~350 ng/uL were diluted to 20 ng/uL. DNA concentration was estimated by OD and further diluted to 5 ng/uL. The concentrations of multiple aliquots of the final dilution were measured by the OD, QG and PG methods. The effects of manual and robotic laboratory sample handling procedures on the estimates of DNA concentration were assessed using variance components analyses. Results The OD method was the DNA quantification method most concordant with the reference sample among the three methods evaluated. A large fraction of the total variance for all three methods (36.0–95.7% was explained by sample-to-sample variation, whereas the amount of variance attributable to sample handling was small (0.8–17.5%. Residual error (3.2–59.4%, corresponding to un-modelled factors, contributed a greater extent to the total variation than the sample handling procedures. Conclusion The application of a specific DNA quantification method to a particular molecular genetic laboratory protocol must take into account the accuracy and precision of the specific method, as well as the requirements of the experimental workflow with respect to sample volumes and throughput. While OD was the most concordant and precise DNA quantification method in this study, the information provided by the quantitative PCR assay regarding the suitability of DNA samples for PCR may be an essential factor for some protocols, despite the decreased concordance and

  11. High throughput comet assay to study genotoxicity of nanomaterials

    Directory of Open Access Journals (Sweden)

    Naouale El Yamani

    2015-06-01

    Full Text Available The unique physicochemical properties of engineered nanomaterials (NMs have accelerated their use in diverse industrial and domestic products. Although their presence in consumer products represents a major concern for public health safety, their potential impact on human health is poorly understood. There is therefore an urgent need to clarify the toxic effects of NMs and to elucidate the mechanisms involved. In view of the large number of NMs currently being used, high throughput (HTP screening technologies are clearly needed for efficient assessment of toxicity. The comet assay is the most used method in nanogenotoxicity studies and has great potential for increasing throughput as it is fast, versatile and robust; simple technical modifications of the assay make it possible to test many compounds (NMs in a single experiment. The standard gel of 70-100 μL contains thousands of cells, of which only a tiny fraction are actually scored. Reducing the gel to a volume of 5 μL, with just a few hundred cells, allows twelve gels to be set on a standard slide, or 96 as a standard 8x12 array. For the 12 gel format, standard slides precoated with agarose are placed on a metal template and gels are set on the positions marked on the template. The HTP comet assay, incorporating digestion of DNA with formamidopyrimidine DNA glycosylase (FPG to detect oxidised purines, has recently been applied to study the potential induction of genotoxicity by NMs via reactive oxygen. In the NanoTEST project we investigated the genotoxic potential of several well-characterized metal and polymeric nanoparticles with the comet assay. All in vitro studies were harmonized; i.e. NMs were from the same batch, and identical dispersion protocols, exposure time, concentration range, culture conditions, and time-courses were used. As a kidney model, Cos-1 fibroblast-like kidney cells were treated with different concentrations of iron oxide NMs, and cells embedded in minigels (12

  12. Recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers

    DEFF Research Database (Denmark)

    Andersen, Lars Dyrskjøt; Zieger, Karsten; Ørntoft, Torben Falck

    2007-01-01

    individually contributed to the management of the disease. However, the development of high-throughput techniques for simultaneous assessment of a large number of markers has allowed classification of tumors into clinically relevant molecular subgroups beyond those possible by pathological classification. Here......, we review the recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers....

  13. High-throughput and computational approaches for diagnostic and prognostic host tuberculosis biomarkers

    Directory of Open Access Journals (Sweden)

    January Weiner

    2017-03-01

    Full Text Available High-throughput techniques strive to identify new biomarkers that will be useful for the diagnosis, treatment, and prevention of tuberculosis (TB. However, their analysis and interpretation pose considerable challenges. Recent developments in the high-throughput detection of host biomarkers in TB are reported in this review.

  14. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  15. High-throughput microfluidic line scan imaging for cytological characterization

    Science.gov (United States)

    Hutcheson, Joshua A.; Powless, Amy J.; Majid, Aneeka A.; Claycomb, Adair; Fritsch, Ingrid; Balachandran, Kartik; Muldoon, Timothy J.

    2015-03-01

    Imaging cells in a microfluidic chamber with an area scan camera is difficult due to motion blur and data loss during frame readout causing discontinuity of data acquisition as cells move at relatively high speeds through the chamber. We have developed a method to continuously acquire high-resolution images of cells in motion through a microfluidics chamber using a high-speed line scan camera. The sensor acquires images in a line-by-line fashion in order to continuously image moving objects without motion blur. The optical setup comprises an epi-illuminated microscope with a 40X oil immersion, 1.4 NA objective and a 150 mm tube lens focused on a microfluidic channel. Samples containing suspended cells fluorescently stained with 0.01% (w/v) proflavine in saline are introduced into the microfluidics chamber via a syringe pump; illumination is provided by a blue LED (455 nm). Images were taken of samples at the focal plane using an ELiiXA+ 8k/4k monochrome line-scan camera at a line rate of up to 40 kHz. The system's line rate and fluid velocity are tightly controlled to reduce image distortion and are validated using fluorescent microspheres. Image acquisition was controlled via MATLAB's Image Acquisition toolbox. Data sets comprise discrete images of every detectable cell which may be subsequently mined for morphological statistics and definable features by a custom texture analysis algorithm. This high-throughput screening method, comparable to cell counting by flow cytometry, provided efficient examination including counting, classification, and differentiation of saliva, blood, and cultured human cancer cells.

  16. A high-throughput Arabidopsis reverse genetics system.

    Science.gov (United States)

    Sessions, Allen; Burke, Ellen; Presting, Gernot; Aux, George; McElver, John; Patton, David; Dietrich, Bob; Ho, Patrick; Bacwaden, Johana; Ko, Cynthia; Clarke, Joseph D; Cotton, David; Bullis, David; Snell, Jennifer; Miguel, Trini; Hutchison, Don; Kimmerly, Bill; Mitzel, Theresa; Katagiri, Fumiaki; Glazebrook, Jane; Law, Marc; Goff, Stephen A

    2002-12-01

    A collection of Arabidopsis lines with T-DNA insertions in known sites was generated to increase the efficiency of functional genomics. A high-throughput modified thermal asymmetric interlaced (TAIL)-PCR protocol was developed and used to amplify DNA fragments flanking the T-DNA left borders from approximately 100000 transformed lines. A total of 85108 TAIL-PCR products from 52964 T-DNA lines were sequenced and compared with the Arabidopsis genome to determine the positions of T-DNAs in each line. Predicted T-DNA insertion sites, when mapped, showed a bias against predicted coding sequences. Predicted insertion mutations in genes of interest can be identified using Arabidopsis Gene Index name searches or by BLAST (Basic Local Alignment Search Tool) search. Insertions can be confirmed by simple PCR assays on individual lines. Predicted insertions were confirmed in 257 of 340 lines tested (76%). This resource has been named SAIL (Syngenta Arabidopsis Insertion Library) and is available to the scientific community at www.tmri.org.

  17. Use of High Throughput Screening Data in IARC Monograph ...

    Science.gov (United States)

    Purpose: Evaluation of carcinogenic mechanisms serves a critical role in IARC monograph evaluations, and can lead to “upgrade” or “downgrade” of the carcinogenicity conclusions based on human and animal evidence alone. Three recent IARC monograph Working Groups (110, 112, and 113) pioneered analysis of high throughput in vitro screening data from the U.S. Environmental Protection Agency’s ToxCast program in evaluations of carcinogenic mechanisms. Methods: For monograph 110, ToxCast assay data across multiple nuclear receptors were used to test the hypothesis that PFOA acts exclusively through the PPAR family of receptors, with activity profiles compared to several prototypical nuclear receptor-activating compounds. For monographs 112 and 113, ToxCast assays were systematically evaluated and used as an additional data stream in the overall evaluation of the mechanistic evidence. Specifically, ToxCast assays were mapped to 10 “key characteristics of carcinogens” recently identified by an IARC expert group, and chemicals’ bioactivity profiles were evaluated both in absolute terms (number of relevant assays positive for bioactivity) and relative terms (ranking with respect to other compounds evaluated by IARC, using the ToxPi methodology). Results: PFOA activates multiple nuclear receptors in addition to the PPAR family in the ToxCast assays. ToxCast assays offered substantial coverage for 5 of the 10 “key characteristics,” with the greates

  18. High-throughput optical screening of cellular mechanotransduction

    Science.gov (United States)

    Compton, Jonathan L.; Luo, Justin C.; Ma, Huan; Botvinick, Elliot; Venugopalan, Vasan

    2014-09-01

    We introduce an optical platform for rapid, high-throughput screening of exogenous molecules that affect cellular mechanotransduction. Our method initiates mechanotransduction in adherent cells using single laser-microbeam generated microcavitation bubbles without requiring flow chambers or microfluidics. These microcavitation bubbles expose adherent cells to a microtsunami, a transient microscale burst of hydrodynamic shear stress, which stimulates cells over areas approaching 1 mm2. We demonstrate microtsunami-initiated mechanosignalling in primary human endothelial cells. This observed signalling is consistent with G-protein-coupled receptor stimulation, resulting in Ca2+ release by the endoplasmic reticulum. Moreover, we demonstrate the dose-dependent modulation of microtsunami-induced Ca2+ signalling by introducing a known inhibitor to this pathway. The imaging of Ca2+ signalling and its modulation by exogenous molecules demonstrates the capacity to initiate and assess cellular mechanosignalling in real time. We utilize this capability to screen the effects of a set of small molecules on cellular mechanotransduction in 96-well plates using standard imaging cytometry.

  19. Strategies for high-throughput gene cloning and expression.

    Science.gov (United States)

    Dieckman, L J; Hanly, W C; Collart, E R

    2006-01-01

    High-throughput approaches for gene cloning and expression require the development of new, nonstandard tools for use by molecular biologists and biochemists. We have developed and implemented a series of methods that enable the production of expression constructs in 96-well plate format. A screening process is described that facilitates the identification of bacterial clones expressing soluble protein. Application of the solubility screen then provides a plate map that identifies the location of wells containing clones producing soluble proteins. A series of semi-automated methods can then be applied for validation of solubility and production of freezer stocks for the protein production group. This process provides an 80% success rate for the identification of clones producing soluble protein and results in a significant decrease in the level of effort required for the labor-intensive components of validation and preparation of freezer stocks. This process is customized for large-scale structural genomics programs that rely on the production of large amounts of soluble proteins for crystallization trials.

  20. A high-throughput biliverdin assay using infrared fluorescence.

    Science.gov (United States)

    Berlec, Aleš; Štrukelj, Borut

    2014-07-01

    Biliverdin is an intermediate of heme degradation with an established role in veterinary clinical diagnostics of liver-related diseases. The need for chromatographic assays has so far prevented its wider use in diagnostic laboratories. The current report describes a simple, fast, high-throughput, and inexpensive assay, based on the interaction of biliverdin with infrared fluorescent protein (iRFP) that yields functional protein exhibiting infrared fluorescence. The assay is linear in the range of 0-10 µmol/l of biliverdin, has a limit of detection of 0.02 μmol/l, and has a limit of quantification of 0.03 µmol/l. The assay is accurate with relative error less than 0.15, and precise, with coefficient of variation less than 5% in the concentration range of 2-9 µmol/l of biliverdin. More than 95% of biliverdin was recovered from biological samples by simple dimethyl sulfoxide extraction. There was almost no interference by hemin, although bilirubin caused an increase in the biliverdin concentration, probably due to spontaneous oxidation of bilirubin to biliverdin. The newly developed biliverdin assay is appropriate for reliable quantification of large numbers of samples in veterinary medicine.

  1. Functional approach to high-throughput plant growth analysis

    Science.gov (United States)

    2013-01-01

    Method Taking advantage of the current rapid development in imaging systems and computer vision algorithms, we present HPGA, a high-throughput phenotyping platform for plant growth modeling and functional analysis, which produces better understanding of energy distribution in regards of the balance between growth and defense. HPGA has two components, PAE (Plant Area Estimation) and GMA (Growth Modeling and Analysis). In PAE, by taking the complex leaf overlap problem into consideration, the area of every plant is measured from top-view images in four steps. Given the abundant measurements obtained with PAE, in the second module GMA, a nonlinear growth model is applied to generate growth curves, followed by functional data analysis. Results Experimental results on model plant Arabidopsis thaliana show that, compared to an existing approach, HPGA reduces the error rate of measuring plant area by half. The application of HPGA on the cfq mutant plants under fluctuating light reveals the correlation between low photosynthetic rates and small plant area (compared to wild type), which raises a hypothesis that knocking out cfq changes the sensitivity of the energy distribution under fluctuating light conditions to repress leaf growth. Availability HPGA is available at http://www.msu.edu/~jinchen/HPGA. PMID:24565437

  2. Mouse eye enucleation for remote high-throughput phenotyping.

    Science.gov (United States)

    Mahajan, Vinit B; Skeie, Jessica M; Assefnia, Amir H; Mahajan, Maryann; Tsang, Stephen H

    2011-11-19

    The mouse eye is an important genetic model for the translational study of human ophthalmic disease. Blinding diseases in humans, such as macular degeneration, photoreceptor degeneration, cataract, glaucoma, retinoblastoma, and diabetic retinopathy have been recapitulated in transgenic mice.(1-5) Most transgenic and knockout mice have been generated by laboratories to study non-ophthalmic diseases, but genetic conservation between organ systems suggests that many of the same genes may also play a role in ocular development and disease. Hence, these mice represent an important resource for discovering new genotype-phenotype correlations in the eye. Because these mice are scattered across the globe, it is difficult to acquire, maintain, and phenotype them in an efficient, cost-effective manner. Thus, most high-throughput ophthalmic phenotyping screens are restricted to a few locations that require on-site, ophthalmic expertise to examine eyes in live mice. (6-9) An alternative approach developed by our laboratory is a method for remote tissue-acquisition that can be used in large or small-scale surveys of transgenic mouse eyes. Standardized procedures for video-based surgical skill transfer, tissue fixation, and shipping allow any lab to collect whole eyes from mutant animals and send them for molecular and morphological phenotyping. In this video article, we present techniques to enucleate and transfer both unfixed and perfusion fixed mouse eyes for remote phenotyping analyses.

  3. High-throughput membrane surface modification to control NOM fouling.

    Science.gov (United States)

    Zhou, Mingyan; Liu, Hongwei; Kilduff, James E; Langer, Robert; Anderson, Daniel G; Belfort, Georges

    2009-05-15

    A novel method for synthesis and screening of fouling-resistant membrane surfaces was developed by combining a high-throughput platform (HTP) approach together with photoinduced graft polymerization (PGP)forfacile modification of commercial poly(aryl sulfone) membranes. This method is an inexpensive, fast, simple, reproducible, and scalable approach to identify fouling-resistant surfaces appropriate for a specific feed. In this research, natural organic matter (NOM)-resistant surfaces were synthesized and indentified from a library of 66 monomers. Surfaces were prepared via graft polymerization onto poly(ether sulfone) (PES) membranes and were evaluated using an assay involving NOM adsorption, followed by pressure-driven-filtration. In this work new and previously tested low-fouling surfaces for NOM are identified, and their ability to mitigate NOM and protein (bovine serum albumin)fouling is compared. The best-performing monomers were the zwitterion [2-(methacryloyloxy)ethyl]dimethyl-(3-sulfopropyl)ammonium hydroxide, and diacetone acrylamide, a neutral monomer containing an amide group. Other excellent surfaces were synthesized from amides, amines, basic monomers, and long-chain poly(ethylene) glycols. Bench-scale studies conducted for selected monomers verified the scalability of HTP-PGP results. The results and the synthesis and screening method presented here offer new opportunities for choosing new membrane chemistries that minimize NOM fouling.

  4. Assessing the utility and limitations of high throughput virtual screening

    Directory of Open Access Journals (Sweden)

    Paul Daniel Phillips

    2016-05-01

    Full Text Available Due to low cost, speed, and unmatched ability to explore large numbers of compounds, high throughput virtual screening and molecular docking engines have become widely utilized by computational scientists. It is generally accepted that docking engines, such as AutoDock, produce reliable qualitative results for ligand-macromolecular receptor binding, and molecular docking results are commonly reported in literature in the absence of complementary wet lab experimental data. In this investigation, three variants of the sixteen amino acid peptide, α-conotoxin MII, were docked to a homology model of the a3β2-nicotinic acetylcholine receptor. DockoMatic version 2.0 was used to perform a virtual screen of each peptide ligand to the receptor for ten docking trials consisting of 100 AutoDock cycles per trial. The results were analyzed for both variation in the calculated binding energy obtained from AutoDock, and the orientation of bound peptide within the receptor. The results show that, while no clear correlation exists between consistent ligand binding pose and the calculated binding energy, AutoDock is able to determine a consistent positioning of bound peptide in the majority of trials when at least ten trials were evaluated.

  5. Savant: genome browser for high-throughput sequencing data.

    Science.gov (United States)

    Fiume, Marc; Williams, Vanessa; Brook, Andrew; Brudno, Michael

    2010-08-15

    The advent of high-throughput sequencing (HTS) technologies has made it affordable to sequence many individuals' genomes. Simultaneously the computational analysis of the large volumes of data generated by the new sequencing machines remains a challenge. While a plethora of tools are available to map the resulting reads to a reference genome, and to conduct primary analysis of the mappings, it is often necessary to visually examine the results and underlying data to confirm predictions and understand the functional effects, especially in the context of other datasets. We introduce Savant, the Sequence Annotation, Visualization and ANalysis Tool, a desktop visualization and analysis browser for genomic data. Savant was developed for visualizing and analyzing HTS data, with special care taken to enable dynamic visualization in the presence of gigabases of genomic reads and references the size of the human genome. Savant supports the visualization of genome-based sequence, point, interval and continuous datasets, and multiple visualization modes that enable easy identification of genomic variants (including single nucleotide polymorphisms, structural and copy number variants), and functional genomic information (e.g. peaks in ChIP-seq data) in the context of genomic annotations. Savant is freely available at http://compbio.cs.toronto.edu/savant.

  6. Generation of RNAi Libraries for High-Throughput Screens

    Directory of Open Access Journals (Sweden)

    Julie Clark

    2006-01-01

    Full Text Available The completion of the genome sequencing for several organisms has created a great demand for genomic tools that can systematically analyze the growing wealth of data. In contrast to the classical reverse genetics approach of creating specific knockout cell lines or animals that is time-consuming and expensive, RNA-mediated interference (RNAi has emerged as a fast, simple, and cost-effective technique for gene knockdown in large scale. Since its discovery as a gene silencing response to double-stranded RNA (dsRNA with homology to endogenous genes in Caenorhabditis elegans (C elegans, RNAi technology has been adapted to various high-throughput screens (HTS for genome-wide loss-of-function (LOF analysis. Biochemical insights into the endogenous mechanism of RNAi have led to advances in RNAi methodology including RNAi molecule synthesis, delivery, and sequence design. In this article, we will briefly review these various RNAi library designs and discuss the benefits and drawbacks of each library strategy.

  7. The Utilization of Formalin Fixed-Paraffin-Embedded Specimens in High Throughput Genomic Studies

    Directory of Open Access Journals (Sweden)

    Pan Zhang

    2017-01-01

    Full Text Available High throughput genomic assays empower us to study the entire human genome in short time with reasonable cost. Formalin fixed-paraffin-embedded (FFPE tissue processing remains the most economical approach for longitudinal tissue specimen storage. Therefore, the ability to apply high throughput genomic applications to FFPE specimens can expand clinical assays and discovery. Many studies have measured the accuracy and repeatability of data generated from FFPE specimens using high throughput genomic assays. Together, these studies demonstrate feasibility and provide crucial guidance for future studies using FFPE specimens. Here, we summarize the findings of these studies and discuss the limitations of high throughput data generated from FFPE specimens across several platforms that include microarray, high throughput sequencing, and NanoString.

  8. High-Throughput Yeast-Based Reporter Assay to Identify Compounds with Anti-inflammatory Potential.

    Science.gov (United States)

    Garcia, G; Santos, C Nunes do; Menezes, R

    2016-01-01

    The association between altered proteostasis and inflammatory responses has been increasingly recognized, therefore the identification and characterization of novel compounds with anti-inflammatory potential will certainly have a great impact in the therapeutics of protein-misfolding diseases such as degenerative disorders. Although cell-based screens are powerful approaches to identify potential therapeutic compounds, establishing robust inflammation models amenable to high-throughput screening remains a challenge. To bridge this gap, we have exploited the use of yeasts as a platform to identify lead compounds with anti-inflammatory properties. The yeast cell model described here relies on the high-degree homology between mammalian and yeast Ca(2+)/calcineurin pathways converging into the activation of NFAT and Crz1 orthologous proteins, respectively. It consists of a recombinant yeast strain encoding the lacZ gene under the control of Crz1-recongition elements to facilitate the identification of compounds interfering with Crz1 activation through the easy monitoring of β-galactosidase activity. Here, we describe in detail a protocol optimized for high-throughput screening of compounds with potential anti-inflammatory activity as well as a protocol to validate the positive hits using an alternative β-galactosidase substrate.

  9. High-throughput combinatorial chemical bath deposition: The case of doping Cu (In, Ga) Se film with antimony

    Science.gov (United States)

    Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong

    2018-01-01

    The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.

  10. Versatile High Throughput Microarray Analysis for Marine Glycobiology

    DEFF Research Database (Denmark)

    Asunción Salmeán, Armando

    Algal cell walls are a type of extracellular matrix mainly made of polysaccharides, highly diverse, complex and heterogeneous. They possess unique and original polymers in their composition including several polysaccharides with industrial relevance such as agar, agarose, carrageenans (red algae...... decaying tissue as a nutrient source. We successfully optimized the method and characterized four new carbohydrate-recognizing proteins (two carrageenan binders, one arabynoxylan binder and one xyloglucan binder), which may be used as analytical probes for polysaccharide research. Finally, we also wanted...

  11. High-throughput flow cytometry data normalization for clinical trials.

    Science.gov (United States)

    Finak, Greg; Jiang, Wenxin; Krouse, Kevin; Wei, Chungwen; Sanz, Ignacio; Phippard, Deborah; Asare, Adam; De Rosa, Stephen C; Self, Steve; Gottardo, Raphael

    2014-03-01

    Flow cytometry datasets from clinical trials generate very large datasets and are usually highly standardized, focusing on endpoints that are well defined apriori. Staining variability of individual makers is not uncommon and complicates manual gating, requiring the analyst to adapt gates for each sample, which is unwieldy for large datasets. It can lead to unreliable measurements, especially if a template-gating approach is used without further correction to the gates. In this article, a computational framework is presented for normalizing the fluorescence intensity of multiple markers in specific cell populations across samples that is suitable for high-throughput processing of large clinical trial datasets. Previous approaches to normalization have been global and applied to all cells or data with debris removed. They provided no mechanism to handle specific cell subsets. This approach integrates tightly with the gating process so that normalization is performed during gating and is local to the specific cell subsets exhibiting variability. This improves peak alignment and the performance of the algorithm. The performance of this algorithm is demonstrated on two clinical trial datasets from the HIV Vaccine Trials Network (HVTN) and the Immune Tolerance Network (ITN). In the ITN data set we show that local normalization combined with template gating can account for sample-to-sample variability as effectively as manual gating. In the HVTN dataset, it is shown that local normalization mitigates false-positive vaccine response calls in an intracellular cytokine staining assay. In both datasets, local normalization performs better than global normalization. The normalization framework allows the use of template gates even in the presence of sample-to-sample staining variability, mitigates the subjectivity and bias of manual gating, and decreases the time necessary to analyze large datasets. © 2013 International Society for Advancement of Cytometry.

  12. Mining Chemical Activity Status from High-Throughput Screening Assays.

    Science.gov (United States)

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  13. Mining Chemical Activity Status from High-Throughput Screening Assays

    KAUST Repository

    Soufan, Othman

    2015-12-14

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  14. High-Throughput Neuroimaging-Genetics Computational Infrastructure

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2014-04-01

    Full Text Available Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate and disseminate novel scientific methods, computational resources and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval and aggregation. Computational processing involves the necessary software, hardware and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical and phenotypic data and meta-data. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI and the Laboratory of Neuro Imaging (LONI at University of Southern California (USC. INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer’s and Parkinson’s data, we provide several examples of translational applications using this infrastructure.

  15. An extractive removal step optimized for a high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis in conifer tree rings.

    Science.gov (United States)

    Lin, Wen; Noormets, Asko; King, John S; Sun, Ge; McNulty, Steve; Domec, Jean-Christophe; Cernusak, Lucas

    2017-01-31

    Stable isotope ratios (δ13C and δ18O) of tree-ring α-cellulose are important tools in paleoclimatology, ecology, plant physiology and genetics. The Multiple Sample Isolation System for Solids (MSISS) was a major advance in the tree-ring α-cellulose extraction methods, offering greater throughput and reduced labor input compared to traditional alternatives. However, the usability of the method for resinous conifer species may be limited by the need to remove extractives from some conifer species in a separate pretreatment step. Here we test the necessity of pretreatment for α-cellulose extraction in loblolly pine (Pinus taeda L.), and the efficiency of a modified acetone-based ambient-temperature step for the removal of extractives (i) in loblolly pine from five geographic locations representing its natural range in the southeastern USA, and (ii) on five other common coniferous species (black spruce (Picea mariana Mill.), Fraser fir (Abies fraseri (Pursh) Poir.), Douglas fir (Pseudotsuga menziesii (Mirb.) Franco), Norway spruce (Picea abies (L.) Karst) and ponderosa pine (Pinus ponderosa D.)) with contrasting extractive profiles. The differences of δ13C values between the new and traditional pretreatment methods were within the precision of the isotope ratio mass spectrometry method used (±0.2‰), and the differences between δ18O values were not statistically significant. Although some unanticipated results were observed in Fraser fir, the new ambient-temperature technique was deemed as effective as the more labor-consuming and toxic traditional pretreatment protocol. The proposed technique requires a separate acetone-inert multiport system similar to MSISS, and the execution of both pretreatment and main extraction steps allows for simultaneous treatment of up to several hundred microsamples from resinous softwood, while the need of additional labor input remains minimal.

  16. Robo-Lector – a novel platform for automated high-throughput cultivations in microtiter plates with high information content

    Directory of Open Access Journals (Sweden)

    Kensy Frank

    2009-08-01

    culture plate, where subsequently similar growth kinetics could be obtained. Conclusion The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.

  17. High throughput phenotyping of left and right ventricular cardiomyopathy in calcineurin transgene mice.

    Science.gov (United States)

    Moreth, Kristin; Afonso, Luciana Caminha; Fuchs, Helmut; Gailus-Durner, Valérie; Katus, Hugo A; Bekeredjian, Raffi; Lehman, Lorenz; Hrabě de Angelis, Martin

    2015-04-01

    Consistent protocols for the assessment of diastolic and systolic cardiac function to assure the comparability of existing data on preclinical models are missing. Calcineurin transgene (CN) mice are a preclinical model for hypertrophic and failing hearts. We aimed at evaluating left and right ventricular structural and functional remodeling in CN hearts with an optimized phenotyping protocol. We developed a protocol using techniques and indices comparable to those from human diagnostics for comprehensive in vivo cardiac screening using high-frequency echocardiography, Doppler, electrocardiography and cardiac magnetic resonance (CMR) techniques. We measured left and right ventricular dimensions and function, pulmonary and mitral flow pattern and the hearts electrophysiology non-invasively in high-throughput. Phenotyping CN hearts revealed new symptom occurrence and allowed insights into the diverse phenotype of hypertrophic failing hearts.

  18. High-Throughput Immunogenetics for Clinical and Research Applications in Immunohematology: Potential and Challenges

    NARCIS (Netherlands)

    Langerak, A.W.; Bruggemann, M.; Davi, F.; Darzentas, N.; Dongen, J.J. van; Gonzalez, D.; Cazzaniga, G.; Giudicelli, V.; Lefranc, M.P.; Giraud, M.; Macintyre, E.A.; Hummel, M.; Pott, C.; Groenen, P.J.T.A.; Stamatopoulos, K.

    2017-01-01

    Analysis and interpretation of Ig and TCR gene rearrangements in the conventional, low-throughput way have their limitations in terms of resolution, coverage, and biases. With the advent of high-throughput, next-generation sequencing (NGS) technologies, a deeper analysis of Ig and/or TCR (IG/TR)

  19. A high-throughput fluorescence polarization assay for inhibitors of gyrase B.

    Science.gov (United States)

    Glaser, Bryan T; Malerich, Jeremiah P; Duellman, Sarah J; Fong, Julie; Hutson, Christopher; Fine, Richard M; Keblansky, Boris; Tang, Mary J; Madrid, Peter B

    2011-02-01

    DNA gyrase, a type II topoisomerase that introduces negative supercoils into DNA, is a validated antibacterial drug target. The holoenzyme is composed of 2 subunits, gyrase A (GyrA) and gyrase B (GyrB), which form a functional A(2)B(2) heterotetramer required for bacterial viability. A novel fluorescence polarization (FP) assay has been developed and optimized to detect inhibitors that bind to the adenosine triphosphate (ATP) binding domain of GyrB. Guided by the crystal structure of the natural product novobiocin bound to GyrB, a novel novobiocin-Texas Red probe (Novo-TRX) was designed and synthesized for use in a high-throughput FP assay. The binding kinetics of the interaction of Novo-TRX with GyrB from Francisella tularensis has been characterized, as well as the effect of common buffer additives on the interaction. The assay was developed into a 21-µL, 384-well assay format and has been validated for use in high-throughput screening against a collection of Food and Drug Administration-approved compounds. The assay performed with an average Z' factor of 0.80 and was able to identify GyrB inhibitors from a screening library.

  20. Mining the structural genomics pipeline: identification of protein properties that affect high-throughput experimental analysis.

    Science.gov (United States)

    Goh, Chern-Sing; Lan, Ning; Douglas, Shawn M; Wu, Baolin; Echols, Nathaniel; Smith, Andrew; Milburn, Duncan; Montelione, Gaetano T; Zhao, Hongyu; Gerstein, Mark

    2004-02-06

    Structural genomics projects represent major undertakings that will change our understanding of proteins. They generate unique datasets that, for the first time, present a standardized view of proteins in terms of their physical and chemical properties. By analyzing these datasets here, we are able to discover correlations between a protein's characteristics and its progress through each stage of the structural genomics pipeline, from cloning, expression, purification, and ultimately to structural determination. First, we use tree-based analyses (decision trees and random forest algorithms) to discover the most significant protein features that influence a protein's amenability to high-throughput experimentation. Based on this, we identify potential bottlenecks in various stages of the structural genomics process through specialized "pipeline schematics". We find that the properties of a protein that are most significant are: (i.) whether it is conserved across many organisms; (ii). the percentage composition of charged residues; (iii). the occurrence of hydrophobic patches; (iv). the number of binding partners it has; and (v). its length. Conversely, a number of other properties that might have been thought to be important, such as nuclear localization signals, are not significant. Thus, using our tree-based analyses, we are able to identify combinations of features that best differentiate the small group of proteins for which a structure has been determined from all the currently selected targets. This information may prove useful in optimizing high-throughput experimentation. Further information is available from http://mining.nesg.org/.

  1. High-throughput testing of terpenoid biosynthesis candidate genes using transient expression in Nicotiana benthamiana.

    Science.gov (United States)

    Bach, Søren Spanner; Bassard, Jean-Étienne; Andersen-Ranberg, Johan; Møldrup, Morten Emil; Simonsen, Henrik Toft; Hamberger, Björn

    2014-01-01

    To respond to the rapidly growing number of genes putatively involved in terpenoid metabolism, a robust high-throughput platform for functional testing is needed. An in planta expression system offers several advantages such as the capacity to produce correctly folded and active enzymes localized to the native compartments, unlike microbial or prokaryotic expression systems. Two inherent drawbacks of plant-based expression systems, time-consuming generation of transgenic plant lines and challenging gene-stacking, can be circumvented by transient expression in Nicotiana benthamiana. In this chapter we describe an expression platform for rapid testing of candidate terpenoid biosynthetic genes based on Agrobacterium mediated gene expression in N. benthamiana leaves. Simultaneous expression of multiple genes is facilitated by co-infiltration of leaves with several engineered Agrobacterium strains, possibly making this the fastest and most convenient system for the assembly of plant terpenoid biosynthetic routes. Tools for cloning of expression plasmids, N. benthamiana culturing, Agrobacterium preparation, leaf infiltration, metabolite extraction, and automated GC-MS data mining are provided. With all steps optimized for high throughput, this in planta expression platform is particularly suited for testing large panels of candidate genes in all possible permutations.

  2. High-throughput screening of PLGA thin films utilizing hydrophobic fluorescent dyes for hydrophobic drug compounds.

    Science.gov (United States)

    Steele, Terry W J; Huang, Charlotte L; Kumar, Saranya; Widjaja, Effendi; Chiang Boey, Freddy Yin; Loo, Joachim S C; Venkatraman, Subbu S

    2011-10-01

    Hydrophobic, antirestenotic drugs such as paclitaxel (PCTX) and rapamycin are often incorporated into thin film coatings for local delivery using implantable medical devices and polymers such as drug-eluting stents and balloons. Selecting the optimum coating formulation through screening the release profile of these drugs in thin films is time consuming and labor intensive. We describe here a high-throughput assay utilizing three model hydrophobic fluorescent compounds: fluorescein diacetate (FDAc), coumarin-6, and rhodamine 6G that were incorporated into poly(d,l-lactide-co-glycolide) (PLGA) and PLGA-polyethylene glycol films. Raman microscopy determined the hydrophobic fluorescent dye distribution within the PLGA thin films in comparison with that of PCTX. Their subsequent release was screened in a high-throughput assay and directly compared with HPLC quantification of PCTX release. It was observed that PCTX controlled-release kinetics could be mimicked by a hydrophobic dye that had similar octanol-water partition coefficient values and homogeneous dissolution in a PLGA matrix as the drug. In particular, FDAc was found to be the optimal hydrophobic dye at modeling the burst release as well as the total amount of PCTX released over a period of 30 days. Copyright © 2011 Wiley-Liss, Inc.

  3. Formation of Linear Gradient of Antibiotics on Microfluidic Chips for High-throughput Antibiotic Susceptibility Testing

    Science.gov (United States)

    Kim, Seunggyu; Lee, Seokhun; Jeon, Jessie S.

    2017-11-01

    To determine the most effective antimicrobial treatments of infectious pathogen, high-throughput antibiotic susceptibility test (AST) is critically required. However, the conventional AST requires at least 16 hours to reach the minimum observable population. Therefore, we developed a microfluidic system that allows maintenance of linear antibiotic concentration and measurement of local bacterial density. Based on the Stokes-Einstein equation, the flow rate in the microchannel was optimized so that linearization was achieved within 10 minutes, taking into account the diffusion coefficient of each antibiotic in the agar gel. As a result, the minimum inhibitory concentration (MIC) of each antibiotic against P. aeruginosa could be immediately determined 6 hours after treatment of the linear antibiotic concentration. In conclusion, our system proved the efficacy of a high-throughput AST platform through MIC comparison with Clinical and Laboratory Standards Institute (CLSI) range of antibiotics. This work was supported by the Climate Change Research Hub (Grant No. N11170060) of the KAIST and by the Brain Korea 21 Plus project.

  4. Profiling the main cell wall polysaccharides of grapevine leaves using high-throughput and fractionation methods.

    Science.gov (United States)

    Moore, John P; Nguema-Ona, Eric; Fangel, Jonatan U; Willats, William G T; Hugo, Annatjie; Vivier, Melané A

    2014-01-01

    Vitis species include Vitis vinifera, the domesticated grapevine, used for wine and grape agricultural production and considered the world's most important fruit crop. A cell wall preparation, isolated from fully expanded photosynthetically active leaves, was fractionated via chemical and enzymatic reagents; and the various extracts obtained were assayed using high-throughput cell wall profiling tools according to a previously optimized and validated workflow. The bulk of the homogalacturonan-rich pectin present was efficiently extracted using CDTA treatment, whereas over half of the grapevine leaf cell wall consisted of vascular veins, comprised of xylans and cellulose. The main hemicellulose component was found to be xyloglucan and an enzymatic oligosaccharide fingerprinting approach was used to analyze the grapevine leaf xyloglucan fraction. When Paenibacillus sp. xyloglucanase was applied the main subunits released were XXFG and XLFG; whereas the less-specific Trichoderma reesei EGII was also able to release the XXXG motif as well as other oligomers likely of mannan and xylan origin. This latter enzyme would thus be useful to screen for xyloglucan, xylan and mannan-linked cell wall alterations in laboratory and field grapevine populations. This methodology is well-suited for high-throughput cell wall profiling of grapevine mutant and transgenic plants for investigating the range of biological processes, specifically plant disease studies and plant-pathogen interactions, where the cell wall plays a crucial role. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Towards Chip Scale Liquid Chromatography and High Throughput Immunosensing

    Energy Technology Data Exchange (ETDEWEB)

    Ni, Jing [Iowa State Univ., Ames, IA (United States)

    2000-09-21

    This work describes several research projects aimed towards developing new instruments and novel methods for high throughput chemical and biological analysis. Approaches are taken in two directions. The first direction takes advantage of well-established semiconductor fabrication techniques and applies them to miniaturize instruments that are workhorses in analytical laboratories. Specifically, the first part of this work focused on the development of micropumps and microvalves for controlled fluid delivery. The mechanism of these micropumps and microvalves relies on the electrochemically-induced surface tension change at a mercury/electrolyte interface. A miniaturized flow injection analysis device was integrated and flow injection analyses were demonstrated. In the second part of this work, microfluidic chips were also designed, fabricated, and tested. Separations of two fluorescent dyes were demonstrated in microfabricated channels, based on an open-tubular liquid chromatography (OT LC) or an electrochemically-modulated liquid chromatography (EMLC) format. A reduction in instrument size can potentially increase analysis speed, and allow exceedingly small amounts of sample to be analyzed under diverse separation conditions. The second direction explores the surface enhanced Raman spectroscopy (SERS) as a signal transduction method for immunoassay analysis. It takes advantage of the improved detection sensitivity as a result of surface enhancement on colloidal gold, the narrow width of Raman band, and the stability of Raman scattering signals to distinguish several different species simultaneously without exploiting spatially-separated addresses on a biochip. By labeling gold nanoparticles with different Raman reporters in conjunction with different detection antibodies, a simultaneous detection of a dual-analyte immunoassay was demonstrated. Using this scheme for quantitative analysis was also studied and preliminary dose-response curves from an immunoassay of a

  6. Missing call bias in high-throughput genotyping

    Directory of Open Access Journals (Sweden)

    Lin Rong

    2009-03-01

    Full Text Available Abstract Background The advent of high-throughput and cost-effective genotyping platforms made genome-wide association (GWA studies a reality. While the primary focus has been invested upon the improvement of reducing genotyping error, the problems associated with missing calls are largely overlooked. Results To probe into the effect of missing calls on GWAs, we demonstrated experimentally the prevalence and severity of the problem of missing call bias (MCB in four genotyping technologies (Affymetrix 500 K SNP array, SNPstream, TaqMan, and Illumina Beadlab. Subsequently, we showed theoretically that MCB leads to biased conclusions in the subsequent analyses, including estimation of allele/genotype frequencies, the measurement of HWE and association tests under various modes of inheritance relationships. We showed that MCB usually leads to power loss in association tests, and such power change is greater than what could be achieved by equivalent reduction of sample size unbiasedly. We also compared the bias in allele frequency estimation and in association tests introduced by MCB with those by genotyping errors. Our results illustrated that in most cases, the bias can be greatly reduced by increasing the call-rate at the cost of genotyping error rate. Conclusion The commonly used 'no-call' procedure for the observations of borderline quality should be modified. If the objective is to minimize the bias, the cut-off for call-rate and that for genotyping error rate should be properly coupled in GWA. We suggested that the ongoing QC cut-off for call-rate should be increased, while the cut-off for genotyping error rate can be reduced properly.

  7. A high-throughput sample preparation method for cellular proteomics using 96-well filter plates.

    Science.gov (United States)

    Switzar, Linda; van Angeren, Jordy; Pinkse, Martijn; Kool, Jeroen; Niessen, Wilfried M A

    2013-10-01

    A high-throughput sample preparation protocol based on the use of 96-well molecular weight cutoff (MWCO) filter plates was developed for shotgun proteomics of cell lysates. All sample preparation steps, including cell lysis, buffer exchange, protein denaturation, reduction, alkylation and proteolytic digestion are performed in a 96-well plate format, making the platform extremely well suited for processing large numbers of samples and directly compatible with functional assays for cellular proteomics. In addition, the usage of a single plate for all sample preparation steps following cell lysis reduces potential samples losses and allows for automation. The MWCO filter also enables sample concentration, thereby increasing the overall sensitivity, and implementation of washing steps involving organic solvents, for example, to remove cell membranes constituents. The optimized protocol allowed for higher throughput with improved sensitivity in terms of the number of identified cellular proteins when compared to an established protocol employing gel-filtration columns. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. High-throughput DNA Stretching in Continuous Elongational Flow for Genome Sequence Scanning

    Science.gov (United States)

    Meltzer, Robert; Griffis, Joshua; Safranovitch, Mikhail; Malkin, Gene; Cameron, Douglas

    2014-03-01

    Genome Sequence Scanning (GSS) identifies and compares bacterial genomes by stretching long (60 - 300 kb) genomic DNA restriction fragments and scanning for site-selective fluorescent probes. Practical application of GSS requires: 1) high throughput data acquisition, 2) efficient DNA stretching, 3) reproducible DNA elasticity in the presence of intercalating fluorescent dyes. GSS utilizes a pseudo-two-dimensional micron-scale funnel with convergent sheathing flows to stretch one molecule at a time in continuous elongational flow and center the DNA stream over diffraction-limited confocal laser excitation spots. Funnel geometry has been optimized to maximize throughput of DNA within the desired length range (>10 million nucleobases per second). A constant-strain detection channel maximizes stretching efficiency by applying a constant parabolic tension profile to each molecule, minimizing relaxation and flow-induced tumbling. The effect of intercalator on DNA elasticity is experimentally controlled by reacting one molecule of DNA at a time in convergent sheathing flows of the dye. Derivations of accelerating flow and non-linear tension distribution permit alignment of detected fluorescence traces to theoretical templates derived from whole-genome sequence data.

  9. Space Link Extension (SLE) Emulation for High-Throughput Network Communication

    Science.gov (United States)

    Murawski, Robert W.; Tchorowski, Nicole; Golden, Bert

    2014-01-01

    As the data rate requirements for space communications increases, significant stress is placed not only on the wireless satellite communication links, but also on the ground networks which forward data from end-users to remote ground stations. These wide area network (WAN) connections add delay and jitter to the end-to-end satellite communication link, effects which can have significant impacts on the wireless communication link. It is imperative that any ground communication protocol can react to these effects such that the ground network does not become a bottleneck in the communication path to the satellite. In this paper, we present our SCENIC Emulation Lab testbed which was developed to test the CCSDS SLE protocol implementations proposed for use on future NASA communication networks. Our results show that in the presence of realistic levels of network delay, high-throughput SLE communication links can experience significant data rate throttling. Based on our observations, we present some insight into why this data throttling happens, and trace the probable issue back to non-optimal blocking communication which is sup-ported by the CCSDS SLE API recommended practices. These issues were presented as well to the SLE implementation developers which, based on our reports, developed a new release for SLE which we show fixes the SLE blocking issue and greatly improves the protocol throughput. In this paper, we also discuss future developments for our end-to-end emulation lab and how these improvements can be used to develop and test future space communication technologies.

  10. A high-throughput microfluidic approach for 1000-fold leukocyte reduction of platelet-rich plasma

    Science.gov (United States)

    Xia, Hui; Strachan, Briony C.; Gifford, Sean C.; Shevkoplyas, Sergey S.

    2016-10-01

    Leukocyte reduction of donated blood products substantially reduces the risk of a number of transfusion-related complications. Current ‘leukoreduction’ filters operate by trapping leukocytes within specialized filtration material, while allowing desired blood components to pass through. However, the continuous release of inflammatory cytokines from the retained leukocytes, as well as the potential for platelet activation and clogging, are significant drawbacks of conventional ‘dead end’ filtration. To address these limitations, here we demonstrate our newly-developed ‘controlled incremental filtration’ (CIF) approach to perform high-throughput microfluidic removal of leukocytes from platelet-rich plasma (PRP) in a continuous flow regime. Leukocytes are separated from platelets within the PRP by progressively syphoning clarified PRP away from the concentrated leukocyte flowstream. Filtrate PRP collected from an optimally-designed CIF device typically showed a ~1000-fold (i.e. 99.9%) reduction in leukocyte concentration, while recovering >80% of the original platelets, at volumetric throughputs of ~1 mL/min. These results suggest that the CIF approach will enable users in many fields to now apply the advantages of microfluidic devices to particle separation, even for applications requiring macroscale flowrates.

  11. Development of a semi-automated high throughput transient transfection system.

    Science.gov (United States)

    Bos, Aaron B; Duque, Joseph N; Bhakta, Sunil; Farahi, Farzam; Chirdon, Lindsay A; Junutula, Jagath R; Harms, Peter D; Wong, Athena W

    2014-06-20

    Transient transfection of mammalian cells provides a rapid method of producing protein for research purposes. Combining the transient transfection protein expression system with new automation technologies developed for the biotechnology industry would enable a high throughput protein production platform that could be utilized to generate a variety of different proteins in a short amount of time. These proteins could be used for an assortment of studies including proof of concept, antibody development, and biological structure and function. Here we describe such a platform: a semi-automated process for PEI-mediated transient protein production in tubespins at a throughput of 96 transfections at a time using a Biomek FX(P) liquid handling system. In one batch, 96 different proteins can be produced in milligram amounts by PEI transfection of HEK293 cells cultured in 50 mL tubespins. Methods were developed for the liquid handling system to automate the different processes associated with transient transfections such as initial cell seeding, DNA:PEI complex activation and DNA:PEI complex addition to the cells. Increasing DNA:PEI complex incubation time resulted in lower protein expression. To minimize protein production variability, the methods were further optimized to achieve consistent cell seeding, control the DNA:PEI incubation time and prevent cross-contamination among different tubespins. This semi-automated transfection process was applied to express 520 variants of a human IgG1 (hu IgG1) antibody. Published by Elsevier B.V.

  12. New catalysts for carboxylation of propylene glycol to propylene carbonate via high-throughput screening.

    Science.gov (United States)

    Castro-Osma, José A; Comerford, James W; Heyn, Richard H; North, Michael; Tangstad, Elisabeth

    2015-01-01

    High throughput methodologies screened 81 different metal salts and metal salt combinations as catalysts for the carboxylation of propylene glycol to propylene carbonate, as compared to a 5 mol% Zn(OAc)2/p-chlorobenzene sulfonic acid benchmark catalyst. The reactions were run with added acetonitrile (MeCN) as a chemical water trap. Two new catalysts were thereby discovered, zinc trifluoromethanesulfonate (Zn(OTf)2) and zinc p-toluenesulfonate. The optimal reaction parameters for the former catalyst were screened. Zn(OTf)2 gave an overall propylene carbonate yield of greater than 50% in 24 h, twice as large as the previous best literature yield with MeCN as a water trap, with 69% selectivity and 75% conversion of propylene glycol at 145 °C and 50 bar CO2 pressure.

  13. High-throughput detection of ethanol-producing cyanobacteria in a microdroplet platform.

    Science.gov (United States)

    Abalde-Cela, Sara; Gould, Anna; Liu, Xin; Kazamia, Elena; Smith, Alison G; Abell, Chris

    2015-05-06

    Ethanol production by microorganisms is an important renewable energy source. Most processes involve fermentation of sugars from plant feedstock, but there is increasing interest in direct ethanol production by photosynthetic organisms. To facilitate this, a high-throughput screening technique for the detection of ethanol is required. Here, a method for the quantitative detection of ethanol in a microdroplet-based platform is described that can be used for screening cyanobacterial strains to identify those with the highest ethanol productivity levels. The detection of ethanol by enzymatic assay was optimized both in bulk and in microdroplets. In parallel, the encapsulation of engineered ethanol-producing cyanobacteria in microdroplets and their growth dynamics in microdroplet reservoirs were demonstrated. The combination of modular microdroplet operations including droplet generation for cyanobacteria encapsulation, droplet re-injection and pico-injection, and laser-induced fluorescence, were used to create this new platform to screen genetically engineered strains of cyanobacteria with different levels of ethanol production.

  14. Bacterial Microcolonies in Gel Beads for High-Throughput Screening of Libraries in Synthetic Biology.

    Science.gov (United States)

    Duarte, José M; Barbier, Içvara; Schaerli, Yolanda

    2017-11-17

    Synthetic biologists increasingly rely on directed evolution to optimize engineered biological systems. Applying an appropriate screening or selection method for identifying the potentially rare library members with the desired properties is a crucial step for success in these experiments. Special challenges include substantial cell-to-cell variability and the requirement to check multiple states (e.g., being ON or OFF depending on the input). Here, we present a high-throughput screening method that addresses these challenges. First, we encapsulate single bacteria into microfluidic agarose gel beads. After incubation, they harbor monoclonal bacterial microcolonies (e.g., expressing a synthetic construct) and can be sorted according their fluorescence by fluorescence activated cell sorting (FACS). We determine enrichment rates and demonstrate that we can measure the average fluorescent signals of microcolonies containing phenotypically heterogeneous cells, obviating the problem of cell-to-cell variability. Finally, we apply this method to sort a pBAD promoter library at ON and OFF states.

  15. Mapping and Classifying Molecules from a High-Throughput Structural Database

    CERN Document Server

    De, Sandip; Ingram, Teresa; Baldauf, Carsten; Ceriotti, Michele

    2016-01-01

    High-throughput computational materials design promises to greatly accelerate the process of discovering new materials and compounds, and of optimizing their properties. The large databases of structures and properties that result from computational searches, as well as the agglomeration of data of heterogeneous provenance leads to considerable challenges when it comes to navigating the database, representing its structure at a glance, understanding structure-property relations, eliminating duplicates and identifying inconsistencies. Here we present a case study, based on a data set of conformers of amino acids and dipeptides, of how machine-learning techniques can help addressing these issues. We will exploit a recently developed strategy to define a metric between structures, and use it as the basis of both clustering and dimensionality reduction techniques showing how these can help reveal structure-property relations, identify outliers and inconsistent structures, and rationalise how perturbations (e.g. b...

  16. High pressure inertial focusing for separation and concentration of bacteria at high throughput

    Science.gov (United States)

    Cruz, F. J.; Hjort, K.

    2017-11-01

    Inertial focusing is a phenomenon where particles migrate across streamlines in microchannels and focus at well-defined, size dependent equilibrium points of the cross section. It can be taken into advantage for focusing, separation and concentration of particles at high through-put and high efficiency. As particles decrease in size, smaller channels and higher pressures are needed. Hence, new designs are needed to decrease the pressure drop. In this work a novel design was adapted to focus and separate 1 µm from 3 µm spherical polystyrene particles. Also 0.5 µm spherical polystyrene particles were separated, although in a band instead of a single line. The ability to separate, concentrate and focus bacteria, its simplicity of use and high throughput make this technology a candidate for daily routines in laboratories and hospitals.

  17. High-throughput extraction and quantification method for targeted metabolomics in murine tissues.

    Science.gov (United States)

    Zukunft, Sven; Prehn, Cornelia; Röhring, Cornelia; Möller, Gabriele; Hrabě de Angelis, Martin; Adamski, Jerzy; Tokarz, Janina

    2018-01-01

    Global metabolomics analyses using body fluids provide valuable results for the understanding and prediction of diseases. However, the mechanism of a disease is often tissue-based and it is advantageous to analyze metabolomic changes directly in the tissue. Metabolomics from tissue samples faces many challenges like tissue collection, homogenization, and metabolite extraction. We aimed to establish a metabolite extraction protocol optimized for tissue metabolite quantification by the targeted metabolomics AbsoluteIDQ™ p180 Kit (Biocrates). The extraction method should be non-selective, applicable to different kinds and amounts of tissues, monophasic, reproducible, and amenable to high throughput. We quantified metabolites in samples of eleven murine tissues after extraction with three solvents (methanol, phosphate buffer, ethanol/phosphate buffer mixture) in two tissue to solvent ratios and analyzed the extraction yield, ionization efficiency, and reproducibility. We found methanol and ethanol/phosphate buffer to be superior to phosphate buffer in regard to extraction yield, reproducibility, and ionization efficiency for all metabolites measured. Phosphate buffer, however, outperformed both organic solvents for amino acids and biogenic amines but yielded unsatisfactory results for lipids. The observed matrix effects of tissue extracts were smaller or in a similar range compared to those of human plasma. We provide for each murine tissue type an optimized high-throughput metabolite extraction protocol, which yields the best results for extraction, reproducibility, and quantification of metabolites in the p180 kit. Although the performance of the extraction protocol was monitored by the p180 kit, the protocol can be applicable to other targeted metabolomics assays.

  18. High-throughput receptor-based assay for the detection of spirolides by chemiluminescence.

    Science.gov (United States)

    Rodríguez, Laura P; Vilariño, Natalia; Molgó, Jordi; Aráoz, Rómulo; Botana, Luis M

    2013-12-01

    The spirolides are marine toxins that belong to a new class of macrocyclic imines produced by dinoflagellates. In this study a previously described solid-phase receptor-based assay for the detection of spirolides was optimized for high-throughput screening and prevalidated. This method is based on the competition between 13-desmethyl spirolide C and biotin-α-bungarotoxin immobilized on a streptavidin-coated surface, for binding to nicotinic acetylcholine receptors. In this inhibition assay the amount of nAChR bound to the well surface is quantified using a specific antibody, followed by a second anti-mouse IgG antibody labeled with horseradish peroxidase (HRP). The assay protocol was optimized for 384-well microplates, which allowed a reduction of the amount of reagents per sample and an increase of the number of samples per plate versus previously published receptor-based assays. The sensitivity of the assay for 13-desmethyl spirolide C ranged from 5 to 150 ng mL(-1). The performance of the assay in scallop extracts was adequate, with an estimated detection limit for 13-desmethyl spirolide C of 50 μg kg(-1) of shellfish meat. The recovery rate of 13-desmethyl spirolide C for spiked samples with this assay was 80% and the inter-assay coefficient of variation was 8%. This 384-well microplate, chemiluminescence method can be used as a high-throughput screening assay to detect 13-desmethyl spirolide C in shellfish meat in order to reduce the number of samples to be processed through bioassays or analytical methods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. High throughput modular chambers for rapid evaluation of anesthetic sensitivity

    Directory of Open Access Journals (Sweden)

    Eckmann David M

    2006-11-01

    Full Text Available Abstract Background Anesthetic sensitivity is determined by the interaction of multiple genes. Hence, a dissection of genetic contributors would be aided by precise and high throughput behavioral screens. Traditionally, anesthetic phenotyping has addressed only induction of anesthesia, evaluated with dose-response curves, while ignoring potentially important data on emergence from anesthesia. Methods We designed and built a controlled environment apparatus to permit rapid phenotyping of twenty-four mice simultaneously. We used the loss of righting reflex to indicate anesthetic-induced unconsciousness. After fitting the data to a sigmoidal dose-response curve with variable slope, we calculated the MACLORR (EC50, the Hill coefficient, and the 95% confidence intervals bracketing these values. Upon termination of the anesthetic, Emergence timeRR was determined and expressed as the mean ± standard error for each inhaled anesthetic. Results In agreement with several previously published reports we find that the MACLORR of halothane, isoflurane, and sevoflurane in 8–12 week old C57BL/6J mice is 0.79% (95% confidence interval = 0.78 – 0.79%, 0.91% (95% confidence interval = 0.90 – 0.93%, and 1.96% (95% confidence interval = 1.94 – 1.97%, respectively. Hill coefficients for halothane, isoflurane, and sevoflurane are 24.7 (95% confidence interval = 19.8 – 29.7%, 19.2 (95% confidence interval = 14.0 – 24.3%, and 33.1 (95% confidence interval = 27.3 – 38.8%, respectively. After roughly 2.5 MACLORR • hr exposures, mice take 16.00 ± 1.07, 6.19 ± 0.32, and 2.15 ± 0.12 minutes to emerge from halothane, isoflurane, and sevoflurane, respectively. Conclusion This system enabled assessment of inhaled anesthetic responsiveness with a higher precision than that previously reported. It is broadly adaptable for delivering an inhaled therapeutic (or toxin to a population while monitoring its vital signs, motor reflexes, and providing precise control

  20. High-throughput metal susceptibility testing of microbial biofilms

    Directory of Open Access Journals (Sweden)

    Turner Raymond J

    2005-10-01

    Full Text Available Abstract Background Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. Results This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO32- than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic

  1. High-throughput metal susceptibility testing of microbial biofilms

    Science.gov (United States)

    Harrison, Joe J; Turner, Raymond J; Ceri, Howard

    2005-01-01

    Background Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. Results This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO32-) than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic cell E. coli JM109 to metals

  2. A high-throughput, restriction-free cloning and screening strategy based on ccdB-gene replacement.

    Science.gov (United States)

    Lund, Bjarte Aarmo; Leiros, Hanna-Kirsti Schrøder; Bjerga, Gro Elin Kjæreng

    2014-03-10

    In high-throughput demanding fields, such as biotechnology and structural biology, molecular cloning is an essential tool in obtaining high yields of recombinant protein. Here, we address recently developed restriction-free methods in cloning, and present a more cost-efficient protocol that has been optimized to improve both cloning and clone screening. In our case study, three homologous β-lactamase genes were successfully cloned using these restriction-free protocols. To clone the genes, we chose a gene replacement strategy, where the recombinant genes contained overhangs that targeted a region of the expression vector including a cytotoxin-encoding ccdB-gene. We provide further evidence that gene replacement can be applied with high-throughput cloning protocols. Targeting a replacement of the ccdB-gene was found to be very successful for counterselection using these protocols. This eliminated the need for treatment with the restriction enzyme DpnI that has so far been the preferred clone selection approach. We thus present an optimized cloning protocol using a restriction-free ccdB-gene replacement strategy, which allows for parallel cloning at a high-throughput level.

  3. Beamline AR-NW12A: high-throughput beamline for macromolecular crystallography at the Photon Factory.

    Science.gov (United States)

    Chavas, L M G; Matsugaki, N; Yamada, Y; Hiraki, M; Igarashi, N; Suzuki, M; Wakatsuki, S

    2012-05-01

    AR-NW12A is an in-vacuum undulator beamline optimized for high-throughput macromolecular crystallography experiments as one of the five macromolecular crystallography (MX) beamlines at the Photon Factory. This report provides details of the beamline design, covering its optical specifications, hardware set-up, control software, and the latest developments for MX experiments. The experimental environment presents state-of-the-art instrumentation for high-throughput projects with a high-precision goniometer with an adaptable goniometer head, and a UV-light sample visualization system. Combined with an efficient automounting robot modified from the SSRL SAM system, a remote control system enables fully automated and remote-access X-ray diffraction experiments.

  4. Simultaneous and high-throughput analysis of iodo-trihalomethanes, haloacetonitriles, and halonitromethanes in drinking water using solid-phase microextraction/gas chromatography-mass spectrometry: an optimization of sample preparation.

    Science.gov (United States)

    Luo, Qian; Chen, Xichao; Wei, Zi; Xu, Xiong; Wang, Donghong; Wang, Zijian

    2014-10-24

    When iodide and natural organic matter are present in raw water, the formation of iodo-trihalomethanes (Iodo-THMs), haloacetonitriles (HANs), and halonitromethanes (HNMs) pose a potential health risk because they have been reported to be more toxic than their brominated or chlorinated analogs. In the work, simultaneous analysis of Iodo-THMs, HANs, and HNMs in drinking water samples in a single cleanup and chromatographic analysis was proposed. The DVB/CAR/PDMS fiber was found to be the most suitable for all target compounds, although 75μm CAR/PDMS was better for chlorinated HANs and 65μm PDMS/DVB for brominated HNMs. After optimization of the SPME parameters (DVB/CAR/PDMS fiber, extraction time of 30min at 40°C, addition of 40% w/v of salt, (NH4)2SO4 as a quenching agent, and desorption time of 3min at 170°C), detection limits ranged from 1 to 50ng/L for different analogs, with a linear range of at least two orders of magnitude. Good recoveries (78.6-104.7%) were obtained for spiked samples of a wide range of treated drinking waters, demonstrating that the method is applicable for analysis of real drinking water samples. Matrix effects were negligible for the treated water samples with total organic carbon concentration of less than 2.9mg/L. An effective survey conducted by two drinking water treatment plants showed the highest proportion of Iodo-THMs, HANs, and HNMs occurred in treated water, and concentrations of 13 detected compounds ranged between the ng/L and the μg/L levels. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. High-throughput design of low-activation, high-strength creep-resistant steels for nuclear-reactor applications

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qi; Zwaag, Sybrand van der [Novel Aerospace Materials Group, Faculty of Aerospace Engineering, Delft University of Technology, Kluyverweg 1, 2629 HS, Delft (Netherlands); Xu, Wei, E-mail: xuwei@ral.neu.edu.cn [State Key Laboratory of Rolling and Automation, Northeastern University, 110819, Shenyang (China); Novel Aerospace Materials Group, Faculty of Aerospace Engineering, Delft University of Technology, Kluyverweg 1, 2629 HS, Delft (Netherlands)

    2016-02-15

    Reduced-activation ferritic/martensitic steels are prime candidate materials for structural applications in nuclear power reactors. However, their creep strength is much lower than that of creep-resistant steel developed for conventional fossil-fired power plants as alloying elements with a high neutron activation cannot be used. To improve the creep strength and to maintain a low activation, a high-throughput computational alloy design model coupling thermodynamics, precipitate-coarsening kinetics and an optimization genetic algorithm, is developed. Twelve relevant alloying elements with either low or high activation are considered simultaneously. The activity levels at 0–10 year after the end of irradiation are taken as optimization parameter. The creep-strength values (after exposure for 10 years at 650 °C) are estimated on the basis of the solid-solution strengthening and the precipitation hardening (taking into account precipitate coarsening). Potential alloy compositions leading to a high austenite fraction or a high percentage of undesirable second phase particles are rejected automatically in the optimization cycle. The newly identified alloys have a much higher precipitation hardening and solid-solution strengthening at the same activity level as existing reduced-activation ferritic/martensitic steels.

  6. Engineering High Affinity Protein-Protein Interactions Using a High-Throughput Microcapillary Array Platform.

    Science.gov (United States)

    Lim, Sungwon; Chen, Bob; Kariolis, Mihalis S; Dimov, Ivan K; Baer, Thomas M; Cochran, Jennifer R

    2017-02-17

    Affinity maturation of protein-protein interactions requires iterative rounds of protein library generation and high-throughput screening to identify variants that bind with increased affinity to a target of interest. We recently developed a multipurpose protein engineering platform, termed μSCALE (Microcapillary Single Cell Analysis and Laser Extraction). This technology enables high-throughput screening of libraries of millions of cell-expressing protein variants based on their binding properties or functional activity. Here, we demonstrate the first use of the μSCALE platform for affinity maturation of a protein-protein binding interaction. In this proof-of-concept study, we engineered an extracellular domain of the Axl receptor tyrosine kinase to bind tighter to its ligand Gas6. Within 2 weeks, two iterative rounds of library generation and screening resulted in engineered Axl variants with a 50-fold decrease in kinetic dissociation rate, highlighting the use of μSCALE as a new tool for directed evolution.

  7. Microscopy with microlens arrays: high throughput, high resolution and light-field imaging.

    Science.gov (United States)

    Orth, Antony; Crozier, Kenneth

    2012-06-04

    We demonstrate highly parallelized fluorescence scanning microscopy using a refractive microlens array. Fluorescent beads and rat femur tissue are imaged over a 5.5 mm x 5.5 mm field of view at a pixel throughput of up to 4 megapixels/s and a resolution of 706 nm. We also demonstrate the ability to extract different perspective views of a pile of microspheres.

  8. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3.04 "Propulsion Systems," Busek Co. Inc. will develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  9. High-Throughput Approaches to Pinpoint Function within the Noncoding Genome.

    Science.gov (United States)

    Montalbano, Antonino; Canver, Matthew C; Sanjana, Neville E

    2017-10-05

    The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas nuclease system is a powerful tool for genome editing, and its simple programmability has enabled high-throughput genetic and epigenetic studies. These high-throughput approaches offer investigators a toolkit for functional interrogation of not only protein-coding genes but also noncoding DNA. Historically, noncoding DNA has lacked the detailed characterization that has been applied to protein-coding genes in large part because there has not been a robust set of methodologies for perturbing these regions. Although the majority of high-throughput CRISPR screens have focused on the coding genome to date, an increasing number of CRISPR screens targeting noncoding genomic regions continue to emerge. Here, we review high-throughput CRISPR-based approaches to uncover and understand functional elements within the noncoding genome and discuss practical aspects of noncoding library design and screen analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    Science.gov (United States)

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  11. High-throughput phenotyping of multicellular organisms: finding the link between genotype and phenotype

    OpenAIRE

    Sozzani, Rosangela; Benfey, Philip N

    2011-01-01

    High-throughput phenotyping approaches (phenomics) are being combined with genome-wide genetic screens to identify alterations in phenotype that result from gene inactivation. Here we highlight promising technologies for 'phenome-scale' analyses in multicellular organisms.

  12. High-throughput phenotyping of multicellular organisms: finding the link between genotype and phenotype

    Science.gov (United States)

    2011-01-01

    High-throughput phenotyping approaches (phenomics) are being combined with genome-wide genetic screens to identify alterations in phenotype that result from gene inactivation. Here we highlight promising technologies for 'phenome-scale' analyses in multicellular organisms. PMID:21457493

  13. EMPeror: a tool for visualizing high-throughput microbial community data

    National Research Council Canada - National Science Library

    Vázquez-Baeza, Yoshiki; Pirrung, Meg; Gonzalez, Antonio; Knight, Rob

    2013-01-01

    As microbial ecologists take advantage of high-throughput sequencing technologies to describe microbial communities across ever-increasing numbers of samples, new analysis tools are required to relate...

  14. High-throughput system-wide engineering and screening for microbial biotechnology.

    Science.gov (United States)

    Vervoort, Yannick; Linares, Alicia Gutiérrez; Roncoroni, Miguel; Liu, Chengxun; Steensels, Jan; Verstrepen, Kevin J

    2017-08-01

    Genetic engineering and screening of large number of cells or populations is a crucial bottleneck in today's systems biology and applied (micro)biology. Instead of using standard methods in bottles, flasks or 96-well plates, scientists are increasingly relying on high-throughput strategies that miniaturize their experiments to the nanoliter and picoliter scale and the single-cell level. In this review, we summarize different high-throughput system-wide genome engineering and screening strategies for microbes. More specifically, we will emphasize the use of multiplex automated genome evolution (MAGE) and CRISPR/Cas systems for high-throughput genome engineering and the application of (lab-on-chip) nanoreactors for high-throughput single-cell or population screening. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. High-throughput screening for novel inhibitors of Neisseria gonorrhoeae penicillin-binding protein 2.

    Directory of Open Access Journals (Sweden)

    Alena Fedarovich

    Full Text Available The increasing prevalence of N. gonorrhoeae strains exhibiting decreased susceptibility to third-generation cephalosporins and the recent isolation of two distinct strains with high-level resistance to cefixime or ceftriaxone heralds the possible demise of β-lactam antibiotics as effective treatments for gonorrhea. To identify new compounds that inhibit penicillin-binding proteins (PBPs, which are proven targets for β-lactam antibiotics, we developed a high-throughput assay that uses fluorescence polarization (FP to distinguish the fluorescent penicillin, Bocillin-FL, in free or PBP-bound form. This assay was used to screen a 50,000 compound library for potential inhibitors of N. gonorrhoeae PBP 2, and 32 compounds were identified that exhibited >50% inhibition of Bocillin-FL binding to PBP 2. These included a cephalosporin that provided validation of the assay. After elimination of compounds that failed to exhibit concentration-dependent inhibition, the antimicrobial activity of the remaining 24 was tested. Of these, 7 showed antimicrobial activity against susceptible and penicillin- or cephalosporin-resistant strains of N. gonorrhoeae. In molecular docking simulations using the crystal structure of PBP 2, two of these inhibitors docked into the active site of the enzyme and each mediate interactions with the active site serine nucleophile. This study demonstrates the validity of a FP-based assay to find novel inhibitors of PBPs and paves the way for more comprehensive high-throughput screening against highly resistant strains of N. gonorrhoeae. It also provides a set of lead compounds for optimization of anti-gonococcal agents.

  16. An image analysis toolbox for high-throughput C. elegans assays.

    Science.gov (United States)

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H; Riklin-Raviv, Tammy; Conery, Annie L; O'Rourke, Eyleen J; Sokolnicki, Katherine L; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M; Carpenter, Anne E

    2012-04-22

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available through the open-source CellProfiler project and enables objective scoring of whole-worm high-throughput image-based assays of C. elegans for the study of diverse biological pathways that are relevant to human disease.

  17. Deep Recurrent Neural Network for Mobile Human Activity Recognition with High Throughput

    OpenAIRE

    Inoue, Masaya; Inoue, Sozo; Nishida, Takeshi

    2016-01-01

    In this paper, we propose a method of human activity recognition with high throughput from raw accelerometer data applying a deep recurrent neural network (DRNN), and investigate various architectures and its combination to find the best parameter values. The "high throughput" refers to short time at a time of recognition. We investigated various parameters and architectures of the DRNN by using the training dataset of 432 trials with 6 activity classes from 7 people. The maximum recognition ...

  18. Construction and analysis of high-density linkage map using high-throughput sequencing data.

    Directory of Open Access Journals (Sweden)

    Dongyuan Liu

    Full Text Available Linkage maps enable the study of important biological questions. The construction of high-density linkage maps appears more feasible since the advent of next-generation sequencing (NGS, which eases SNP discovery and high-throughput genotyping of large population. However, the marker number explosion and genotyping errors from NGS data challenge the computational efficiency and linkage map quality of linkage study methods. Here we report the HighMap method for constructing high-density linkage maps from NGS data. HighMap employs an iterative ordering and error correction strategy based on a k-nearest neighbor algorithm and a Monte Carlo multipoint maximum likelihood algorithm. Simulation study shows HighMap can create a linkage map with three times as many markers as ordering-only methods while offering more accurate marker orders and stable genetic distances. Using HighMap, we constructed a common carp linkage map with 10,004 markers. The singleton rate was less than one-ninth of that generated by JoinMap4.1. Its total map distance was 5,908 cM, consistent with reports on low-density maps. HighMap is an efficient method for constructing high-density, high-quality linkage maps from high-throughput population NGS data. It will facilitate genome assembling, comparative genomic analysis, and QTL studies. HighMap is available at http://highmap.biomarker.com.cn/.

  19. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality.

    Science.gov (United States)

    Albert, Océane; Reintsch, Wolfgang E; Chan, Peter; Robaire, Bernard

    2016-05-01

    Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses ( ITALIC! n = 3-5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi-manual analysis software. Using this method, a cross-sectional study on 123 men showed no significant correlation between sperm concentration and sperm DNA

  20. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality

    Science.gov (United States)

    Albert, Océane; Reintsch, Wolfgang E.; Chan, Peter; Robaire, Bernard

    2016-01-01

    STUDY QUESTION Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? SUMMARY ANSWER We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. WHAT IS KNOWN ALREADY The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. STUDY DESIGN, SIZE, DURATION The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses (n = 3–5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. PARTICIPANTS/MATERIALS, SETTING, METHODS Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. MAIN RESULTS AND THE ROLE OF CHANCE We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi

  1. Embedded image enhancement for high-throughput cameras

    Science.gov (United States)

    Geerts, Stan J. C.; Cornelissen, Dion; de With, Peter H. N.

    2014-03-01

    This paper presents image enhancement for a novel Ultra-High-Definition (UHD) video camera offering 4K images and higher. Conventional image enhancement techniques need to be reconsidered for the high-resolution images and the low-light sensitivity of the new sensor. We study two image enhancement functions and evaluate and optimize the algorithms for embedded implementation in programmable logic (FPGA). The enhancement study involves high-quality Auto White Balancing (AWB) and Local Contrast Enhancement (LCE). We have compared multiple algorithms from literature, both with objective and subjective metrics. In order to objectively compare Local Contrast (LC), an existing LC metric is modified for LC measurement in UHD images. For AWB, we have found that color histogram stretching offers a subjective high image quality and it is among the algorithms with the lowest complexity, while giving only a small balancing error. We impose a color-to-color gain constraint, which improves robustness of low-light images. For local contrast enhancement, a combination of contrast preserving gamma and single-scale Retinex is selected. A modified bilateral filter is designed to prevent halo artifacts, while significantly reducing the complexity and simultaneously preserving quality. We show that by cascading contrast preserving gamma and single-scale Retinex, the visibility of details is improved towards the level appropriate for high-quality surveillance applications. The user is offered control over the amount of enhancement. Also, we discuss the mapping of those functions on a heterogeneous platform to come to an effective implementation while preserving quality and robustness.

  2. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    Science.gov (United States)

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  3. Protocol: A high-throughput DNA extraction system suitable for conifers.

    Science.gov (United States)

    Bashalkhanov, Stanislav; Rajora, Om P

    2008-08-01

    High throughput DNA isolation from plants is a major bottleneck for most studies requiring large sample sizes. A variety of protocols have been developed for DNA isolation from plants. However, many species, including conifers, have high contents of secondary metabolites that interfere with the extraction process or the subsequent analysis steps. Here, we describe a procedure for high-throughput DNA isolation from conifers. We have developed a high-throughput DNA extraction protocol for conifers using an automated liquid handler and modifying the Qiagen MagAttract Plant Kit protocol. The modifications involve change to the buffer system and improving the protocol so that it almost doubles the number of samples processed per kit, which significantly reduces the overall costs. We describe two versions of the protocol: one for medium-throughput (MTP) and another for high-throughput (HTP) DNA isolation. The HTP version works from start to end in the industry-standard 96-well format, while the MTP version provides higher DNA yields per sample processed. We have successfully used the protocol for DNA extraction and genotyping of thousands of individuals of several spruce and a pine species. A high-throughput system for DNA extraction from conifer needles and seeds has been developed and validated. The quality of the isolated DNA was comparable with that obtained from two commonly used methods: the silica-spin column and the classic CTAB protocol. Our protocol provides a fully automatable and cost effective solution for processing large numbers of conifer samples.

  4. High-throughput genome editing and phenotyping facilitated by high resolution melting curve analysis.

    Science.gov (United States)

    Thomas, Holly R; Percival, Stefanie M; Yoder, Bradley K; Parant, John M

    2014-01-01

    With the goal to generate and characterize the phenotypes of null alleles in all genes within an organism and the recent advances in custom nucleases, genome editing limitations have moved from mutation generation to mutation detection. We previously demonstrated that High Resolution Melting (HRM) analysis is a rapid and efficient means of genotyping known zebrafish mutants. Here we establish optimized conditions for HRM based detection of novel mutant alleles. Using these conditions, we demonstrate that HRM is highly efficient at mutation detection across multiple genome editing platforms (ZFNs, TALENs, and CRISPRs); we observed nuclease generated HRM positive targeting in 1 of 6 (16%) open pool derived ZFNs, 14 of 23 (60%) TALENs, and 58 of 77 (75%) CRISPR nucleases. Successful targeting, based on HRM of G0 embryos correlates well with successful germline transmission (46 of 47 nucleases); yet, surprisingly mutations in the somatic tail DNA weakly correlate with mutations in the germline F1 progeny DNA. This suggests that analysis of G0 tail DNA is a good indicator of the efficiency of the nuclease, but not necessarily a good indicator of germline alleles that will be present in the F1s. However, we demonstrate that small amplicon HRM curve profiles of F1 progeny DNA can be used to differentiate between specific mutant alleles, facilitating rare allele identification and isolation; and that HRM is a powerful technique for screening possible off-target mutations that may be generated by the nucleases. Our data suggest that micro-homology based alternative NHEJ repair is primarily utilized in the generation of CRISPR mutant alleles and allows us to predict likelihood of generating a null allele. Lastly, we demonstrate that HRM can be used to quickly distinguish genotype-phenotype correlations within F1 embryos derived from G0 intercrosses. Together these data indicate that custom nucleases, in conjunction with the ease and speed of HRM, will facilitate future high-throughput

  5. High-throughput genome editing and phenotyping facilitated by high resolution melting curve analysis.

    Directory of Open Access Journals (Sweden)

    Holly R Thomas

    Full Text Available With the goal to generate and characterize the phenotypes of null alleles in all genes within an organism and the recent advances in custom nucleases, genome editing limitations have moved from mutation generation to mutation detection. We previously demonstrated that High Resolution Melting (HRM analysis is a rapid and efficient means of genotyping known zebrafish mutants. Here we establish optimized conditions for HRM based detection of novel mutant alleles. Using these conditions, we demonstrate that HRM is highly efficient at mutation detection across multiple genome editing platforms (ZFNs, TALENs, and CRISPRs; we observed nuclease generated HRM positive targeting in 1 of 6 (16% open pool derived ZFNs, 14 of 23 (60% TALENs, and 58 of 77 (75% CRISPR nucleases. Successful targeting, based on HRM of G0 embryos correlates well with successful germline transmission (46 of 47 nucleases; yet, surprisingly mutations in the somatic tail DNA weakly correlate with mutations in the germline F1 progeny DNA. This suggests that analysis of G0 tail DNA is a good indicator of the efficiency of the nuclease, but not necessarily a good indicator of germline alleles that will be present in the F1s. However, we demonstrate that small amplicon HRM curve profiles of F1 progeny DNA can be used to differentiate between specific mutant alleles, facilitating rare allele identification and isolation; and that HRM is a powerful technique for screening possible off-target mutations that may be generated by the nucleases. Our data suggest that micro-homology based alternative NHEJ repair is primarily utilized in the generation of CRISPR mutant alleles and allows us to predict likelihood of generating a null allele. Lastly, we demonstrate that HRM can be used to quickly distinguish genotype-phenotype correlations within F1 embryos derived from G0 intercrosses. Together these data indicate that custom nucleases, in conjunction with the ease and speed of HRM, will

  6. High-throughput FTIR-based bioprocess analysis of recombinant cyprosin production.

    Science.gov (United States)

    Sampaio, Pedro N; Sales, Kevin C; Rosa, Filipa O; Lopes, Marta B; Calado, Cecília R C

    2017-01-01

    To increase the knowledge of the recombinant cyprosin production process in Saccharomyces cerevisiae cultures, it is relevant to implement efficient bioprocess monitoring techniques. The present work focuses on the implementation of a mid-infrared (MIR) spectroscopy-based tool for monitoring the recombinant culture in a rapid, economic, and high-throughput (using a microplate system) mode. Multivariate data analysis on the MIR spectra of culture samples was conducted. Principal component analysis (PCA) enabled capturing the general metabolic status of the yeast cells, as replicated samples appear grouped together in the score plot and groups of culture samples according to the main growth phase can be clearly distinguished. The PCA-loading vectors also revealed spectral regions, and the corresponding chemical functional groups and biomolecules that mostly contributed for the cell biomolecular fingerprint associated with the culture growth phase. These data were corroborated by the analysis of the samples' second derivative spectra. Partial least square (PLS) regression models built based on the MIR spectra showed high predictive ability for estimating the bioprocess critical variables: biomass (R 2 = 0.99, RMSEP 2.8%); cyprosin activity (R 2 = 0.98, RMSEP 3.9%); glucose (R 2 = 0.93, RMSECV 7.2%); galactose (R 2 = 0.97, RMSEP 4.6%); ethanol (R 2 = 0.97, RMSEP 5.3%); and acetate (R 2 = 0.95, RMSEP 7.0%). In conclusion, high-throughput MIR spectroscopy and multivariate data analysis were effective in identifying the main growth phases and specific cyprosin production phases along the yeast culture as well as in quantifying the critical variables of the process. This knowledge will promote future process optimization and control the recombinant cyprosin bioprocess according to Quality by Design framework.

  7. Melter Throughput Enhancements for High-Iron HLW

    Energy Technology Data Exchange (ETDEWEB)

    Kruger, A. A. [Department of Energy, Office of River Protection, Richland, Washington (United States); Gan, Hoa [The Catholic University of America, Washington, DC (United States); Joseph, Innocent [The Catholic University of America, Washington, DC (United States); Pegg, Ian L. [The Catholic University of America, Washington, DC (United States); Matlack, Keith S. [The Catholic University of America, Washington, DC (United States); Chaudhuri, Malabika [The Catholic University of America, Washington, DC (United States); Kot, Wing [The Catholic University of America, Washington, DC (United States)

    2012-12-26

    This report describes work performed to develop and test new glass and feed formulations in order to increase glass melting rates in high waste loading glass formulations for HLW with high concentrations of iron. Testing was designed to identify glass and melter feed formulations that optimize waste loading and waste processing rate while meeting all processing and product quality requirements. The work included preparation and characterization of crucible melts to assess melt rate using a vertical gradient furnace system and to develop new formulations with enhanced melt rate. Testing evaluated the effects of waste loading on glass properties and the maximum waste loading that can be achieved. The results from crucible-scale testing supported subsequent DuraMelter 100 (DM100) tests designed to examine the effects of enhanced glass and feed formulations on waste processing rate and product quality. The DM100 was selected as the platform for these tests due to its extensive previous use in processing rate determination for various HLW streams and glass compositions.

  8. Direct multiplex sequencing (DMPS)--a novel method for targeted high-throughput sequencing of ancient and highly degraded DNA

    National Research Council Canada - National Science Library

    Stiller, Mathias; Knapp, Michael; Stenzel, Udo; Hofreiter, Michael; Meyer, Matthias

    2009-01-01

    Although the emergence of high-throughput sequencing technologies has enabled whole-genome sequencing from extinct organisms, little progress has been made in accelerating targeted sequencing from highly degraded DNA...

  9. A practical strategy for using miniature chromatography columns in a standardized high-throughput workflow for purification development of monoclonal antibodies.

    Science.gov (United States)

    Welsh, John P; Petroff, Matthew G; Rowicki, Patricia; Bao, Haiying; Linden, Thomas; Roush, David J; Pollard, Jennifer M

    2014-01-01

    The emergence of monoclonal antibody (mAb) therapies has created a need for faster and more efficient bioprocess development strategies in order to meet timeline and material demands. In this work, a high-throughput process development (HTPD) strategy implementing several high-throughput chromatography purification techniques is described. Namely, batch incubations are used to scout feasible operating conditions, miniature columns are then used to determine separation of impurities, and, finally, a limited number of lab scale columns are tested to confirm the conditions identified using high-throughput techniques and to provide a path toward large scale processing. This multistep approach builds upon previous HTPD work by combining, in a unique sequential fashion, the flexibility and throughput of batch incubations with the increased separation characteristics for the packed bed format of miniature columns. Additionally, in order to assess the applicability of using miniature columns in this workflow, transport considerations were compared with traditional lab scale columns, and performances were mapped for the two techniques. The high-throughput strategy was utilized to determine optimal operating conditions with two different types of resins for a difficult separation of a mAb monomer from aggregates. Other more detailed prediction models are cited, but the intent of this work was to use high-throughput strategies as a general guide for scaling and assessing operating space rather than as a precise model to exactly predict performance. © 2014 American Institute of Chemical Engineers.

  10. A high-throughput method for the detection of homoeologous gene deletions in hexaploid wheat

    Directory of Open Access Journals (Sweden)

    Li Zhongyi

    2010-11-01

    4500 M2 mutant wheat lines generated by heavy ion irradiation, we detected multiple mutants with deletions of each TaPFT1 homoeologue, and confirmed these deletions using a CAPS method. We have subsequently designed, optimized, and applied this method for the screening of homoeologous deletions of three additional wheat genes putatively involved in plant disease resistance. Conclusions We have developed a method for automated, high-throughput screening to identify deletions of individual homoeologues of a wheat gene. This method is also potentially applicable to other polyploidy plants.

  11. Throughput Analysis for a High-Performance FPGA-Accelerated Real-Time Search Application

    Directory of Open Access Journals (Sweden)

    Wim Vanderbauwhede

    2012-01-01

    Full Text Available We propose an FPGA design for the relevancy computation part of a high-throughput real-time search application. The application matches terms in a stream of documents against a static profile, held in off-chip memory. We present a mathematical analysis of the throughput of the application and apply it to the problem of scaling the Bloom filter used to discard nonmatches.

  12. Powerful DMD-based light sources with a high throughput virtual slit

    Science.gov (United States)

    Hajian, Arsen R.; Gooding, Ed; Gunn, Thomas; Bradbury, Steven

    2016-02-01

    Many DMD-based programmable light sources consist of a white light source and a pair of spectrometers operating in subtractive mode. A DMD between the two spectrometers shapes the delivered spectrum. Since both spectrometers must (1) fit within a small volume, and (2) provide significant spectral resolution, a narrow intermediary slit is required. Another approach is to use a spectrometer designed around a High Throughput Virtual Slit, which enables higher spectral resolution than is achievable with conventional spectroscopy by manipulating the beam profile in pupil space. Conventional imaging spectrograph designs image the entrance slit onto the exit focal plane after dispersing the spectrum. Most often, near 1:1 imaging optics are used in order to optimize both entrance aperture and spectral resolution. This approach limits the spectral resolution to the product of the dispersion and the slit width. Achieving high spectral resolution in a compact instrument necessarily requires a narrow entrance slit, which limits instrumental throughput (étendue). By reshaping the pupil with reflective optics, HTVS-equipped instruments create a tall, narrow image profile at the exit focal plane without altering the NA, typically delivering 5X or better spectral resolution than is achievable with a conventional design. This approach works equally well in DMD-based programmable light sources as in single stage spectrometers. Assuming a 5X improvement in étendue, a 500 W source can be replaced by a 100 W equivalent, creating a cooler, more efficient tunable light source with equal power density over the desired bandwidth without compromising output power.

  13. Development of an NMR microprobe procedure for high-throughput environmental metabolomics of Daphnia magna.

    Science.gov (United States)

    Nagato, Edward G; Lankadurai, Brian P; Soong, Ronald; Simpson, André J; Simpson, Myrna J

    2015-09-01

    Nuclear magnetic resonance (NMR) is the primary platform used in high-throughput environmental metabolomics studies because its non-selectivity is well suited for non-targeted approaches. However, standard NMR probes may limit the use of NMR-based metabolomics for tiny organisms because of the sample volumes required for routine metabolic profiling. Because of this, keystone ecological species, such as the water flea Daphnia magna, are not commonly studied because of the analytical challenges associated with NMR-based approaches. Here, the use of a 1.7-mm NMR microprobe in analyzing tissue extracts from D. magna is tested. Three different extraction procedures (D2O-based buffer, Bligh and Dyer, and acetonitrile : methanol : water) were compared in terms of the yields and breadth of polar metabolites. The D2O buffer extraction yielded the most metabolites and resulted in the best reproducibility. Varying amounts of D. magna dry mass were extracted to optimize metabolite isolation from D. magna tissues. A ratio of 1-1.5-mg dry mass to 40 µl of extraction solvent provided excellent signal-to-noise and spectral resolution using (1)H NMR. The metabolite profile of a single daphnid was also investigated (approximately 0.2 mg). However, the signal-to-noise of the (1)H NMR was considerably lower, and while feasible for select applications would likely not be appropriate for high-throughput NMR-based metabolomics. Two-dimensional NMR experiments on D. magna extracts were also performed using the 1.7-mm NMR probe to confirm (1)H NMR metabolite assignments. This study provides an NMR-based analytical framework for future metabolomics studies that use D. magna in ecological and ecotoxicity studies. Copyright © 2015 John Wiley & Sons, Ltd.

  14. High throughput optimization of content-loaded nanoparticles

    DEFF Research Database (Denmark)

    2016-01-01

    The present invention relates to tagged particles and the identification and characterization of particles based on their tag. In particular, the present invention relates to a method for the production of a multitude of uniquely tagged particles comprised of a range of components selected from t...

  15. Forecasting Ecological Genomics: High-Tech Animal Instrumentation Meets High-Throughput Sequencing.

    Science.gov (United States)

    Shafer, Aaron B A; Northrup, Joseph M; Wikelski, Martin; Wittemyer, George; Wolf, Jochen B W

    2016-01-01

    Recent advancements in animal tracking technology and high-throughput sequencing are rapidly changing the questions and scope of research in the biological sciences. The integration of genomic data with high-tech animal instrumentation comes as a natural progression of traditional work in ecological genetics, and we provide a framework for linking the separate data streams from these technologies. Such a merger will elucidate the genetic basis of adaptive behaviors like migration and hibernation and advance our understanding of fundamental ecological and evolutionary processes such as pathogen transmission, population responses to environmental change, and communication in natural populations.

  16. Status Of The Development Of A Thin Foil High Throughput X-Ray Telescope For The Soviet Spectrum X-Gamma Mission

    DEFF Research Database (Denmark)

    WESTERGAARD, NJ; BYRNAK, BP; Christensen, Finn Erland

    1989-01-01

    modification of this design is optimized with respect to high energy throughput of the telescope. The mechanical design and the status of the surface preparation technologies are described. Various X-ray and optical test facilities for the measurement of surface roughness, "orange peel", and figure errors...

  17. Some Aspects of Crystal Centering During X-ray High-throughput Protein Crystallography Experiment

    Science.gov (United States)

    Gaponov, Yu. A.; Matsugaki, N.; Sasajima, K.; Igarashi, N.; Wakatsuki, S.

    A set of algorithms and procedures of a crystal loop centering during X-ray high-throughput protein crystallography experiment has been designed and developed. A simple algorithm of the crystal loop detection and preliminary recognition has been designed and developed. The crystal loop detection algorithm is based on finding out the crystal loop ending point (opposite to the crystal loop pin) using image cross section (digital image column) profile analysis. The crystal loop preliminary recognition procedure is based on finding out the crystal loop sizes and position using image cross section profile analysis. The crystal loop fine recognition procedure based on Hooke-Jeeves pattern search method with an ellipse as a fitting pattern has been designed and developed. The procedure of restoring missing coordinate of the crystal loop is described. Based on developed algorithms and procedures the optimal auto-centering procedure has been designed and developed. A procedure of optimal manual crystal centering (Two Clicks Procedure) has been designed and developed. Developed procedures have been integrated into control software system PCCS installed at crystallography beamlines Photon Factory BL5A and PF-AR NW12, KEK.

  18. Bis-benzimidazole hits against Naegleria fowleri discovered with new high-throughput screens.

    Science.gov (United States)

    Rice, Christopher A; Colon, Beatrice L; Alp, Mehmet; Göker, Hakan; Boykin, David W; Kyle, Dennis E

    2015-04-01

    Naegleria fowleri is a pathogenic free-living amoeba (FLA) that causes an acute fatal disease known as primary amoebic meningoencephalitis (PAM). The major problem for infections with any pathogenic FLA is a lack of effective therapeutics, since PAM has a case mortality rate approaching 99%. Clearly, new drugs that are potent and have rapid onset of action are needed to enhance the treatment regimens for PAM. Diamidines have demonstrated potency against multiple pathogens, including FLA, and are known to cross the blood-brain barrier to cure other protozoan diseases of the central nervous system. Therefore, amidino derivatives serve as an important chemotype for discovery of new drugs. In this study, we validated two new in vitro assays suitable for medium- or high-throughput drug discovery and used these for N. fowleri. We next screened over 150 amidino derivatives of multiple structural classes and identified two hit series with nM potency that are suitable for further lead optimization as new drugs for this neglected disease. These include both mono- and diamidino derivatives, with the most potent compound (DB173) having a 50% inhibitory concentration (IC50) of 177 nM. Similarly, we identified 10 additional analogues with IC50s of 500 times more potent than pentamidine. In summary, the mono- and diamidino derivatives offer potential for lead optimization to develop new drugs to treat central nervous system infections with N. fowleri. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  19. Application of Titration-Based Screening for the Rapid Pilot Testing of High-Throughput Assays.

    Science.gov (United States)

    Zhang, Ji-Hu; Kang, Zhao B; Ardayfio, Ophelia; Ho, Pei-i; Smith, Thomas; Wallace, Iain; Bowes, Scott; Hill, W Adam; Auld, Douglas S

    2014-06-01

    Pilot testing of an assay intended for high-throughput screening (HTS) with small compound sets is a necessary but often time-consuming step in the validation of an assay protocol. When the initial testing concentration is less than optimal, this can involve iterative testing at different concentrations to further evaluate the pilot outcome, which can be even more time-consuming. Quantitative HTS (qHTS) enables flexible and rapid collection of assay performance statistics, hits at different concentrations, and concentration-response curves in a single experiment. Here we describe the qHTS process for pilot testing in which eight-point concentration-response curves are produced using an interplate asymmetric dilution protocol in which the first four concentrations are used to represent the range of typical HTS screening concentrations and the last four concentrations are added for robust curve fitting to determine potency/efficacy values. We also describe how these data can be analyzed to predict the frequency of false-positives, false-negatives, hit rates, and confirmation rates for the HTS process as a function of screening concentration. By taking into account the compound pharmacology, this pilot-testing paradigm enables rapid assessment of the assay performance and choosing the optimal concentration for the large-scale HTS in one experiment. © 2013 Society for Laboratory Automation and Screening.

  20. Optimizing MRI Logistics: Prospective Analysis of Performance, Efficiency, and Patient Throughput.

    Science.gov (United States)

    Beker, Kevin; Garces-Descovich, Alejandro; Mangosing, Jason; Cabral-Goncalves, Ines; Hallett, Donna; Mortele, Koenraad J

    2017-10-01

    The objective of this study is to optimize MRI logistics through evaluation of MRI workflow and analysis of performance, efficiency, and patient throughput in a tertiary care academic center. For 2 weeks, workflow data from two outpatient MRI scanners were prospectively collected and stratified by value added to the process (i.e., value-added time, business value-added time, or non-value-added time). Two separate time cycles were measured: the actual MRI process cycle as well as the complete length of patient stay in the department. In addition, the impact and frequency of delays across all observations were measured. A total of 305 MRI examinations were evaluated, including body (34.1%), neurologic (28.9%), musculoskeletal (21.0%), and breast examinations (16.1%). The MRI process cycle lasted a mean of 50.97 ± 24.4 (SD) minutes per examination; the mean non-value-added time was 13.21 ± 18.77 minutes (25.87% of the total process cycle time). The mean length-of-stay cycle was 83.51 ± 33.63 minutes; the mean non-value-added time was 24.33 ± 24.84 minutes (29.14% of the total patient stay). The delay with the highest frequency (5.57%) was IV or port placement, which had a mean delay of 22.82 minutes. The delay with the greatest impact on time was MRI arthrography for which joint injection of contrast medium was necessary but was not accounted for in the schedule (mean delay, 42.2 minutes; frequency, 1.64%). Of 305 patients, 34 (11.15%) did not arrive at or before their scheduled time. Non-value-added time represents approximately one-third of the total MRI process cycle and patient length of stay. Identifying specific delays may expedite the application of targeted improvement strategies, potentially increasing revenue, efficiency, and overall patient satisfaction.

  1. High-throughput Analysis of Large Microscopy Image Datasets on CPU-GPU Cluster Platforms.

    Science.gov (United States)

    Teodoro, George; Pan, Tony; Kurc, Tahsin M; Kong, Jun; Cooper, Lee A D; Podhorszki, Norbert; Klasky, Scott; Saltz, Joel H

    2013-05-01

    Analysis of large pathology image datasets offers significant opportunities for the investigation of disease morphology, but the resource requirements of analysis pipelines limit the scale of such studies. Motivated by a brain cancer study, we propose and evaluate a parallel image analysis application pipeline for high throughput computation of large datasets of high resolution pathology tissue images on distributed CPU-GPU platforms. To achieve efficient execution on these hybrid systems, we have built runtime support that allows us to express the cancer image analysis application as a hierarchical data processing pipeline. The application is implemented as a coarse-grain pipeline of stages, where each stage may be further partitioned into another pipeline of fine-grain operations. The fine-grain operations are efficiently managed and scheduled for computation on CPUs and GPUs using performance aware scheduling techniques along with several optimizations, including architecture aware process placement, data locality conscious task assignment, data prefetching, and asynchronous data copy. These optimizations are employed to maximize the utilization of the aggregate computing power of CPUs and GPUs and minimize data copy overheads. Our experimental evaluation shows that the cooperative use of CPUs and GPUs achieves significant improvements on top of GPU-only versions (up to 1.6×) and that the execution of the application as a set of fine-grain operations provides more opportunities for runtime optimizations and attains better performance than coarser-grain, monolithic implementations used in other works. An implementation of the cancer image analysis pipeline using the runtime support was able to process an image dataset consisting of 36,848 4Kx4K-pixel image tiles (about 1.8TB uncompressed) in less than 4 minutes (150 tiles/second) on 100 nodes of a state-of-the-art hybrid cluster system.

  2. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  3. High-throughput atomic force microscopes operating in parallel

    Science.gov (United States)

    Sadeghian, Hamed; Herfst, Rodolf; Dekker, Bert; Winters, Jasper; Bijnagte, Tom; Rijnbeek, Ramon

    2017-03-01

    Atomic force microscopy (AFM) is an essential nanoinstrument technique for several applications such as cell biology and nanoelectronics metrology and inspection. The need for statistically significant sample sizes means that data collection can be an extremely lengthy process in AFM. The use of a single AFM instrument is known for its very low speed and not being suitable for scanning large areas, resulting in a very-low-throughput measurement. We address this challenge by parallelizing AFM instruments. The parallelization is achieved by miniaturizing the AFM instrument and operating many of them simultaneously. This instrument has the advantages that each miniaturized AFM can be operated independently and that the advances in the field of AFM, both in terms of speed and imaging modalities, can be implemented more easily. Moreover, a parallel AFM instrument also allows one to measure several physical parameters simultaneously; while one instrument measures nano-scale topography, another instrument can measure mechanical, electrical, or thermal properties, making it a lab-on-an-instrument. In this paper, a proof of principle of such a parallel AFM instrument has been demonstrated by analyzing the topography of large samples such as semiconductor wafers. This nanoinstrument provides new research opportunities in the nanometrology of wafers and nanolithography masks by enabling real die-to-die and wafer-level measurements and in cell biology by measuring the nano-scale properties of a large number of cells.

  4. Software Switching for High Throughput Data Acquisition Networks

    CERN Document Server

    AUTHOR|(CDS)2089787; Lehmann Miotto, Giovanna

    The bursty many-to-one communication pattern, typical for data acquisition systems, is particularly demanding for commodity TCP/IP and Ethernet technologies. The problem arising from this pattern is widely known in the literature as \\emph{incast} and can be observed as TCP throughput collapse. It is a result of overloading the switch buffers, when a specific node in a network requests data from multiple sources. This will become even more demanding for future upgrades of the experiments at the Large Hadron Collider at CERN. It is questionable whether commodity TCP/IP and Ethernet technologies in their current form will be still able to effectively adapt to bursty traffic without losing packets due to the scarcity of buffers in the networking hardware. This thesis provides an analysis of TCP/IP performance in data acquisition networks and presents a novel approach to incast congestion in these networks based on software-based packet forwarding. Our first contribution lies in confirming the strong analogies bet...

  5. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  6. Preselection of shotgun clones by oligonucleotide fingerprinting: an efficient and high throughput strategy to reduce redundancy in large-scale sequencing projects

    National Research Council Canada - National Science Library

    Radelof, U; Hennig, S; Seranski, P; Steinfath, M; Ramser, J; Reinhardt, R; Poustka, A; Francis, F; Lehrach, H

    1998-01-01

    .... To reduce the overall effort and cost of those projects and to accelerate the sequencing throughput, we have developed an efficient, high throughput oligonucleotide fingerprinting protocol to select...

  7. High Throughput Sequencing of Extracellular RNA from Human Plasma.

    Directory of Open Access Journals (Sweden)

    Kirsty M Danielson

    Full Text Available The presence and relative stability of extracellular RNAs (exRNAs in biofluids has led to an emerging recognition of their promise as 'liquid biopsies' for diseases. Most prior studies on discovery of exRNAs as disease-specific biomarkers have focused on microRNAs (miRNAs using technologies such as qRT-PCR and microarrays. The recent application of next-generation sequencing to discovery of exRNA biomarkers has revealed the presence of potential novel miRNAs as well as other RNA species such as tRNAs, snoRNAs, piRNAs and lncRNAs in biofluids. At the same time, the use of RNA sequencing for biofluids poses unique challenges, including low amounts of input RNAs, the presence of exRNAs in different compartments with varying degrees of vulnerability to isolation techniques, and the high abundance of specific RNA species (thereby limiting the sensitivity of detection of less abundant species. Moreover, discovery in human diseases often relies on archival biospecimens of varying age and limiting amounts of samples. In this study, we have tested RNA isolation methods to optimize profiling exRNAs by RNA sequencing in individuals without any known diseases. Our findings are consistent with other recent studies that detect microRNAs and ribosomal RNAs as the major exRNA species in plasma. Similar to other recent studies, we found that the landscape of biofluid microRNA transcriptome is dominated by several abundant microRNAs that appear to comprise conserved extracellular miRNAs. There is reasonable correlation of sets of conserved miRNAs across biological replicates, and even across other data sets obtained at different investigative sites. Conversely, the detection of less abundant miRNAs is far more dependent on the exact methodology of RNA isolation and profiling. This study highlights the challenges in detecting and quantifying less abundant plasma miRNAs in health and disease using RNA sequencing platforms.

  8. Recent progress using high-throughput sequencing technologies in plant molecular breeding.

    Science.gov (United States)

    Gao, Qiang; Yue, Guidong; Li, Wenqi; Wang, Junyi; Xu, Jiaohui; Yin, Ye

    2012-04-01

    High-throughput sequencing is a revolutionary technological innovation in DNA sequencing. This technology has an ultra-low cost per base of sequencing and an overwhelmingly high data output. High-throughput sequencing has brought novel research methods and solutions to the research fields of genomics and post-genomics. Furthermore, this technology is leading to a new molecular breeding revolution that has landmark significance for scientific research and enables us to launch multi-level, multi-faceted, and multi-extent studies in the fields of crop genetics, genomics, and crop breeding. In this paper, we review progress in the application of high-throughput sequencing technologies to plant molecular breeding studies. © 2012 Institute of Botany, Chinese Academy of Sciences.

  9. Repeated Assessment by High-Throughput Assay Demonstrates that Sperm DNA Methylation Levels Are Highly Reproducible

    Science.gov (United States)

    Cortessis, Victoria K.; Siegmund, Kimberly; Houshdaran, Sahar; Laird, Peter W.; Sokol, Rebecca Z.

    2011-01-01

    Objective To assess reliability of high-throughput assay of sperm DNA methylation. Design Observational study comparing DNA methylation of sperm isolated from three divided and twelve longitudinally collected semen samples. Setting Academic Medical Center Patients One man undergoing screening semen analysis during evaluation of the infertile couple and two healthy fertile male volunteers. Interventions Spermatozoa were separated from seminal plasma and somatic cells using gradient separation. DNA was extracted from spermatozoa, and DNA methylation was assessed at 1,505 DNA-sequence specific sites. Main Outcome Measures Repeatability of sperm DNA methylation measures, estimated by correlation coefficients. Results DNA methylation levels were highly correlated within matched sets of divided samples (all r≥0.97) and longitudinal samples (average r=0.97). Conclusions The described methodology reliably assesses methylation of sperm DNA at large numbers of sites. Methylation profiles were consistent over time. High-throughput assessment of sperm DNA methylation is a promising tool for studying the role of epigenetic state in male fertility. PMID:22035967

  10. High-Throughput Continuous Hydrothermal Synthesis of Transparent Conducting Aluminum and Gallium Co-doped Zinc Oxides.

    Science.gov (United States)

    Howard, Dougal P; Marchand, Peter; McCafferty, Liam; Carmalt, Claire J; Parkin, Ivan P; Darr, Jawwad A

    2017-04-10

    High-throughput continuous hydrothermal flow synthesis was used to generate a library of aluminum and gallium-codoped zinc oxide nanoparticles of specific atomic ratios. Resistivities of the materials were determined by Hall Effect measurements on heat-treated pressed discs and the results collated into a conductivity-composition map. Optimal resistivities of ∼9 × 10-3 Ω cm were reproducibly achieved for several samples, for example, codoped ZnO with 2 at% Ga and 1 at% Al. The optimum sample on balance of performance and cost was deemed to be ZnO codoped with 3 at% Al and 1 at% Ga.

  11. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  12. High throughput microplate respiratory measurements using minimal quantities of isolated mitochondria.

    Directory of Open Access Journals (Sweden)

    George W Rogers

    Full Text Available Recently developed technologies have enabled multi-well measurement of O(2 consumption, facilitating the rate of mitochondrial research, particularly regarding the mechanism of action of drugs and proteins that modulate metabolism. Among these technologies, the Seahorse XF24 Analyzer was designed for use with intact cells attached in a monolayer to a multi-well tissue culture plate. In order to have a high throughput assay system in which both energy demand and substrate availability can be tightly controlled, we have developed a protocol to expand the application of the XF24 Analyzer to include isolated mitochondria. Acquisition of optimal rates requires assay conditions that are unexpectedly distinct from those of conventional polarography. The optimized conditions, derived from experiments with isolated mouse liver mitochondria, allow multi-well assessment of rates of respiration and proton production by mitochondria attached to the bottom of the XF assay plate, and require extremely small quantities of material (1-10 µg of mitochondrial protein per well. Sequential measurement of basal, State 3, State 4, and uncoupler-stimulated respiration can be made in each well through additions of reagents from the injection ports. We describe optimization and validation of this technique using isolated mouse liver and rat heart mitochondria, and apply the approach to discover that inclusion of phosphatase inhibitors in the preparation of the heart mitochondria results in a specific decrease in rates of Complex I-dependent respiration. We believe this new technique will be particularly useful for drug screening and for generating previously unobtainable respiratory data on small mitochondrial samples.

  13. A High-Throughput Hardware Architecture for the H.264/AVC Half-Pixel Motion Estimation Targeting High-Definition Videos

    Directory of Open Access Journals (Sweden)

    Marcel M. Corrêa

    2011-01-01

    Full Text Available This paper presents a high-performance hardware architecture for the H.264/AVC Half-Pixel Motion Estimation that targets high-definition videos. This design can process very high-definition videos like QHDTV (3840×2048 in real time (30 frames per second. It also presents an optimized arrangement of interpolated samples, which is the main key to achieve an efficient search. The interpolation process is interleaved with the SAD calculation and comparison, allowing the high throughput. The architecture was fully described in VHDL, synthesized for two different Xilinx FPGA devices, and it achieved very good results when compared to related works.

  14. High-Throughput Metabolic Profiling of Soybean Leaves by Fourier Transform Ion Cyclotron Resonance Mass Spectrometry.

    Science.gov (United States)

    Yilmaz, Ali; Rudolph, Heather L; Hurst, Jerod J; Wood, Troy D

    2016-01-19

    As a relatively recent research field, plant metabolomics has gained increasing interest in the past few years and has been applied to answer biological questions through large-scale qualitative and quantitative analyses of the plant metabolome. The combination of sensitivity and selectivity offered by mass spectrometry (MS) for measurement of many metabolites in a single shot makes it an indispensable platform in metabolomics. In this regard, Fourier-transform ion cyclotron resonance (FTICR) has the unique advantage of delivering high mass resolving power and mass accuracy simultaneously, making it ideal for the study of complex mixtures such as plant extracts. Here we optimize soybean leaf extraction methods compatible with high-throughput reproducible MS-based metabolomics. In addition, matrix-assisted laser desorption ionization (MALDI) and direct LDI of soybean leaves are compared for metabolite profiling. The extraction method combined with electrospray (ESI)-FTICR is supported by the significant reduction of chlorophyll and its related metabolites as the growing season moves from midsummer to the autumn harvest day. To our knowledge for the first time, the use of ESI-FTICR MS and MALDI-FTICR MS is described in a complementary manner with the aim of metabolic profiling of plant leaves that have been collected at different time points during the growing season.

  15. Microfabrication of a High-Throughput Nanochannel Delivery/Filtration System

    Science.gov (United States)

    Ferrari, Mauro; Liu, Xuewu; Grattoni, Alessandro; Fine, Daniel; Hosali, Sharath; Goodall, Randi; Medema, Ryan; Hudson, Lee

    2011-01-01

    A microfabrication process is proposed to produce a nanopore membrane for continuous passive drug release to maintain constant drug concentrations in the patient s blood throughout the delivery period. Based on silicon microfabrication technology, the dimensions of the nanochannel area, as well as microchannel area, can be precisely controlled, thus providing a steady, constant drug release rate within an extended time period. The multilayered nanochannel structures extend the limit of release rate range of a single-layer nanochannel system, and allow a wide range of pre-defined porosity to achieve any arbitrary drug release rate using any preferred nanochannel size. This membrane system could also be applied to molecular filtration or isolation. In this case, the nanochannel length can be reduced to the nanofabrication limit, i.e., 10s of nm. The nanochannel delivery system membrane is composed of a sandwich of a thin top layer, the horizontal nanochannels, and a thicker bottom wafer. The thin top layer houses an array of microchannels that offers the inlet port for diffusing molecules. It also works as a lid for the nanochannels by providing the channels a top surface. The nanochannels are fabricated by a sacrificial layer technique that obtains smooth surfaces and precisely controlled dimensions. The structure of this nanopore membrane is optimized to yield high mechanical strength and high throughput.

  16. An ultra-compact, high-throughput molecular beam epitaxy growth system

    Energy Technology Data Exchange (ETDEWEB)

    Baker, A. A.; Hesjedal, T. [Clarendon Laboratory, Department of Physics, University of Oxford, Oxford OX1 3PU (United Kingdom); Diamond Light Source, Didcot OX11 0DE (United Kingdom); Braun, W., E-mail: w.braun@fkf.mpg.de, E-mail: fischer@createc.de; Rembold, S.; Fischer, A., E-mail: w.braun@fkf.mpg.de, E-mail: fischer@createc.de [CreaTec Fischer and Co. GmbH, Industriestr. 9, 74391 Erligheim (Germany); Gassler, G. [Dr. Gassler Electron Devices GmbH, List Str. 4, 89079 Ulm (Germany)

    2015-04-15

    We present a miniaturized molecular beam epitaxy (miniMBE) system with an outer diameter of 206 mm, optimized for flexible and high-throughput operation. The three-chamber system, used here for oxide growth, consists of a sample loading chamber, a storage chamber, and a growth chamber. The growth chamber is equipped with eight identical effusion cell ports with linear shutters, one larger port for either a multi-pocket electron beam evaporator or an oxygen plasma source, an integrated cryoshroud, retractable beam-flux monitor or quartz-crystal microbalance, reflection high energy electron diffraction, substrate manipulator, main shutter, and quadrupole mass spectrometer. The system can be combined with ultrahigh vacuum (UHV) end stations on synchrotron and neutron beamlines, or equivalently with other complex surface analysis systems, including low-temperature scanning probe microscopy systems. Substrate handling is compatible with most UHV surface characterization systems, as the miniMBE can accommodate standard surface science sample holders. We introduce the design of the system, and its specific capabilities and operational parameters, and we demonstrate the epitaxial thin film growth of magnetoelectric Cr{sub 2}O{sub 3} on c-plane sapphire and ferrimagnetic Fe{sub 3}O{sub 4} on MgO (001)

  17. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  18. High-throughput high-resolution class I HLA genotyping in East Africa.

    Directory of Open Access Journals (Sweden)

    Rebecca N Koehler

    Full Text Available HLA, the most genetically diverse loci in the human genome, play a crucial role in host-pathogen interaction by mediating innate and adaptive cellular immune responses. A vast number of infectious diseases affect East Africa, including HIV/AIDS, malaria, and tuberculosis, but the HLA genetic diversity in this region remains incompletely described. This is a major obstacle for the design and evaluation of preventive vaccines. Available HLA typing techniques, that provide the 4-digit level resolution needed to interpret immune responses, lack sufficient throughput for large immunoepidemiological studies. Here we present a novel HLA typing assay bridging the gap between high resolution and high throughput. The assay is based on real-time PCR using sequence-specific primers (SSP and can genotype carriers of the 49 most common East African class I HLA-A, -B, and -C alleles, at the 4-digit level. Using a validation panel of 175 samples from Kampala, Uganda, previously defined by sequence-based typing, the new assay performed with 100% sensitivity and specificity. The assay was also implemented to define the HLA genetic complexity of a previously uncharacterized Tanzanian population, demonstrating its inclusion in the major East African genetic cluster. The availability of genotyping tools with this capacity will be extremely useful in the identification of correlates of immune protection and the evaluation of candidate vaccine efficacy.

  19. The sva package for removing batch effects and other unwanted variation in high-throughput experiments.

    Science.gov (United States)

    Leek, Jeffrey T; Johnson, W Evan; Parker, Hilary S; Jaffe, Andrew E; Storey, John D

    2012-03-15

    Heterogeneity and latent variables are now widely recognized as major sources of bias and variability in high-throughput experiments. The most well-known source of latent variation in genomic experiments are batch effects-when samples are processed on different days, in different groups or by different people. However, there are also a large number of other variables that may have a major impact on high-throughput measurements. Here we describe the sva package for identifying, estimating and removing unwanted sources of variation in high-throughput experiments. The sva package supports surrogate variable estimation with the sva function, direct adjustment for known batch effects with the ComBat function and adjustment for batch and latent variables in prediction problems with the fsva function.

  20. Plant phenomics and high-throughput phenotyping: accelerating rice functional genomics using multidisciplinary technologies.

    Science.gov (United States)

    Yang, Wanneng; Duan, Lingfeng; Chen, Guoxing; Xiong, Lizhong; Liu, Qian

    2013-05-01

    The functional analysis of the rice genome has entered into a high-throughput stage, and a project named RICE2020 has been proposed to determine the function of every gene in the rice genome by the year 2020. However, as compared with the robustness of genetic techniques, the evaluation of rice phenotypic traits is still performed manually, and the process is subjective, inefficient, destructive and error-prone. To overcome these limitations and help rice phenomics more closely parallel rice genomics, reliable, automatic, multifunctional, and high-throughput phenotyping platforms should be developed. In this article, we discuss the key plant phenotyping technologies, particularly photonics-based technologies, and then introduce their current applications in rice (wheat or barley) phenomics. We also note the major challenges in rice phenomics and are confident that these reliable high-throughput phenotyping tools will give plant scientists new perspectives on the information encoded in the rice genome. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. The AFLOW Standard for High-throughput Materials Science Calculations

    Science.gov (United States)

    2015-01-01

    Engineering, Physics and Chemistry, Duke University, Durham, NC 27708, USA a r t i c l e i n f o Article history: Received 31 May 2015 Received in revised...with the potentials provided with the DFT software package. In VASP, these include Ultra-Soft Pseudopotentials (USPP) [22,23] and Projector -Augmented...intensive reciprocal space. This approach is prone to aliasing errors, and requires the optimization of real-space projectors if these are to be

  2. Protocol: A high-throughput DNA extraction system suitable for conifers

    Directory of Open Access Journals (Sweden)

    Rajora Om P

    2008-08-01

    Full Text Available Abstract Background High throughput DNA isolation from plants is a major bottleneck for most studies requiring large sample sizes. A variety of protocols have been developed for DNA isolation from plants. However, many species, including conifers, have high contents of secondary metabolites that interfere with the extraction process or the subsequent analysis steps. Here, we describe a procedure for high-throughput DNA isolation from conifers. Results We have developed a high-throughput DNA extraction protocol for conifers using an automated liquid handler and modifying the Qiagen MagAttract Plant Kit protocol. The modifications involve change to the buffer system and improving the protocol so that it almost doubles the number of samples processed per kit, which significantly reduces the overall costs. We describe two versions of the protocol: one for medium-throughput (MTP and another for high-throughput (HTP DNA isolation. The HTP version works from start to end in the industry-standard 96-well format, while the MTP version provides higher DNA yields per sample processed. We have successfully used the protocol for DNA extraction and genotyping of thousands of individuals of several spruce and a pine species. Conclusion A high-throughput system for DNA extraction from conifer needles and seeds has been developed and validated. The quality of the isolated DNA was comparable with that obtained from two commonly used methods: the silica-spin column and the classic CTAB protocol. Our protocol provides a fully automatable and cost effective solution for processing large numbers of conifer samples.

  3. High throughput discovery of families of high activity WGS catalysts: part I--history and methodology.

    Science.gov (United States)

    Yaccato, Karin; Carhart, Ray; Hagemeyer, Alfred; Herrmann, Michael; Lesik, Andreas; Strasser, Peter; Volpe, Anthony; Turner, Howard; Weinberg, Henry; Grasselli, Robert K; Brooks, Christopher J; Pigos, John M

    2010-05-01

    State-of-art water gas shift catalysts (FeCr for high temperature shift and CuZn for low temperature shift) are not active enough to be used in fuel processors for the production of hydrogen from hydrocarbon fuels for fuel cells. The need for drastically lower catalyst volumes has triggered a search for novel WGS catalysts that are an order of magnitude more active than current systems. Novel catalytic materials for the high, medium and low temperature water gas shift reactions have been discovered by application of combinatorial methodologies. Catalyst libraries were synthesized on 4 inch wafers in 16 x 16 arrays and screened in a high throughput scanning mass spectrometer in the temperature range 200 degrees C to 400 degrees C. More than 200 wafers were screened under various conditions and more than 250,000 experiments were conducted to comprehensively examine catalyst performance for various binary, ternary and higher-order compositions.

  4. Rapid and high-throughput detection of highly pathogenic bacteria by Ibis PLEX-ID technology.

    Directory of Open Access Journals (Sweden)

    Daniela Jacob

    Full Text Available In this manuscript, we describe the identification of highly pathogenic bacteria using an assay coupling biothreat group-specific PCR with electrospray ionization mass spectrometry (PCR/ESI-MS run on an Ibis PLEX-ID high-throughput platform. The biothreat cluster assay identifies most of the potential bioterrorism-relevant microorganisms including Bacillus anthracis, Francisella tularensis, Yersinia pestis, Burkholderia mallei and pseudomallei, Brucella species, and Coxiella burnetii. DNA from 45 different reference materials with different formulations and different concentrations were chosen and sent to a service screening laboratory that uses the PCR/ESI-MS platform to provide a microbial identification service. The standard reference materials were produced out of a repository built up in the framework of the EU funded project "Establishment of Quality Assurances for Detection of Highly Pathogenic Bacteria of Potential Bioterrorism Risk" (EQADeBa. All samples were correctly identified at least to the genus level.

  5. Complementing high-throughput X-ray powder diffraction data with quantum-chemical calculations

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; van de Streek, Jacco; Rantanen, Jukka

    2012-01-01

    High-throughput crystallisation and characterisation platforms provide an efficient means to carry out solid-form screening during the pre-formulation phase. To determine the crystal structures of identified new solid phases, however, usually requires independent crystallisation trials to produce...... obtained only during high-energy processing such as spray drying or milling....

  6. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  7. Accelerating the design of solar thermal fuel materials through high throughput simulations.

    Science.gov (United States)

    Liu, Yun; Grossman, Jeffrey C

    2014-12-10

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  8. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    Science.gov (United States)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  9. High-Throughput 3D Tumor Culture in a Recyclable Microfluidic Platform.

    Science.gov (United States)

    Liu, Wenming; Wang, Jinyi

    2017-01-01

    Three-dimensional (3D) tumor culture miniaturized platforms are of importance to biomimetic model construction and pathophysiological studies. Controllable and high-throughput production of 3D tumors is desirable to make cell-based manipulation dynamic and efficient at micro-scale. Moreover, the 3D culture platform being reusable is convenient to research scholars. In this chapter, we describe a dynamically controlled 3D tumor manipulation and culture method using pneumatic microstructure-based microfluidics, which has potential applications in the fields of tissue engineering, tumor biology, and clinical medicine in a high-throughput way.

  10. Applications of high-throughput plant phenotyping to study nutrient use efficiency.

    Science.gov (United States)

    Berger, Bettina; de Regt, Bas; Tester, Mark

    2013-01-01

    Remote sensing and spectral reflectance measurements of plants has long been used to assess the growth and nutrient status of plants in a noninvasive manner. With improved imaging and computer technologies, these approaches can now be used at high-throughput for more extensive physiological and genetic studies. Here, we present an example of how high-throughput imaging can be used to study the growth of plants exposed to different nutrient levels. In addition, the color of the leaves can be used to estimate leaf chlorophyll and nitrogen status of the plant.

  11. A platform for high-throughput screening of DNA-encoded catalyst libraries in organic solvents.

    Science.gov (United States)

    Hook, K Delaney; Chambers, John T; Hili, Ryan

    2017-10-01

    We have developed a novel high-throughput screening platform for the discovery of small-molecules catalysts for bond-forming reactions. The method employs an in vitro selection for bond-formation using amphiphilic DNA-encoded small molecules charged with reaction substrate, which enables selections to be conducted in a variety of organic or aqueous solvents. Using the amine-catalysed aldol reaction as a catalytic model and high-throughput DNA sequencing as a selection read-out, we demonstrate the 1200-fold enrichment of a known aldol catalyst from a library of 16.7-million uncompetitive library members.

  12. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    Science.gov (United States)

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  13. High-throughput exposure modeling to support prioritization of chemicals in personal care products

    DEFF Research Database (Denmark)

    Csiszar, Susan A.; Ernstoff, Alexi; Fantke, Peter

    2016-01-01

    We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or population per mass...... intakes were associated with body lotion. Bioactive doses derived from high-throughput in vitro toxicity data were combined with the estimated PiFs to demonstrate an approach to estimate bioactive equivalent chemical content and to screen chemicals for risk....

  14. HT-Paxos: High Throughput State-Machine Replication Protocol for Large Clustered Data Centers

    Directory of Open Access Journals (Sweden)

    Vinit Kumar

    2015-01-01

    Full Text Available Paxos is a prominent theory of state-machine replication. Recent data intensive systems that implement state-machine replication generally require high throughput. Earlier versions of Paxos as few of them are classical Paxos, fast Paxos, and generalized Paxos have a major focus on fault tolerance and latency but lacking in terms of throughput and scalability. A major reason for this is the heavyweight leader. Through offloading the leader, we can further increase throughput of the system. Ring Paxos, Multiring Paxos, and S-Paxos are few prominent attempts in this direction for clustered data centers. In this paper, we are proposing HT-Paxos, a variant of Paxos that is the best suitable for any large clustered data center. HT-Paxos further offloads the leader very significantly and hence increases the throughput and scalability of the system, while at the same time, among high throughput state-machine replication protocols, it provides reasonably low latency and response time.

  15. High throughput phenotyping to accelerate crop breeding and monitoring of diseases in the field.

    Science.gov (United States)

    Shakoor, Nadia; Lee, Scott; Mockler, Todd C

    2017-08-01

    Effective implementation of technology that facilitates accurate and high-throughput screening of thousands of field-grown lines is critical for accelerating crop improvement and breeding strategies for higher yield and disease tolerance. Progress in the development of field-based high throughput phenotyping methods has advanced considerably in the last 10 years through technological progress in sensor development and high-performance computing. Here, we review recent advances in high throughput field phenotyping technologies designed to inform the genetics of quantitative traits, including crop yield and disease tolerance. Successful application of phenotyping platforms to advance crop breeding and identify and monitor disease requires: (1) high resolution of imaging and environmental sensors; (2) quality data products that facilitate computer vision, machine learning and GIS; (3) capacity infrastructure for data management and analysis; and (4) automated environmental data collection. Accelerated breeding for agriculturally relevant crop traits is key to the development of improved varieties and is critically dependent on high-resolution, high-throughput field-scale phenotyping technologies that can efficiently discriminate better performing lines within a larger population and across multiple environments. Copyright © 2017. Published by Elsevier Ltd.

  16. Liquid Phase Multiplex High-Throughput Screening of Metagenomic Libraries Using p-Nitrophenyl-Linked Substrates for Accessory Lignocellulosic Enzymes.

    Science.gov (United States)

    Smart, Mariette; Huddy, Robert J; Cowan, Don A; Trindade, Marla

    2017-01-01

    To access the genetic potential contained in large metagenomic libraries, suitable high-throughput functional screening methods are required. Here we describe a high-throughput screening approach which enables the rapid identification of metagenomic library clones expressing functional accessory lignocellulosic enzymes. The high-throughput nature of this method hinges on the multiplexing of both the E. coli metagenomic library clones and the colorimetric p-nitrophenyl linked substrates which allows for the simultaneous screening for β-glucosidases, β-xylosidases, and α-L-arabinofuranosidases. This method is readily automated and compatible with high-throughput robotic screening systems.

  17. How to design a good photoresist solvent package using solubility parameters and high-throughput research

    Science.gov (United States)

    Tate, Michael P.; Cutler, Charlotte; Sakillaris, Mike; Kaufman, Michael; Estelle, Thomas; Mohler, Carol; Tucker, Chris; Thackeray, Jim

    2014-03-01

    Understanding fundamental properties of photoresists and how interactions between photoresist components affect performance targets are crucial to the continued success of photoresists. More specifically, polymer solubility is critical to the overall performance capability of the photoresist formulation. While several theories describe polymer solvent solubility, the most common industrially applied method is Hansen's solubility parameters. Hansen's method, based on regular solution theory, describes a solute's ability to dissolve in a solvent or solvent blend using four physical properties determined experimentally through regression of solubility data in many known solvents. The four physical parameters are dispersion, polarity, hydrogen bonding, and radius of interaction. Using these parameters a relative cohesive energy difference (RED), which describes a polymer's likelihood to dissolve in a given solvent blend, may be calculated. Leveraging a high throughput workflow to prepare and analyze the thousands of samples necessary to calculate the Hansen's solubility parameters from many different methacrylate-based polymers, we compare the physical descriptors to reveal a large range of polarities and hydrogen bonding. Further, we find that Hansen's model correctly predicts the soluble/insoluble state of 3-component solvent blends where the dispersion, polar, hydrogen-bonding, and radius of interaction values were determined through regression of experimental values. These modeling capabilities have allowed for optimization of the photoresist solubility from initial blending through application providing valuable insights into the nature of photoresist.

  18. Sensitivity and accuracy of high-throughput metabarcoding methods for early detection of invasive fish species

    Science.gov (United States)

    Hatzenbuhler, Chelsea; Kelly, John R.; Martinson, John; Okum, Sara; Pilgrim, Erik

    2017-04-01

    High-throughput DNA metabarcoding has gained recognition as a potentially powerful tool for biomonitoring, including early detection of aquatic invasive species (AIS). DNA based techniques are advancing, but our understanding of the limits to detection for metabarcoding complex samples is inadequate. For detecting AIS at an early stage of invasion when the species is rare, accuracy at low detection limits is key. To evaluate the utility of metabarcoding in future fish community monitoring programs, we conducted several experiments to determine the sensitivity and accuracy of routine metabarcoding methods. Experimental mixes used larval fish tissue from multiple “common” species spiked with varying proportions of tissue from an additional “rare” species. Pyrosequencing of genetic marker, COI (cytochrome c oxidase subunit I) and subsequent sequence data analysis provided experimental evidence of low-level detection of the target “rare” species at biomass percentages as low as 0.02% of total sample biomass. Limits to detection varied interspecifically and were susceptible to amplification bias. Moreover, results showed some data processing methods can skew sequence-based biodiversity measurements from corresponding relative biomass abundances and increase false absences. We suggest caution in interpreting presence/absence and relative abundance in larval fish assemblages until metabarcoding methods are optimized for accuracy and precision.

  19. Continuous high throughput molecular adhesion based cell sorting using ridged microchannels

    Science.gov (United States)

    Tasadduq, Bushra; Wang, Gonghao; Alexeev, Alexander; Sarioglu, Ali Fatih; Sulchek, Todd

    2016-11-01

    Cell molecular interactions govern important physiological processes such as stem cell homing, inflammation and cancer metastasis. But due to a lack of effective separation technologies selective to these interactions it is challenging to specifically sort cells. Other label free separation techniques based on size, stiffness and shape do not provide enough specificity to cell type, and correlation to clinical condition. We propose a novel microfluidic device capable of high throughput molecule dependent separation of cells by flowing them through a microchannel decorated with molecule specific coated ridges. The unique aspect of this sorting design is the use of optimized gap size which is small enough to lightly squeeze the cells while flowing under the ridged part of the channel to increase the surface area for interaction between the ligand on cell surface and coated receptor molecule but large enough so that biomechanical markers, stiffness and viscoelasticity, do not dominate the cell separation mechanism. We are able to separate Jurkat cells based on its expression of PSGL-1ligand using ridged channel coated with P selectin at a flow rate of 0.045ml/min and achieve 2-fold and 5-fold enrichment of PSGL-1 positive and negative Jurkat cells respectively.

  20. FPGA Implementation of Stereo Disparity with High Throughput for Mobility Applications

    Science.gov (United States)

    Villalpando, Carlos Y.; Morfopolous, Arin; Matthies, Larry; Goldberg, Steven

    2011-01-01

    High speed stereo vision can allow unmanned robotic systems to navigate safely in unstructured terrain, but the computational cost can exceed the capacity of typical embedded CPUs. In this paper, we describe an end-to-end stereo computation co-processing system optimized for fast throughput that has been implemented on a single Virtex 4 LX160 FPGA. This system is capable of operating on images from a 1024 x 768 3CCD (true RGB) camera pair at 15 Hz. Data enters the FPGA directly from the cameras via Camera Link and is rectified, pre-filtered and converted into a disparity image all within the FPGA, incurring no CPU load. Once complete, a rectified image and the final disparity image are read out over the PCI bus, for a bandwidth cost of 68 MB/sec. Within the FPGA there are 4 distinct algorithms: Camera Link capture, Bilinear rectification, Bilateral subtraction pre-filtering and the Sum of Absolute Difference (SAD) disparity. Each module will be described in brief along with the data flow and control logic for the system. The system has been successfully fielded upon the Carnegie Mellon University's National Robotics Engineering Center (NREC) Crusher system during extensive field trials in 2007 and 2008 and is being implemented for other surface mobility systems at JPL.

  1. Experimentally validated novel inhibitors of Helicobacter pylori phosphopantetheine adenylyltransferase discovered by virtual high-throughput screening.

    Science.gov (United States)

    Cheng, Chao-Sheng; Jia, Kai-Fan; Chen, Ting; Chang, Shun-Ya; Lin, Ming-Shen; Yin, Hsien-Sheng

    2013-01-01

    Helicobacter pylori is a major etiologic agent associated with the development and maintenance of human gastritis. The goal of this study was to develop novel antibiotics against H. pylori, and we thus targeted H. pylori phosphopantetheine adenylyltransferase (HpPPAT). PPAT catalyzes the penultimate step in coenzyme A biosynthesis. Its inactivation effectively prevents bacterial viability, making it an attractive target for antibacterial drug discovery. We employed virtual high-throughput screening and the HpPPAT crystal structure to identify compounds in the PubChem database that might act as inhibitors of HpPPAT. d-amethopterin is a potential inhibitor for blocking HpPPAT activity and suppressing H. pylori viability. Following treatment with d-amethopterin, H. pylori exhibited morphological characteristics associated with cell death. d-amethopterin is a mixed inhibitor of HpPPAT activity; it simultaneously occupies the HpPPAT 4'-phosphopantetheine- and ATP-binding sites. Its binding affinity is in the micromolar range, implying that it is sufficiently potent to serve as a lead compound in subsequent drug development. Characterization of the d-amethopterin and HpPPAT interaction network in a docked model will allow us to initiate rational drug optimization to improve the inhibitory efficacy of d-amethopterin. We anticipate that novel, potent, and selective HpPPAT inhibitors will emerge for the treatment of H. pylori infection.

  2. Experimentally validated novel inhibitors of Helicobacter pylori phosphopantetheine adenylyltransferase discovered by virtual high-throughput screening.

    Directory of Open Access Journals (Sweden)

    Chao-Sheng Cheng

    Full Text Available Helicobacter pylori is a major etiologic agent associated with the development and maintenance of human gastritis. The goal of this study was to develop novel antibiotics against H. pylori, and we thus targeted H. pylori phosphopantetheine adenylyltransferase (HpPPAT. PPAT catalyzes the penultimate step in coenzyme A biosynthesis. Its inactivation effectively prevents bacterial viability, making it an attractive target for antibacterial drug discovery. We employed virtual high-throughput screening and the HpPPAT crystal structure to identify compounds in the PubChem database that might act as inhibitors of HpPPAT. d-amethopterin is a potential inhibitor for blocking HpPPAT activity and suppressing H. pylori viability. Following treatment with d-amethopterin, H. pylori exhibited morphological characteristics associated with cell death. d-amethopterin is a mixed inhibitor of HpPPAT activity; it simultaneously occupies the HpPPAT 4'-phosphopantetheine- and ATP-binding sites. Its binding affinity is in the micromolar range, implying that it is sufficiently potent to serve as a lead compound in subsequent drug development. Characterization of the d-amethopterin and HpPPAT interaction network in a docked model will allow us to initiate rational drug optimization to improve the inhibitory efficacy of d-amethopterin. We anticipate that novel, potent, and selective HpPPAT inhibitors will emerge for the treatment of H. pylori infection.

  3. Benchmarking Ligand-Based Virtual High-Throughput Screening with the PubChem Database

    Directory of Open Access Journals (Sweden)

    Mariusz Butkiewicz

    2013-01-01

    Full Text Available With the rapidly increasing availability of High-Throughput Screening (HTS data in the public domain, such as the PubChem database, methods for ligand-based computer-aided drug discovery (LB-CADD have the potential to accelerate and reduce the cost of probe development and drug discovery efforts in academia. We assemble nine data sets from realistic HTS campaigns representing major families of drug target proteins for benchmarking LB-CADD methods. Each data set is public domain through PubChem and carefully collated through confirmation screens validating active compounds. These data sets provide the foundation for benchmarking a new cheminformatics framework BCL::ChemInfo, which is freely available for non-commercial use. Quantitative structure activity relationship (QSAR models are built using Artificial Neural Networks (ANNs, Support Vector Machines (SVMs, Decision Trees (DTs, and Kohonen networks (KNs. Problem-specific descriptor optimization protocols are assessed including Sequential Feature Forward Selection (SFFS and various information content measures. Measures of predictive power and confidence are evaluated through cross-validation, and a consensus prediction scheme is tested that combines orthogonal machine learning algorithms into a single predictor. Enrichments ranging from 15 to 101 for a TPR cutoff of 25% are observed.

  4. Detection of genomic variation by selection of a 9 mb DNA region and high throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Sergey I Nikolaev

    Full Text Available Detection of the rare polymorphisms and causative mutations of genetic diseases in a targeted genomic area has become a major goal in order to understand genomic and phenotypic variability. We have interrogated repeat-masked regions of 8.9 Mb on human chromosomes 21 (7.8 Mb and 7 (1.1 Mb from an individual from the International HapMap Project (NA12872. We have optimized a method of genomic selection for high throughput sequencing. Microarray-based selection and sequencing resulted in 260-fold enrichment, with 41% of reads mapping to the target region. 83% of SNPs in the targeted region had at least 4-fold sequence coverage and 54% at least 15-fold. When assaying HapMap SNPs in NA12872, our sequence genotypes are 91.3% concordant in regions with coverage > or = 4-fold, and 97.9% concordant in regions with coverage > or = 15-fold. About 81% of the SNPs recovered with both thresholds are listed in dbSNP. We observed that regions with low sequence coverage occur in close proximity to low-complexity DNA. Validation experiments using Sanger sequencing were performed for 46 SNPs with 15-20 fold coverage, with a confirmation rate of 96%, suggesting that DNA selection provides an accurate and cost-effective method for identifying rare genomic variants.

  5. AWV: high-throughput cross-array cross-wafer variation mapping

    Science.gov (United States)

    Yeo, Jeong-Ho; Lee, Byoung-Ho; Lee, Tae-Yong; Greenberg, Gadi; Meshulach, Doron; Ravid, Erez; Levi, Shimon; Kan, Kobi; Shabtay, Saar; Cohen, Yehuda; Rotlevi, Ofer

    2008-03-01

    Minute variations in advanced VLSI manufacturing processes are well known to significantly impact device performance and die yield. These variations drive the need for increased measurement sampling with a minimal impact on Fab productivity. Traditional discrete measurements such as CDSEM or OCD, provide, statistical information for process control and monitoring. Typically these measurements require a relatively long time and cover only a fraction of the wafer area. Across array across wafer variation mapping ( AWV) suggests a new approach for high throughput, full wafer process variation monitoring, using a DUV bright-field inspection tool. With this technique we present a full wafer scanning, visualizing the variation trends within a single die and across the wafer. The underlying principle of the AWV inspection method is to measure variations in the reflected light from periodic structures, under optimized illumination and collection conditions. Structural changes in the periodic array induce variations in the reflected light. This information is collected and analyzed in real time. In this paper we present AWV concept, measurements and simulation results. Experiments were performed using a DUV bright-field inspection tool (UVision (TM), Applied Materials) on a memory short loop experiment (SLE), Focus Exposure Matrix (FEM) and normal wafers. AWV and CDSEM results are presented to reflect CD variations within a memory array and across wafers.

  6. High Throughput Atomic Layer Deposition Processes: High Pressure Operations, New Reactor Designs, and Novel Metal Processing

    Science.gov (United States)

    Mousa, MoatazBellah Mahmoud

    Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor

  7. INSIDIA: A FIJI Macro Delivering High-Throughput and High-Content Spheroid Invasion Analysis.

    Science.gov (United States)

    Moriconi, Chiara; Palmieri, Valentina; Di Santo, Riccardo; Tornillo, Giusy; Papi, Massimiliano; Pilkington, Geoff; De Spirito, Marco; Gumbleton, Mark

    2017-10-01

    Time-series image capture of in vitro 3D spheroidal cancer models embedded within an extracellular matrix affords examination of spheroid growth and cancer cell invasion. However, a customizable, comprehensive and open source solution for the quantitative analysis of such spheroid images is lacking. Here, the authors describe INSIDIA (INvasion SpheroID ImageJ Analysis), an open-source macro implemented as a customizable software algorithm running on the FIJI platform, that enables high-throughput high-content quantitative analysis of spheroid images (both bright-field gray and fluorescent images) with the output of a range of parameters defining the spheroid "tumor" core and its invasive characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  9. High pressure inertial focusing for separating and concentrating bacteria at high throughput

    Science.gov (United States)

    Cruz, J.; Hooshmand Zadeh, S.; Graells, T.; Andersson, M.; Malmström, J.; Wu, Z. G.; Hjort, K.

    2017-08-01

    Inertial focusing is a promising microfluidic technology for concentration and separation of particles by size. However, there is a strong correlation of increased pressure with decreased particle size. Theory and experimental results for larger particles were used to scale down the phenomenon and find the conditions that focus 1 µm particles. High pressure experiments in robust glass chips were used to demonstrate the alignment. We show how the technique works for 1 µm spherical polystyrene particles and for Escherichia coli, not being harmful for the bacteria at 50 µl min-1. The potential to focus bacteria, simplicity of use and high throughput make this technology interesting for healthcare applications, where concentration and purification of a sample may be required as an initial step.

  10. High-throughput time-stretch microscopy with morphological and chemical specificity

    Science.gov (United States)

    Lei, Cheng; Ugawa, Masashi; Nozawa, Taisuke; Ideguchi, Takuro; Di Carlo, Dino; Ota, Sadao; Ozeki, Yasuyuki; Goda, Keisuke

    2016-03-01

    Particle analysis is an effective method in analytical chemistry for sizing and counting microparticles such as emulsions, colloids, and biological cells. However, conventional methods for particle analysis, which fall into two extreme categories, have severe limitations. Sieving and Coulter counting are capable of analyzing particles with high throughput, but due to their lack of detailed information such as morphological and chemical characteristics, they can only provide statistical results with low specificity. On the other hand, CCD or CMOS image sensors can be used to analyze individual microparticles with high content, but due to their slow charge download, the frame rate (hence, the throughput) is significantly limited. Here by integrating a time-stretch optical microscope with a three-color fluorescent analyzer on top of an inertial-focusing microfluidic device, we demonstrate an optofluidic particle analyzer with a sub-micrometer spatial resolution down to 780 nm and a high throughput of 10,000 particles/s. In addition to its morphological specificity, the particle analyzer provides chemical specificity to identify chemical expressions of particles via fluorescence detection. Our results indicate that we can identify different species of microparticles with high specificity without sacrificing throughput. Our method holds promise for high-precision statistical particle analysis in chemical industry and pharmaceutics.

  11. Predicting gene function through systematic analysis and quality assessment of high-throughput data.

    Science.gov (United States)

    Kemmeren, Patrick; Kockelkorn, Thessa T J P; Bijma, Theo; Donders, Rogier; Holstege, Frank C P

    2005-04-15

    Determining gene function is an important challenge arising from the availability of whole genome sequences. Until recently, approaches based on sequence homology were the only high-throughput method for predicting gene function. Use of high-throughput generated experimental data sets for determining gene function has been limited for several reasons. Here a new approach is presented for integration of high-throughput data sets, leading to prediction of function based on relationships supported by multiple types and sources of data. This is achieved with a database containing 125 different high-throughput data sets describing phenotypes, cellular localizations, protein interactions and mRNA expression levels from Saccharomyces cerevisiae, using a bit-vector representation and information content-based ranking. The approach takes characteristic and qualitative differences between the data sets into account, is highly flexible, efficient and scalable. Database queries result in predictions for 543 uncharacterized genes, based on multiple functional relationships each supported by at least three types of experimental data. Some of these are experimentally verified, further demonstrating their reliability. The results also generate insights into the relative merits of different data types and provide a coherent framework for functional genomic datamining. Free availability over the Internet. f.c.p.holstege@med.uu.nl http://www.genomics.med.uu.nl/pub/pk/comb_gen_network.

  12. RECENT PROCESS AND EQUIPMENT IMPROVEMENTS TO INCREASE HIGH LEVEL WASTE THROUGHPUT AT THE DEFENSE WASTE PROCESSING FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Odriscoll, R; Allan Barnes, A; Jim Coleman, J; Timothy Glover, T; Robert Hopkins, R; Dan Iverson, D; Jeff Leita, J

    2008-01-15

    The Savannah River Site's (SRS) Defense Waste Processing Facility (DWPF) began stabilizing high level waste (HLW) in a glass matrix in 1996. Over the past few years, there have been several process and equipment improvements at the DWPF to increase the rate at which the high level waste can be stabilized. These improvements have either directly increased waste processing rates or have desensitized the process to upsets, thereby minimizing downtime and increasing production. Improvements due to optimization of waste throughput with increased HLW loading of the glass resulted in a 6% waste throughput increase based upon operational efficiencies. Improvements in canister production include the pour spout heated bellows liner (5%), glass surge (siphon) protection software (2%), melter feed pump software logic change to prevent spurious interlocks of the feed pump with subsequent dilution of feed stock (2%) and optimization of the steam atomized scrubber (SAS) operation to minimize downtime (3%) for a total increase in canister production of 12%. A number of process recovery efforts have allowed continued operation. These include the off gas system pluggage and restoration, slurry mix evaporator (SME) tank repair and replacement, remote cleaning of melter top head center nozzle, remote melter internal inspection, SAS pump J-Tube recovery, inadvertent pour scenario resolutions, dome heater transformer bus bar cooling water leak repair and new Infra-red camera for determination of glass height in the canister are discussed.

  13. Large-scale microfluidics providing high-resolution and high-throughput screening of Caenorhabditis elegans poly-glutamine aggregation model

    Science.gov (United States)

    Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela

    2016-10-01

    Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.

  14. Upscaling and automation of electrophysiology: toward high throughput screening in ion channel drug discovery

    DEFF Research Database (Denmark)

    Asmild, Margit; Oswald, Nicholas; Krzywkowski, Karen M

    2003-01-01

    Effective screening of large compound libraries in ion channel drug discovery requires the development of new electrophysiological techniques with substantially increased throughputs compared to the conventional patch clamp technique. Sophion Bioscience is aiming to meet this challenge by develop......Effective screening of large compound libraries in ion channel drug discovery requires the development of new electrophysiological techniques with substantially increased throughputs compared to the conventional patch clamp technique. Sophion Bioscience is aiming to meet this challenge...... by developing two lines of automated patch clamp products, a traditional pipette-based system called Apatchi-1, and a silicon chip-based system QPatch. The degree of automation spans from semi-automation (Apatchi-1) where a trained technician interacts with the system in a limited way, to a complete automation...... (QPatch 96) where the system works continuously and unattended until screening of a full compound library is completed. The performance of the systems range from medium to high throughputs....

  15. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    Science.gov (United States)

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  16. Lights, camera, action: high-throughput plant phenotyping is ready for a close-up.

    Science.gov (United States)

    Fahlgren, Noah; Gehan, Malia A; Baxter, Ivan

    2015-04-01

    Anticipated population growth, shifting demographics, and environmental variability over the next century are expected to threaten global food security. In the face of these challenges, crop yield for food and fuel must be maintained and improved using fewer input resources. In recent years, genetic tools for profiling crop germplasm has benefited from rapid advances in DNA sequencing, and now similar advances are needed to improve the throughput of plant phenotyping. We highlight recent developments in high-throughput plant phenotyping using robotic-assisted imaging platforms and computer vision-assisted analysis tools. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis

    Directory of Open Access Journals (Sweden)

    Na Wen

    2016-07-01

    Full Text Available This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1 prototype demonstration of single-cell encapsulation in microfluidic droplets; (2 technical improvements of single-cell encapsulation in microfluidic droplets; (3 microfluidic droplets enabling single-cell proteomic analysis; (4 microfluidic droplets enabling single-cell genomic analysis; and (5 integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  18. A high throughput Nile red method for quantitative measurement of neutral lipids in microalgae.

    Science.gov (United States)

    Chen, Wei; Zhang, Chengwu; Song, Lirong; Sommerfeld, Milton; Hu, Qiang

    2009-04-01

    Isolation of high neutral lipid-containing microalgae is key to the commercial success of microalgae-based biofuel production. The Nile red fluorescence method has been successfully applied to the determination of lipids in certain microalgae, but has been unsuccessful in many others, particularly those with thick, rigid cell walls that prevent the penetration of the fluorescence dye. The conventional "one sample at a time" method was also time-consuming. In this study, the solvent dimethyl sulfoxide (DMSO) was introduced to microalgal samples as the stain carrier at an elevated temperature. The cellular neutral lipids were determined and quantified using a 96-well plate on a fluorescence spectrophotometer with an excitation wavelength of 530 nm and an emission wavelength of 575 nm. An optimized procedure yielded a high correlation coefficient (R(2)=0.998) with the lipid standard triolein and repeated measurements of replicates. Application of the improved method to several green algal strains gave very reproducible results with relative standard errors of 8.5%, 3.9% and 8.6%, 4.5% for repeatability and reproducibility at two concentration levels (2.0 microg/mL and 20 microg/mL), respectively. Moreover, the detection and quantification limits of the improved Nile red staining method were 0.8 microg/mL and 2.0 microg/mL for the neutral lipid standard triolein, respectively. The modified method and a conventional gravimetric determination method provided similar results on replicate samples. The 96-well plate-based Nile red method can be used as a high throughput technique for rapid screening of a broader spectrum of naturally-occurring and genetically-modified algal strains and mutants for high neutral lipid/oil production.

  19. Holographic memory module with ultra-high capacity and throughput

    Energy Technology Data Exchange (ETDEWEB)

    Vladimir A. Markov, Ph.D.

    2000-06-04

    High capacity, high transfer rate, random access memory systems are needed to archive and distribute the tremendous volume of digital information being generated, for example, the human genome mapping and online libraries. The development of multi-gigabit per second networks underscores the need for next-generation archival memory systems. During Phase I we conducted the theoretical analysis and accomplished experimental tests that validated the key aspects of the ultra-high density holographic data storage module with high transfer rate. We also inspected the secure nature of the encoding method and estimated the performance of full-scale system. Two basic architectures were considered, allowing for reversible compact solid-state configuration with limited capacity, and very large capacity write once read many memory system.

  20. Multidimensional NMR approaches towards highly resolved, sensitive and high-throughput quantitative metabolomics.

    Science.gov (United States)

    Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick

    2017-02-01

    Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis

  2. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  3. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  4. Microfluidic Impedance Flow Cytometry Enabling High-Throughput Single-Cell Electrical Property Characterization

    OpenAIRE

    Jian Chen; Chengcheng Xue; Yang Zhao; Deyong Chen; Min-Hsien Wu; Junbo Wang

    2015-01-01

    This article reviews recent developments in microfluidic impedance flow cytometry for high-throughput electrical property characterization of single cells. Four major perspectives of microfluidic impedance flow cytometry for single-cell characterization are included in this review: (1) early developments of microfluidic impedance flow cytometry for single-cell electrical property characterization; (2) microfluidic impedance flow cytometry with enhanced sensitivity; (3) microfluidic impedance ...

  5. The Power of High-Throughput Experimentation in Homogeneous Catalysis Research for Fine Chemicals

    NARCIS (Netherlands)

    Vries, Johannes G. de; Vries, André H.M. de

    2003-01-01

    The use of high-throughput experimentation (HTE) in homogeneous catalysis research for the production of fine chemicals is an important breakthrough. Whereas in the past stoichiometric chemistry was often preferred because of time-to-market constraints, HTE allows catalytic solutions to be found

  6. Functional characterisation of human glycine receptors in a fluorescence-based high throughput screening assay

    DEFF Research Database (Denmark)

    Jensen, Anders A.

    2005-01-01

    receptors in this assay were found to be in good agreement with those from electrophysiology studies of the receptors expressed in Xenopus oocytes or mammalian cell lines. Hence, this high throughput screening assay will be of great use in future pharmacological studies of glycine receptors, particular...

  7. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    Science.gov (United States)

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...

  8. Application of high-throughput technologies to a structural proteomics-type analysis of Bacillus anthracis

    NARCIS (Netherlands)

    Au, K.; Folkers, G.E.; Kaptein, R.

    2006-01-01

    A collaborative project between two Structural Proteomics In Europe (SPINE) partner laboratories, York and Oxford, aimed at high-throughput (HTP) structure determination of proteins from Bacillus anthracis, the aetiological agent of anthrax and a biomedically important target, is described. Based

  9. Roche genome sequencer FLX based high-throughput sequencing of ancient DNA

    DEFF Research Database (Denmark)

    Alquezar-Planas, David E; Fordyce, Sarah Louise

    2012-01-01

    Since the development of so-called "next generation" high-throughput sequencing in 2005, this technology has been applied to a variety of fields. Such applications include disease studies, evolutionary investigations, and ancient DNA. Each application requires a specialized protocol to ensure tha...

  10. 20170913 - Retrofit Strategies for Incorporating Xenobiotic Metabolism into High Throughput Screening Assays (EMGS)

    Science.gov (United States)

    The US EPA’s ToxCast program is designed to assess chemical perturbations of molecular and cellular endpoints using a variety of high-throughput screening (HTS) assays. However, existing HTS assays have limited or no xenobiotic metabolism which could lead to a mischaracteri...

  11. PLASMA PROTEIN PROFILING AS A HIGH THROUGHPUT TOOL FOR CHEMICAL SCREENING USING A SMALL FISH MODEL

    Science.gov (United States)

    Hudson, R. Tod, Michael J. Hemmer, Kimberly A. Salinas, Sherry S. Wilkinson, James Watts, James T. Winstead, Peggy S. Harris, Amy Kirkpatrick and Calvin C. Walker. In press. Plasma Protein Profiling as a High Throughput Tool for Chemical Screening Using a Small Fish Model (Abstra...

  12. HTPheno: an image analysis pipeline for high-throughput plant phenotyping.

    Science.gov (United States)

    Hartmann, Anja; Czauderna, Tobias; Hoffmann, Roberto; Stein, Nils; Schreiber, Falk

    2011-05-12

    In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. This paper presents an image analysis pipeline (HTPheno) for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view) during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  13. HTPheno: An image analysis pipeline for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Stein Nils

    2011-05-01

    Full Text Available Abstract Background In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. Results This paper presents an image analysis pipeline (HTPheno for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. Conclusions HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  14. Investigation of non-halogenated solvent mixtures for high throughput fabrication of polymerfullerene solar cells

    NARCIS (Netherlands)

    Schmidt-Hansberg, B.; Sanyal, M.; Grossiord, N.; Galagan, Y.O.; Baunach, M.; Klein, M.F.G.; Colsmann, A.; Scharfer, P.; Lemmer, U.; Dosch, H.; Michels, J.J; Barrena, E.; Schabel, W.

    2012-01-01

    The rapidly increasing power conversion efficiencies of organic solar cells are an important prerequisite towards low cost photovoltaic fabricated in high throughput. In this work we suggest indane as a non-halogenated replacement for the commonly used halogenated solvent o-dichlorobenzene. Indane

  15. ESSENTIALS: Software for Rapid Analysis of High Throughput Transposon Insertion Sequencing Data.

    NARCIS (Netherlands)

    Zomer, A.L.; Burghout, P.J.; Bootsma, H.J.; Hermans, P.W.M.; Hijum, S.A.F.T. van

    2012-01-01

    High-throughput analysis of genome-wide random transposon mutant libraries is a powerful tool for (conditional) essential gene discovery. Recently, several next-generation sequencing approaches, e.g. Tn-seq/INseq, HITS and TraDIS, have been developed that accurately map the site of transposon

  16. Development of a thyroperoxidase inhibition assay for high-throughput screening

    Science.gov (United States)

    High-throughput screening (HTPS) assays to detect inhibitors of thyroperoxidase (TPO), the enzymatic catalyst for thyroid hormone (TH) synthesis, are not currently available. Herein we describe the development of a HTPS TPO inhibition assay. Rat thyroid microsomes and a fluores...

  17. Evaluation of Simple and Inexpensive High-Throughput Methods for Phytic Acid Determination

    DEFF Research Database (Denmark)

    Raboy, Victor; Johnson, Amy; Bilyeu, Kristin

    2017-01-01

    High-throughput/low-cost/low-tech methods for phytic acid determination that are sufficiently accurate and reproducible would be of value in plant genetics, crop breeding and in the food and feed industries. Variants of two candidate methods, those described by Vaintraub and Lapteva (Anal Biochem...

  18. High-throughput genotoxicity assay identifies antioxidants as inducers of DNA damage response and cell death

    Science.gov (United States)

    Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...

  19. Virtual high screening throughput and design of 14α-lanosterol ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-07-06

    Jul 6, 2009 ... Hildebert B. Maurice1*, Esther Tuarira1 and Kennedy Mwambete2. 1School of Pharmaceutical Sciences, Institute of Allied Health Sciences, Muhimbili University of ... high throughput screening (Guardiola-Diaz et al.,. 2001). It is therefore logical to think that developing inhi- bitors against the mycobacterial ...

  20. Increasing ecological inference from high throughput sequencing of fungi in the environment through a tagging approach

    Science.gov (United States)

    D. Lee Taylor; Michael G. Booth; Jack W. McFarland; Ian C. Herriott; Niall J. Lennon; Chad Nusbaum; Thomas G. Marr

    2008-01-01

    High throughput sequencing methods are widely used in analyses of microbial diversity but are generally applied to small numbers of samples, which precludes charaterization of patterns of microbial diversity across space and time. We have designed a primer-tagging approach that allows pooling and subsequent sorting of numerous samples, which is directed to...

  1. A high throughput platform for understanding the influence of excipients on physical and chemical stability

    DEFF Research Database (Denmark)

    Raijada, Dhara; Cornett, Claus; Rantanen, Jukka

    2013-01-01

    The present study puts forward a miniaturized high-throughput platform to understand influence of excipient selection and processing on the stability of a given drug compound. Four model drugs (sodium naproxen, theophylline, amlodipine besylate and nitrofurantoin) and ten different excipients were...

  2. A High-Throughput MALDI-TOF Mass Spectrometry-Based Assay of Chitinase Activity

    Science.gov (United States)

    A high-throughput MALDI-TOF mass spectrometric assay is described for assay of chitolytic enzyme activity. The assay uses unmodified chitin oligosaccharide substrates, and is readily achievable on a microliter scale (2 µL total volume, containing 2 µg of substrate and 1 ng of protein). The speed a...

  3. High-throughput siRNA screening applied to the ubiquitin-proteasome system

    DEFF Research Database (Denmark)

    Poulsen, Esben Guldahl; Nielsen, Sofie V.; Pietras, Elin J.

    2016-01-01

    that are not genetically tractable as, for instance, a yeast model system. Here, we describe a method relying on high-throughput cellular imaging of cells transfected with a targeted siRNA library to screen for components involved in degradation of a protein of interest. This method is a rapid and cost-effective tool...

  4. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  5. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free...

  6. High throughput generated micro-aggregates of chondrocytes stimulate cartilage formation in vitro and in vivo

    NARCIS (Netherlands)

    Moreira Teixeira, Liliana; Leijten, Jeroen Christianus Hermanus; Sobral, J.; Jin, R.; van Apeldoorn, Aart A.; Feijen, Jan; van Blitterswijk, Clemens; Dijkstra, Pieter J.; Karperien, Hermanus Bernardus Johannes

    2012-01-01

    Cell-based cartilage repair strategies such as matrix-induced autologous chondrocyte implantation (MACI) could be improved by enhancing cell performance. We hypothesised that micro-aggregates of chondrocytes generated in high-throughput prior to implantation in a defect could stimulate cartilaginous

  7. DNA from buccal swabs suitable for high-throughput SNP multiplex analysis.

    Science.gov (United States)

    McMichael, Gai L; Gibson, Catherine S; O'Callaghan, Michael E; Goldwater, Paul N; Dekker, Gustaaf A; Haan, Eric A; MacLennan, Alastair H

    2009-12-01

    We sought a convenient and reliable method for collection of genetic material that is inexpensive and noninvasive and suitable for self-collection and mailing and a compatible, commercial DNA extraction protocol to meet quantitative and qualitative requirements for high-throughput single nucleotide polymorphism (SNP) multiplex analysis on an automated platform. Buccal swabs were collected from 34 individuals as part of a pilot study to test commercially available buccal swabs and DNA extraction kits. DNA was quantified on a spectrofluorometer with Picogreen dsDNA prior to testing the DNA integrity with predesigned SNP multiplex assays. Based on the pilot study results, the Catch-All swabs and Isohelix buccal DNA isolation kit were selected for our high-throughput application and extended to a further 1140 samples as part of a large cohort study. The average DNA yield in the pilot study (n=34) was 1.94 microg +/- 0.54 with a 94% genotyping pass rate. For the high-throughput application (n=1140), the average DNA yield was 2.44 microg +/- 1.74 with a >or=93% genotyping pass rate. The Catch-All buccal swabs are a convenient and cost-effective alternative to blood sampling. Combined with the Isohelix buccal DNA isolation kit, they provided DNA of sufficient quantity and quality for high-throughput SNP multiplex analysis.

  8. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    Science.gov (United States)

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  9. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an

  10. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...

  11. High throughput system for magnetic manipulation of cells, polymers, and biomaterials

    Science.gov (United States)

    Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.

    2008-01-01

    In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357

  12. The Impact of Data Fragmentation on High-Throughput Clinical Phenotyping

    Science.gov (United States)

    Wei, Weiqi

    2012-01-01

    Subject selection is essential and has become the rate-limiting step for harvesting knowledge to advance healthcare through clinical research. Present manual approaches inhibit researchers from conducting deep and broad studies and drawing confident conclusions. High-throughput clinical phenotyping (HTCP), a recently proposed approach, leverages…

  13. A high-throughput method for GMO multi-detection using a microfluidic dynamic array

    NARCIS (Netherlands)

    Brod, F.C.A.; Dijk, van J.P.; Voorhuijzen, M.M.; Dinon, A.Z.; Guimarães, L.H.S.; Scholtens, I.M.J.; Arisi, A.C.M.; Kok, E.J.

    2014-01-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNAbased methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the

  14. High-throughput assessment of context-dependent effects of chromatin proteins

    NARCIS (Netherlands)

    Brueckner, L. (Laura); Van Arensbergen, J. (Joris); Akhtar, W. (Waseem); L. Pagie (Ludo); B. van Steensel (Bas)

    2016-01-01

    textabstractBackground: Chromatin proteins control gene activity in a concerted manner. We developed a high-throughput assay to study the effects of the local chromatin environment on the regulatory activity of a protein of interest. The assay combines a previously reported multiplexing strategy

  15. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    Science.gov (United States)

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (ΦPSII). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of ΦPSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll

  16. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    ORGANIZATION REPORT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 10. SPONSOR / MONITOR’S ACRONYM(S)9. SPONSORING / MONITORING AGENCY...the high computing power of the main supercomputer. Each supercomputer is different in node architecture as well as hardware specifications. 2

  17. HiCTMap: Detection and analysis of chromosome territory structure and position by high-throughput imaging.

    Science.gov (United States)

    Jowhar, Ziad; Gudla, Prabhakar R; Shachar, Sigal; Wangsa, Darawalee; Russ, Jill L; Pegoraro, Gianluca; Ried, Thomas; Raznahan, Armin; Misteli, Tom

    2018-02-10

    The spatial organization of chromosomes in the nuclear space is an extensively studied field that relies on measurements of structural features and 3D positions of chromosomes with high precision and robustness. However, no tools are currently available to image and analyze chromosome territories in a high-throughput format. Here, we have developed High-throughput Chromosome Territory Mapping (HiCTMap), a method for the robust and rapid analysis of 2D and 3D chromosome territory positioning in mammalian cells. HiCTMap is a high-throughput imaging-based chromosome detection method which enables routine analysis of chromosome structure and nuclear position. Using an optimized FISH staining protocol in a 384-well plate format in conjunction with a bespoke automated image analysis workflow, HiCTMap faithfully detects chromosome territories and their position in 2D and 3D in a large population of cells per experimental condition. We apply this novel technique to visualize chromosomes 18, X, and Y in male and female primary human skin fibroblasts, and show accurate detection of the correct number of chromosomes in the respective genotypes. Given the ability to visualize and quantitatively analyze large numbers of nuclei, we use HiCTMap to measure chromosome territory area and volume with high precision and determine the radial position of chromosome territories using either centroid or equidistant-shell analysis. The HiCTMap protocol is also compatible with RNA FISH as demonstrated by simultaneous labeling of X chromosomes and Xist RNA in female cells. We suggest HiCTMap will be a useful tool for routine precision mapping of chromosome territories in a wide range of cell types and tissues. Published by Elsevier Inc.

  18. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  19. Nanostructured biosensing platform-shadow edge lithography for high-throughput nanofabrication.

    Science.gov (United States)

    Bai, John G; Yeo, Woon-Hong; Chung, Jae-Hyun

    2009-02-07

    One of the critical challenges in nanostructured biosensors is to manufacture an addressable array of nanopatterns at low cost. The addressable array (1) provides multiplexing for biomolecule detection and (2) enables direct detection of biomolecules without labeling and amplification. To fabricate such an array of nanostructures, current nanolithography methods are limited by the lack of either high throughput or high resolution. This paper presents a high-resolution and high-throughput nanolithography method using the compensated shadow effect in high-vacuum evaporation. The approach enables the fabrication of uniform nanogaps down to 20 nm in width across a 100 mm silicon wafer. The nanogap pattern is used as a template for the routine fabrication of zero-, one-, and two-dimensional nanostructures with a high yield. The method can facilitate the fabrication of nanostructured biosensors on a wafer scale at a low manufacturing cost.

  20. The development and implementation of high-throughput tools for discovery and characterization of proton exchange membranes

    Science.gov (United States)

    Reed, Keith Gregory

    supplied by Arkema, Inc. Although mass transport values were generally lower than reported literature values, pervaporation experiments showed that HT-MTA could be used to effectively screen and optimize relative water transport characteristics in PEMs. To further demonstrate the utility of HT-MTA, the instrument was incorporated into the lab's current high-throughput characterization toolset and used to investigate the mechanisms and effects of rapid free radical degradation of NafionRTM membranes based on various concentrations of H2O2 and Fe2+. The results showed that changes in Nafion'sRTM mechanical, conductive, and water transport properties were strong functions of H2O 2, and that maximal degradation could be achieved around 50 ppm Fe 2+. Furthermore, by including chemical composition analysis techniques in the characterization toolset, the dominating free radical degradation pathways could be deduced. These results are promising for later correlating rapidly aged degradation experiments to in situ fuel cell lifetime testing which is both time-intensive and costly. The high-throughput toolset was also used to develop a novel optimized blend consisting of polyetherimide (PEI), a low-cost high performance resin, and sulfonated PEI (S-PEI) made using a relatively mild post sulfonation reaction with trimethylsilyl chlorosulfonate. The effects of blend composition and thermal annealing on film performance were evaluated and the polymer system was shown to have optimal mechanical and ion-conducting properties between 20--30 wt% S-PEI in the unannealed state. Although the properties of the proposed PEI-based polymer system were below PEMFC performance standards, a PEI film with superior mechanical properties was discovered and should prove to be useful in other applications. In general, this work shows promising results for efficiently developing advanced polymer materials using high-throughput screening techniques.

  1. Using Mendelian inheritance to improve high-throughput SNP discovery.

    Science.gov (United States)

    Chen, Nancy; Van Hout, Cristopher V; Gottipati, Srikanth; Clark, Andrew G

    2014-11-01

    Restriction site-associated DNA sequencing or genotyping-by-sequencing (GBS) approaches allow for rapid and cost-effective discovery and genotyping of thousands of single-nucleotide polymorphisms (SNPs) in multiple individuals. However, rigorous quality control practices are needed to avoid high levels of error and bias with these reduced representation methods. We developed a formal statistical framework for filtering spurious loci, using Mendelian inheritance patterns in nuclear families, that accommodates variable-quality genotype calls and missing data--both rampant issues with GBS data--and for identifying sex-linked SNPs. Simulations predict excellent performance of both the Mendelian filter and the sex-linkage assignment under a variety of conditions. We further evaluate our method by applying it to real GBS data and validating a subset of high-quality SNPs. These results demonstrate that our metric of Mendelian inheritance is a powerful quality filter for GBS loci that is complementary to standard coverage and Hardy-Weinberg filters. The described method, implemented in the software MendelChecker, will improve quality control during SNP discovery in nonmodel as well as model organisms. Copyright © 2014 by the Genetics Society of America.

  2. High throughput selection of effective serodiagnostics for Trypanosoma cruzi infection.

    Directory of Open Access Journals (Sweden)

    Gretchen Cooley

    2008-10-01

    Full Text Available Diagnosis of Trypanosoma cruzi infection by direct pathogen detection is complicated by the low parasite burden in subjects persistently infected with this agent of human Chagas disease. Determination of infection status by serological analysis has also been faulty, largely due to the lack of well-characterized parasite reagents for the detection of anti-parasite antibodies.In this study, we screened more than 400 recombinant proteins of T. cruzi, including randomly selected and those known to be highly expressed in the parasite stages present in mammalian hosts, for the ability to detect anti-parasite antibodies in the sera of subjects with confirmed or suspected T. cruzi infection.A set of 16 protein groups were identified and incorporated into a multiplex bead array format which detected 100% of >100 confirmed positive sera and also documented consistent, strong and broad responses in samples undetected or discordant using conventional serologic tests. Each serum had a distinct but highly stable reaction pattern. This diagnostic panel was also useful for monitoring drug treatment efficacy in chronic Chagas disease.These results substantially extend the variety and quality of diagnostic targets for Chagas disease and offer a useful tool for determining treatment success or failure.

  3. High-throughput discovery of novel developmental phenotypes

    Science.gov (United States)

    Dickinson, Mary E.; Flenniken, Ann M.; Ji, Xiao; Teboul, Lydia; Wong, Michael D.; White, Jacqueline K.; Meehan, Terrence F.; Weninger, Wolfgang J.; Westerberg, Henrik; Adissu, Hibret; Baker, Candice N.; Bower, Lynette; Brown, James M.; Caddle, L. Brianna; Chiani, Francesco; Clary, Dave; Cleak, James; Daly, Mark J.; Denegre, James M.; Doe, Brendan; Dolan, Mary E.; Edie, Sarah M.; Fuchs, Helmut; Gailus-Durner, Valerie; Galli, Antonella; Gambadoro, Alessia; Gallegos, Juan; Guo, Shiying; Horner, Neil R.; Hsu, Chih-wei; Johnson, Sara J.; Kalaga, Sowmya; Keith, Lance C.; Lanoue, Louise; Lawson, Thomas N.; Lek, Monkol; Mark, Manuel; Marschall, Susan; Mason, Jeremy; McElwee, Melissa L.; Newbigging, Susan; Nutter, Lauryl M.J.; Peterson, Kevin A.; Ramirez-Solis, Ramiro; Rowland, Douglas J.; Ryder, Edward; Samocha, Kaitlin E.; Seavitt, John R.; Selloum, Mohammed; Szoke-Kovacs, Zsombor; Tamura, Masaru; Trainor, Amanda G; Tudose, Ilinca; Wakana, Shigeharu; Warren, Jonathan; Wendling, Olivia; West, David B.; Wong, Leeyean; Yoshiki, Atsushi; MacArthur, Daniel G.; Tocchini-Valentini, Glauco P.; Gao, Xiang; Flicek, Paul; Bradley, Allan; Skarnes, William C.; Justice, Monica J.; Parkinson, Helen E.; Moore, Mark; Wells, Sara; Braun, Robert E.; Svenson, Karen L.; de Angelis, Martin Hrabe; Herault, Yann; Mohun, Tim; Mallon, Ann-Marie; Henkelman, R. Mark; Brown, Steve D.M.; Adams, David J.; Lloyd, K.C. Kent; McKerlie, Colin; Beaudet, Arthur L.; Bucan, Maja; Murray, Stephen A.

    2016-01-01

    Approximately one third of all mammalian genes are essential for life. Phenotypes resulting from mouse knockouts of these genes have provided tremendous insight into gene function and congenital disorders. As part of the International Mouse Phenotyping Consortium effort to generate and phenotypically characterize 5000 knockout mouse lines, we have identified 410 lethal genes during the production of the first 1751 unique gene knockouts. Using a standardised phenotyping platform that incorporates high-resolution 3D imaging, we identified novel phenotypes at multiple time points for previously uncharacterized genes and additional phenotypes for genes with previously reported mutant phenotypes. Unexpectedly, our analysis reveals that incomplete penetrance and variable expressivity are common even on a defined genetic background. In addition, we show that human disease genes are enriched for essential genes identified in our screen, thus providing a novel dataset that facilitates prioritization and validation of mutations identified in clinical sequencing efforts. PMID:27626380

  4. Single DNA molecule patterning for high-throughput epigenetic mapping.

    Science.gov (United States)

    Cerf, Aline; Cipriany, Benjamin R; Benítez, Jaime J; Craighead, Harold G

    2011-11-01

    We present a method for profiling the 5-methyl cytosine distribution on single DNA molecules. Our method combines soft-lithography and molecular elongation to form ordered arrays estimated to contain more than 250 000 individual DNA molecules immobilized on a solid substrate. The methylation state of the DNA is detected and mapped by binding of fluorescently labeled methyl-CpG binding domain peptides to the elongated dsDNA molecules and imaging of their distribution. The stretched molecules are fixed in their extended configuration by adsorption onto the substrate so analysis can be performed with high spatial resolution and signal averaging. We further prove this technique allows imaging of DNA molecules with different methylation states.

  5. High-throughput screening of chromatographic separations: IV. Ion-exchange.

    Science.gov (United States)

    Kelley, Brian D; Switzer, Mary; Bastek, Patrick; Kramarczyk, Jack F; Molnar, Kathleen; Yu, Tianning; Coffman, Jon

    2008-08-01

    Ion-exchange (IEX) chromatography steps are widely applied in protein purification processes because of their high capacity, selectivity, robust operation, and well-understood principles. Optimization of IEX steps typically involves resin screening and selection of the pH and counterion concentrations of the load, wash, and elution steps. Time and material constraints associated with operating laboratory columns often preclude evaluating more than 20-50 conditions during early stages of process development. To overcome this limitation, a high-throughput screening (HTS) system employing a robotic liquid handling system and 96-well filterplates was used to evaluate various operating conditions for IEX steps for monoclonal antibody (mAb) purification. A screening study for an adsorptive cation-exchange step evaluated eight different resins. Sodium chloride concentrations defining the operating boundaries of product binding and elution were established at four different pH levels for each resin. Adsorption isotherms were measured for 24 different pH and salt combinations for a single resin. An anion-exchange flowthrough step was then examined, generating data on mAb adsorption for 48 different combinations of pH and counterion concentration for three different resins. The mAb partition coefficients were calculated and used to estimate the characteristic charge of the resin-protein interaction. Host cell protein and residual Protein A impurity levels were also measured, providing information on selectivity within this operating window. The HTS system shows promise for accelerating process development of IEX steps, enabling rapid acquisition of large datasets addressing the performance of the chromatography step under many different operating conditions. (c) 2008 Wiley Periodicals, Inc.

  6. Discovery of Bile Salt Hydrolase Inhibitors Using an Efficient High-Throughput Screening System

    Science.gov (United States)

    Smith, Katie; Zeng, Ximin; Lin, Jun

    2014-01-01

    The global trend of restricting the use of antibiotic growth promoters (AGP) in animal production necessitates the need to develop valid alternatives to maintain productivity and sustainability of food animals. Previous studies suggest inhibition of bile salt hydrolase (BSH), an intestinal bacteria-produced enzyme that exerts negative impact on host fat digestion and utilization, is a promising approach to promote animal growth performance. To achieve the long term goal of developing novel alternatives to AGPs, in this study, a rapid and convenient high-throughput screening (HTS) system was developed and successfully used for identification of BSH inhibitors. With the aid of a high-purity BSH from a chicken Lactobacillus salivarius strain, we optimized various screening conditions (e.g. BSH concentration, reaction buffer pH, incubation temperature and length, substrate type and concentration) and establish a precipitation-based screening approach to identify BSH inhibitors using 96-well or 384-well microplates. A pilot HTS was performed using a small compound library comprised of 2,240 biologically active and structurally diverse compounds. Among the 107 hits, several promising and potent BSH inhibitors (e.g. riboflavin and phenethyl caffeate) were selected and validated by standard BSH activity assay. Interestingly, the HTS also identified a panel of antibiotics as BSH inhibitor; in particular, various tetracycline antibiotics and roxarsone, the widely used AGP, have been demonstrated to display potent inhibitory effect on BSH. Together, this study developed an efficient HTS system and identified several BSH inhibitors with potential as alternatives to AGP. In addition, the findings from this study also suggest a new mode of action of AGP for promoting animal growth. PMID:24454844

  7. Discovery of bile salt hydrolase inhibitors using an efficient high-throughput screening system.

    Science.gov (United States)

    Smith, Katie; Zeng, Ximin; Lin, Jun

    2014-01-01

    The global trend of restricting the use of antibiotic growth promoters (AGP) in animal production necessitates the need to develop valid alternatives to maintain productivity and sustainability of food animals. Previous studies suggest inhibition of bile salt hydrolase (BSH), an intestinal bacteria-produced enzyme that exerts negative impact on host fat digestion and utilization, is a promising approach to promote animal growth performance. To achieve the long term goal of developing novel alternatives to AGPs, in this study, a rapid and convenient high-throughput screening (HTS) system was developed and successfully used for identification of BSH inhibitors. With the aid of a high-purity BSH from a chicken Lactobacillus salivarius strain, we optimized various screening conditions (e.g. BSH concentration, reaction buffer pH, incubation temperature and length, substrate type and concentration) and establish a precipitation-based screening approach to identify BSH inhibitors using 96-well or 384-well microplates. A pilot HTS was performed using a small compound library comprised of 2,240 biologically active and structurally diverse compounds. Among the 107 hits, several promising and potent BSH inhibitors (e.g. riboflavin and phenethyl caffeate) were selected and validated by standard BSH activity assay. Interestingly, the HTS also identified a panel of antibiotics as BSH inhibitor; in particular, various tetracycline antibiotics and roxarsone, the widely used AGP, have been demonstrated to display potent inhibitory effect on BSH. Together, this study developed an efficient HTS system and identified several BSH inhibitors with potential as alternatives to AGP. In addition, the findings from this study also suggest a new mode of action of AGP for promoting animal growth.

  8. High-throughput characterization of virus-like particles by interlaced size-exclusion chromatography.

    Science.gov (United States)

    Ladd Effio, Christopher; Oelmeier, Stefan A; Hubbuch, Jürgen

    2016-03-04

    The development and manufacturing of safe and effective vaccines relies essentially on the availability of robust and precise analytical techniques. Virus-like particles (VLPs) have emerged as an important and valuable class of vaccines for the containment of infectious diseases. VLPs are produced by recombinant protein expression followed by purification procedures to minimize the levels of process- and product-related impurities. The control of these impurities is necessary during process development and manufacturing. Especially monitoring of the VLP size distribution is important for the characterization of the final vaccine product. Currently used methods require long analysis times and tailor-made assays. In this work, we present a size-exclusion ultra-high performance liquid chromatography (SE-UHPLC) method to characterize VLPs and quantify aggregates within 3.1min per sample applying interlaced injections. Four analytical SEC columns were evaluated for the analysis of human B19 parvo-VLPs and murine polyoma-VLPs. The optimized method was successfully used for the characterization of five recombinant protein-based VLPs including human papillomavirus (HPV) VLPs, human enterovirus 71 (EV71) VLPs, and chimeric hepatitis B core antigen (HBcAg) VLPs pointing out the generic applicability of the assay. Measurements were supported by transmission electron microscopy and dynamic light scattering. It was demonstrated that the iSE-UHPLC method provides a rapid, precise and robust tool for the characterization of VLPs. Two case studies on purification tools for VLP aggregates and storage conditions of HPV VLPs highlight the relevance of the analytical method for high-throughput process development and process monitoring of virus-like particles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. High-Throughput Mutational Analysis of a Twister Ribozyme.

    Science.gov (United States)

    Kobori, Shungo; Yokobayashi, Yohei

    2016-08-22

    Recent discoveries of new classes of self-cleaving ribozymes in diverse organisms have triggered renewed interest in the chemistry and biology of ribozymes. Functional analysis and engineering of ribozymes often involve performing biochemical assays on multiple ribozyme mutants. However, because each ribozyme mutant must be individually prepared and assayed, the number and variety of mutants that can be studied are severely limited. All of the single and double mutants of a twister ribozyme (a total of 10 296 mutants) were generated and assayed for their self-cleaving activity by exploiting deep sequencing to count the numbers of cleaved and uncleaved sequences for every mutant. Interestingly, we found that the ribozyme is highly robust against mutations such that 71 % and 30 % of all single and double mutants, respectively, retain detectable activity under the assay conditions. It was also observed that the structural elements that comprise the ribozyme exhibit distinct sensitivity to mutations. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  10. Development of a high-throughput Candida albicans biofilm chip.

    Directory of Open Access Journals (Sweden)

    Anand Srinivasan

    2011-04-01

    Full Text Available We have developed a high-density microarray platform consisting of nano-biofilms of Candida albicans. A robotic microarrayer was used to print yeast cells of C. albicans encapsulated in a collagen matrix at a volume as low as 50 nL onto surface-modified microscope slides. Upon incubation, the cells grow into fully formed "nano-biofilms". The morphological and architectural complexity of these biofilms were evaluated by scanning electron and confocal scanning laser microscopy. The extent of biofilm formation was determined using a microarray scanner from changes in fluorescence intensities due to FUN 1 metabolic processing. This staining technique was also adapted for antifungal susceptibility testing, which demonstrated that, similar to regular biofilms, cells within the on-chip biofilms displayed elevated levels of resistance against antifungal agents (fluconazole and amphotericin B. Thus, results from structural analyses and antifungal susceptibility testing indicated that despite miniaturization, these biofilms display the typical phenotypic properties associated with the biofilm mode of growth. In its final format, the C. albicans biofilm chip (CaBChip is composed of 768 equivalent and spatially distinct nano-biofilms on a single slide; multiple chips can be printed and processed simultaneously. Compared to current methods for the formation of microbial biofilms, namely the 96-well microtiter plate model, this fungal biofilm chip has advantages in terms of miniaturization and automation, which combine to cut reagent use and analysis time, minimize labor intensive steps, and dramatically reduce assay costs. Such a chip should accelerate the antifungal drug discovery process by enabling rapid, convenient and inexpensive screening of hundreds-to-thousands of compounds simultaneously.

  11. Analisis TCP Cubic dan Simulasi untuk Menentukan Parameter Congestion Window dan Throughput Optimal pada Jaringan Nirkabel Ad Hoc

    Directory of Open Access Journals (Sweden)

    Ary Firnanda

    2017-08-01

    Full Text Available Ad hoc wireless network performance often declines caused the onset of congestion on the process of sending data. TCP Congestion Control used to solve these problems. TCP Cubic is one variant TCP Congestion Control. This research was conducted with the test and compare between variable value b=0.2 with the value of the variable b to be used in ad hoc wireless networks. This research was conducted with the experimental method using network simulation software NS-3. The results showed that the value of variable b is the right on ad hoc wireless network with packet loss by 5% to generate optimal congestion window max is b=0.5 and the average throughput optimal is b=0.1

  12. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  13. New developments of RNAi in Paracoccidioides brasiliensis: prospects for high-throughput, genome-wide, functional genomics.

    Directory of Open Access Journals (Sweden)

    Tercio Goes

    2014-10-01

    Full Text Available The Fungal Genome Initiative of the Broad Institute, in partnership with the Paracoccidioides research community, has recently sequenced the genome of representative isolates of this human-pathogen dimorphic fungus: Pb18 (S1, Pb03 (PS2 and Pb01. The accomplishment of future high-throughput, genome-wide, functional genomics will rely upon appropriate molecular tools and straightforward techniques to streamline the generation of stable loss-of-function phenotypes. In the past decades, RNAi has emerged as the most robust genetic technique to modulate or to suppress gene expression in diverse eukaryotes, including fungi. These molecular tools and techniques, adapted for RNAi, were up until now unavailable for P. brasiliensis.In this paper, we report Agrobacterium tumefaciens mediated transformation of yeast cells for high-throughput applications with which higher transformation frequencies of 150±24 yeast cell transformants per 1×106 viable yeast cells were obtained. Our approach is based on a bifunctional selective marker fusion protein consisted of the Streptoalloteichus hindustanus bleomycin-resistance gene (Shble and the intrinsically fluorescent monomeric protein mCherry which was codon-optimized for heterologous expression in P. brasiliensis. We also report successful GP43 gene knock-down through the expression of intron-containing hairpin RNA (ihpRNA from a Gateway-adapted cassette (cALf which was purpose-built for gene silencing in a high-throughput manner. Gp43 transcript levels were reduced by 73.1±22.9% with this approach.We have a firm conviction that the genetic transformation technique and the molecular tools herein described will have a relevant contribution in future Paracoccidioides spp. functional genomics research.

  14. Optimizing High Level Waste Disposal

    Energy Technology Data Exchange (ETDEWEB)

    Dirk Gombert

    2005-09-01

    If society is ever to reap the potential benefits of nuclear energy, technologists must close the fuel-cycle completely. A closed cycle equates to a continued supply of fuel and safe reactors, but also reliable and comprehensive closure of waste issues. High level waste (HLW) disposal in borosilicate glass (BSG) is based on 1970s era evaluations. This host matrix is very adaptable to sequestering a wide variety of radionuclides found in raffinates from spent fuel reprocessing. However, it is now known that the current system is far from optimal for disposal of the diverse HLW streams, and proven alternatives are available to reduce costs by billions of dollars. The basis for HLW disposal should be reassessed to consider extensive waste form and process technology research and development efforts, which have been conducted by the United States Department of Energy (USDOE), international agencies and the private sector. Matching the waste form to the waste chemistry and using currently available technology could increase the waste content in waste forms to 50% or more and double processing rates. Optimization of the HLW disposal system would accelerate HLW disposition and increase repository capacity. This does not necessarily require developing new waste forms, the emphasis should be on qualifying existing matrices to demonstrate protection equal to or better than the baseline glass performance. Also, this proposed effort does not necessarily require developing new technology concepts. The emphasis is on demonstrating existing technology that is clearly better (reliability, productivity, cost) than current technology, and justifying its use in future facilities or retrofitted facilities. Higher waste processing and disposal efficiency can be realized by performing the engineering analyses and trade-studies necessary to select the most efficient methods for processing the full spectrum of wastes across the nuclear complex. This paper will describe technologies being

  15. Design of novel solar thermal fuels with high-throughput ab initio simulations

    Science.gov (United States)

    Liu, Yun; Grossman, Jeffrey

    2014-03-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. Previously we have predicted a new class of functional materials that have the potential to address these challenges. Recently, we have developed an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening algorithm we have developed can run through large numbers of molecules composed of earth-abundant elements, and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical design principles to guide further STF materials design through the correlation between isomerization enthalpy and structural properties.

  16. High-Throughput 3D Tumor Spheroid Screening Method for Cancer Drug Discovery Using Celigo Image Cytometry.

    Science.gov (United States)

    Kessel, Sarah; Cribbes, Scott; Déry, Olivier; Kuksin, Dmitry; Sincoff, Eric; Qiu, Jean; Chan, Leo Li-Ying

    2017-08-01

    Oncologists have investigated the effect of protein or chemical-based compounds on cancer cells to identify potential drug candidates. Traditionally, the growth inhibitory and cytotoxic effects of the drugs are first measured in 2D in vitro models, and then further tested in 3D xenograft in vivo models. Although the drug candidates can demonstrate promising inhibitory or cytotoxicity results in a 2D environment, similar effects may not be observed under a 3D environment. In this work, we developed an image-based high-throughput screening method for 3D tumor spheroids using the Celigo image cytometer. First, optimal seeding density for tumor spheroid formation was determined by investigating the cell seeding density of U87MG, a human glioblastoma cell line. Next, the dose-response effects of 17-AAG with respect to spheroid size and viability were measured to determine the IC50 value. Finally, the developed high-throughput method was used to measure the dose response of four drugs (17-AAG, paclitaxel, TMZ, and doxorubicin) with respect to the spheroid size and viability. Each experiment was performed simultaneously in the 2D model for comparison. This detection method allowed for a more efficient process to identify highly qualified drug candidates, which may reduce the overall time required to bring a drug to clinical trial.

  17. A high-throughput liquid bead array-based screening technology for Bt presence in GMO manipulation.

    Science.gov (United States)

    Fu, Wei; Wang, Huiyu; Wang, Chenguang; Mei, Lin; Lin, Xiangmei; Han, Xueqing; Zhu, Shuifang

    2016-03-15

    The number of species and planting areas of genetically modified organisms (GMOs) has been rapidly developed during the past ten years. For the purpose of GMO inspection, quarantine and manipulation, we have now devised a high-throughput Bt-based GMOs screening method based on the liquid bead array. This novel method is based on the direct competitive recognition between biotinylated antibodies and beads-coupled antigens, searching for Bt presence in samples if it contains Bt Cry1 Aa, Bt Cry1 Ab, Bt Cry1 Ac, Bt Cry1 Ah, Bt Cry1 B, Bt Cry1 C, Bt Cry1 F, Bt Cry2 A, Bt Cry3 or Bt Cry9 C. Our method has a wide GMO species coverage so that more than 90% of the whole commercialized GMO species can be identified throughout the world. Under our optimization, specificity, sensitivity, repeatability and availability validation, the method shows a high specificity and 10-50 ng/mL sensitivity of quantification. We then assessed more than 1800 samples in the field and food market to prove capacity of our method in performing a high throughput screening work for GMO manipulation. Our method offers an applicant platform for further inspection and research on GMO plants. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Predictions versus high-throughput experiments in T-cell epitope discovery: competition or synergy?

    DEFF Research Database (Denmark)

    Lundegaard, Claus; Lund, Ole; Nielsen, Morten

    2012-01-01

    Prediction methods as well as experimental methods for T-cell epitope discovery have developed significantly in recent years. High-throughput experimental methods have made it possible to perform full-length protein scans for epitopes restricted to a limited number of MHC alleles. The high costs...... discovery. We expect prediction methods as well as experimental validation methods to continue to develop and that we will soon see clinical trials of products whose development has been guided by prediction methods....

  19. Deep Mutational Scanning: Library Construction, Functional Selection, and High-Throughput Sequencing.

    Science.gov (United States)

    Starita, Lea M; Fields, Stanley

    2015-08-03

    Deep mutational scanning is a highly parallel method that uses high-throughput sequencing to track changes in >10(5) protein variants before and after selection to measure the effects of mutations on protein function. Here we outline the stages of a deep mutational scanning experiment, focusing on the construction of libraries of protein sequence variants and the preparation of Illumina sequencing libraries. © 2015 Cold Spring Harbor Laboratory Press.

  20. A High-Throughput, Adaptive FFT Architecture for FPGA-Based Space-Borne Data Processors

    Science.gov (United States)

    Nguyen, Kayla; Zheng, Jason; He, Yutao; Shah, Biren

    2010-01-01

    Historically, computationally-intensive data processing for space-borne instruments has heavily relied on ground-based computing resources. But with recent advances in functional densities of Field-Programmable Gate-Arrays (FPGAs), there has been an increasing desire to shift more processing on-board; therefore relaxing the downlink data bandwidth requirements. Fast Fourier Transforms (FFTs) are commonly used building blocks for data processing applications, with a growing need to increase the FFT block size. Many existing FFT architectures have mainly emphasized on low power consumption or resource usage; but as the block size of the FFT grows, the throughput is often compromised first. In addition to power and resource constraints, space-borne digital systems are also limited to a small set of space-qualified memory elements, which typically lag behind the commercially available counterparts in capacity and bandwidth. The bandwidth limitation of the external memory creates a bottleneck for a large, high-throughput FFT design with large block size. In this paper, we present the Multi-Pass Wide Kernel FFT (MPWK-FFT) architecture for a moderately large block size (32K) with considerations to power consumption and resource usage, as well as throughput. We will also show that the architecture can be easily adapted for different FFT block sizes with different throughput and power requirements. The result is completely contained within an FPGA without relying on external memories. Implementation results are summarized.

  1. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  2. An Air-Well sparging minifermenter system for high-throughput protein production.

    Science.gov (United States)

    Deantonio, Cecilia; Sedini, Valentina; Cesaro, Patrizia; Quasso, Fabio; Cotella, Diego; Persichetti, Francesca; Santoro, Claudio; Sblattero, Daniele

    2014-09-14

    Over the last few years High-Throughput Protein Production (HTPP) has played a crucial role for functional proteomics. High-quality, high yield and fast recombinant protein production are critical for new HTPP technologies. Escherichia coli is usually the expression system of choice in protein production thanks to its fast growth, ease of handling and high yields of protein produced. Even though shake-flask cultures are widely used, there is an increasing need for easy to handle, lab scale, high throughput systems. In this article we described a novel minifermenter system suitable for HTPP. The Air-Well minifermenter system is made by a homogeneous air sparging device that includes an air diffusion system, and a stainless steel 96 needle plate integrated with a 96 deep well plate where cultures take place. This system provides aeration to achieve higher optical density growth compared to classical shaking growth without the decrease in pH value and bacterial viability. Moreover the yield of recombinant protein is up to 3-fold higher with a considerable improvement in the amount of full length proteins. High throughput production of hundreds of proteins in parallel can be obtained sparging air in a continuous and controlled manner. The system used is modular and can be easily modified and scaled up to meet the demands for HTPP.

  3. A Novel High-Throughput Approach to Measure Hydroxyl Radicals Induced by Airborne Particulate Matter

    Directory of Open Access Journals (Sweden)

    Yeongkwon Son

    2015-10-01

    Full Text Available Oxidative stress is one of the key mechanisms linking ambient particulate matter (PM exposure with various adverse health effects. The oxidative potential of PM has been used to characterize the ability of PM induced oxidative stress. Hydroxyl radical (•OH is the most destructive radical produced by PM. However, there is currently no high-throughput approach which can rapidly measure PM-induced •OH for a large number of samples with an automated system. This study evaluated four existing molecular probes (disodium terephthalate, 3′-p-(aminophenylfluorescein, coumarin-3-carboxylic acid, and sodium benzoate for their applicability to measure •OH induced by PM in a high-throughput cell-free system using fluorescence techniques, based on both our experiments and on an assessment of the physicochemical properties of the probes reported in the literature. Disodium terephthalate (TPT was the most applicable molecular probe to measure •OH induced by PM, due to its high solubility, high stability of the corresponding fluorescent product (i.e., 2-hydroxyterephthalic acid, high yield compared with the other molecular probes, and stable fluorescence intensity in a wide range of pH environments. TPT was applied in a high-throughput format to measure PM (NIST 1648a-induced •OH, in phosphate buffered saline. The formed fluorescent product was measured at designated time points up to 2 h. The fluorescent product of TPT had a detection limit of 17.59 nM. The soluble fraction of PM contributed approximately 76.9% of the •OH induced by total PM, and the soluble metal ions of PM contributed 57.4% of the overall •OH formation. This study provides a promising cost-effective high-throughput method to measure •OH induced by PM on a routine basis.

  4. Accurate Classification of Protein Subcellular Localization from High-Throughput Microscopy Images Using Deep Learning

    Directory of Open Access Journals (Sweden)

    Tanel Pärnamaa

    2017-05-01

    Full Text Available High-throughput microscopy of many single cells generates high-dimensional data that are far from straightforward to analyze. One important problem is automatically detecting the cellular compartment where a fluorescently-tagged protein resides, a task relatively simple for an experienced human, but difficult to automate on a computer. Here, we train an 11-layer neural network on data from mapping thousands of yeast proteins, achieving per cell localization classification accuracy of 91%, and per protein accuracy of 99% on held-out images. We confirm that low-level network features correspond to basic image characteristics, while deeper layers separate localization classes. Using this network as a feature calculator, we train standard classifiers that assign proteins to previously unseen compartments after observing only a small number of training examples. Our results are the most accurate subcellular localization classifications to date, and demonstrate the usefulness of deep learning for high-throughput microscopy.

  5. The promise and challenge of high-throughput sequencing of the antibody repertoire

    Science.gov (United States)

    Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R

    2014-01-01

    Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474

  6. tcpl: the ToxCast pipeline for high-throughput screening data.

    Science.gov (United States)

    Filer, Dayne L; Kothiya, Parth; Setzer, R Woodrow; Judson, Richard S; Martin, Matthew T

    2017-02-15

    Large high-throughput screening (HTS) efforts are widely used in drug development and chemical toxicity screening. Wide use and integration of these data can benefit from an efficient, transparent and reproducible data pipeline. Summary: The tcpl R package and its associated MySQL database provide a generalized platform for efficiently storing, normalizing and dose-response modeling of large high-throughput and high-content chemical screening data. The novel dose-response modeling algorithm has been tested against millions of diverse dose-response series, and robustly fits data with outliers and cytotoxicity-related signal loss. tcpl is freely available on the Comprehensive R Archive Network under the GPL-2 license. martin.matt@epa.gov.

  7. Quantitative monitoring of Arabidopsis thaliana growth and development using high-throughput plant phenotyping.

    Science.gov (United States)

    Arend, Daniel; Lange, Matthias; Pape, Jean-Michel; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Mücke, Ingo; Klukas, Christian; Altmann, Thomas; Scholz, Uwe; Junker, Astrid

    2016-08-16

    With the implementation of novel automated, high throughput methods and facilities in the last years, plant phenomics has developed into a highly interdisciplinary research domain integrating biology, engineering and bioinformatics. Here we present a dataset of a non-invasive high throughput plant phenotyping experiment, which uses image- and image analysis- based approaches to monitor the growth and development of 484 Arabidopsis thaliana plants (thale cress). The result is a comprehensive dataset of images and extracted phenotypical features. Such datasets require detailed documentation, standardized description of experimental metadata as well as sustainable data storage and publication in order to ensure the reproducibility of experiments, data reuse and comparability among the scientific community. Therefore the here presented dataset has been annotated using the standardized ISA-Tab format and considering the recently published recommendations for the semantical description of plant phenotyping experiments.

  8. High-throughput clone screening followed by protein expression cross-check: A visual assay platform.

    Science.gov (United States)

    Bose, Partha Pratim; Kumar, Prakash

    2017-01-01

    In high-throughput biotechnology and structural biology, molecular cloning is an essential prerequisite for attaining high yields of recombinant protein. However, a rapid, cost-effective, easy clone screening protocol is still required to identify colonies with desired insert along with a cross check method to certify the expression of the desired protein as the end product. We report an easy, fast, sensitive and cheap visual clone screening and protein expression cross check protocol employing gold nanoparticle based plasmonic detection phenomenon. This is a non-gel, non-PCR based visual detection technique, which can be used as simultaneous high throughput clone screening followed by the determination of expression of desired protein. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Applying active learning to high-throughput phenotyping algorithms for electronic health records data.

    Science.gov (United States)

    Chen, Yukun; Carroll, Robert J; Hinz, Eugenia R McPeek; Shah, Anushi; Eyler, Anne E; Denny, Joshua C; Xu, Hua

    2013-12-01

    Generalizable, high-throughput phenotyping methods based on supervised machine learning (ML) algorithms could significantly accelerate the use of electronic health records data for clinical and translational research. However, they often require large numbers of annotated samples, which are costly and time-consuming to review. We investigated the use of active learning (AL) in ML-based phenotyping algorithms. We integrated an uncertainty sampling AL approach with support vector machines-based phenotyping algorithms and evaluated its performance using three annotated disease cohorts including rheumatoid arthritis (RA), colorectal cancer (CRC), and venous thromboembolism (VTE). We investigated performance using two types of feature sets: unrefined features, which contained at least all clinical concepts extracted from notes and billing codes; and a smaller set of refined features selected by domain experts. The performance of the AL was compared with a passive learning (PL) approach based on random sampling. Our evaluation showed that AL outperformed PL on three phenotyping tasks. When unrefined features were used in the RA and CRC tasks, AL reduced the number of annotated samples required to achieve an area under the curve (AUC) score of 0.95 by 68% and 23%, respectively. AL also achieved a reduction of 68% for VTE with an optimal AUC of 0.70 using refined features. As expected, refined features improved the performance of phenotyping classifiers and required fewer annotated samples. This study demonstrated that AL can be useful in ML-based phenotyping methods. Moreover, AL and feature engineering based on domain knowledge could be combined to develop efficient and generalizable phenotyping methods.

  10. Development of a high-throughput screen for inhibitors of Epstein-Barr virus EBNA1.

    Science.gov (United States)

    Thompson, Scott; Messick, Troy; Schultz, David C; Reichman, Melvin; Lieberman, Paul M

    2010-10-01

    Latent infection with Epstein-Barr virus (EBV) is a carcinogenic cofactor in several lymphoid and epithelial cell malignancies. At present, there are no small-molecule inhibitors that specifically target EBV latent infection or latency-associated oncoproteins. EBNA1 is an EBV-encoded sequence-specific DNA binding protein that is consistently expressed in EBV-associated tumors and required for stable maintenance of the viral genome in proliferating cells. EBNA1 is also thought to provide cell survival function in latently infected cells. In this work, the authors describe the development of a biochemical high-throughput screening (HTS) method using a homogeneous fluorescence polarization (FP) assay monitoring EBNA1 binding to its cognate DNA binding site. An FP-based counterscreen was developed using another EBV-encoded DNA binding protein, Zta, and its cognate DNA binding site. The authors demonstrate that EBNA1 binding to a fluorescent-labeled DNA probe provides a robust assay with a Z factor consistently greater than 0.6. A pilot screen of a small-molecule library of ~14,000 compounds identified 3 structurally related molecules that selectively inhibit EBNA1 but not Zta. All 3 compounds had activity in a cell-based assay specific for the disruption of EBNA1 transcription repression function. One of the compounds was effective in reducing EBV genome copy number in Raji Burkitt lymphoma cells. These experiments provide a proof of concept that small-molecule inhibitors of EBNA1 can be identified by biochemical HTS of compound libraries. Further screening in conjunction with medicinal chemistry optimization may provide a selective inhibitor of EBNA1 and EBV latent infection.

  11. Kaleidaseq: a Web-based tool to monitor data flow in a high throughput sequencing facility.

    Science.gov (United States)

    Dedhia, N N; McCombie, W R

    1998-03-01

    Tracking data flow in high throughput sequencing is important in maintaining a consistent number of successfully sequenced samples, making decisions on scheduling the flow of sequencing steps, resolving problems at various steps and tracking the status of different projects. This is especially critical when the laboratory is handling a multitude of projects. We have built a Web-based data flow tracking package, called Kaleidaseq, which allows us to monitor the flow and quality of sequencing samples through the steps of preparation of library plates, plaque-picking, preparation of templates, conducting sequencing reactions, loading of samples on gels, base-calling the traces, and calculating the quality of the sequenced samples. Kaleidaseq's suite of displays allows for outstanding monitoring of the production sequencing process. The online display of current information that Kaleidaseq provides on both project status and process queues sorted by project enables accurate real-time assessment of the necessary samples that must be processed to complete the project. This information allows the process manager to allocate future resources optimally and schedule tasks according to scientific priorities. Quality of the sequenced samples can be tracked on a daily basis, which allows the sequencing laboratory to maintain a steady performance level and quickly resolve dips in quality. Kaleidaseq has a simple easy-to-use interface that allows access to all major functions and process queues from one Web page. This software package is modular and designed to allow additional processing steps and new monitoring variables to be added and tracked with ease. Access to the underlying relational database is through the Perl DBI interface, which allows for the use of different relational databases. Kaleidaseq is available for free use by the academic community from http://www.cshl.org/kaleidaseq.

  12. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit

    National Research Council Canada - National Science Library

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-01-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated...

  13. Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.

    Science.gov (United States)

    Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil

    2015-07-17

    In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.

  14. Streptococcus mutans protein synthesis during mixed-species biofilm development by high-throughput quantitative proteomics.

    Directory of Open Access Journals (Sweden)

    Marlise I Klein

    Full Text Available Biofilms formed on tooth surfaces are comprised of mixed microbiota enmeshed in an extracellular matrix. Oral biofilms are constantly exposed to environmental changes, which influence the microbial composition, matrix formation and expression of virulence. Streptococcus mutans and sucrose are key modulators associated with the evolution of virulent-cariogenic biofilms. In this study, we used a high-throughput quantitative proteomics approach to examine how S. mutans produces relevant proteins that facilitate its establishment and optimal survival during mixed-species biofilms development induced by sucrose. Biofilms of S. mutans, alone or mixed with Actinomyces naeslundii and Streptococcus oralis, were initially formed onto saliva-coated hydroxyapatite surface under carbohydrate-limiting condition. Sucrose (1%, w/v was then introduced to cause environmental changes, and to induce biofilm accumulation. Multidimensional protein identification technology (MudPIT approach detected up to 60% of proteins encoded by S. mutans within biofilms. Specific proteins associated with exopolysaccharide matrix assembly, metabolic and stress adaptation processes were highly abundant as the biofilm transit from earlier to later developmental stages following sucrose introduction. Our results indicate that S. mutans within a mixed-species biofilm community increases the expression of specific genes associated with glucan synthesis and remodeling (gtfBC, dexA and glucan-binding (gbpB during this transition (P<0.05. Furthermore, S. mutans up-regulates specific adaptation mechanisms to cope with acidic environments (F1F0-ATPase system, fatty acid biosynthesis, branched chain amino acids metabolism, and molecular chaperones (GroEL. Interestingly, the protein levels and gene expression are in general augmented when S. mutans form mixed-species biofilms (vs. single-species biofilms demonstrating fundamental differences in the matrix assembly, survival and biofilm

  15. Streptococcus mutans Protein Synthesis during Mixed-Species Biofilm Development by High-Throughput Quantitative Proteomics

    Science.gov (United States)

    Klein, Marlise I.; Xiao, Jin; Lu, Bingwen; Delahunty, Claire M.; Yates, John R.; Koo, Hyun

    2012-01-01

    Biofilms formed on tooth surfaces are comprised of mixed microbiota enmeshed in an extracellular matrix. Oral biofilms are constantly exposed to environmental changes, which influence the microbial composition, matrix formation and expression of virulence. Streptococcus mutans and sucrose are key modulators associated with the evolution of virulent-cariogenic biofilms. In this study, we used a high-throughput quantitative proteomics approach to examine how S. mutans produces relevant proteins that facilitate its establishment and optimal survival during mixed-species biofilms development induced by sucrose. Biofilms of S. mutans, alone or mixed with Actinomyces naeslundii and Streptococcus oralis, were initially formed onto saliva-coated hydroxyapatite surface under carbohydrate-limiting condition. Sucrose (1%, w/v) was then introduced to cause environmental changes, and to induce biofilm accumulation. Multidimensional protein identification technology (MudPIT) approach detected up to 60% of proteins encoded by S. mutans within biofilms. Specific proteins associated with exopolysaccharide matrix assembly, metabolic and stress adaptation processes were highly abundant as the biofilm transit from earlier to later developmental stages following sucrose introduction. Our results indicate that S. mutans within a mixed-species biofilm community increases the expression of specific genes associated with glucan synthesis and remodeling (gtfBC, dexA) and glucan-binding (gbpB) during this transition (Pmutans up-regulates specific adaptation mechanisms to cope with acidic environments (F1F0-ATPase system, fatty acid biosynthesis, branched chain amino acids metabolism), and molecular chaperones (GroEL). Interestingly, the protein levels and gene expression are in general augmented when S. mutans form mixed-species biofilms (vs. single-species biofilms) demonstrating fundamental differences in the matrix assembly, survival and biofilm maintenance in the presence of other

  16. A low cost and high throughput magnetic bead-based immuno-agglutination assay in confined droplets.

    Science.gov (United States)

    Teste, Bruno; Ali-Cherif, Anaïs; Viovy, Jean Louis; Malaquin, Laurent

    2013-06-21

    Although passive immuno-agglutination assays consist of one step and simple procedures, they are usually not adapted for high throughput analyses and they require expensive and bulky equipment for quantitation steps. Here we demonstrate a low cost, multimodal and high throughput immuno-agglutination assay that relies on a combination of magnetic beads (MBs), droplets microfluidics and magnetic tweezers. Antibody coated MBs were used as a capture support in the homogeneous phase. Following the immune interaction, water in oil droplets containing MBs and analytes were generated and transported in Teflon tubing. When passing in between magnetic tweezers, the MBs contained in the droplets were magnetically confined in order to enhance the agglutination rate and kinetics. When releasing the magnetic field, the internal recirculation flows in the droplet induce shear forces that favor MBs redispersion. In the presence of the analyte, the system preserves specific interactions and MBs stay in the aggregated state while in the case of a non-specific analyte, redispersion of particles occurs. The analyte quantitation procedure relies on the MBs redispersion rate within the droplet. The influence of different parameters such as magnetic field intensity, flow rate and MBs concentration on the agglutination performances have been investigated and optimized. Although the immuno-agglutination assay described in this work may not compete with enzyme linked immunosorbent assay (ELISA) in terms of sensitivity, it offers major advantages regarding the reagents consumption (analysis is performed in sub microliter droplet) and the platform cost that yields to very cheap analyses. Moreover the fully automated analysis procedure provides reproducible analyses with throughput well above those of existing technologies. We demonstrated the detection of biotinylated phosphatase alkaline in 100 nL sample volumes with an analysis rate of 300 assays per hour and a limit of detection of 100 pM.

  17. High throughput integrated thermal characterization with non-contact optical calorimetry

    Science.gov (United States)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  18. Synthetic Substrate for Application in both High and Low Throughput Assays for Botulinum Neurotoxin B Protease Inhibitors

    OpenAIRE

    Salzameda, Nicholas T.; Barbieri, Joseph T.; Janda, Kim D.

    2009-01-01

    A FRET peptide substrate was synthesized and evaluated for enzymatic cleavage by the BoNT/B light chain protease. The FRET substrate was found to be useful in both a high throughput assay to uncover initial “hits” and a low throughput HPLC assay to determine kinetic parameters and modes of inhibition.

  19. Design and construction of a first-generation high-throughput integrated robotic molecular biology platform for bioenergy applications.

    Science.gov (United States)

    Hughes, Stephen R; Butt, Tauseef R; Bartolett, Scott; Riedmuller, Steven B; Farrelly, Philip

    2011-08-01

    The molecular biological techniques for plasmid-based assembly and cloning of gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. High-throughput integrated robotic molecular biology platforms that have the capacity to rapidly clone and express heterologous gene open reading frames in bacteria and yeast and to screen large numbers of expressed proteins for optimized function are an important technology for improving microbial strains for biofuel production. The process involves the production of full-length complementary DNA libraries as a source of plasmid-based clones to express the desired proteins in active form for determination of their functions. Proteins that were identified by high-throughput screening as having desired characteristics are overexpressed in microbes to enable them to perform functions that will allow more cost-effective and sustainable production of biofuels. Because the plasmid libraries are composed of several thousand unique genes, automation of the process is essential. This review describes the design and implementation of an automated integrated programmable robotic workcell capable of producing complementary DNA libraries, colony picking, isolating plasmid DNA, transforming yeast and bacteria, expressing protein, and performing appropriate functional assays. These operations will allow tailoring microbial strains to use renewable feedstocks for production of biofuels, bioderived chemicals, fertilizers, and other coproducts for profitable and sustainable biorefineries. Published by Elsevier Inc.

  20. High-Throughput Quantification of Nanoparticle Degradation Using Computational Microscopy and Its Application to Drug Delivery Nanocapsules

    KAUST Repository

    Ray, Aniruddha

    2017-04-25

    Design and synthesis of degradable nanoparticles are very important in drug delivery and biosensing fields. Although accurate assessment of nanoparticle degradation rate would improve the characterization and optimization of drug delivery vehicles, current methods rely on estimating the size of the particles at discrete points over time using, for example, electron microscopy or dynamic light scattering (DLS), among other techniques, all of which have drawbacks and practical limitations. There is a significant need for a high-throughput and cost-effective technology to accurately monitor nanoparticle degradation as a function of time and using small amounts of sample. To address this need, here we present two different computational imaging-based methods for monitoring and quantification of nanoparticle degradation. The first method is suitable for discrete testing, where a computational holographic microscope is designed to track the size changes of protease-sensitive protein-core nanoparticles following degradation, by periodically sampling a subset of particles mixed with proteases. In the second method, a sandwich structure was utilized to observe, in real-time, the change in the properties of liquid nanolenses that were self-assembled around degrading nanoparticles, permitting continuous monitoring and quantification of the degradation process. These cost-effective holographic imaging based techniques enable high-throughput monitoring of the degradation of any type of nanoparticle, using an extremely small amount of sample volume that is at least 3 orders of magnitude smaller than what is required by, for example, DLS-based techniques.

  1. Assay development and high-throughput screening for small molecule inhibitors of a Vibrio cholerae stress response pathway

    Directory of Open Access Journals (Sweden)

    Stanbery L

    2017-09-01

    Full Text Available Laura Stanbery, Jyl S Matson Department of Medical Microbiology and Immunology, College of Medicine and Life Sciences, The University of Toledo, Toledo, OH, USA Abstract: Antibiotics are important adjuncts to oral rehydration therapy in cholera disease management. However, due to the rapid emergence of resistance to the antibiotics used to treat cholera, therapeutic options are becoming limited. Therefore, there is a critical need to develop additional therapeutics to aid in the treatment of cholera. Previous studies showed that the extracytoplasmic stress response (σE pathway of Vibrio cholerae is required for full virulence of the organism. The pathway is also required for bacterial growth in the presence of ethanol. Therefore, we exploited this ethanol sensitivity phenotype in order to develop a screen for inhibitors of the pathway, with the aim of also inhibiting virulence of the pathogen. Here we describe the optimization and implementation of our high-throughput screening strategy. From a primary screen of over 100,000 compounds, we have identified seven compounds that validated the growth phenotypes from the primary and counterscreens. These compounds have the potential to be developed into therapeutic agents for cholera and will also be valuable probes for uncovering basic molecular mechanisms of an important cause of diarrheal disease. Keywords: Vibrio cholerae, stress response, σE, high-throughput screening

  2. Target-dependent enrichment of virions determines the reduction of high-throughput sequencing in virus discovery.

    Directory of Open Access Journals (Sweden)

    Randi Holm Jensen

    Full Text Available Viral infections cause many different diseases stemming both from well-characterized viral pathogens but also from emerging viruses, and the search for novel viruses continues to be of great importance. High-throughput sequencing is an important technology for this purpose. However, viral nucleic acids often constitute a minute proportion of the total genetic material in a sample from infected tissue. Techniques to enrich viral targets in high-throughput sequencing have been reported, but the sensitivity of such methods is not well established. This study compares different library preparation techniques targeting both DNA and RNA with and without virion enrichment. By optimizing the selection of intact virus particles, both by physical and enzymatic approaches, we assessed the effectiveness of the specific enrichment of viral sequences as compared to non-enriched sample preparations by selectively looking for and counting read sequences obtained from shotgun sequencing. Using shotgun sequencing of total DNA or RNA, viral targets were detected at concentrations corresponding to the predicted level, providing a foundation for estimating the effectiveness of virion enrichment. Virion enrichment typically produced a 1000-fold increase in the proportion of DNA virus sequences. For RNA virions the gain was less pronounced with a maximum 13-fold increase. This enrichment varied between the different sample concentrations, with no clear trend. Despite that less sequencing was required to identify target sequences, it was not evident from our data that a lower detection level was achieved by virion enrichment compared to shotgun sequencing.

  3. CE-SSCP and CE-FLA, simple and high-throughput alternatives for fungal diversity studies.

    Science.gov (United States)

    Zinger, Lucie; Gury, Jérôme; Alibeu, Olivier; Rioux, Delphine; Gielly, Ludovic; Sage, Lucile; Pompanon, François; Geremia, Roberto A

    2008-01-01

    Fungal communities are key components of soil, but the study of their ecological significance is limited by a lack of appropriated methods. For instance, the assessment of fungi occurrence and spatio-temporal variation in soil requires the analysis of a large number of samples. The molecular signature methods provide a useful tool to monitor these microbial communities and can be easily adapted to capillary electrophoresis (CE) allowing high-throughput studies. Here we assess the suitability of CE-FLA (Fragment Length Polymorphism, denaturing conditions) and CE-SSCP (Single-Stranded Conformation Polymorphism, native conditions) applied to environmental studies since they require a short molecular marker and no post-PCR treatments. We amplified the ITS1 region from 22 fungal strains isolated from an alpine ecosystem and from total genomic DNA of alpine and infiltration basin soils. The CE-FLA and CE-SSCP separated 17 and 15 peaks respectively from a mixture of 19 strains. For the alpine soil-metagenomic DNA, the FLA displayed more peaks than the SSCP and the converse result was found for infiltration basin sediments. We concluded that CE-FLA and CE-SSCP of ITS1 region provided complementary information. In order to improve CE-SSCP sensitivity, we tested its resolution according to migration temperature and found 32 degrees C to be optimal. Because of their simplicity, quickness and reproducibility, we found that these two methods were promising for high-throughput studies of soil fungal communities.

  4. Inertial-ordering-assisted droplet microfluidics for high-throughput single-cell RNA-sequencing.

    Science.gov (United States)

    Moon, Hui-Sung; Je, Kwanghwi; Min, Jae-Woong; Park, Donghyun; Han, Kyung-Yeon; Shin, Seung-Ho; Park, Woong-Yang; Yoo, Chang Eun; Kim, Shin-Hyun

    2018-02-27

    Single-cell RNA-seq reveals the cellular heterogeneity inherent in the population of cells, which is very important in many clinical and research applications. Recent advances in droplet microfluidics have achieved the automatic isolation, lysis, and labeling of single cells in droplet compartments without complex instrumentation. However, barcoding errors occurring in the cell encapsulation process because of the multiple-beads-in-droplet and insufficient throughput because of the low concentration of beads for avoiding multiple-beads-in-a-droplet remain important challenges for precise and efficient expression profiling of single cells. In this study, we developed a new droplet-based microfluidic platform that significantly improved the throughput while reducing barcoding errors through deterministic encapsulation of inertially ordered beads. Highly concentrated beads containing oligonucleotide barcodes were spontaneously ordered in a spiral channel by an inertial effect, which were in turn encapsulated in droplets one-by-one, while cells were simultaneously encapsulated in the droplets. The deterministic encapsulation of beads resulted in a high fraction of single-bead-in-a-droplet and rare multiple-beads-in-a-droplet although the bead concentration increased to 1000 μl -1 , which diminished barcoding errors and enabled accurate high-throughput barcoding. We successfully validated our device with single-cell RNA-seq. In addition, we found that multiple-beads-in-a-droplet, generated using a normal Drop-Seq device with a high concentration of beads, underestimated transcript numbers and overestimated cell numbers. This accurate high-throughput platform can expand the capability and practicality of Drop-Seq in single-cell analysis.

  5. Development of High-Throughput Quantitative Assays for Glucose Uptake in Cancer Cell Lines

    Science.gov (United States)

    Hassanein, Mohamed; Weidow, Brandy; Koehler, Elizabeth; Bakane, Naimish; Garbett, Shawn; Shyr, Yu; Quaranta, Vito

    2013-01-01

    Purpose Metabolism, and especially glucose uptake, is a key quantitative cell trait that is closely linked to cancer initiation and progression. Therefore, developing high-throughput assays for measuring glucose uptake in cancer cells would be enviable for simultaneous comparisons of multiple cell lines and microenvironmental conditions. This study was designed with two specific aims in mind: the first was to develop and validate a high-throughput screening method for quantitative assessment of glucose uptake in “normal” and tumor cells using the fluorescent 2-deoxyglucose analog 2-[N-(7-nitrobenz-2-oxa-1,3-diazol-4-yl)amino]-2-deoxyglucose (2-NBDG), and the second was to develop an image-based, quantitative, single-cell assay for measuring glucose uptake using the same probe to dissect the full spectrum of metabolic variability within populations of tumor cells in vitro in higher resolution. Procedure The kinetics of population-based glucose uptake was evaluated for MCF10A mammary epithelial and CA1d breast cancer cell lines, using 2-NBDG and a fluorometric microplate reader. Glucose uptake for the same cell lines was also examined at the single-cell level using high-content automated microscopy coupled with semi-automated cell-cytometric image analysis approaches. Statistical treatments were also implemented to analyze intra-population variability. Results Our results demonstrate that the high-throughput fluorometric assay using 2-NBDG is a reliable method to assess population-level kinetics of glucose uptake in cell lines in vitro. Similarly, single-cell image-based assays and analyses of 2-NBDG fluorescence proved an effective and accurate means for assessing glucose uptake, which revealed that breast tumor cell lines display intra-population variability that is modulated by growth conditions. Conclusions These studies indicate that 2-NBDG can be used to aid in the high-throughput analysis of the influence of chemotherapeutics on glucose uptake in cancer

  6. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Energy Technology Data Exchange (ETDEWEB)

    Schuster, Andre; Bruno, Kenneth S.; Collett, James R.; Baker, Scott E.; Seiboth, Bernhard; Kubicek, Christian P.; Schmoll, Monika

    2012-01-02

    The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina), represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. RESULTS: Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ) by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414.CONCLUSIONS:Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  7. PCR cycles above routine numbers do not compromise high-throughput DNA barcoding results.

    Science.gov (United States)

    Vierna, J; Doña, J; Vizcaíno, A; Serrano, D; Jovani, R

    2017-10-01

    High-throughput DNA barcoding has become essential in ecology and evolution, but some technical questions still remain. Increasing the number of PCR cycles above the routine 20-30 cycles is a common practice when working with old-type specimens, which provide little amounts of DNA, or when facing annealing issues with the primers. However, increasing the number of cycles can raise the number of artificial mutations due to polymerase errors. In this work, we sequenced 20 COI libraries in the Illumina MiSeq platform. Libraries were prepared with 40, 45, 50, 55, and 60 PCR cycles from four individuals belonging to four species of four genera of cephalopods. We found no relationship between the number of PCR cycles and the number of mutations despite using a nonproofreading polymerase. Moreover, even when using a high number of PCR cycles, the resulting number of mutations was low enough not to be an issue in the context of high-throughput DNA barcoding (but may still remain an issue in DNA metabarcoding due to chimera formation). We conclude that the common practice of increasing the number of PCR cycles should not negatively impact the outcome of a high-throughput DNA barcoding study in terms of the occurrence of point mutations.

  8. High-throughput DNA sequencing errors are reduced by orders of magnitude using circle sequencing

    Science.gov (United States)

    Lou, Dianne I.; Hussmann, Jeffrey A.; McBee, Ross M.; Acevedo, Ashley; Andino, Raul; Press, William H.; Sawyer, Sara L.

    2013-01-01

    A major limitation of high-throughput DNA sequencing is the high rate of erroneous base calls produced. For instance, Illumina sequencing machines produce errors at a rate of ∼0.1–1 × 10−2 per base sequenced. These technologies typically produce billions of base calls per experiment, translating to millions of errors. We have developed a unique library preparation strategy, “circle sequencing,” which allows for robust downstream computational correction of these errors. In this strategy, DNA templates are circularized, copied multiple times in tandem with a rolling circle polymerase, and then sequenced on any high-throughput sequencing machine. Each read produced is computationally processed to obtain a consensus sequence of all linked copies of the original molecule. Physically linking the copies ensures that each copy is independently derived from the original molecule and allows for efficient formation of consensus sequences. The circle-sequencing protocol precedes standard library preparations and is therefore suitable for a broad range of sequencing applications. We tested our method using the Illumina MiSeq platform and obtained errors in our processed sequencing reads at a rate as low as 7.6 × 10−6 per base sequenced, dramatically improving the error rate of Illumina sequencing and putting error on par with low-throughput, but highly accurate, Sanger sequencing. Circle sequencing also had substantially higher efficiency and lower cost than existing barcode-based schemes for correcting sequencing errors. PMID:24243955

  9. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Science.gov (United States)

    Prashar, Ankush; Yildiz, Jane; McNicol, James W; Bryan, Glenn J; Jones, Hamlyn G

    2013-01-01

    The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  10. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Directory of Open Access Journals (Sweden)

    Schuster André

    2012-01-01

    Full Text Available Abstract Background The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina, represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. Results Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414. Conclusions Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  11. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  12. Droplet microfluidic technology for single-cell high-throughput screening.

    Science.gov (United States)

    Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L

    2009-08-25

    We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.

  13. High-throughput Sequencing Based Immune Repertoire Study during Infectious Disease

    Directory of Open Access Journals (Sweden)

    Dongni Hou

    2016-08-01

    Full Text Available The selectivity of the adaptive immune response is based on the enormous diversity of T and B cell antigen-specific receptors. The immune repertoire, the collection of T and B cells with functional diversity in the circulatory system at any given time, is dynamic and reflects the essence of immune selectivity. In this article, we review the recent advances in immune repertoire study of infectious diseases that achieved by traditional techniques and high-throughput sequencing techniques. High-throughput sequencing techniques enable the determination of complementary regions of lymphocyte receptors with unprecedented efficiency and scale. This progress in methodology enhances the understanding of immunologic changes during pathogen challenge, and also provides a basis for further development of novel diagnostic markers, immunotherapies and vaccines.

  14. Miniaturization of High-Throughput Epigenetic Methyltransferase Assays with Acoustic Liquid Handling.

    Science.gov (United States)

    Edwards, Bonnie; Lesnick, John; Wang, Jing; Tang, Nga; Peters, Carl

    2016-02-01

    Epigenetics continues to emerge as an important target class for drug discovery and cancer research. As programs scale to evaluate many new targets related to epigenetic expression, new tools and techniques are required to enable efficient and reproducible high-throughput epigenetic screening. Assay miniaturization increases screening throughput and reduces operating costs. Echo liquid handlers can transfer compounds, samples, reagents, and beads in submicroliter volumes to high-density assay formats using only acoustic energy-no contact or tips required. This eliminates tip costs and reduces the risk of reagent carryover. In this study, we demonstrate the miniaturization of a methyltransferase assay using Echo liquid handlers and two different assay technologies: AlphaLISA from PerkinElmer and EPIgeneous HTRF from Cisbio. © 2015 Society for Laboratory Automation and Screening.

  15. An automated system for high-throughput single cell-based breeding

    Science.gov (United States)

    Yoshimoto, Nobuo; Kida, Akiko; Jie, Xu; Kurokawa, Masaya; Iijima, Masumi; Niimi, Tomoaki; Maturana, Andrés D.; Nikaido, Itoshi; Ueda, Hiroki R.; Tatematsu, Kenji; Tanizawa, Katsuyuki; Kondo, Akihiko; Fujii, Ikuo; Kuroda, Shun'ichi

    2013-01-01

    When establishing the most appropriate cells from the huge numbers of a cell library for practical use of cells in regenerative medicine and production of various biopharmaceuticals, cell heterogeneity often found in an isogenic cell population limits the refinement of clonal cell culture. Here, we demonstrated high-throughput screening of the most suitable cells in a cell library by an automated undisruptive single-cell analysis and isolation system, followed by expansion of isolated single cells. This system enabled establishment of the most suitable cells, such as embryonic stem cells with the highest expression of the pluripotency marker Rex1 and hybridomas with the highest antibody secretion, which could not be achieved by conventional high-throughput cell screening systems (e.g., a fluorescence-activated cell sorter). This single cell-based breeding system may be a powerful tool to analyze stochastic fluctuations and delineate their molecular mechanisms. PMID:23378922

  16. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  17. Current developments in high-throughput analysis for microalgae cellular contents.

    Science.gov (United States)

    Lee, Tsung-Hua; Chang, Jo-Shu; Wang, Hsiang-Yu

    2013-11-01

    Microalgae have emerged as one of the most promising feedstocks for biofuels and bio-based chemical production. However, due to the lack of effective tools enabling rapid and high-throughput analysis of the content of microalgae biomass, the efficiency of screening and identification of microalgae with desired functional components from the natural environment is usually quite low. Moreover, the real-time monitoring of the production of target components from microalgae is also difficult. Recently, research efforts focusing on overcoming this limitation have started. In this review, the recent development of high-throughput methods for analyzing microalgae cellular contents is summarized. The future prospects and impacts of these detection methods in microalgae-related processing and industries are also addressed. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Automated High-Throughput Root Phenotyping of Arabidopsis thaliana Under Nutrient Deficiency Conditions.

    Science.gov (United States)

    Satbhai, Santosh B; Göschl, Christian; Busch, Wolfgang

    2017-01-01

    The central question of genetics is how a genotype determines the phenotype of an organism. Genetic mapping approaches are a key for finding answers to this question. In particular, genome-wide association (GWA) studies have been rapidly adopted to study the architecture of complex quantitative traits. This was only possible due to the improvement of high-throughput and low-cost phenotyping methodologies. In this chapter we provide a detailed protocol for obtaining root trait data from the model species Arabidopsis thaliana using the semiautomated, high-throughput phenotyping pipeline BRAT (Busch-lab Root Analysis Toolchain) for early root growth under the stress condition of iron deficiency. Extracted root trait data can be directly used to perform GWA mapping using the freely accessible web application GWAPP to identify marker polymorphisms associated with the phenotype of interest.

  19. Post-high-throughput screening analysis: an empirical compound prioritization scheme.

    Science.gov (United States)

    Oprea, Tudor I; Bologa, Cristian G; Edwards, Bruce S; Prossnitz, Eric R; Sklar, Larry A

    2005-08-01

    An empirical scheme to evaluate and prioritize screening hits from high-throughput screening (HTS) is proposed. Negative scores are given when chemotypes found in the HTS hits are present in annotated databases such as MDDR and WOMBAT or for testing positive in toxicity-related experiments reported in TOXNET. Positive scores were given for higher measured biological activities, for testing negative in toxicity-related literature, and for good overlap when profiled against drug-related properties. Particular emphasis is placed on estimating aqueous solubility to prioritize in vivo experiments. This empirical scheme is given as an illustration to assist the decision-making process in selecting chemotypes and individual compounds for further experimentation, when confronted with multiple hits from high-throughput experiments. The decision-making process is discussed for a set of G-protein coupled receptor antagonists and validated on a literature example for dihydrofolate reductase inhibition.

  20. Towards low-delay and high-throughput cognitive radio vehicular networks

    Directory of Open Access Journals (Sweden)

    Nada Elgaml

    2017-12-01

    Full Text Available Cognitive Radio Vehicular Ad-hoc Networks (CR-VANETs exploit cognitive radios to allow vehicles to access the unused channels in their radio environment. Thus, CR-VANETs do not only suffer the traditional CR problems, especially spectrum sensing, but also suffer new challenges due to the highly dynamic nature of VANETs. In this paper, we present a low-delay and high-throughput radio environment assessment scheme for CR-VANETs that can be easily incorporated with the IEEE 802.11p standard developed for VANETs. Simulation results show that the proposed scheme significantly reduces the time to get the radio environment map and increases the CR-VANET throughput.