WorldWideScience

Sample records for reliable high-throughput high-yield

  1. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  2. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    Science.gov (United States)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  3. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  4. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  5. Improved Yield, Performance and Reliability of High-Actuator-Count Deformable Mirrors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The project team will conduct processing and design research aimed at improving yield, performance, and reliability of high-actuator-count micro-electro-mechanical...

  6. Rapid and reliable high-throughput methods of DNA extraction for use in barcoding and molecular systematics of mushrooms.

    Science.gov (United States)

    Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc

    2010-07-01

    We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.

  7. High-throughput electrical characterization for robust overlay lithography control

    Science.gov (United States)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  8. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  9. High-throughput GPU-based LDPC decoding

    Science.gov (United States)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  10. Filtering high-throughput protein-protein interaction data using a combination of genomic features

    Directory of Open Access Journals (Sweden)

    Patil Ashwini

    2005-04-01

    Full Text Available Abstract Background Protein-protein interaction data used in the creation or prediction of molecular networks is usually obtained from large scale or high-throughput experiments. This experimental data is liable to contain a large number of spurious interactions. Hence, there is a need to validate the interactions and filter out the incorrect data before using them in prediction studies. Results In this study, we use a combination of 3 genomic features – structurally known interacting Pfam domains, Gene Ontology annotations and sequence homology – as a means to assign reliability to the protein-protein interactions in Saccharomyces cerevisiae determined by high-throughput experiments. Using Bayesian network approaches, we show that protein-protein interactions from high-throughput data supported by one or more genomic features have a higher likelihood ratio and hence are more likely to be real interactions. Our method has a high sensitivity (90% and good specificity (63%. We show that 56% of the interactions from high-throughput experiments in Saccharomyces cerevisiae have high reliability. We use the method to estimate the number of true interactions in the high-throughput protein-protein interaction data sets in Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens to be 27%, 18% and 68% respectively. Our results are available for searching and downloading at http://helix.protein.osaka-u.ac.jp/htp/. Conclusion A combination of genomic features that include sequence, structure and annotation information is a good predictor of true interactions in large and noisy high-throughput data sets. The method has a very high sensitivity and good specificity and can be used to assign a likelihood ratio, corresponding to the reliability, to each interaction.

  11. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  12. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  13. High-throughput screening to identify inhibitors of lysine demethylases.

    Science.gov (United States)

    Gale, Molly; Yan, Qin

    2015-01-01

    Lysine demethylases (KDMs) are epigenetic regulators whose dysfunction is implicated in the pathology of many human diseases including various types of cancer, inflammation and X-linked intellectual disability. Particular demethylases have been identified as promising therapeutic targets, and tremendous efforts are being devoted toward developing suitable small-molecule inhibitors for clinical and research use. Several High-throughput screening strategies have been developed to screen for small-molecule inhibitors of KDMs, each with advantages and disadvantages in terms of time, cost, effort, reliability and sensitivity. In this Special Report, we review and evaluate the High-throughput screening methods utilized for discovery of novel small-molecule KDM inhibitors.

  14. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  15. Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.

    Science.gov (United States)

    Yang, Darren; Wong, Wesley P

    2018-01-01

    We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.

  16. Integration of ARTP mutagenesis with biosensor-mediated high-throughput screening to improve L-serine yield in Corynebacterium glutamicum.

    Science.gov (United States)

    Zhang, Xin; Zhang, Xiaomei; Xu, Guoqiang; Zhang, Xiaojuan; Shi, Jinsong; Xu, Zhenghong

    2018-05-03

    L-Serine is widely used in the pharmaceutical, food, and cosmetics industries. Although direct fermentative production of L-serine from sugar in Corynebacterium glutamicum has been achieved, the L-serine yield remains relatively low. In this study, atmospheric and room temperature plasma (ARTP) mutagenesis was used to improve the L-serine yield based on engineered C. glutamicum ΔSSAAI strain. Subsequently, we developed a novel high-throughput screening method using a biosensor constructed based on NCgl0581, a transcriptional factor specifically responsive to L-serine, so that L-serine concentration within single cell of C. glutamicum can be monitored via fluorescence-activated cell sorting (FACS). Novel L-serine-producing mutants were isolated from a large library of mutagenized cells. The mutant strain A36-pDser was screened from 1.2 × 10 5 cells, and the magnesium ion concentration in the medium was optimized specifically for this mutant. C. glutamicum A36-pDser accumulated 34.78 g/L L-serine with a yield of 0.35 g/g sucrose, which were 35.9 and 66.7% higher than those of the parent C. glutamicum ΔSSAAI-pDser strain, respectively. The L-serine yield achieved in this mutant was the highest of all reported L-serine-producing strains of C. glutamicum. Moreover, the whole-genome sequencing identified 11 non-synonymous mutations of genes associated with metabolic and transport pathways, which might be responsible for the higher L-serine production and better cell growth in C. glutamicum A36-pDser. This study explored an effective mutagenesis strategy and reported a novel high-throughput screening method for the development of L-serine-producing strains.

  17. High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret

    Science.gov (United States)

    Zheng, Guokuo

    In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.

  18. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    Science.gov (United States)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  19. Improvements and impacts of GRCh38 human reference on high throughput sequencing data analysis.

    Science.gov (United States)

    Guo, Yan; Dai, Yulin; Yu, Hui; Zhao, Shilin; Samuels, David C; Shyr, Yu

    2017-03-01

    Analyses of high throughput sequencing data starts with alignment against a reference genome, which is the foundation for all re-sequencing data analyses. Each new release of the human reference genome has been augmented with improved accuracy and completeness. It is presumed that the latest release of human reference genome, GRCh38 will contribute more to high throughput sequencing data analysis by providing more accuracy. But the amount of improvement has not yet been quantified. We conducted a study to compare the genomic analysis results between the GRCh38 reference and its predecessor GRCh37. Through analyses of alignment, single nucleotide polymorphisms, small insertion/deletions, copy number and structural variants, we show that GRCh38 offers overall more accurate analysis of human sequencing data. More importantly, GRCh38 produced fewer false positive structural variants. In conclusion, GRCh38 is an improvement over GRCh37 not only from the genome assembly aspect, but also yields more reliable genomic analysis results. Copyright © 2017. Published by Elsevier Inc.

  20. High throughput route selection in multi-rate wireless mesh networks

    Institute of Scientific and Technical Information of China (English)

    WEI Yi-fei; GUO Xiang-li; SONG Mei; SONG Jun-de

    2008-01-01

    Most existing Ad-hoc routing protocols use the shortest path algorithm with a hop count metric to select paths. It is appropriate in single-rate wireless networks, but has a tendency to select paths containing long-distance links that have low data rates and reduced reliability in multi-rate networks. This article introduces a high throughput routing algorithm utilizing the multi-rate capability and some mesh characteristics in wireless fidelity (WiFi) mesh networks. It uses the medium access control (MAC) transmission time as the routing metric, which is estimated by the information passed up from the physical layer. When the proposed algorithm is adopted, the Ad-hoc on-demand distance vector (AODV) routing can be improved as high throughput AODV (HT-AODV). Simulation results show that HT-AODV is capable of establishing a route that has high data-rate, short end-to-end delay and great network throughput.

  1. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  2. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  3. High-throughput characterization methods for lithium batteries

    Directory of Open Access Journals (Sweden)

    Yingchun Lyu

    2017-09-01

    Full Text Available The development of high-performance lithium ion batteries requires the discovery of new materials and the optimization of key components. By contrast with traditional one-by-one method, high-throughput method can synthesize and characterize a large number of compositionally varying samples, which is able to accelerate the pace of discovery, development and optimization process of materials. Because of rapid progress in thin film and automatic control technologies, thousands of compounds with different compositions could be synthesized rapidly right now, even in a single experiment. However, the lack of rapid or combinatorial characterization technologies to match with high-throughput synthesis methods, limit the application of high-throughput technology. Here, we review a series of representative high-throughput characterization methods used in lithium batteries, including high-throughput structural and electrochemical characterization methods and rapid measuring technologies based on synchrotron light sources.

  4. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels....... A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct...... characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion...

  5. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    High-throughput screening is extensively applied for identification of drug targets and drug discovery and recently it found entry into toxicity testing. Reverse phase protein arrays (RPPAs) are used widespread for quantification of protein markers. We reasoned that RPPAs also can be utilized...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si......RNAs with known killing effects as a model system to demonstrate that RPPA-based protein quantification can serve as substitute readout of cell viability, hereby reliably reflecting toxicity. In terms of automation, cell exposure, protein harvest, serial dilution and sample reformatting were performed using...

  6. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  7. High-Throughput Cloning and Expression Library Creation for Functional Proteomics

    Science.gov (United States)

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-01-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047

  8. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  9. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  10. Evaluation of Capacity on a High Throughput Vol-oxidizer for Operability

    International Nuclear Information System (INIS)

    Kim, Young Hwan; Park, Geun Il; Lee, Jung Won; Jung, Jae Hoo; Kim, Ki Ho; Lee, Yong Soon; Lee, Do Youn; Kim, Su Sung

    2010-01-01

    KAERI is developing a pyro-process. As a piece of process equipment, a high throughput vol-oxidizer which can handle a several tens kg HM/batch was developed to supply U 3 O 8 powders to an electrolytic reduction(ER) reactor. To increase the reduction yield, UO 2 pellets should be converted into uniform powders. In this paper, we aim at the evaluation of a high throughput vol-oxidizer for operability. The evaluation consisted of 3 targets, a mechanical motion test, a heating test and hull separation test. In order to test a high throughput vol-oxidizer, By using a control system, mechanical motion tests of the vol-oxidizer were conducted, and heating rates were analyzed. Also the separation tests of hulls for recovery rate were conducted. The test results of the vol-oxidizer are going to be applied for operability. A study on the characteristics of the volatile gas produced during a vol-oxidation process is not included in this study

  11. Library Design-Facilitated High-Throughput Sequencing of Synthetic Peptide Libraries.

    Science.gov (United States)

    Vinogradov, Alexander A; Gates, Zachary P; Zhang, Chi; Quartararo, Anthony J; Halloran, Kathryn H; Pentelute, Bradley L

    2017-11-13

    A methodology to achieve high-throughput de novo sequencing of synthetic peptide mixtures is reported. The approach leverages shotgun nanoliquid chromatography coupled with tandem mass spectrometry-based de novo sequencing of library mixtures (up to 2000 peptides) as well as automated data analysis protocols to filter away incorrect assignments, noise, and synthetic side-products. For increasing the confidence in the sequencing results, mass spectrometry-friendly library designs were developed that enabled unambiguous decoding of up to 600 peptide sequences per hour while maintaining greater than 85% sequence identification rates in most cases. The reliability of the reported decoding strategy was additionally confirmed by matching fragmentation spectra for select authentic peptides identified from library sequencing samples. The methods reported here are directly applicable to screening techniques that yield mixtures of active compounds, including particle sorting of one-bead one-compound libraries and affinity enrichment of synthetic library mixtures performed in solution.

  12. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.

    Science.gov (United States)

    Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli

    2018-01-23

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning

  13. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  14. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  15. REDItools: high-throughput RNA editing detection made easy.

    Science.gov (United States)

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  16. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  17. High-yield exfoliation of tungsten disulphide nanosheets by rational mixing of low-boiling-point solvents

    Science.gov (United States)

    Sajedi-Moghaddam, Ali; Saievar-Iranizad, Esmaiel

    2018-01-01

    Developing high-throughput, reliable, and facile approaches for producing atomically thin sheets of transition metal dichalcogenides is of great importance to pave the way for their use in real applications. Here, we report a highly promising route for exfoliating two-dimensional tungsten disulphide sheets by using binary combination of low-boiling-point solvents. Experimental results show significant dependence of exfoliation yield on the type of solvents as well as relative volume fraction of each solvent. The highest yield was found for appropriate combination of isopropanol/water (20 vol% isopropanol and 80 vol% water) which is approximately 7 times higher than that in pure isopropanol and 4 times higher than that in pure water. The dramatic increase in exfoliation yield can be attributed to perfect match between the surface tension of tungsten disulphide and binary solvent system. Furthermore, solvent molecular size also has a profound impact on the exfoliation efficiency, due to the steric repulsion.

  18. Applications of high-throughput clonogenic survival assays in high-LET particle microbeams

    Directory of Open Access Journals (Sweden)

    Antonios eGeorgantzoglou

    2016-01-01

    Full Text Available Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-LET particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells’ clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells’ response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell’s capacity to divide at least 4-5 times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  19. Applications of High-Throughput Clonogenic Survival Assays in High-LET Particle Microbeams.

    Science.gov (United States)

    Georgantzoglou, Antonios; Merchant, Michael J; Jeynes, Jonathan C G; Mayhead, Natalie; Punia, Natasha; Butler, Rachel E; Jena, Rajesh

    2015-01-01

    Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-linear energy transfer (LET) particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells' clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells' response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell's capacity to divide at least four to five times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  20. Achieving high data throughput in research networks

    International Nuclear Information System (INIS)

    Matthews, W.; Cottrell, L.

    2001-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155 Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  1. Achieving High Data Throughput in Research Networks

    International Nuclear Information System (INIS)

    Matthews, W

    2004-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  2. High-throughput technology for novel SO2 oxidation catalysts

    International Nuclear Information System (INIS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO 2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO 2 to SO 3 . High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO 2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO 2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO 3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. (topical review)

  3. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    Science.gov (United States)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  4. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  5. Engineering a vitamin B12 high-throughput screening system by riboswitch sensor in Sinorhizobium meliloti.

    Science.gov (United States)

    Cai, Yingying; Xia, Miaomiao; Dong, Huina; Qian, Yuan; Zhang, Tongcun; Zhu, Beiwei; Wu, Jinchuan; Zhang, Dawei

    2018-05-11

    As a very important coenzyme in the cell metabolism, Vitamin B 12 (cobalamin, VB 12 ) has been widely used in food and medicine fields. The complete biosynthesis of VB 12 requires approximately 30 genes, but overexpression of these genes did not result in expected increase of VB 12 production. High-yield VB 12 -producing strains are usually obtained by mutagenesis treatments, thus developing an efficient screening approach is urgently needed. By the help of engineered strains with varied capacities of VB 12 production, a riboswitch library was constructed and screened, and the btuB element from Salmonella typhimurium was identified as the best regulatory device. A flow cytometry high-throughput screening system was developed based on the btuB riboswitch with high efficiency to identify positive mutants. Mutation of Sinorhizobium meliloti (S. meliloti) was optimized using the novel mutation technique of atmospheric and room temperature plasma (ARTP). Finally, the mutant S. meliloti MC5-2 was obtained and considered as a candidate for industrial applications. After 7 d's cultivation on a rotary shaker at 30 °C, the VB 12 titer of S. meliloti MC5-2 reached 156 ± 4.2 mg/L, which was 21.9% higher than that of the wild type strain S. meliloti 320 (128 ± 3.2 mg/L). The genome of S. meliloti MC5-2 was sequenced, and gene mutations were identified and analyzed. To our knowledge, it is the first time that a riboswitch element was used in S. meliloti. The flow cytometry high-throughput screening system was successfully developed and a high-yield VB 12 producing strain was obtained. The identified and analyzed gene mutations gave useful information for developing high-yield strains by metabolic engineering. Overall, this work provides a useful high-throughput screening method for developing high VB 12 -yield strains.

  6. Modular high-throughput test stand for versatile screening of thin-film materials libraries

    International Nuclear Information System (INIS)

    Thienhaus, Sigurd; Hamann, Sven; Ludwig, Alfred

    2011-01-01

    Versatile high-throughput characterization tools are required for the development of new materials using combinatorial techniques. Here, we describe a modular, high-throughput test stand for the screening of thin-film materials libraries, which can carry out automated electrical, magnetic and magnetoresistance measurements in the temperature range of −40 to 300 °C. As a proof of concept, we measured the temperature-dependent resistance of Fe–Pd–Mn ferromagnetic shape-memory alloy materials libraries, revealing reversible martensitic transformations and the associated transformation temperatures. Magneto-optical screening measurements of a materials library identify ferromagnetic samples, whereas resistivity maps support the discovery of new phases. A distance sensor in the same setup allows stress measurements in materials libraries deposited on cantilever arrays. A combination of these methods offers a fast and reliable high-throughput characterization technology for searching for new materials. Using this approach, a composition region has been identified in the Fe–Pd–Mn system that combines ferromagnetism and martensitic transformation.

  7. Application of ToxCast High-Throughput Screening and ...

    Science.gov (United States)

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  8. Combinatorial chemoenzymatic synthesis and high-throughput screening of sialosides.

    Science.gov (United States)

    Chokhawala, Harshal A; Huang, Shengshu; Lau, Kam; Yu, Hai; Cheng, Jiansong; Thon, Vireak; Hurtado-Ziola, Nancy; Guerrero, Juan A; Varki, Ajit; Chen, Xi

    2008-09-19

    Although the vital roles of structures containing sialic acid in biomolecular recognition are well documented, limited information is available on how sialic acid structural modifications, sialyl linkages, and the underlying glycan structures affect the binding or the activity of sialic acid-recognizing proteins and related downstream biological processes. A novel combinatorial chemoenzymatic method has been developed for the highly efficient synthesis of biotinylated sialosides containing different sialic acid structures and different underlying glycans in 96-well plates from biotinylated sialyltransferase acceptors and sialic acid precursors. By transferring the reaction mixtures to NeutrAvidin-coated plates and assaying for the yields of enzymatic reactions using lectins recognizing sialyltransferase acceptors but not the sialylated products, the biotinylated sialoside products can be directly used, without purification, for high-throughput screening to quickly identify the ligand specificity of sialic acid-binding proteins. For a proof-of-principle experiment, 72 biotinylated alpha2,6-linked sialosides were synthesized in 96-well plates from 4 biotinylated sialyltransferase acceptors and 18 sialic acid precursors using a one-pot three-enzyme system. High-throughput screening assays performed in NeutrAvidin-coated microtiter plates show that whereas Sambucus nigra Lectin binds to alpha2,6-linked sialosides with high promiscuity, human Siglec-2 (CD22) is highly selective for a number of sialic acid structures and the underlying glycans in its sialoside ligands.

  9. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  10. High-throughput screening for industrial enzyme production hosts by droplet microfluidics

    DEFF Research Database (Denmark)

    Sjostrom, Staffan L.; Bai, Yunpeng; Huang, Mingtao

    2014-01-01

    A high-throughput method for single cell screening by microfluidic droplet sorting is applied to a whole-genome mutated yeast cell library yielding improved production hosts of secreted industrial enzymes. The sorting method is validated by enriching a yeast strain 14 times based on its α......-amylase production, close to the theoretical maximum enrichment. Furthermore, a 105 member yeast cell library is screened yielding a clone with a more than 2-fold increase in α-amylase production. The increase in enzyme production results from an improvement of the cellular functions of the production host...

  11. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  12. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Science.gov (United States)

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  13. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  14. A simple, high throughput method to locate single copy sequences from Bacterial Artificial Chromosome (BAC libraries using High Resolution Melt analysis

    Directory of Open Access Journals (Sweden)

    Caligari Peter DS

    2010-05-01

    Full Text Available Abstract Background The high-throughput anchoring of genetic markers into contigs is required for many ongoing physical mapping projects. Multidimentional BAC pooling strategies for PCR-based screening of large insert libraries is a widely used alternative to high density filter hybridisation of bacterial colonies. To date, concerns over reliability have led most if not all groups engaged in high throughput physical mapping projects to favour BAC DNA isolation prior to amplification by conventional PCR. Results Here, we report the first combined use of Multiplex Tandem PCR (MT-PCR and High Resolution Melt (HRM analysis on bacterial stocks of BAC library superpools as a means of rapidly anchoring markers to BAC colonies and thereby to integrate genetic and physical maps. We exemplify the approach using a BAC library of the model plant Arabidopsis thaliana. Super pools of twenty five 384-well plates and two-dimension matrix pools of the BAC library were prepared for marker screening. The entire procedure only requires around 3 h to anchor one marker. Conclusions A pre-amplification step during MT-PCR allows high multiplexing and increases the sensitivity and reliability of subsequent HRM discrimination. This simple gel-free protocol is more reliable, faster and far less costly than conventional PCR screening. The option to screen in parallel 3 genetic markers in one MT-PCR-HRM reaction using templates from directly pooled bacterial stocks of BAC-containing bacteria further reduces time for anchoring markers in physical maps of species with large genomes.

  15. High throughput screening method for assessing heterogeneity of microorganisms

    NARCIS (Netherlands)

    Ingham, C.J.; Sprenkels, A.J.; van Hylckama Vlieg, J.E.T.; Bomer, Johan G.; de Vos, W.M.; van den Berg, Albert

    2006-01-01

    The invention relates to the field of microbiology. Provided is a method which is particularly powerful for High Throughput Screening (HTS) purposes. More specific a high throughput method for determining heterogeneity or interactions of microorganisms is provided.

  16. A robust robotic high-throughput antibody purification platform.

    Science.gov (United States)

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  18. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  19. Optimization and high-throughput screening of antimicrobial peptides.

    Science.gov (United States)

    Blondelle, Sylvie E; Lohner, Karl

    2010-01-01

    While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.

  20. Intelligent, net or wireless enabled fluorosensors for high throughput monitoring of assorted crops

    International Nuclear Information System (INIS)

    Barócsi, Attila

    2013-01-01

    Phenotypic characterization of assorted crops of different genotypes requires large data sets of diverse types for statistical reliability. Temporal monitoring of plant fluorescence is able to capture the dynamics of the photosynthesis process that is summarized in a number of parameters for which the genotypic heritability can be calculated. In this paper, an intelligent sensor system is presented that is capable of high-throughput production of baseline-corrected temporal fluorescence curves with many feature points. These are obtained by integrating several (direct and modulated) measurement methods applied at different wavelengths. Simultaneously, temporal change of the sample's emission and the ambient reference temperatures are recorded. Multiple sensors can be deployed easily in large span greenhouse environments with centralized data collection over wired or wireless infrastructure. The unique features of the sensors are a compact, embedded signal guiding fibre optic system, instrument-standard variable tubular detector and source modules, net or wireless enabling for remote control and fast, quasi real-time data collection. Along with the instrumentation, some representative phenotyping data are also presented that were taken on a subset of pepper recombinant inbred line population. It is also demonstrated that transient fluorescence feature points yield high heritability, offering a high confidence level for distinguishing the pepper genotypes. (paper)

  1. High-throughput screening (HTS) and modeling of the retinoid ...

    Science.gov (United States)

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  2. High yield purification of full-length functional hERG K+ channels produced in Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Molbaek, Karen; Scharff-Poulsen, Peter; Hélix-Nielsen, Claus

    2015-01-01

    knowledge this is the first reported high-yield production and purification of full length, tetrameric and functional hERG. This significant breakthrough will be paramount in obtaining hERG crystal structures, and in establishment of new high-throughput hERG drug safety screening assays....

  3. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  4. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    Science.gov (United States)

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  5. Multitrait, Random Regression, or Simple Repeatability Model in High-Throughput Phenotyping Data Improve Genomic Prediction for Wheat Grain Yield.

    Science.gov (United States)

    Sun, Jin; Rutkoski, Jessica E; Poland, Jesse A; Crossa, José; Jannink, Jean-Luc; Sorrells, Mark E

    2017-07-01

    High-throughput phenotyping (HTP) platforms can be used to measure traits that are genetically correlated with wheat ( L.) grain yield across time. Incorporating such secondary traits in the multivariate pedigree and genomic prediction models would be desirable to improve indirect selection for grain yield. In this study, we evaluated three statistical models, simple repeatability (SR), multitrait (MT), and random regression (RR), for the longitudinal data of secondary traits and compared the impact of the proposed models for secondary traits on their predictive abilities for grain yield. Grain yield and secondary traits, canopy temperature (CT) and normalized difference vegetation index (NDVI), were collected in five diverse environments for 557 wheat lines with available pedigree and genomic information. A two-stage analysis was applied for pedigree and genomic selection (GS). First, secondary traits were fitted by SR, MT, or RR models, separately, within each environment. Then, best linear unbiased predictions (BLUPs) of secondary traits from the above models were used in the multivariate prediction models to compare predictive abilities for grain yield. Predictive ability was substantially improved by 70%, on average, from multivariate pedigree and genomic models when including secondary traits in both training and test populations. Additionally, (i) predictive abilities slightly varied for MT, RR, or SR models in this data set, (ii) results indicated that including BLUPs of secondary traits from the MT model was the best in severe drought, and (iii) the RR model was slightly better than SR and MT models under drought environment. Copyright © 2017 Crop Science Society of America.

  6. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    Science.gov (United States)

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  7. High-throughput optical system for HDES hyperspectral imager

    Science.gov (United States)

    Václavík, Jan; Melich, Radek; Pintr, Pavel; Pleštil, Jan

    2015-01-01

    Affordable, long-wave infrared hyperspectral imaging calls for use of an uncooled FPA with high-throughput optics. This paper describes the design of the optical part of a stationary hyperspectral imager in a spectral range of 7-14 um with a field of view of 20°×10°. The imager employs a push-broom method made by a scanning mirror. High throughput and a demand for simplicity and rigidity led to a fully refractive design with highly aspheric surfaces and off-axis positioning of the detector array. The design was optimized to exploit the machinability of infrared materials by the SPDT method and a simple assemblage.

  8. The Principals and Practice of Distributed High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The potential of Distributed Processing Systems to deliver computing capabilities with qualities ranging from high availability and reliability to easy expansion in functionality and capacity were recognized and formalized in the 1970’s. For more three decade these principals Distributed Computing guided the development of the HTCondor resource and job management system. The widely adopted suite of software tools offered by HTCondor are based on novel distributed computing technologies and are driven by the evolving needs of High Throughput scientific applications. We will review the principals that underpin our work, the distributed computing frameworks and technologies we developed and the lessons we learned from delivering effective and dependable software tools in an ever changing landscape computing technologies and needs that range today from a desktop computer to tens of thousands of cores offered by commercial clouds. About the speaker Miron Livny received a B.Sc. degree in Physics and Mat...

  9. Droplet electrospray ionization mass spectrometry for high throughput screening for enzyme inhibitors.

    Science.gov (United States)

    Sun, Shuwen; Kennedy, Robert T

    2014-09-16

    High throughput screening (HTS) is important for identifying molecules with desired properties. Mass spectrometry (MS) is potentially powerful for label-free HTS due to its high sensitivity, speed, and resolution. Segmented flow, where samples are manipulated as droplets separated by an immiscible fluid, is an intriguing format for high throughput MS because it can be used to reliably and precisely manipulate nanoliter volumes and can be directly coupled to electrospray ionization (ESI) MS for rapid analysis. In this study, we describe a "MS Plate Reader" that couples standard multiwell plate HTS workflow to droplet ESI-MS. The MS plate reader can reformat 3072 samples from eight 384-well plates into nanoliter droplets segmented by an immiscible oil at 4.5 samples/s and sequentially analyze them by MS at 2 samples/s. Using the system, a label-free screen for cathepsin B modulators against 1280 chemicals was completed in 45 min with a high Z-factor (>0.72) and no false positives (24 of 24 hits confirmed). The assay revealed 11 structures not previously linked to cathepsin inhibition. For even larger scale screening, reformatting and analysis could be conducted simultaneously, which would enable more than 145,000 samples to be analyzed in 1 day.

  10. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy

    International Nuclear Information System (INIS)

    Lai Yiuwai; Hofmann, Martin R; Ludwig, Alfred; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios

    2011-01-01

    A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  11. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy.

    Science.gov (United States)

    Lai, Yiu Wai; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios; Hofmann, Martin R; Ludwig, Alfred

    2011-10-01

    A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  12. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    Science.gov (United States)

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  13. High throughput measurement of high temperature strength of ceramics in controlled atmosphere and its use on solid oxide fuel cell anode supports

    DEFF Research Database (Denmark)

    Frandsen, Henrik Lund; Curran, Declan; Rasmussen, Steffen

    2014-01-01

    In the development of structural and functional ceramics for high temperature electrochemical conversion devices such as solid oxide fuel cells, their mechanical properties must be tested at operational conditions, i.e. at high temperature and controlled atmospheres. Furthermore, characterization...... for testing multiple samples at operational conditions providing a high throughput and thus the possibility achieve high reliability. Optical methods are used to measure deformations contactless, frictionless load measuring is achieved, and multiple samples are handled in one heat up. The methodology...... is validated at room temperature, and exemplified by measurement of the strength of solid oxide fuel cell anode supports at 800 C. © 2014 Elsevier B.V. All rights reserved....

  14. High-throughput theoretical design of lithium battery materials

    International Nuclear Information System (INIS)

    Ling Shi-Gang; Gao Jian; Xiao Rui-Juan; Chen Li-Quan

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. (topical review)

  15. High throughput salt separation from uranium deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S.W.; Park, K.M.; Kim, J.G.; Kim, I.T.; Park, S.B., E-mail: swkwon@kaeri.re.kr [Korea Atomic Energy Research Inst. (Korea, Republic of)

    2014-07-01

    It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites in pyroprocessing. Multilayer porous crucible system was proposed to increase a throughput of the salt distiller in this study. An integrated sieve-crucible assembly was also investigated for the practical use of the porous crucible system. The salt evaporation behaviors were compared between the conventional nonporous crucible and the porous crucible. Two step weight reductions took place in the porous crucible, whereas the salt weight reduced only at high temperature by distillation in a nonporous crucible. The first weight reduction in the porous crucible was caused by the liquid salt penetrated out through the perforated crucible during the temperature elevation until the distillation temperature. Multilayer porous crucibles have a benefit to expand the evaporation surface area. (author)

  16. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  17. High-Throughput Block Optical DNA Sequence Identification.

    Science.gov (United States)

    Sagar, Dodderi Manjunatha; Korshoj, Lee Erik; Hanson, Katrina Bethany; Chowdhury, Partha Pratim; Otoupal, Peter Britton; Chatterjee, Anushree; Nagpal, Prashant

    2018-01-01

    Optical techniques for molecular diagnostics or DNA sequencing generally rely on small molecule fluorescent labels, which utilize light with a wavelength of several hundred nanometers for detection. Developing a label-free optical DNA sequencing technique will require nanoscale focusing of light, a high-throughput and multiplexed identification method, and a data compression technique to rapidly identify sequences and analyze genomic heterogeneity for big datasets. Such a method should identify characteristic molecular vibrations using optical spectroscopy, especially in the "fingerprinting region" from ≈400-1400 cm -1 . Here, surface-enhanced Raman spectroscopy is used to demonstrate label-free identification of DNA nucleobases with multiplexed 3D plasmonic nanofocusing. While nanometer-scale mode volumes prevent identification of single nucleobases within a DNA sequence, the block optical technique can identify A, T, G, and C content in DNA k-mers. The content of each nucleotide in a DNA block can be a unique and high-throughput method for identifying sequences, genes, and other biomarkers as an alternative to single-letter sequencing. Additionally, coupling two complementary vibrational spectroscopy techniques (infrared and Raman) can improve block characterization. These results pave the way for developing a novel, high-throughput block optical sequencing method with lossy genomic data compression using k-mer identification from multiplexed optical data acquisition. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  19. Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections

    Science.gov (United States)

    Tchorowski, Nicole; Murawski, Robert

    2014-01-01

    New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option

  20. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families.......Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...

  1. Development of Amorphous/Microcrystalline Silicon Tandem Thin-Film Solar Modules with Low Output Voltage, High Energy Yield, Low Light-Induced Degradation, and High Damp-Heat Reliability

    Directory of Open Access Journals (Sweden)

    Chin-Yi Tsai

    2014-01-01

    Full Text Available In this work, tandem amorphous/microcrystalline silicon thin-film solar modules with low output voltage, high energy yield, low light-induced degradation, and high damp-heat reliability were successfully designed and developed. Several key technologies of passivation, transparent-conducting-oxide films, and cell and segment laser scribing were researched, developed, and introduced into the production line to enhance the performance of these low-voltage modules. A 900 kWp photovoltaic system with these low-voltage panels was installed and its performance ratio has been simulated and projected to be 92.1%, which is 20% more than the crystalline silicon and CdTe counterparts.

  2. High-throughput measurement of rice tillers using a conveyor equipped with x-ray computed tomography

    Science.gov (United States)

    Yang, Wanneng; Xu, Xiaochun; Duan, Lingfeng; Luo, Qingming; Chen, Shangbin; Zeng, Shaoqun; Liu, Qian

    2011-02-01

    Tillering is one of the most important agronomic traits because the number of shoots per plant determines panicle number, a key component of grain yield. The conventional method of counting tillers is still manual. Under the condition of mass measurement, the accuracy and efficiency could be gradually degraded along with fatigue of experienced staff. Thus, manual measurement, including counting and recording, is not only time consuming but also lack objectivity. To automate this process, we developed a high-throughput facility, dubbed high-throughput system for measuring automatically rice tillers (H-SMART), for measuring rice tillers based on a conventional x-ray computed tomography (CT) system and industrial conveyor. Each pot-grown rice plant was delivered into the CT system for scanning via the conveyor equipment. A filtered back-projection algorithm was used to reconstruct the transverse section image of the rice culms. The number of tillers was then automatically extracted by image segmentation. To evaluate the accuracy of this system, three batches of rice at different growth stages (tillering, heading, or filling) were tested, yielding absolute mean absolute errors of 0.22, 0.36, and 0.36, respectively. Subsequently, the complete machine was used under industry conditions to estimate its efficiency, which was 4320 pots per continuous 24 h workday. Thus, the H-SMART could determine the number of tillers of pot-grown rice plants, providing three advantages over the manual tillering method: absence of human disturbance, automation, and high throughput. This facility expands the application of agricultural photonics in plant phenomics.

  3. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  4. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  5. High-throughput screening and confirmation of 22 banned veterinary drugs in feedstuffs using LC-MS/MS and high-resolution Orbitrap mass spectrometry.

    Science.gov (United States)

    Wang, Xufeng; Liu, Yanghong; Su, Yijuan; Yang, Jianwen; Bian, Kui; Wang, Zongnan; He, Li-Min

    2014-01-15

    A new analytical strategy based on liquid chromatography-tandem mass spectrometry (LC-MS/MS) combined with accurate mass high-resolution Orbitrap mass spectrometry (HR-Orbitrap MS) was performed for high-throughput screening, confirmation, and quantification of 22 banned or unauthorized veterinary drugs in feedstuffs according to Bulletin 235 of the Ministry of Agriculture, China. Feed samples were extracted with acidified acetonitrile, followed by cleanup using solid-phase extraction cartridge. The extracts were first screened by LC-MS/MS in a single selected reaction monitoring mode. The suspected positive samples were subjected to a specific pretreatment for confirmation and quantification of analyte of interest with LC-MS/MS and HR-Orbitrap MS. Mean recoveries for all target analytes (except for carbofuran and chlordimeform, which were about 35 and 45%, respectively) ranged from 52.2 to 90.4%, and the relative standard deviations were screening of real samples obtained from local feed markets and confirmation of the suspected target analytes. It provides a high-throughput, sensitive, and reliable screening, identification, and quantification of banned veterinary drugs in routine monitoring programs of feedstuffs.

  6. High throughput, low set-up time reconfigurable linear feedback shift registers

    NARCIS (Netherlands)

    Nas, R.J.M.; Berkel, van C.H.

    2010-01-01

    This paper presents a hardware design for a scalable, high throughput, configurable LFSR. High throughput is achieved by producing L consecutive outputs per clock cycle with a clock cycle period that, for practical cases, increases only logarithmically with the block size L and the length of the

  7. Optical tools for high-throughput screening of abrasion resistance of combinatorial libraries of organic coatings

    Science.gov (United States)

    Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.

    2002-02-01

    Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.

  8. Automated cleaning and pre-processing of immunoglobulin gene sequences from high-throughput sequencing

    Directory of Open Access Journals (Sweden)

    Miri eMichaeli

    2012-12-01

    Full Text Available High throughput sequencing (HTS yields tens of thousands to millions of sequences that require a large amount of pre-processing work to clean various artifacts. Such cleaning cannot be performed manually. Existing programs are not suitable for immunoglobulin (Ig genes, which are variable and often highly mutated. This paper describes Ig-HTS-Cleaner (Ig High Throughput Sequencing Cleaner, a program containing a simple cleaning procedure that successfully deals with pre-processing of Ig sequences derived from HTS, and Ig-Indel-Identifier (Ig Insertion – Deletion Identifier, a program for identifying legitimate and artifact insertions and/or deletions (indels. Our programs were designed for analyzing Ig gene sequences obtained by 454 sequencing, but they are applicable to all types of sequences and sequencing platforms. Ig-HTS-Cleaner and Ig-Indel-Identifier have been implemented in Java and saved as executable JAR files, supported on Linux and MS Windows. No special requirements are needed in order to run the programs, except for correctly constructing the input files as explained in the text. The programs' performance has been tested and validated on real and simulated data sets.

  9. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  10. Solion ion source for high-efficiency, high-throughput solar cell manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Koo, John, E-mail: john-koo@amat.com; Binns, Brant; Miller, Timothy; Krause, Stephen; Skinner, Wesley; Mullin, James [Applied Materials, Inc., Varian Semiconductor Equipment Business Unit, 35 Dory Road, Gloucester, Massachusetts 01930 (United States)

    2014-02-15

    In this paper, we introduce the Solion ion source for high-throughput solar cell doping. As the source power is increased to enable higher throughput, negative effects degrade the lifetime of the plasma chamber and the extraction electrodes. In order to improve efficiency, we have explored a wide range of electron energies and determined the conditions which best suit production. To extend the lifetime of the source we have developed an in situ cleaning method using only existing hardware. With these combinations, source life-times of >200 h for phosphorous and >100 h for boron ion beams have been achieved while maintaining 1100 cell-per-hour production.

  11. High-throughput screening of small molecule libraries using SAMDI mass spectrometry.

    Science.gov (United States)

    Gurard-Levin, Zachary A; Scholle, Michael D; Eisenberg, Adam H; Mrksich, Milan

    2011-07-11

    High-throughput screening is a common strategy used to identify compounds that modulate biochemical activities, but many approaches depend on cumbersome fluorescent reporters or antibodies and often produce false-positive hits. The development of "label-free" assays addresses many of these limitations, but current approaches still lack the throughput needed for applications in drug discovery. This paper describes a high-throughput, label-free assay that combines self-assembled monolayers with mass spectrometry, in a technique called SAMDI, as a tool for screening libraries of 100,000 compounds in one day. This method is fast, has high discrimination, and is amenable to a broad range of chemical and biological applications.

  12. High-Throughput Combinatorial Development of High-Entropy Alloys For Light-Weight Structural Applications

    Energy Technology Data Exchange (ETDEWEB)

    Van Duren, Jeroen K [Intermolecular, Inc., San Jose, CA (United States); Koch, Carl [North Carolina State Univ., Raleigh, NC (United States); Luo, Alan [The Ohio State Univ., Columbus, OH (United States); Sample, Vivek [Arconic, Pittsburgh, PA (United States); Sachdev, Anil [General Motors, Detroit, MI (United States)

    2017-12-29

    The primary limitation of today’s lightweight structural alloys is that specific yield strengths (SYS) higher than 200MPa x cc/g (typical value for titanium alloys) are extremely difficult to achieve. This holds true especially at a cost lower than 5dollars/kg (typical value for magnesium alloys). Recently, high-entropy alloys (HEA) have shown promising SYS, yet the large composition space of HEA makes screening compositions complex and time-consuming. Over the course of this 2-year project we started from 150 billion compositions and reduced the number of potential low-density (<5g/cc), low-cost (<5dollars/kg) high-entropy alloy (LDHEA) candidates that are single-phase, disordered, solid-solution (SPSS) to a few thousand compositions. This was accomplished by means of machine learning to guide design for SPSS LDHEA based on a combination of recursive partitioning, an extensive, experimental HEA database compiled from 24 literature sources, and 91 calculated parameters serving as phenomenological selection rules. Machine learning shows an accuracy of 82% in identifying which compositions of a separate, smaller, experimental HEA database are SPSS HEA. Calculation of Phase Diagrams (CALPHAD) shows an accuracy of 71-77% for the alloys supported by the CALPHAD database, where 30% of the compiled HEA database is not supported by CALPHAD. In addition to machine learning, and CALPHAD, a third tool was developed to aid design of SPSS LDHEA. Phase diagrams were calculated by constructing the Gibbs-free energy convex hull based on easily accessible enthalpy and entropy terms. Surprisingly, accuracy was 78%. Pursuing these LDHEA candidates by high-throughput experimental methods resulted in SPSS LDHEA composed of transition metals (e.g. Cr, Mn, Fe, Ni, Cu) alloyed with Al, yet the high concentration of Al, necessary to bring the mass density below 5.0g/cc, makes these materials hard and brittle, body-centered-cubic (BCC) alloys. A related, yet multi-phase BCC alloy, based

  13. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  14. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  15. High throughput imaging cytometer with acoustic focussing.

    Science.gov (United States)

    Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter

    2015-10-31

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.

  16. High-throughput determination of RNA structure by proximity ligation.

    Science.gov (United States)

    Ramani, Vijay; Qiu, Ruolan; Shendure, Jay

    2015-09-01

    We present an unbiased method to globally resolve RNA structures through pairwise contact measurements between interacting regions. RNA proximity ligation (RPL) uses proximity ligation of native RNA followed by deep sequencing to yield chimeric reads with ligation junctions in the vicinity of structurally proximate bases. We apply RPL in both baker's yeast (Saccharomyces cerevisiae) and human cells and generate contact probability maps for ribosomal and other abundant RNAs, including yeast snoRNAs, the RNA subunit of the signal recognition particle and the yeast U2 spliceosomal RNA homolog. RPL measurements correlate with established secondary structures for these RNA molecules, including stem-loop structures and long-range pseudoknots. We anticipate that RPL will complement the current repertoire of computational and experimental approaches in enabling the high-throughput determination of secondary and tertiary RNA structures.

  17. Newborn screening for X-linked adrenoleukodystrophy: further evidence high throughput screening is feasible.

    Science.gov (United States)

    Theda, Christiane; Gibbons, Katy; Defor, Todd E; Donohue, Pamela K; Golden, W Christopher; Kline, Antonie D; Gulamali-Majid, Fizza; Panny, Susan R; Hubbard, Walter C; Jones, Richard O; Liu, Anita K; Moser, Ann B; Raymond, Gerald V

    2014-01-01

    X-linked adrenoleukodystrophy (ALD) is characterized by adrenal insufficiency and neurologic involvement with onset at variable ages. Plasma very long chain fatty acids are elevated in ALD; even in asymptomatic patients. We demonstrated previously that liquid chromatography tandem mass spectrometry measuring C26:0 lysophosphatidylcholine reliably identifies affected males. We prospectively applied this method to 4689 newborn blood spot samples; no false positives were observed. We show that high throughput neonatal screening for ALD is methodologically feasible. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. HTTK: R Package for High-Throughput Toxicokinetics

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  19. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  20. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  1. Homologous high-throughput expression and purification of highly conserved E coli proteins

    Directory of Open Access Journals (Sweden)

    Duchmann Rainer

    2007-06-01

    Full Text Available Abstract Background Genetic factors and a dysregulated immune response towards commensal bacteria contribute to the pathogenesis of Inflammatory Bowel Disease (IBD. Animal models demonstrated that the normal intestinal flora is crucial for the development of intestinal inflammation. However, due to the complexity of the intestinal flora, it has been difficult to design experiments for detection of proinflammatory bacterial antigen(s involved in the pathogenesis of the disease. Several studies indicated a potential association of E. coli with IBD. In addition, T cell clones of IBD patients were shown to cross react towards antigens from different enteric bacterial species and thus likely responded to conserved bacterial antigens. We therefore chose highly conserved E. coli proteins as candidate antigens for abnormal T cell responses in IBD and used high-throughput techniques for cloning, expression and purification under native conditions of a set of 271 conserved E. coli proteins for downstream immunologic studies. Results As a standardized procedure, genes were PCR amplified and cloned into the expression vector pQTEV2 in order to express proteins N-terminally fused to a seven-histidine-tag. Initial small-scale expression and purification under native conditions by metal chelate affinity chromatography indicated that the vast majority of target proteins were purified in high yields. Targets that revealed low yields after purification probably due to weak solubility were shuttled into Gateway (Invitrogen destination vectors in order to enhance solubility by N-terminal fusion of maltose binding protein (MBP, N-utilizing substance A (NusA, or glutathione S-transferase (GST to the target protein. In addition, recombinant proteins were treated with polymyxin B coated magnetic beads in order to remove lipopolysaccharide (LPS. Thus, 73% of the targeted proteins could be expressed and purified in large-scale to give soluble proteins in the range of 500

  2. High throughput integrated thermal characterization with non-contact optical calorimetry

    Science.gov (United States)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  3. High-Throughput Analysis and Automation for Glycomics Studies.

    Science.gov (United States)

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  4. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale L

    2005-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  5. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale

    2004-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  6. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  7. High-throughput single nucleotide polymorphism genotyping using nanofluidic Dynamic Arrays

    Directory of Open Access Journals (Sweden)

    Crenshaw Andrew

    2009-01-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs have emerged as the genetic marker of choice for mapping disease loci and candidate gene association studies, because of their high density and relatively even distribution in the human genomes. There is a need for systems allowing medium multiplexing (ten to hundreds of SNPs with high throughput, which can efficiently and cost-effectively generate genotypes for a very large sample set (thousands of individuals. Methods that are flexible, fast, accurate and cost-effective are urgently needed. This is also important for those who work on high throughput genotyping in non-model systems where off-the-shelf assays are not available and a flexible platform is needed. Results We demonstrate the use of a nanofluidic Integrated Fluidic Circuit (IFC - based genotyping system for medium-throughput multiplexing known as the Dynamic Array, by genotyping 994 individual human DNA samples on 47 different SNP assays, using nanoliter volumes of reagents. Call rates of greater than 99.5% and call accuracies of greater than 99.8% were achieved from our study, which demonstrates that this is a formidable genotyping platform. The experimental set up is very simple, with a time-to-result for each sample of about 3 hours. Conclusion Our results demonstrate that the Dynamic Array is an excellent genotyping system for medium-throughput multiplexing (30-300 SNPs, which is simple to use and combines rapid throughput with excellent call rates, high concordance and low cost. The exceptional call rates and call accuracy obtained may be of particular interest to those working on validation and replication of genome- wide- association (GWA studies.

  8. Chlorophyll fluorescence is a rigorous, high throughput tool to analyze the impacts of genotype, species, and stress on plant and ecosystem productivity

    Science.gov (United States)

    Ewers, B. E.; Pleban, J. R.; Aston, T.; Beverly, D.; Speckman, H. N.; Hosseini, A.; Bretfeld, M.; Edwards, C.; Yarkhunova, Y.; Weinig, C.; Mackay, D. S.

    2017-12-01

    Abiotic and biotic stresses reduce plant productivity, yet high-throughput characterization of plant responses across genotypes, species and stress conditions are limited by both instrumentation and data analysis techniques. Recent developments in chlorophyll a fluorescence measurement at leaf to landscape scales could improve our predictive understanding of plants response to stressors. We analyzed the interaction of species and stress across two crop types, five gymnosperm and two angiosperm tree species from boreal and montane forests, grasses, forbs and shrubs from sagebrush steppe, and 30 tree species from seasonally wet tropical forest. We also analyzed chlorophyll fluorescence and gas exchange data from twelve Brassica rapa crop accessions and 120 recombinant inbred lines to investigate phenotypic responses to drought. These data represent more than 10,000 measurements of fluorescence and allow us to answer two questions 1) are the measurements from high-throughput, hand held and drone-mounted instruments quantitatively similar to lower throughput camera and gas exchange mounted instruments and 2) do the measurements find differences in genotypic, species and environmental stress on plants? We found through regression that the high and low throughput instruments agreed across both individual chlorophyll fluorescence components and calculated ratios and were not different from a 1:1 relationship with correlation greater than 0.9. We used hierarchical Bayesian modeling to test the second question. We found a linear relationship between the fluorescence-derived quantum yield of PSII and the quantum yield of CO2 assimilation from gas-exchange, with a slope of ca. 0.1 indicating that the efficiency of the entire photosynthetic process was about 10% of PSII across genotypes, species and drought stress. Posterior estimates of quantum yield revealed that drought-treatment, genotype and species differences were preserved when accounting for measurement uncertainty

  9. A target-based high throughput screen yields Trypanosoma brucei hexokinase small molecule inhibitors with antiparasitic activity.

    Directory of Open Access Journals (Sweden)

    Elizabeth R Sharlow

    2010-04-01

    Full Text Available The parasitic protozoan Trypanosoma brucei utilizes glycolysis exclusively for ATP production during infection of the mammalian host. The first step in this metabolic pathway is mediated by hexokinase (TbHK, an enzyme essential to the parasite that transfers the gamma-phospho of ATP to a hexose. Here we describe the identification and confirmation of novel small molecule inhibitors of bacterially expressed TbHK1, one of two TbHKs expressed by T. brucei, using a high throughput screening assay.Exploiting optimized high throughput screening assay procedures, we interrogated 220,233 unique compounds and identified 239 active compounds from which ten small molecules were further characterized. Computation chemical cluster analyses indicated that six compounds were structurally related while the remaining four compounds were classified as unrelated or singletons. All ten compounds were approximately 20-17,000-fold more potent than lonidamine, a previously identified TbHK1 inhibitor. Seven compounds inhibited T. brucei blood stage form parasite growth (0.03

  10. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification....... Monoclonal antibodies were raised to different targets in single batch runs of 6-10 wk using multiplexed immunisations, automated fusion and cell-culture, and a novel antigen-coated microarray-screening assay. In a large-scale experiment, where eight mice were immunized with ten antigens each, we generated...

  11. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  12. A high-throughput, multi-channel photon-counting detector with picosecond timing

    International Nuclear Information System (INIS)

    Lapington, J.S.; Fraser, G.W.; Miller, G.M.; Ashton, T.J.R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  13. Probing biolabels for high throughput biosensing via synchrotron radiation SEIRA technique

    Energy Technology Data Exchange (ETDEWEB)

    Hornemann, Andrea, E-mail: andrea.hornemann@ptb.de; Hoehl, Arne, E-mail: arne.hoehl@ptb.de; Ulm, Gerhard, E-mail: gerhard.ulm@ptb.de; Beckhoff, Burkhard, E-mail: burkhard.beckhoff@ptb.de [Physikalisch-Technische Bundesanstalt, Abbestr. 2-12, 10587 Berlin (Germany); Eichert, Diane, E-mail: diane.eichert@elettra.eu [Elettra-Sincrotrone Trieste S.C.p.A., Strada Statale 14, Area Science Park, 34149 Trieste (Italy); Flemig, Sabine, E-mail: sabine.flemig@bam.de [BAM Bundesanstalt für Materialforschung und –prüfung, Richard-Willstätter-Str.10, 12489 Berlin (Germany)

    2016-07-27

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzed by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.

  14. Probing biolabels for high throughput biosensing via synchrotron radiation SEIRA technique

    International Nuclear Information System (INIS)

    Hornemann, Andrea; Hoehl, Arne; Ulm, Gerhard; Beckhoff, Burkhard; Eichert, Diane; Flemig, Sabine

    2016-01-01

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzed by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.

  15. High-throughput volumetric reconstruction for 3D wheat plant architecture studies

    Directory of Open Access Journals (Sweden)

    Wei Fang

    2016-09-01

    Full Text Available For many tiller crops, the plant architecture (PA, including the plant fresh weight, plant height, number of tillers, tiller angle and stem diameter, significantly affects the grain yield. In this study, we propose a method based on volumetric reconstruction for high-throughput three-dimensional (3D wheat PA studies. The proposed methodology involves plant volumetric reconstruction from multiple images, plant model processing and phenotypic parameter estimation and analysis. This study was performed on 80 Triticum aestivum plants, and the results were analyzed. Comparing the automated measurements with manual measurements, the mean absolute percentage error (MAPE in the plant height and the plant fresh weight was 2.71% (1.08cm with an average plant height of 40.07cm and 10.06% (1.41g with an average plant fresh weight of 14.06g, respectively. The root mean square error (RMSE was 1.37cm and 1.79g for the plant height and plant fresh weight, respectively. The correlation coefficients were 0.95 and 0.96 for the plant height and plant fresh weight, respectively. Additionally, the proposed methodology, including plant reconstruction, model processing and trait extraction, required only approximately 20s on average per plant using parallel computing on a graphics processing unit (GPU, demonstrating that the methodology would be valuable for a high-throughput phenotyping platform.

  16. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  17. The efficacy of high-throughput sequencing and target enrichment on charred archaeobotanical remains

    DEFF Research Database (Denmark)

    Nistelberger, H. M.; Smith, O.; Wales, Nathan

    2016-01-01

    . It has been suggested that high-throughput sequencing (HTS) technologies coupled with DNA enrichment techniques may overcome some of these limitations. Here we report the findings of HTS and target enrichment on four important archaeological crops (barley, grape, maize and rice) performed in three...... lightly-charred maize cob. Even with target enrichment, this sample failed to yield adequate data required to address fundamental questions in archaeology and biology. We further reanalysed part of an existing dataset on charred plant material, and found all purported endogenous DNA sequences were likely...

  18. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  19. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  20. Combinatorial Strategies and High Throughput Screening in Drug Discovery Targeted to the Channel of Botulinum Neurotoxin

    National Research Council Canada - National Science Library

    Montal, Mauricio

    2006-01-01

    .... The major focus thus far has been the implementation of a reliable and robust high-throughput screen for blockers specific for BoNT using Neuro 2A cells in which BoNTA forms channels with similar properties to those previously characterized in lipid bilayers. The immediate task during the present reporting period involved the detailed characterization of the channel and chaperone activity of BoNTA on Neuro2A cells.

  1. High throughput experimentation for the discovery of new catalysts

    International Nuclear Information System (INIS)

    Thomson, S.; Hoffmann, C.; Johann, T.; Wolf, A.; Schmidt, H.-W.; Farrusseng, D.; Schueth, F.

    2002-01-01

    Full text: The use of combinatorial chemistry to obtain new materials has been developed extensively by the pharmaceutical and biochemical industries, but such approaches have been slow to impact on the field of heterogeneous catalysis. The reasons for this lie in with difficulties associated in the synthesis, characterisation and determination of catalytic properties of such materials. In many synthetic and catalytic reactions, the conditions used are difficult to emulate using High Throughput Experimentation (HTE). Furthermore, the ability to screen these catalysts simultaneously in real time, requires the development and/or modification of characterisation methods. Clearly, there is a need for both high throughput synthesis and screening of new and novel reactions, and we describe several new concepts that help to achieve these goals. Although such problems have impeded the development of combinatorial catalysis, the fact remains that many highly attractive processes still exist for which no suitable catalysts have been developed. The ability to decrease the tiFme needed to evaluate catalyst is therefore essential and this makes the use of high throughput techniques highly desirable. In this presentation we will describe the synthesis, catalytic testing, and novel screening methods developed at the Max Planck Institute. Automated synthesis procedures, performed by the use of a modified Gilson pipette robot, will be described, as will the development of two 16 and 49 sample fixed bed reactors and two 25 and 29 sample three phase reactors for catalytic testing. We will also present new techniques for the characterisation of catalysts and catalytic products using standard IR microscopy and infrared focal plane array detection, respectively

  2. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  3. High-throughput phenotyping of large wheat breeding nurseries using unmanned aerial system, remote sensing and GIS techniques

    Science.gov (United States)

    Haghighattalab, Atena

    Wheat breeders are in a race for genetic gain to secure the future nutritional needs of a growing population. Multiple barriers exist in the acceleration of crop improvement. Emerging technologies are reducing these obstacles. Advances in genotyping technologies have significantly decreased the cost of characterizing the genetic make-up of candidate breeding lines. However, this is just part of the equation. Field-based phenotyping informs a breeder's decision as to which lines move forward in the breeding cycle. This has long been the most expensive and time-consuming, though most critical, aspect of breeding. The grand challenge remains in connecting genetic variants to observed phenotypes followed by predicting phenotypes based on the genetic composition of lines or cultivars. In this context, the current study was undertaken to investigate the utility of UAS in assessment field trials in wheat breeding programs. The major objective was to integrate remotely sensed data with geospatial analysis for high throughput phenotyping of large wheat breeding nurseries. The initial step was to develop and validate a semi-automated high-throughput phenotyping pipeline using a low-cost UAS and NIR camera, image processing, and radiometric calibration to build orthomosaic imagery and 3D models. The relationship between plot-level data (vegetation indices and height) extracted from UAS imagery and manual measurements were examined and found to have a high correlation. Data derived from UAS imagery performed as well as manual measurements while exponentially increasing the amount of data available. The high-resolution, high-temporal HTP data extracted from this pipeline offered the opportunity to develop a within season grain yield prediction model. Due to the variety in genotypes and environmental conditions, breeding trials are inherently spatial in nature and vary non-randomly across the field. This makes geographically weighted regression models a good choice as a

  4. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  5. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  6. High-resolution and high-throughput multichannel Fourier transform spectrometer with two-dimensional interferogram warping compensation

    Science.gov (United States)

    Watanabe, A.; Furukawa, H.

    2018-04-01

    The resolution of multichannel Fourier transform (McFT) spectroscopy is insufficient for many applications despite its extreme advantage of high throughput. We propose an improved configuration to realise both performance using a two-dimensional area sensor. For the spectral resolution, we obtained the interferogram of a larger optical path difference by shifting the area sensor without altering any optical components. The non-linear phase error of the interferometer was successfully corrected using a phase-compensation calculation. Warping compensation was also applied to realise a higher throughput to accumulate the signal between vertical pixels. Our approach significantly improved the resolution and signal-to-noise ratio by factors of 1.7 and 34, respectively. This high-resolution and high-sensitivity McFT spectrometer will be useful for detecting weak light signals such as those in non-invasive diagnosis.

  7. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  8. High-throughput fractionation of human plasma for fast enrichment of low- and high-abundance proteins.

    Science.gov (United States)

    Breen, Lucas; Cao, Lulu; Eom, Kirsten; Srajer Gajdosik, Martina; Camara, Lila; Giacometti, Jasminka; Dupuy, Damian E; Josic, Djuro

    2012-05-01

    Fast, cost-effective and reproducible isolation of IgM from plasma is invaluable to the study of IgM and subsequent understanding of the human immune system. Additionally, vast amounts of information regarding human physiology and disease can be derived from analysis of the low abundance proteome of the plasma. In this study, methods were optimized for both the high-throughput isolation of IgM from human plasma, and the high-throughput isolation and fractionation of low abundance plasma proteins. To optimize the chromatographic isolation of IgM from human plasma, many variables were examined including chromatography resin, mobile phases, and order of chromatographic separations. Purification of IgM was achieved most successfully through isolation of immunoglobulin from human plasma using Protein A chromatography with a specific resin followed by subsequent fractionation using QA strong anion exchange chromatography. Through these optimization experiments, an additional method was established to prepare plasma for analysis of low abundance proteins. This method involved chromatographic depletion of high-abundance plasma proteins and reduction of plasma proteome complexity through further chromatographic fractionation. Purification of IgM was achieved with high purity as confirmed by SDS-PAGE and IgM-specific immunoblot. Isolation and fractionation of low abundance protein was also performed successfully, as confirmed by SDS-PAGE and mass spectrometry analysis followed by label-free quantitative spectral analysis. The level of purity of the isolated IgM allows for further IgM-specific analysis of plasma samples. The developed fractionation scheme can be used for high throughput screening of human plasma in order to identify low and high abundance proteins as potential prognostic and diagnostic disease biomarkers.

  9. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  10. In-field High Throughput Phenotyping and Cotton Plant Growth Analysis Using LiDAR.

    Science.gov (United States)

    Sun, Shangpeng; Li, Changying; Paterson, Andrew H; Jiang, Yu; Xu, Rui; Robertson, Jon S; Snider, John L; Chee, Peng W

    2018-01-01

    Plant breeding programs and a wide range of plant science applications would greatly benefit from the development of in-field high throughput phenotyping technologies. In this study, a terrestrial LiDAR-based high throughput phenotyping system was developed. A 2D LiDAR was applied to scan plants from overhead in the field, and an RTK-GPS was used to provide spatial coordinates. Precise 3D models of scanned plants were reconstructed based on the LiDAR and RTK-GPS data. The ground plane of the 3D model was separated by RANSAC algorithm and a Euclidean clustering algorithm was applied to remove noise generated by weeds. After that, clean 3D surface models of cotton plants were obtained, from which three plot-level morphologic traits including canopy height, projected canopy area, and plant volume were derived. Canopy height ranging from 85th percentile to the maximum height were computed based on the histogram of the z coordinate for all measured points; projected canopy area was derived by projecting all points on a ground plane; and a Trapezoidal rule based algorithm was proposed to estimate plant volume. Results of validation experiments showed good agreement between LiDAR measurements and manual measurements for maximum canopy height, projected canopy area, and plant volume, with R 2 -values of 0.97, 0.97, and 0.98, respectively. The developed system was used to scan the whole field repeatedly over the period from 43 to 109 days after planting. Growth trends and growth rate curves for all three derived morphologic traits were established over the monitoring period for each cultivar. Overall, four different cultivars showed similar growth trends and growth rate patterns. Each cultivar continued to grow until ~88 days after planting, and from then on varied little. However, the actual values were cultivar specific. Correlation analysis between morphologic traits and final yield was conducted over the monitoring period. When considering each cultivar individually

  11. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  12. A direct comparison of remote sensing approaches for high-throughput phenotyping in plant breeding

    Directory of Open Access Journals (Sweden)

    Maria Tattaris

    2016-08-01

    Full Text Available Remote sensing (RS of plant canopies permits non-intrusive, high-throughput monitoring of plant physiological characteristics. This study compared three RS approaches using a low flying UAV (unmanned aerial vehicle, with that of proximal sensing, and satellite-based imagery. Two physiological traits were considered, canopy temperature (CT and a vegetation index (NDVI, to determine the most viable approaches for large scale crop genetic improvement. The UAV-based platform achieves plot-level resolution while measuring several hundred plots in one mission via high-resolution thermal and multispectral imagery measured at altitudes of 30-100 m. The satellite measures multispectral imagery from an altitude of 770 km. Information was compared with proximal measurements using IR thermometers and an NDVI sensor at a distance of 0.5-1m above plots. For robust comparisons, CT and NDVI were assessed on panels of elite cultivars under irrigated and drought conditions, in different thermal regimes, and on un-adapted genetic resources under water deficit. Correlations between airborne data and yield/biomass at maturity were generally higher than equivalent proximal correlations. NDVI was derived from high-resolution satellite imagery for only larger sized plots (8.5 x 2.4 m due to restricted pixel density. Results support use of UAV-based RS techniques for high-throughput phenotyping for both precision and efficiency.

  13. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  14. High throughput nanostructure-initiator mass spectrometry screening of microbial growth conditions for maximal β-glucosidase production.

    Science.gov (United States)

    Cheng, Xiaoliang; Hiras, Jennifer; Deng, Kai; Bowen, Benjamin; Simmons, Blake A; Adams, Paul D; Singer, Steven W; Northen, Trent R

    2013-01-01

    Production of biofuels via enzymatic hydrolysis of complex plant polysaccharides is a subject of intense global interest. Microbial communities are known to express a wide range of enzymes necessary for the saccharification of lignocellulosic feedstocks and serve as a powerful reservoir for enzyme discovery. However, the growth temperature and conditions that yield high cellulase activity vary widely, and the throughput to identify optimal conditions has been limited by the slow handling and conventional analysis. A rapid method that uses small volumes of isolate culture to resolve specific enzyme activity is needed. In this work, a high throughput nanostructure-initiator mass spectrometry (NIMS)-based approach was developed for screening a thermophilic cellulolytic actinomycete, Thermobispora bispora, for β-glucosidase production under various growth conditions. Media that produced high β-glucosidase activity were found to be I/S + glucose or microcrystalline cellulose (MCC), Medium 84 + rolled oats, and M9TE + MCC at 45°C. Supernatants of cell cultures grown in M9TE + 1% MCC cleaved 2.5 times more substrate at 45°C than at all other temperatures. While T. bispora is reported to grow optimally at 60°C in Medium 84 + rolled oats and M9TE + 1% MCC, approximately 40% more conversion was observed at 45°C. This high throughput NIMS approach may provide an important tool in discovery and characterization of enzymes from environmental microbes for industrial and biofuel applications.

  15. High throughput nanostructure-initiator mass spectrometry screening of microbial growth conditions for maximal β-glucosidase production

    Directory of Open Access Journals (Sweden)

    Xiaoliang eCheng

    2013-12-01

    Full Text Available Production of biofuels via enzymatic hydrolysis of complex plant polysaccharides is a subject of intense global interest. Microbial communities are known to express a wide range of enzymes necessary for the saccharification of lignocellulosic feedstocks and serve as a powerful reservoir for enzyme discovery. However, the growth temperature and conditions that yield high cellulase activity vary widely, and the throughput to identify optimal conditions has been limited by the slow handling and conventional analysis. A rapid method that uses small volumes of isolate culture to resolve specific enzyme activity is needed. In this work, a high throughput nanostructure-initiator mass spectrometry (NIMS based approach was developed for screening a thermophilic cellulolytic actinomycete, Thermobispora bispora, for β-glucosidase production under various growth conditions. Media that produced high β-glucosidase activity were found to be I/S + glucose or microcrystalline cellulose (MCC, Medium 84 + rolled oats, and M9TE + MCC at 45 °C. Supernatants of cell cultures grown in M9TE + 1% MCC cleaved 2.5 times more substrate at 45 °C than at all other temperatures. While T. bispora is reported to grow optimally at 60 °C in Medium 84 + rolled oats and M9TE + 1% MCC, approximately 40% more conversion was observed at 45 °C. This high throughput NIMS approach may provide an important tool in discovery and characterization of enzymes from environmental microbes for industrial and biofuel applications.

  16. Fluorescence-based high-throughput screening of dicer cleavage activity.

    Science.gov (United States)

    Podolska, Katerina; Sedlak, David; Bartunek, Petr; Svoboda, Petr

    2014-03-01

    Production of small RNAs by ribonuclease III Dicer is a key step in microRNA and RNA interference pathways, which employ Dicer-produced small RNAs as sequence-specific silencing guides. Further studies and manipulations of microRNA and RNA interference pathways would benefit from identification of small-molecule modulators. Here, we report a study of a fluorescence-based in vitro Dicer cleavage assay, which was adapted for high-throughput screening. The kinetic assay can be performed under single-turnover conditions (35 nM substrate and 70 nM Dicer) in a small volume (5 µL), which makes it suitable for high-throughput screening in a 1536-well format. As a proof of principle, a small library of bioactive compounds was analyzed, demonstrating potential of the assay.

  17. Fun with High Throughput Toxicokinetics (CalEPA webinar)

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  18. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  19. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    Science.gov (United States)

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for

  20. PUFKEY: A High-Security and High-Throughput Hardware True Random Number Generator for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Dongfang Li

    2015-10-01

    Full Text Available Random number generators (RNG play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST randomness tests and is resilient to a wide range of security attacks.

  1. PUFKEY: a high-security and high-throughput hardware true random number generator for sensor networks.

    Science.gov (United States)

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-10-16

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.

  2. Towards Prebiotic Catalytic Amyloids Using High Throughput Screening.

    Directory of Open Access Journals (Sweden)

    Michael P Friedmann

    Full Text Available Enzymes are capable of directing complex stereospecific transformations and of accelerating reaction rates many orders of magnitude. As even the simplest known enzymes comprise thousands of atoms, the question arises as to how such exquisite catalysts evolved. A logical predecessor would be shorter peptides, but they lack the defined structure and size that are apparently necessary for enzyme functions. However, some very short peptides are able to assemble into amyloids, thereby forming a well-defined tertiary structure called the cross-β-sheet, which bestows unique properties upon the peptides. We have hypothesized that amyloids could have been the catalytically active precursor to modern enzymes. To test this hypothesis, we designed an amyloid peptide library that could be screened for catalytic activity. Our approach, amenable to high-throughput methodologies, allowed us to find several peptides and peptide mixtures that form amyloids with esterase activity. These results indicate that amyloids, with their stability in a wide range of conditions and their potential as catalysts with low sequence specificity, would indeed be fitting precursors to modern enzymes. Furthermore, our approach can be efficiently expanded upon in library size, screening conditions, and target activity to yield novel amyloid catalysts with potential applications in aqueous-organic mixtures, at high temperature and in other extreme conditions that could be advantageous for industrial applications.

  3. High-throughput differentiation of heparin from other glycosaminoglycans by pyrolysis mass spectrometry.

    Science.gov (United States)

    Nemes, Peter; Hoover, William J; Keire, David A

    2013-08-06

    Sensors with high chemical specificity and enhanced sample throughput are vital to screening food products and medical devices for chemical or biochemical contaminants that may pose a threat to public health. For example, the rapid detection of oversulfated chondroitin sulfate (OSCS) in heparin could prevent reoccurrence of heparin adulteration that caused hundreds of severe adverse events including deaths worldwide in 2007-2008. Here, rapid pyrolysis is integrated with direct analysis in real time (DART) mass spectrometry to rapidly screen major glycosaminoglycans, including heparin, chondroitin sulfate A, dermatan sulfate, and OSCS. The results demonstrate that, compared to traditional liquid chromatography-based analyses, pyrolysis mass spectrometry achieved at least 250-fold higher sample throughput and was compatible with samples volume-limited to about 300 nL. Pyrolysis yielded an abundance of fragment ions (e.g., 150 different m/z species), many of which were specific to the parent compound. Using multivariate and statistical data analysis models, these data enabled facile differentiation of the glycosaminoglycans with high throughput. After method development was completed, authentically contaminated samples obtained during the heparin crisis by the FDA were analyzed in a blinded manner for OSCS contamination. The lower limit of differentiation and detection were 0.1% (w/w) OSCS in heparin and 100 ng/μL (20 ng) OSCS in water, respectively. For quantitative purposes the linear dynamic range spanned approximately 3 orders of magnitude. Moreover, this chemical readout was successfully employed to find clues in the manufacturing history of the heparin samples that can be used for surveillance purposes. The presented technology and data analysis protocols are anticipated to be readily adaptable to other chemical and biochemical agents and volume-limited samples.

  4. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  5. Reprocessing yields and material throughput: HTGR recycle demonstration facility

    International Nuclear Information System (INIS)

    Holder, N.; Abraham, L.

    1977-08-01

    Recovery and reuse of residual U-235 and bred U-233 from the HTGR thorium-uranium fuel cycle will contribute significantly to HTGR fuel cycle economics and to uranium resource conservation. The Thorium Utilization National Program Plan for HTGR Fuel Recycle Development includes the demonstration, on a production scale, of reprocessing and refabrication processes in an HTGR Recycle Demonstration Facility (HRDF). This report addresses process yields and material throughput that may be typically expected in the reprocessing of highly enriched uranium fuels in the HRDF. Material flows will serve as guidance in conceptual design of the reprocessing portion of the HRDF. In addition, uranium loss projections, particle breakage limits, and decontamination factor requirements are identified to serve as guidance to the HTGR fuel reprocessing development program

  6. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    Science.gov (United States)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  7. Towards low-delay and high-throughput cognitive radio vehicular networks

    Directory of Open Access Journals (Sweden)

    Nada Elgaml

    2017-12-01

    Full Text Available Cognitive Radio Vehicular Ad-hoc Networks (CR-VANETs exploit cognitive radios to allow vehicles to access the unused channels in their radio environment. Thus, CR-VANETs do not only suffer the traditional CR problems, especially spectrum sensing, but also suffer new challenges due to the highly dynamic nature of VANETs. In this paper, we present a low-delay and high-throughput radio environment assessment scheme for CR-VANETs that can be easily incorporated with the IEEE 802.11p standard developed for VANETs. Simulation results show that the proposed scheme significantly reduces the time to get the radio environment map and increases the CR-VANET throughput.

  8. High-throughput screening to enhance oncolytic virus immunotherapy

    Directory of Open Access Journals (Sweden)

    Allan KJ

    2016-04-01

    Full Text Available KJ Allan,1,2 David F Stojdl,1–3 SL Swift1 1Children’s Hospital of Eastern Ontario (CHEO Research Institute, 2Department of Biology, Microbiology and Immunology, 3Department of Pediatrics, University of Ottawa, Ottawa, ON, Canada Abstract: High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. Keywords: oncolytic, virus, screen, high-throughput, cancer, chemical, genomic, immunotherapy

  9. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    Directory of Open Access Journals (Sweden)

    Salvo-Chirnside Eliane

    2011-12-01

    Full Text Available Abstract The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue. The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates can easily be fully processed (samples homogenised, RNA purified and quantified in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.

  10. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format.

    Science.gov (United States)

    Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E

    2011-12-02

    The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.

  11. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  12. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  13. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    Science.gov (United States)

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  14. High-throughput kinase assays with protein substrates using fluorescent polymer superquenching

    Directory of Open Access Journals (Sweden)

    Weatherford Wendy

    2005-05-01

    Full Text Available Abstract Background High-throughput screening is used by the pharmaceutical industry for identifying lead compounds that interact with targets of pharmacological interest. Because of the key role that aberrant regulation of protein phosphorylation plays in diseases such as cancer, diabetes and hypertension, kinases have become one of the main drug targets. With the exception of antibody-based assays, methods to screen for specific kinase activity are generally restricted to the use of small synthetic peptides as substrates. However, the use of natural protein substrates has the advantage that potential inhibitors can be detected that affect enzyme activity by binding to a site other than the catalytic site. We have previously reported a non-radioactive and non-antibody-based fluorescence quench assay for detection of phosphorylation or dephosphorylation using synthetic peptide substrates. The aim of this work is to develop an assay for detection of phosphorylation of chemically unmodified proteins based on this polymer superquenching platform. Results Using a modified QTL Lightspeed™ assay, phosphorylation of native protein was quantified by the interaction of the phosphorylated proteins with metal-ion coordinating groups co-located with fluorescent polymer deposited onto microspheres. The binding of phospho-protein inhibits a dye-labeled "tracer" peptide from associating to the phosphate-binding sites present on the fluorescent microspheres. The resulting inhibition of quench generates a "turn on" assay, in which the signal correlates with the phosphorylation of the substrate. The assay was tested on three different proteins: Myelin Basic Protein (MBP, Histone H1 and Phosphorylated heat- and acid-stable protein (PHAS-1. Phosphorylation of the proteins was detected by Protein Kinase Cα (PKCα and by the Interleukin -1 Receptor-associated Kinase 4 (IRAK4. Enzyme inhibition yielded IC50 values that were comparable to those obtained using

  15. High-throughput kinase assays with protein substrates using fluorescent polymer superquenching.

    Science.gov (United States)

    Rininsland, Frauke; Stankewicz, Casey; Weatherford, Wendy; McBranch, Duncan

    2005-05-31

    High-throughput screening is used by the pharmaceutical industry for identifying lead compounds that interact with targets of pharmacological interest. Because of the key role that aberrant regulation of protein phosphorylation plays in diseases such as cancer, diabetes and hypertension, kinases have become one of the main drug targets. With the exception of antibody-based assays, methods to screen for specific kinase activity are generally restricted to the use of small synthetic peptides as substrates. However, the use of natural protein substrates has the advantage that potential inhibitors can be detected that affect enzyme activity by binding to a site other than the catalytic site. We have previously reported a non-radioactive and non-antibody-based fluorescence quench assay for detection of phosphorylation or dephosphorylation using synthetic peptide substrates. The aim of this work is to develop an assay for detection of phosphorylation of chemically unmodified proteins based on this polymer superquenching platform. Using a modified QTL Lightspeed assay, phosphorylation of native protein was quantified by the interaction of the phosphorylated proteins with metal-ion coordinating groups co-located with fluorescent polymer deposited onto microspheres. The binding of phospho-protein inhibits a dye-labeled "tracer" peptide from associating to the phosphate-binding sites present on the fluorescent microspheres. The resulting inhibition of quench generates a "turn on" assay, in which the signal correlates with the phosphorylation of the substrate. The assay was tested on three different proteins: Myelin Basic Protein (MBP), Histone H1 and Phosphorylated heat- and acid-stable protein (PHAS-1). Phosphorylation of the proteins was detected by Protein Kinase Calpha (PKCalpha) and by the Interleukin -1 Receptor-associated Kinase 4 (IRAK4). Enzyme inhibition yielded IC50 values that were comparable to those obtained using peptide substrates. Statistical parameters that

  16. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...

  17. Machine learning in computational biology to accelerate high-throughput protein expression

    DEFF Research Database (Denmark)

    Sastry, Anand; Monk, Jonathan M.; Tegel, Hanna

    2017-01-01

    and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide...... the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. Availability and implementation: We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template...

  18. Quantitative in vitro-to-in vivo extrapolation in a high-throughput environment

    International Nuclear Information System (INIS)

    Wetmore, Barbara A.

    2015-01-01

    High-throughput in vitro toxicity screening provides an efficient way to identify potential biological targets for environmental and industrial chemicals while conserving limited testing resources. However, reliance on the nominal chemical concentrations in these in vitro assays as an indicator of bioactivity may misrepresent potential in vivo effects of these chemicals due to differences in clearance, protein binding, bioavailability, and other pharmacokinetic factors. Development of high-throughput in vitro hepatic clearance and protein binding assays and refinement of quantitative in vitro-to-in vivo extrapolation (QIVIVE) methods have provided key tools to predict xenobiotic steady state pharmacokinetics. Using a process known as reverse dosimetry, knowledge of the chemical steady state behavior can be incorporated with HTS data to determine the external in vivo oral exposure needed to achieve internal blood concentrations equivalent to those eliciting bioactivity in the assays. These daily oral doses, known as oral equivalents, can be compared to chronic human exposure estimates to assess whether in vitro bioactivity would be expected at the dose-equivalent level of human exposure. This review will describe the use of QIVIVE methods in a high-throughput environment and the promise they hold in shaping chemical testing priorities and, potentially, high-throughput risk assessment strategies

  19. High-throughput fragment screening by affinity LC-MS.

    Science.gov (United States)

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in 3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  20. High yield fabrication of fluorescent nanodiamonds

    International Nuclear Information System (INIS)

    Boudou, Jean-Paul; Curmi, Patrick A; Jelezko, Fedor; Wrachtrup, Joerg; Balasubramanian, Gopalakrischnan; Reuter, Rolf; Aubert, Pascal; Sennour, Mohamed; Thorel, Alain; Gaffet, Eric

    2009-01-01

    A new fabrication method to produce homogeneously fluorescent nanodiamonds with high yields is described. The powder obtained by high energy ball milling of fluorescent high pressure, high temperature diamond microcrystals was converted in a pure concentrated aqueous colloidal dispersion of highly crystalline ultrasmall nanoparticles with a mean size less than or equal to 10 nm. The whole fabrication yield of colloidal quasi-spherical nanodiamonds was several orders of magnitude higher than those previously reported starting from microdiamonds. The results open up avenues for the industrial cost-effective production of fluorescent nanodiamonds with well-controlled properties.

  1. High yield fabrication of fluorescent nanodiamonds

    Energy Technology Data Exchange (ETDEWEB)

    Boudou, Jean-Paul; Curmi, Patrick A [Structure and Activity of Normal and Pathological Biomolecules-INSERM/UEVE U829, Universite d' Evry-Val d' Essonne, Batiment Maupertuis, Rue du pere Andre Jarlan, F-91025 Evry (France); Jelezko, Fedor; Wrachtrup, Joerg; Balasubramanian, Gopalakrischnan; Reuter, Rolf [3.Physikalisches Institut, University of Stuttgart, Pfaffenwaldring 57, D-70550 Stuttgart (Germany); Aubert, Pascal [Nanometric Media Laboratory, Universite d' Evry-Val d' Essonne, Batiment Maupertuis, Rue du pere Andre Jarlan, F-91025 Evry (France); Sennour, Mohamed; Thorel, Alain [Centre des Materiaux, Mines Paris, ParisTech, BP 87, F-91000 Evry (France); Gaffet, Eric [Nanomaterials Research Group-UMR 5060, CNRS, UTBM, Site de Sevenans, F-90010 Belfort (France)], E-mail: jpb.cnrs@free.fr, E-mail: pcurmi@univ-evry.fr, E-mail: f.jelezko@physik.uni-stuttgart.de

    2009-06-10

    A new fabrication method to produce homogeneously fluorescent nanodiamonds with high yields is described. The powder obtained by high energy ball milling of fluorescent high pressure, high temperature diamond microcrystals was converted in a pure concentrated aqueous colloidal dispersion of highly crystalline ultrasmall nanoparticles with a mean size less than or equal to 10 nm. The whole fabrication yield of colloidal quasi-spherical nanodiamonds was several orders of magnitude higher than those previously reported starting from microdiamonds. The results open up avenues for the industrial cost-effective production of fluorescent nanodiamonds with well-controlled properties.

  2. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  3. High-throughput preparation and testing of ion-exchanged zeolites

    International Nuclear Information System (INIS)

    Janssen, K.P.F.; Paul, J.S.; Sels, B.F.; Jacobs, P.A.

    2007-01-01

    A high-throughput research platform was developed for the preparation and subsequent catalytic liquid-phase screening of ion-exchanged zeolites, for instance with regard to their use as heterogeneous catalysts. In this system aqueous solutions and other liquid as well as solid reagents are employed as starting materials and 24 samples are prepared on a library plate with a 4 x 6 layout. Volumetric dispensing of metal precursor solutions, weighing of zeolite and subsequent mixing/washing cycles of the starting materials and distributing reaction mixtures to the library plate are automatically performed by liquid and solid handlers controlled by a single common and easy-to-use programming software interface. The thus prepared materials are automatically contacted with reagent solutions, heated, stirred and sampled continuously using a modified liquid handling. The high-throughput platform is highly promising in enhancing synthesis of catalysts and their screening. In this paper the preparation of lanthanum-exchanged NaY zeolites (LaNaY) on the platform is reported, along with their use as catalyst for the conversion of renewables

  4. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  5. Identification of QTL Associated with Nitrogen Uptake and Nitrogen Use Efficiency Using High Throughput Genotyped CSSLs in Rice (Oryza sativa L.

    Directory of Open Access Journals (Sweden)

    Yong Zhou

    2017-07-01

    Full Text Available Nitrogen (N availability is a major factor limiting crop growth and development. Identification of quantitative trait loci (QTL for N uptake (NUP and N use efficiency (NUE can provide useful information regarding the genetic basis of these traits and their associated effects on yield production. In this study, a set of high throughput genotyped chromosome segment substitution lines (CSSLs derived from a cross between recipient 9311 and donor Nipponbare were used to identify QTL for rice NUP and NUE. Using high throughput sequencing, each CSSL were genotyped and an ultra-high-quality physical map was constructed. A total of 13 QTL, seven for NUP and six for NUE, were identified in plants under hydroponic culture with all nutrients supplied in sufficient quantities. The proportion of phenotypic variation explained by these QTL for NUP and NUE ranged from 3.16–13.99% and 3.76–12.34%, respectively. We also identified several QTL for biomass yield (BY and grain yield (GY, which were responsible for 3.21–45.54% and 6.28–7.31%, respectively, of observed phenotypic variation. GY were significantly positively correlated with NUP and NUE, with NUP more closely correlated than NUE. Our results contribute information to NUP and NUE improvement in rice.

  6. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  7. Optimization of a PCRAM Chip for high-speed read and highly reliable reset operations

    Science.gov (United States)

    Li, Xiaoyun; Chen, Houpeng; Li, Xi; Wang, Qian; Fan, Xi; Hu, Jiajun; Lei, Yu; Zhang, Qi; Tian, Zhen; Song, Zhitang

    2016-10-01

    The widely used traditional Flash memory suffers from its performance limits such as its serious crosstalk problems, and increasing complexity of floating gate scaling. Phase change random access memory (PCRAM) becomes one of the most potential nonvolatile memories among the new memory techniques. In this paper, a 1M-bit PCRAM chip is designed based on the SMIC 40nm CMOS technology. Focusing on the read and write performance, two new circuits with high-speed read operation and highly reliable reset operation are proposed. The high-speed read circuit effectively reduces the reading time from 74ns to 40ns. The double-mode reset circuit improves the chip yield. This 1M-bit PCRAM chip has been simulated on cadence. After layout design is completed, the chip will be taped out for post-test.

  8. High-throughput, high-resolution X-ray phase contrast tomographic microscopy for visualisation of soft tissue

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, S A; Marone, F; Hintermueller, C; Stampanoni, M [Swiss Light Source, Paul Scherrer Institut, 5232 Villigen PSI (Switzerland); Bensadoun, J-C; Aebischer, P, E-mail: samuel.mcdonald@psi.c [EPFL, School of Life Sciences, Station 15, 1015 Lausanne (Switzerland)

    2009-09-01

    The use of conventional absorption based X-ray microtomography can become limited for samples showing only very weak absorption contrast. However, a wide range of samples studied in biology and materials science can produce significant phase shifts of the X-ray beam, and thus the use of the phase signal can provide substantially increased contrast and therefore new and otherwise inaccessible information. The application of two approaches for high-throughput, high-resolution X-ray phase contrast tomography, both available on the TOMCAT beamline of the SLS, is illustrated. Differential Phase Contrast (DPC) imaging uses a grating interferometer and a phase-stepping technique. It has been integrated into the beamline environment on TOMCAT in terms of the fast acquisition and reconstruction of data and the availability to scan samples within an aqueous environment. The second phase contrast approach is a modified transfer of intensity approach that can yield the 3D distribution of the phase (refractive index) of a weakly absorbing object from a single tomographic dataset. These methods are being used for the evaluation of cell integrity in 3D, with the specific aim of following and analyzing progressive cell degeneration to increase knowledge of the mechanistic events of neurodegenerative disorders such as Parkinson's disease.

  9. High-throughput characterization for solar fuels materials discovery

    Science.gov (United States)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  10. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  11. High throughput screening of ligand binding to macromolecules using high resolution powder diffraction

    Science.gov (United States)

    Von Dreele, Robert B.; D'Amico, Kevin

    2006-10-31

    A process is provided for the high throughput screening of binding of ligands to macromolecules using high resolution powder diffraction data including producing a first sample slurry of a selected polycrystalline macromolecule material and a solvent, producing a second sample slurry of a selected polycrystalline macromolecule material, one or more ligands and the solvent, obtaining a high resolution powder diffraction pattern on each of said first sample slurry and the second sample slurry, and, comparing the high resolution powder diffraction pattern of the first sample slurry and the high resolution powder diffraction pattern of the second sample slurry whereby a difference in the high resolution powder diffraction patterns of the first sample slurry and the second sample slurry provides a positive indication for the formation of a complex between the selected polycrystalline macromolecule material and at least one of the one or more ligands.

  12. Quality control methodology for high-throughput protein-protein interaction screening.

    Science.gov (United States)

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  13. A high-throughput screening method of bisphenols, bisphenols digycidyl ethers and their derivatives in dairy products by ultra-high performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Cheng, Yan; Nie, Xue-Mei; Wu, Han-Qiu; Hong, Yun-He; Yang, Bing-Cheng; Liu, Tong; Zhao, Dan; Wang, Jian-Feng; Yao, Gui-Hong; Zhang, Feng

    2017-01-15

    A simple and universal analytical method based on ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for high throughput screening of 21 bisphenols, bisphenols digycidyl ethers and their derivatives in dairy products was developed. Response Surface Methodology (RSM) was used to optimize sample preparation conditions based on a quick, easy, cheap, effective, rugged and safe (QuEChERS) method. The analytes were extracted by using 15 mL acetonitrile with 1% acetic acid, and the extracts were further purified by using 190 mg of C18 and 390 mg of PSA. The extracts were analyzed by UHPLC-MS/MS with electrospray ionization (ESI) source. Linearity was assessed by using matrix-matched standard calibration and good correlation coefficients (r 2  > 0.99) were obtained. The limits of quantitation (LOQs) for the analytes ranged from 0.02 to 5 μg kg -1 . The extraction recoveries were in a range of 88.2%-108.2%. Good method reproducibility in terms of intra- and inter-day precision was observed, yielding relative standard deviations (RSDs) less than 8.9% and 9.9%, respectively. The validation method results revealed that the proposed method was sensitive and reliable. Finally, this method was successfully applied to dairy product analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. High-throughput Sequencing Based Immune Repertoire Study during Infectious Disease

    Directory of Open Access Journals (Sweden)

    Dongni Hou

    2016-08-01

    Full Text Available The selectivity of the adaptive immune response is based on the enormous diversity of T and B cell antigen-specific receptors. The immune repertoire, the collection of T and B cells with functional diversity in the circulatory system at any given time, is dynamic and reflects the essence of immune selectivity. In this article, we review the recent advances in immune repertoire study of infectious diseases that achieved by traditional techniques and high-throughput sequencing techniques. High-throughput sequencing techniques enable the determination of complementary regions of lymphocyte receptors with unprecedented efficiency and scale. This progress in methodology enhances the understanding of immunologic changes during pathogen challenge, and also provides a basis for further development of novel diagnostic markers, immunotherapies and vaccines.

  15. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  16. High-throughput bioinformatics with the Cyrille2 pipeline system.

    NARCIS (Netherlands)

    Fiers, M.W.E.J.; Burgt, van der A.; Datema, E.; Groot, de J.C.W.; Ham, van R.C.H.J.

    2008-01-01

    Background - Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses

  17. High-throughput full-automatic synchrotron-based tomographic microscopy

    International Nuclear Information System (INIS)

    Mader, Kevin; Marone, Federica; Hintermueller, Christoph; Mikuljan, Gordan; Isenegger, Andreas; Stampanoni, Marco

    2011-01-01

    At the TOMCAT (TOmographic Microscopy and Coherent rAdiology experimenTs) beamline of the Swiss Light Source with an energy range of 8-45 keV and voxel size from 0.37 (micro)m to 7.4 (micro)m, full tomographic datasets are typically acquired in 5 to 10 min. To exploit the speed of the system and enable high-throughput studies to be performed in a fully automatic manner, a package of automation tools has been developed. The samples are automatically exchanged, aligned, moved to the correct region of interest, and scanned. This task is accomplished through the coordination of Python scripts, a robot-based sample-exchange system, sample positioning motors and a CCD camera. The tools are suited for any samples that can be mounted on a standard SEM stub, and require no specific environmental conditions. Up to 60 samples can be analyzed at a time without user intervention. The throughput of the system is dependent on resolution, energy and sample size, but rates of four samples per hour have been achieved with 0.74 (micro)m voxel size at 17.5 keV. The maximum intervention-free scanning time is theoretically unlimited, and in practice experiments have been running unattended as long as 53 h (the average beam time allocation at TOMCAT is 48 h per user). The system is the first fully automated high-throughput tomography station: mounting samples, finding regions of interest, scanning and reconstructing can be performed without user intervention. The system also includes many features which accelerate and simplify the process of tomographic microscopy.

  18. A high throughput array microscope for the mechanical characterization of biomaterials

    Science.gov (United States)

    Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard

    2015-02-01

    In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.

  19. High Resolution Melting (HRM for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Directory of Open Access Journals (Sweden)

    Marcin Słomka

    2017-11-01

    Full Text Available High resolution melting (HRM is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs. This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  20. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  1. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  2. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    Science.gov (United States)

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots

  3. A gas trapping method for high-throughput metabolic experiments.

    Science.gov (United States)

    Krycer, James R; Diskin, Ciana; Nelson, Marin E; Zeng, Xiao-Yi; Fazakerley, Daniel J; James, David E

    2018-01-01

    Research into cellular metabolism has become more high-throughput, with typical cell-culture experiments being performed in multiwell plates (microplates). This format presents a challenge when trying to collect gaseous products, such as carbon dioxide (CO2), which requires a sealed environment and a vessel separate from the biological sample. To address this limitation, we developed a gas trapping protocol using perforated plastic lids in sealed cell-culture multiwell plates. We used this trap design to measure CO2 production from glucose and fatty acid metabolism, as well as hydrogen sulfide production from cysteine-treated cells. Our data clearly show that this gas trap can be applied to liquid and solid gas-collection media and can be used to study gaseous product generation by both adherent cells and cells in suspension. Since our gas traps can be adapted to multiwell plates of various sizes, they present a convenient, cost-effective solution that can accommodate the trend toward high-throughput measurements in metabolic research.

  4. High-Throughput Next-Generation Sequencing of Polioviruses

    Science.gov (United States)

    Montmayeur, Anna M.; Schmidt, Alexander; Zhao, Kun; Magaña, Laura; Iber, Jane; Castro, Christina J.; Chen, Qi; Henderson, Elizabeth; Ramos, Edward; Shaw, Jing; Tatusov, Roman L.; Dybdahl-Sissoko, Naomi; Endegue-Zanga, Marie Claire; Adeniji, Johnson A.; Oberste, M. Steven; Burns, Cara C.

    2016-01-01

    ABSTRACT The poliovirus (PV) is currently targeted for worldwide eradication and containment. Sanger-based sequencing of the viral protein 1 (VP1) capsid region is currently the standard method for PV surveillance. However, the whole-genome sequence is sometimes needed for higher resolution global surveillance. In this study, we optimized whole-genome sequencing protocols for poliovirus isolates and FTA cards using next-generation sequencing (NGS), aiming for high sequence coverage, efficiency, and throughput. We found that DNase treatment of poliovirus RNA followed by random reverse transcription (RT), amplification, and the use of the Nextera XT DNA library preparation kit produced significantly better results than other preparations. The average viral reads per total reads, a measurement of efficiency, was as high as 84.2% ± 15.6%. PV genomes covering >99 to 100% of the reference length were obtained and validated with Sanger sequencing. A total of 52 PV genomes were generated, multiplexing as many as 64 samples in a single Illumina MiSeq run. This high-throughput, sequence-independent NGS approach facilitated the detection of a diverse range of PVs, especially for those in vaccine-derived polioviruses (VDPV), circulating VDPV, or immunodeficiency-related VDPV. In contrast to results from previous studies on other viruses, our results showed that filtration and nuclease treatment did not discernibly increase the sequencing efficiency of PV isolates. However, DNase treatment after nucleic acid extraction to remove host DNA significantly improved the sequencing results. This NGS method has been successfully implemented to generate PV genomes for molecular epidemiology of the most recent PV isolates. Additionally, the ability to obtain full PV genomes from FTA cards will aid in facilitating global poliovirus surveillance. PMID:27927929

  5. High-throughput purification of recombinant proteins using self-cleaving intein tags.

    Science.gov (United States)

    Coolbaugh, M J; Shakalli Tang, M J; Wood, D W

    2017-01-01

    High throughput methods for recombinant protein production using E. coli typically involve the use of affinity tags for simple purification of the protein of interest. One drawback of these techniques is the occasional need for tag removal before study, which can be hard to predict. In this work, we demonstrate two high throughput purification methods for untagged protein targets based on simple and cost-effective self-cleaving intein tags. Two model proteins, E. coli beta-galactosidase (βGal) and superfolder green fluorescent protein (sfGFP), were purified using self-cleaving versions of the conventional chitin-binding domain (CBD) affinity tag and the nonchromatographic elastin-like-polypeptide (ELP) precipitation tag in a 96-well filter plate format. Initial tests with shake flask cultures confirmed that the intein purification scheme could be scaled down, with >90% pure product generated in a single step using both methods. The scheme was then validated in a high throughput expression platform using 24-well plate cultures followed by purification in 96-well plates. For both tags and with both target proteins, the purified product was consistently obtained in a single-step, with low well-to-well and plate-to-plate variability. This simple method thus allows the reproducible production of highly pure untagged recombinant proteins in a convenient microtiter plate format. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. 3D material cytometry (3DMaC): a very high-replicate, high-throughput analytical method using microfabricated, shape-specific, cell-material niches.

    Science.gov (United States)

    Parratt, Kirsten; Jeong, Jenny; Qiu, Peng; Roy, Krishnendu

    2017-08-08

    Studying cell behavior within 3D material niches is key to understanding cell biology in health and diseases, and developing biomaterials for regenerative medicine applications. Current approaches to studying these cell-material niches have low throughput and can only analyze a few replicates per experiment resulting in reduced measurement assurance and analytical power. Here, we report 3D material cytometry (3DMaC), a novel high-throughput method based on microfabricated, shape-specific 3D cell-material niches and imaging cytometry. 3DMaC achieves rapid and highly multiplexed analyses of very high replicate numbers ("n" of 10 4 -10 6 ) of 3D biomaterial constructs. 3DMaC overcomes current limitations of low "n", low-throughput, and "noisy" assays, to provide rapid and simultaneous analyses of potentially hundreds of parameters in 3D biomaterial cultures. The method is demonstrated here for a set of 85 000 events containing twelve distinct cell-biomaterial micro-niches along with robust, customized computational methods for high-throughput analytics with potentially unprecedented statistical power.

  7. High Throughput WAN Data Transfer with Hadoop-based Storage

    Science.gov (United States)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  8. High Throughput WAN Data Transfer with Hadoop-based Storage

    International Nuclear Information System (INIS)

    Amin, A; Thomas, M; Bockelman, B; Letts, J; Martin, T; Pi, H; Sfiligoi, I; Wüerthwein, F; Levshina, T

    2011-01-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  9. Use of 15N dilution method for screening soybean lines with high yield and high nitrogen fixation ability

    International Nuclear Information System (INIS)

    Li Haixian; Li Xinmin; Danso, S.K.A.

    1998-01-01

    15 N dilution method was used for screening soybean lines with high nitrogen fixation ability. Screened lines 1005, 8502, 2096, 943, 1454 and Dongnong-42 have high nitrogen fixation ability with their % Ndfa of about 70%. 1454 and 1555 are both high yield and high nitrogen fixation lines. The ability of nitrogen fixation was not related to the yield, but related to maturing time. The cultivars with different maturing time have different levels of nitrogen fixation ability. The longer the maturing period is, the greater the ability of nitrogen fixation it has. There were ten cultivars or lines used in the test of 1992 and 1994. Although the weather condition were greatly different between the two years the results of seven cultivars or lines were the same, indicating that nitrogen fixation ability of the soybean is stable with years. Using 15 N dilution method to estimate nitrogen fixation ability of soybean is reliable, however, the % Ndfa of lines 8502 and 2096 increased by 19% in 1994, a rainy year, indicating that a change in % Ndfa with a few varieties maybe caused by weather

  10. Affinity selection-mass spectrometry and its emerging application to the high throughput screening of G protein-coupled receptors.

    Science.gov (United States)

    Whitehurst, Charles E; Annis, D Allen

    2008-07-01

    Advances in combinatorial chemistry and genomics have inspired the development of novel affinity selection-based screening techniques that rely on mass spectrometry to identify compounds that preferentially bind to a protein target. Of the many affinity selection-mass spectrometry techniques so far documented, only a few solution-based implementations that separate target-ligand complexes away from unbound ligands persist today as routine high throughput screening platforms. Because affinity selection-mass spectrometry techniques do not rely on radioactive or fluorescent reporters or enzyme activities, they can complement traditional biochemical and cell-based screening assays and enable scientists to screen targets that may not be easily amenable to other methods. In addition, by employing mass spectrometry for ligand detection, these techniques enable high throughput screening of massive library collections of pooled compound mixtures, vastly increasing the chemical space that a target can encounter during screening. Of all drug targets, G protein coupled receptors yield the highest percentage of therapeutically effective drugs. In this manuscript, we present the emerging application of affinity selection-mass spectrometry to the high throughput screening of G protein coupled receptors. We also review how affinity selection-mass spectrometry can be used as an analytical tool to guide receptor purification, and further used after screening to characterize target-ligand binding interactions, enabling the classification of orthosteric and allosteric binders.

  11. A Self-Reporting Photocatalyst for Online Fluorescence Monitoring of High Throughput RAFT Polymerization.

    Science.gov (United States)

    Yeow, Jonathan; Joshi, Sanket; Chapman, Robert; Boyer, Cyrille Andre Jean Marie

    2018-04-25

    Translating controlled/living radical polymerization (CLRP) from batch to the high throughput production of polymer libraries presents several challenges in terms of both polymer synthesis and characterization. Although recently there have been significant advances in the field of low volume, high throughput CLRP, techniques able to simultaneously monitor multiple polymerizations in an "online" manner have not yet been developed. Here, we report our discovery that 5,10,15,20-tetraphenyl-21H,23H-porphine zinc (ZnTPP) is a self-reporting photocatalyst that can mediate PET-RAFT polymerization as well as report on monomer conversion via changes in its fluorescence properties. This enables the use of a microplate reader to conduct high throughput "online" monitoring of PET-RAFT polymerizations performed directly in 384-well, low volume microtiter plates. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. High yield neutron generators using the DD reaction

    Energy Technology Data Exchange (ETDEWEB)

    Vainionpaa, J. H.; Harris, J. L.; Piestrup, M. A.; Gary, C. K.; Williams, D. L.; Apodaca, M. D.; Cremer, J. T. [Adelphi technology, 2003 E. Bayshore Rd. 94061, Redwood City, CA (United States); Ji, Qing; Ludewigt, B. A. [Lawrence Berkeley National Lab, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Jones, G. [G and J Enterprise, 1258 Quary Ln, Suite F, Pleasanton California 94566 (United States)

    2013-04-19

    A product line of high yield neutron generators has been developed at Adelphi technology inc. The generators use the D-D fusion reaction and are driven by an ion beam supplied by a microwave ion source. Yields of up to 5 Multiplication-Sign 10{sup 9} n/s have been achieved, which are comparable to those obtained using the more efficient D-T reaction. The microwave-driven plasma uses the electron cyclotron resonance (ECR) to produce a high plasma density for high current and high atomic ion species. These generators have an actively pumped vacuum system that allows operation at reduced pressure in the target chamber, increasing the overall system reliability. Since no radioactive tritium is used, the generators can be easily serviced, and components can be easily replaced, providing essentially an unlimited lifetime. Fast neutron source size can be adjusted by selecting the aperture and target geometries according to customer specifications. Pulsed and continuous operation has been demonstrated. Minimum pulse lengths of 50 {mu}s have been achieved. Since the generators are easily serviceable, they offer a long lifetime neutron generator for laboratories and commercial systems requiring continuous operation. Several of the generators have been enclosed in radiation shielding/moderator structures designed for customer specifications. These generators have been proven to be useful for prompt gamma neutron activation analysis (PGNAA), neutron activation analysis (NAA) and fast neutron radiography. Thus these generators make excellent fast, epithermal and thermal neutron sources for laboratories and industrial applications that require neutrons with safe operation, small footprint, low cost and small regulatory burden.

  13. High pressure inertial focusing for separating and concentrating bacteria at high throughput

    Science.gov (United States)

    Cruz, J.; Hooshmand Zadeh, S.; Graells, T.; Andersson, M.; Malmström, J.; Wu, Z. G.; Hjort, K.

    2017-08-01

    Inertial focusing is a promising microfluidic technology for concentration and separation of particles by size. However, there is a strong correlation of increased pressure with decreased particle size. Theory and experimental results for larger particles were used to scale down the phenomenon and find the conditions that focus 1 µm particles. High pressure experiments in robust glass chips were used to demonstrate the alignment. We show how the technique works for 1 µm spherical polystyrene particles and for Escherichia coli, not being harmful for the bacteria at 50 µl min-1. The potential to focus bacteria, simplicity of use and high throughput make this technology interesting for healthcare applications, where concentration and purification of a sample may be required as an initial step.

  14. Use of ultra-high pressure liquid chromatography coupled to high resolution mass spectrometry for fast screening in high throughput doping control.

    Science.gov (United States)

    Musenga, Alessandro; Cowan, David A

    2013-05-03

    We describe a sensitive, comprehensive and fast screening method based on liquid chromatography-high resolution mass spectrometry for the detection of a large number of analytes in sports samples. UHPLC coupled to high resolution mass spectrometry with polarity switching capability is applied for the rapid screening of a large number of analytes in human urine samples. Full scan data are acquired alternating both positive and negative ionisation. Collision-induced dissociation with positive ionisation is also performed to produce fragment ions to improve selectivity for some analytes. Data are reviewed as extracted ion chromatograms based on narrow mass/charge windows (±5ppm). A simple sample preparation method was developed, using direct enzymatic hydrolysis of glucuronide conjugates, followed by solid phase extraction with mixed mode ion-exchange cartridges. Within a 10min run time (including re-equilibration) the method presented allows for the analysis of a large number of analytes from most of the classes in the World Anti-Doping Agency (WADA) Prohibited List, including anabolic agents, β2-agonists, hormone antagonists and modulators, diuretics, stimulants, narcotics, glucocorticoids and β-blockers, and does so while meeting the WADA sensitivity requirements. The high throughput of the method and the fast sample pre-treatment reduces analysis cost and increases productivity. The method presented has been used for the analysis of over 5000 samples in about one month and proved to be reliable. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  16. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  17. High-Throughput Printing Process for Flexible Electronics

    Science.gov (United States)

    Hyun, Woo Jin

    Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.

  18. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    . A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct....... The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery....

  19. Blood group genotyping: from patient to high-throughput donor screening.

    Science.gov (United States)

    Veldhuisen, B; van der Schoot, C E; de Haas, M

    2009-10-01

    Blood group antigens, present on the cell membrane of red blood cells and platelets, can be defined either serologically or predicted based on the genotypes of genes encoding for blood group antigens. At present, the molecular basis of many antigens of the 30 blood group systems and 17 human platelet antigens is known. In many laboratories, blood group genotyping assays are routinely used for diagnostics in cases where patient red cells cannot be used for serological typing due to the presence of auto-antibodies or after recent transfusions. In addition, DNA genotyping is used to support (un)-expected serological findings. Fetal genotyping is routinely performed when there is a risk of alloimmune-mediated red cell or platelet destruction. In case of patient blood group antigen typing, it is important that a genotyping result is quickly available to support the selection of donor blood, and high-throughput of the genotyping method is not a prerequisite. In addition, genotyping of blood donors will be extremely useful to obtain donor blood with rare phenotypes, for example lacking a high-frequency antigen, and to obtain a fully typed donor database to be used for a better matching between recipient and donor to prevent adverse transfusion reactions. Serological typing of large cohorts of donors is a labour-intensive and expensive exercise and hampered by the lack of sufficient amounts of approved typing reagents for all blood group systems of interest. Currently, high-throughput genotyping based on DNA micro-arrays is a very feasible method to obtain a large pool of well-typed blood donors. Several systems for high-throughput blood group genotyping are developed and will be discussed in this review.

  20. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    Technological advances in mass spectrometry and meticulous method development have produced several shotgun lipidomic approaches capable of characterizing lipid species by direct analysis of total lipid extracts. Shotgun lipidomics by hybrid quadrupole time-of-flight mass spectrometry allows...... the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  1. High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics

    Science.gov (United States)

    Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine

    2016-06-01

    Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.

  2. The application of the high throughput sequencing technology in the transposable elements.

    Science.gov (United States)

    Liu, Zhen; Xu, Jian-hong

    2015-09-01

    High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.

  3. High-Throughput Dissection of AAV-Host Interactions: The Fast and the Curious.

    Science.gov (United States)

    Herrmann, Anne-Kathrin; Grimm, Dirk

    2018-05-18

    Over fifty years after its initial description, Adeno-associated virus (AAV) remains a most exciting but also most elusive study object in basic or applied virology. On the one hand, its simple structure not only facilitates investigations into virus biology, but combined with the availability of numerous natural AAV variants with distinct infection efficiency and specificity also makes AAV a preferred substrate for engineering of gene delivery vectors. On the other hand, it is striking to witness a recent flurry of reports that highlight and partially close persistent gaps in our understanding of AAV virus and vector biology. This is all the more perplexing considering that recombinant AAVs have already been used in >160 clinical trials and recently been commercialized as gene therapeutics. Here, we discuss a reason for these advances in AAV research, namely, the advent and application of powerful high-throughput technology for dissection of AAV-host interactions and optimization of AAV gene therapy vectors. As relevant examples, we focus on the discovery of (i) a "new" cellular AAV receptor, AAVR, (ii) host restriction factors for AAV entry, and (iii) AAV capsid determinants that mediate trafficking through the blood-brain barrier. While (i)/(ii) are prototypes of extra- or intracellular AAV host factors that were identified via high-throughput screenings, (iii) exemplifies the power of molecular evolution to investigate the virus itself. In the future, we anticipate that these and other key technologies will continue to accelerate the dissection of AAV biology and will yield a wealth of new designer viruses for clinical use. Copyright © 2018. Published by Elsevier Ltd.

  4. The JCSG high-throughput structural biology pipeline

    International Nuclear Information System (INIS)

    Elsliger, Marc-André; Deacon, Ashley M.; Godzik, Adam; Lesley, Scott A.; Wooley, John; Wüthrich, Kurt; Wilson, Ian A.

    2010-01-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years and has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe. The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications

  5. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  6. High-Reliability Health Care: Getting There from Here

    Science.gov (United States)

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific

  7. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  9. High throughput-screening of animal urine samples: It is fast but is it also reliable?

    Science.gov (United States)

    Kaufmann, Anton

    2016-05-01

    Advanced analytical technologies like ultra-high-performance liquid chromatography coupled to high resolution mass spectrometry can be used for veterinary drug screening of animal urine. The technique is sufficiently robust and reliable to detect veterinary drugs in urine samples of animals where the maximum residue limit of these compounds in organs like muscle, kidney, or liver has been exceeded. The limitations and possibilities of the technique are discussed. The most critical point is the variability of the drug concentration ratio between the tissue and urine. Ways to manage the false positive and false negatives are discussed. The capability to confirm findings and the possibility of semi-targeted analysis are also addressed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. High power klystrons for efficient reliable high power amplifiers

    Science.gov (United States)

    Levin, M.

    1980-11-01

    This report covers the design of reliable high efficiency, high power klystrons which may be used in both existing and proposed troposcatter radio systems. High Power (10 kW) klystron designs were generated in C-band (4.4 GHz to 5.0 GHz), S-band (2.5 GHz to 2.7 GHz), and L-band or UHF frequencies (755 MHz to 985 MHz). The tubes were designed for power supply compatibility and use with a vapor/liquid phase heat exchanger. Four (4) S-band tubes were developed in the course of this program along with two (2) matching focusing solenoids and two (2) heat exchangers. These tubes use five (5) tuners with counters which are attached to the focusing solenoids. A reliability mathematical model of the tube and heat exchanger system was also generated.

  11. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  12. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  13. High-throughput anisotropic plasma etching of polyimide for MEMS

    International Nuclear Information System (INIS)

    Bliznetsov, Vladimir; Manickam, Anbumalar; Ranganathan, Nagarajan; Chen, Junwei

    2011-01-01

    This note describes a new high-throughput process of polyimide etching for the fabrication of MEMS devices with an organic sacrificial layer approach. Using dual frequency superimposed capacitively coupled plasma we achieved a vertical profile of polyimide with an etching rate as high as 3.5 µm min −1 . After the fabrication of vertical structures in a polyimide material, additional steps were performed to fabricate structural elements of MEMS by deposition of a SiO 2 layer and performing release etching of polyimide. (technical note)

  14. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    Science.gov (United States)

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J; Cox, David D

    2009-11-01

    While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  15. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    Directory of Open Access Journals (Sweden)

    Nicolas Pinto

    2009-11-01

    Full Text Available While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor. In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  16. High throughput soft embossing process for micro-patterning of PEDOT thin films

    DEFF Research Database (Denmark)

    Fanzio, Paola; Cagliani, Alberto; Peterffy, Kristof G.

    2017-01-01

    The patterning of conductive polymers is a major challenge in the implementation of these materials in several research and industrial applications, spanning from photovoltaics to biosensors. Within this context, we have developed a reliable technique to pattern a thin layer of the conductive...... polymer poly(3,4-ethylenedioxythiophene) (PEDOT) by means of a low cost and high throughput soft embossing process. We were able to reproduce a functional conductive pattern with a minimum dimension of 1 μm and to fabricate electrically decoupled electrodes. Moreover, the conductivity of the PEDOT films...... has been characterized, finding that a post-processing treatment with Ethylene Glycol allows an increase in conductivity and a decrease in water solubility of the PEDOT film. Finally, cyclic voltammetry demonstrates that the post-treatment also ensures the electrochemical activity of the film. Our...

  17. A high-throughput surface plasmon resonance biosensor based on differential interferometric imaging

    International Nuclear Information System (INIS)

    Wang, Daqian; Ding, Lili; Zhang, Wei; Zhang, Enyao; Yu, Xinglong; Luo, Zhaofeng; Ou, Huichao

    2012-01-01

    A new high-throughput surface plasmon resonance (SPR) biosensor based on differential interferometric imaging is reported. The two SPR interferograms of the sensing surface are imaged on two CCD cameras. The phase difference between the two interferograms is 180°. The refractive index related factor (RIRF) of the sensing surface is calculated from the two simultaneously acquired interferograms. The simulation results indicate that the RIRF exhibits a linear relationship with the refractive index of the sensing surface and is unaffected by the noise, drift and intensity distribution of the light source. The affinity and kinetic information can be extracted in real time from continuously acquired RIRF distributions. The results of refractometry experiments show that the dynamic detection range of SPR differential interferometric imaging system can be over 0.015 refractive index unit (RIU). High refractive index resolution is down to 0.45 RU (1 RU = 1 × 10 −6 RIU). Imaging and protein microarray experiments demonstrate the ability of high-throughput detection. The aptamer experiments demonstrate that the SPR sensor based on differential interferometric imaging has a great capability to be implemented for high-throughput aptamer kinetic evaluation. These results suggest that this biosensor has the potential to be utilized in proteomics and drug discovery after further improvement. (paper)

  18. Management of High-Throughput DNA Sequencing Projects: Alpheus.

    Science.gov (United States)

    Miller, Neil A; Kingsmore, Stephen F; Farmer, Andrew; Langley, Raymond J; Mudge, Joann; Crow, John A; Gonzalez, Alvaro J; Schilkey, Faye D; Kim, Ryan J; van Velkinburgh, Jennifer; May, Gregory D; Black, C Forrest; Myers, M Kathy; Utsey, John P; Frost, Nicholas S; Sugarbaker, David J; Bueno, Raphael; Gullans, Stephen R; Baxter, Susan M; Day, Steve W; Retzel, Ernest F

    2008-12-26

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem's SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis.

  19. High-throughput screening with micro-x-ray fluorescence

    International Nuclear Information System (INIS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-01-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity

  20. A high-throughput pipeline for the design of real-time PCR signatures

    Directory of Open Access Journals (Sweden)

    Reifman Jaques

    2010-06-01

    Full Text Available Abstract Background Pathogen diagnostic assays based on polymerase chain reaction (PCR technology provide high sensitivity and specificity. However, the design of these diagnostic assays is computationally intensive, requiring high-throughput methods to identify unique PCR signatures in the presence of an ever increasing availability of sequenced genomes. Results We present the Tool for PCR Signature Identification (TOPSI, a high-performance computing pipeline for the design of PCR-based pathogen diagnostic assays. The TOPSI pipeline efficiently designs PCR signatures common to multiple bacterial genomes by obtaining the shared regions through pairwise alignments between the input genomes. TOPSI successfully designed PCR signatures common to 18 Staphylococcus aureus genomes in less than 14 hours using 98 cores on a high-performance computing system. Conclusions TOPSI is a computationally efficient, fully integrated tool for high-throughput design of PCR signatures common to multiple bacterial genomes. TOPSI is freely available for download at http://www.bhsai.org/downloads/topsi.tar.gz.

  1. High-Throughput Assay for Enantiomeric Excess Determination in 1,2- and 1,3-Diols and Direct Asymmetric Reaction Screening.

    Science.gov (United States)

    Shcherbakova, Elena G; Brega, Valentina; Lynch, Vincent M; James, Tony D; Anzenbacher, Pavel

    2017-07-26

    A simple and efficient method for determination of the yield, enantiomeric/diasteriomeric excess (ee/de), and absolute configuration of crude chiral diols without the need of work-up and product isolation in a high throughput setting is described. This approach utilizes a self-assembled iminoboronate ester formed as a product by dynamic covalent self-assembly of a chiral diol with an enantiopure fluorescent amine such as tryptophan methyl ester or tryptophanol and 2-formylphenylboronic acid. The resulting diastereomeric boronates display different photophysical properties and allow for fluorescence-based ee determination of molecules containing a 1,2- or 1,3-diol moiety. This method has been utilized for the screening of ee in a number of chiral diols including atorvastatin, a statin used for the treatment of hypercholesterolemia. Noyori asymmetric hydrogenation of benzil was performed in a highly parallel fashion with errors products from the parallel asymmetric synthesis in real time and in a high-throughput screening (HTS) fashion. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Simultaneous measurements of auto-immune and infectious disease specific antibodies using a high throughput multiplexing tool.

    Directory of Open Access Journals (Sweden)

    Atul Asati

    Full Text Available Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders.

  3. High-throughput assessment of context-dependent effects of chromatin proteins

    NARCIS (Netherlands)

    Brueckner, L. (Laura); Van Arensbergen, J. (Joris); Akhtar, W. (Waseem); L. Pagie (Ludo); B. van Steensel (Bas)

    2016-01-01

    textabstractBackground: Chromatin proteins control gene activity in a concerted manner. We developed a high-throughput assay to study the effects of the local chromatin environment on the regulatory activity of a protein of interest. The assay combines a previously reported multiplexing strategy

  4. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  5. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    Science.gov (United States)

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...

  6. Reliable Detection of Herpes Simplex Virus Sequence Variation by High-Throughput Resequencing.

    Science.gov (United States)

    Morse, Alison M; Calabro, Kaitlyn R; Fear, Justin M; Bloom, David C; McIntyre, Lauren M

    2017-08-16

    High-throughput sequencing (HTS) has resulted in data for a number of herpes simplex virus (HSV) laboratory strains and clinical isolates. The knowledge of these sequences has been critical for investigating viral pathogenicity. However, the assembly of complete herpesviral genomes, including HSV, is complicated due to the existence of large repeat regions and arrays of smaller reiterated sequences that are commonly found in these genomes. In addition, the inherent genetic variation in populations of isolates for viruses and other microorganisms presents an additional challenge to many existing HTS sequence assembly pipelines. Here, we evaluate two approaches for the identification of genetic variants in HSV1 strains using Illumina short read sequencing data. The first, a reference-based approach, identifies variants from reads aligned to a reference sequence and the second, a de novo assembly approach, identifies variants from reads aligned to de novo assembled consensus sequences. Of critical importance for both approaches is the reduction in the number of low complexity regions through the construction of a non-redundant reference genome. We compared variants identified in the two methods. Our results indicate that approximately 85% of variants are identified regardless of the approach. The reference-based approach to variant discovery captures an additional 15% representing variants divergent from the HSV1 reference possibly due to viral passage. Reference-based approaches are significantly less labor-intensive and identify variants across the genome where de novo assembly-based approaches are limited to regions where contigs have been successfully assembled. In addition, regions of poor quality assembly can lead to false variant identification in de novo consensus sequences. For viruses with a well-assembled reference genome, a reference-based approach is recommended.

  7. High-Throughput Accurate Single-Cell Screening of Euglena gracilis with Fluorescence-Assisted Optofluidic Time-Stretch Microscopy.

    Directory of Open Access Journals (Sweden)

    Baoshan Guo

    Full Text Available The development of reliable, sustainable, and economical sources of alternative fuels is an important, but challenging goal for the world. As an alternative to liquid fossil fuels, algal biofuel is expected to play a key role in alleviating global warming since algae absorb atmospheric CO2 via photosynthesis. Among various algae for fuel production, Euglena gracilis is an attractive microalgal species as it is known to produce wax ester (good for biodiesel and aviation fuel within lipid droplets. To date, while there exist many techniques for inducing microalgal cells to produce and accumulate lipid with high efficiency, few analytical methods are available for characterizing a population of such lipid-accumulated microalgae including E. gracilis with high throughout, high accuracy, and single-cell resolution simultaneously. Here we demonstrate high-throughput, high-accuracy, single-cell screening of E. gracilis with fluorescence-assisted optofluidic time-stretch microscopy-a method that combines the strengths of microfluidic cell focusing, optical time-stretch microscopy, and fluorescence detection used in conventional flow cytometry. Specifically, our fluorescence-assisted optofluidic time-stretch microscope consists of an optical time-stretch microscope and a fluorescence analyzer on top of a hydrodynamically focusing microfluidic device and can detect fluorescence from every E. gracilis cell in a population and simultaneously obtain its image with a high throughput of 10,000 cells/s. With the multi-dimensional information acquired by the system, we classify nitrogen-sufficient (ordinary and nitrogen-deficient (lipid-accumulated E. gracilis cells with a low false positive rate of 1.0%. This method holds promise for evaluating cultivation techniques and selective breeding for microalgae-based biofuel production.

  8. High-reliability health care: getting there from here.

    Science.gov (United States)

    Chassin, Mark R; Loeb, Jerod M

    2013-09-01

    Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer "project fatigue" because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals' readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research

  9. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    Science.gov (United States)

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established

  10. Virtual high screening throughput and design of 14α-lanosterol ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-07-06

    Jul 6, 2009 ... Virtual high screening throughput and design of. 14α-lanosterol demethylase inhibitors against. Mycobacterium tuberculosis. Hildebert B. Maurice1*, Esther Tuarira1 and Kennedy Mwambete2. 1School of Pharmaceutical Sciences, Institute of Allied Health Sciences, Muhimbili University of Health and.

  11. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  12. Validation of a high-throughput fermentation system based on online monitoring of biomass and fluorescence in continuously shaken microtiter plates

    Directory of Open Access Journals (Sweden)

    Kensy Frank

    2009-06-01

    Full Text Available Abstract Background An advanced version of a recently reported high-throughput fermentation system with online measurement, called BioLector, and its validation is presented. The technology combines high-throughput screening and high-information content by applying online monitoring of scattered light and fluorescence intensities in continuously shaken microtiter plates. Various examples in calibration of the optical measurements, clone and media screening and promoter characterization are given. Results Bacterial and yeast biomass concentrations of up to 50 g/L cell dry weight could be linearly correlated to scattered light intensities. In media screening, the BioLector could clearly demonstrate its potential for detecting different biomass and product yields and deducing specific growth rates for quantitatively evaluating media and nutrients. Growth inhibition due to inappropriate buffer conditions could be detected by reduced growth rates and a temporary increase in NADH fluorescence. GFP served very well as reporter protein for investigating the promoter regulation under different carbon sources in yeast strains. A clone screening of 90 different GFP-expressing Hansenula polymorpha clones depicted the broad distribution of growth behavior and an even stronger distribution in GFP expression. The importance of mass transfer conditions could be demonstrated by varying filling volumes of an E. coli culture in 96 well MTP. The different filling volumes cause a deviation in the culture growth and acidification both monitored via scattered light intensities and the fluorescence of a pH indicator, respectively. Conclusion The BioLector technology is a very useful tool to perform quantitative microfermentations under engineered reaction conditions. With this technique, specific yields and rates can be directly deduced from online biomass and product concentrations, which is superior to existing technologies such as microplate readers or optode

  13. High-throughput screening of carbohydrate-degrading enzymes using novel insoluble chromogenic substrate assay kits

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Willats, William George Tycho

    2016-01-01

    for this is that advances in genome and transcriptome sequencing, together with associated bioinformatics tools allow for rapid identification of candidate CAZymes, but technology for determining an enzyme's biochemical characteristics has advanced more slowly. To address this technology gap, a novel high-throughput assay...... CPH and ICB substrates are provided in a 96-well high-throughput assay system. The CPH substrates can be made in four different colors, enabling them to be mixed together and thus increasing assay throughput. The protocol describes a 96-well plate assay and illustrates how this assay can be used...... for screening the activities of enzymes, enzyme cocktails, and broths....

  14. The High-Throughput Analyses Era: Are We Ready for the Data Struggle?

    Science.gov (United States)

    D'Argenio, Valeria

    2018-03-02

    Recent and rapid technological advances in molecular sciences have dramatically increased the ability to carry out high-throughput studies characterized by big data production. This, in turn, led to the consequent negative effect of highlighting the presence of a gap between data yield and their analysis. Indeed, big data management is becoming an increasingly important aspect of many fields of molecular research including the study of human diseases. Now, the challenge is to identify, within the huge amount of data obtained, that which is of clinical relevance. In this context, issues related to data interpretation, sharing and storage need to be assessed and standardized. Once this is achieved, the integration of data from different -omic approaches will improve the diagnosis, monitoring and therapy of diseases by allowing the identification of novel, potentially actionably biomarkers in view of personalized medicine.

  15. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  16. MGI-oriented High-throughput Measurement of Interdiffusion Coefficient Matrices in Ni-based Superalloys

    Directory of Open Access Journals (Sweden)

    TANG Ying

    2017-01-01

    our research group. Up to now, the interdiffusivity matrices in γ and γ' phases of the core ternary systems including Ni-Al-X (X=Rh, Ta, W, Re, Os and Ir have been efficiently measured, and their reliability has also been carefully validated. Based on the experimental results, the interdiffusivities for different elements in Ni-based superalloys are carefully compared, from which the potential substitutional alloying elements for Re in Ni-based supperalloys as well as the points for alloy composition design are proposed. Finally, the research work of next step on the measurement of interdiffusivity matrices in Ni-based superalloys as well as the development trends of high-throughput measurement of interdiffusivities in our research group are pointed out.

  17. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  18. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  19. Fluorescence-based high-throughput screening of dicer cleavage activity

    Czech Academy of Sciences Publication Activity Database

    Podolská, Kateřina; Sedlák, David; Bartůněk, Petr; Svoboda, Petr

    2014-01-01

    Roč. 19, č. 3 (2014), s. 417-426 ISSN 1087-0571 R&D Projects: GA ČR GA13-29531S; GA MŠk(CZ) LC06077; GA MŠk LM2011022 Grant - others:EMBO(DE) 1483 Institutional support: RVO:68378050 Keywords : Dicer * siRNA * high-throughput screening Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.423, year: 2014

  20. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free...

  1. High-throughput transcriptome analysis of barley (Hordeum vulgare) exposed to excessive boron.

    Science.gov (United States)

    Tombuloglu, Guzin; Tombuloglu, Huseyin; Sakcali, M Serdal; Unver, Turgay

    2015-02-15

    Boron (B) is an essential micronutrient for optimum plant growth. However, above certain threshold B is toxic and causes yield loss in agricultural lands. While a number of studies were conducted to understand B tolerance mechanism, a transcriptome-wide approach for B tolerant barley is performed here for the first time. A high-throughput RNA-Seq (cDNA) sequencing technology (Illumina) was used with barley (Hordeum vulgare), yielding 208 million clean reads. In total, 256,874 unigenes were generated and assigned to known peptide databases: Gene Ontology (GO) (99,043), Swiss-Prot (38,266), Clusters of Orthologous Groups (COG) (26,250), and the Kyoto Encyclopedia of Genes and Genomes (KEGG) (36,860), as determined by BLASTx search. According to the digital gene expression (DGE) analyses, 16% and 17% of the transcripts were found to be differentially regulated in root and leaf tissues, respectively. Most of them were involved in cell wall, stress response, membrane, protein kinase and transporter mechanisms. Some of the genes detected as highly expressed in root tissue are phospholipases, predicted divalent heavy-metal cation transporters, formin-like proteins and calmodulin/Ca(2+)-binding proteins. In addition, chitin-binding lectin precursor, ubiquitin carboxyl-terminal hydrolase, and serine/threonine-protein kinase AFC2 genes were indicated to be highly regulated in leaf tissue upon excess B treatment. Some pathways, such as the Ca(2+)-calmodulin system, are activated in response to B toxicity. The differential regulation of 10 transcripts was confirmed by qRT-PCR, revealing the tissue-specific responses against B toxicity and their putative function in B-tolerance mechanisms. Copyright © 2014. Published by Elsevier B.V.

  2. Crystal Symmetry Algorithms in a High-Throughput Framework for Materials

    Science.gov (United States)

    Taylor, Richard

    The high-throughput framework AFLOW that has been developed and used successfully over the last decade is improved to include fully-integrated software for crystallographic symmetry characterization. The standards used in the symmetry algorithms conform with the conventions and prescriptions given in the International Tables of Crystallography (ITC). A standard cell choice with standard origin is selected, and the space group, point group, Bravais lattice, crystal system, lattice system, and representative symmetry operations are determined. Following the conventions of the ITC, the Wyckoff sites are also determined and their labels and site symmetry are provided. The symmetry code makes no assumptions on the input cell orientation, origin, or reduction and has been integrated in the AFLOW high-throughput framework for materials discovery by adding to the existing code base and making use of existing classes and functions. The software is written in object-oriented C++ for flexibility and reuse. A performance analysis and examination of the algorithms scaling with cell size and symmetry is also reported.

  3. Development of Control Applications for High-Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yurii A.; Matsugaki, Naohiro; Honda, Nobuo; Sasajima, Kumiko; Igarashi, Noriyuki; Hiraki, Masahiko; Yamada, Yusuke; Wakatsuki, Soichi

    2007-01-01

    An integrated client-server control system (PCCS) with a unified relational database (PCDB) has been developed for high-throughput protein crystallography experiments on synchrotron beamlines. The major steps in protein crystallographic experiments (purification, crystallization, crystal harvesting, data collection, and data processing) are integrated into the software. All information necessary for performing protein crystallography experiments is stored in the PCDB database (except raw X-ray diffraction data, which is stored in the Network File Server). To allow all members of a protein crystallography group to participate in experiments, the system was developed as a multi-user system with secure network access based on TCP/IP secure UNIX sockets. Secure remote access to the system is possible from any operating system with X-terminal and SSH/X11 (Secure Shell with graphical user interface) support. Currently, the system covers the high-throughput X-ray data collection stages and is being commissioned at BL5A and NW12A (PF, PF-AR, KEK, Tsukuba, Japan)

  4. A pocket device for high-throughput optofluidic holographic microscopy

    Science.gov (United States)

    Mandracchia, B.; Bianco, V.; Wang, Z.; Paturzo, M.; Bramanti, A.; Pioggia, G.; Ferraro, P.

    2017-06-01

    Here we introduce a compact holographic microscope embedded onboard a Lab-on-a-Chip (LoC) platform. A wavefront division interferometer is realized by writing a polymer grating onto the channel to extract a reference wave from the object wave impinging the LoC. A portion of the beam reaches the samples flowing along the channel path, carrying their information content to the recording device, while one of the diffraction orders from the grating acts as an off-axis reference wave. Polymeric micro-lenses are delivered forward the chip by Pyro-ElectroHydroDynamic (Pyro-EHD) inkjet printing techniques. Thus, all the required optical components are embedded onboard a pocket device, and fast, non-iterative, reconstruction algorithms can be used. We use our device in combination with a novel high-throughput technique, named Space-Time Digital Holography (STDH). STDH exploits the samples motion inside microfluidic channels to obtain a synthetic hologram, mapped in a hybrid space-time domain, and with intrinsic useful features. Indeed, a single Linear Sensor Array (LSA) is sufficient to build up a synthetic representation of the entire experiment (i.e. the STDH) with unlimited Field of View (FoV) along the scanning direction, independently from the magnification factor. The throughput of the imaging system is dramatically increased as STDH provides unlimited FoV, refocusable imaging of samples inside the liquid volume with no need for hologram stitching. To test our embedded STDH microscopy module, we counted, imaged and tracked in 3D with high-throughput red blood cells moving inside the channel volume under non ideal flow conditions.

  5. An RNA-Based Fluorescent Biosensor for High-Throughput Analysis of the cGAS-cGAMP-STING Pathway.

    Science.gov (United States)

    Bose, Debojit; Su, Yichi; Marcus, Assaf; Raulet, David H; Hammond, Ming C

    2016-12-22

    In mammalian cells, the second messenger (2'-5',3'-5') cyclic guanosine monophosphate-adenosine monophosphate (2',3'-cGAMP), is produced by the cytosolic DNA sensor cGAMP synthase (cGAS), and subsequently bound by the stimulator of interferon genes (STING) to trigger interferon response. Thus, the cGAS-cGAMP-STING pathway plays a critical role in pathogen detection, as well as pathophysiological conditions including cancer and autoimmune disorders. However, studying and targeting this immune signaling pathway has been challenging due to the absence of tools for high-throughput analysis. We have engineered an RNA-based fluorescent biosensor that responds to 2',3'-cGAMP. The resulting "mix-and-go" cGAS activity assay shows excellent statistical reliability as a high-throughput screening (HTS) assay and distinguishes between direct and indirect cGAS inhibitors. Furthermore, the biosensor enables quantitation of 2',3'-cGAMP in mammalian cell lysates. We envision this biosensor-based assay as a resource to study the cGAS-cGAMP-STING pathway in the context of infectious diseases, cancer immunotherapy, and autoimmune diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Robust high-yield ~1 TBq production of cyclotron based sodium [99mTc]pertechnetate.

    Science.gov (United States)

    Andersson, J D; Thomas, B; Selivanova, S V; Berthelette, E; Wilson, J S; McEwan, A J B; Gagnon, K

    2018-03-02

    This paper presents the irradiation and processing of high-current 100 Mo targets at the University of Alberta (UofA) in a GMP compliant setting. For purpose of comparison with a second production facility, additional studies at Centre Hospitalier Universitaire de Sherbrooke (CHUS) are also described. More than 70% of today's diagnostic radiopharmaceuticals are based on 99m Tc, however the conventional supply chain for obtaining 99m Tc is fragile. The aim of this work was to demonstrate reliable high yield production and processing of 99m Tc with medium-energy, high-current, cyclotrons. We used two cyclotrons (TR-24, Advanced Cyclotron Systems, Inc) for irradiations with 22 MeV or 24 MeV incident energy and 400 μA current up to a maximum of 6 h. The irradiated 100 Mo was dissolved using peroxide, basified using ammonium carbonate, and purified using a PEG-based solid phase extraction technique. High-yield productions with 22 MeV (400 μA, 6 h) yielded an average isolated [ 99m Tc]TcO 4 - yield of 878 GBq ± 99 GBq (23.7 Ci ± 2.7 Ci) decay corrected to EOB, n = 8 (isolated saturation yield: 4.36 ± 0.49 GBq/μA). Irradiations with 24 MeV (400 μA, 6 h) resulted in an average isolated [ 99m Tc]TcO 4 - yield of 993 GBq ± 100 GBq (26.8 Ci ± 2.7 Ci) decay corrected to EOB, n = 7 (isolated saturation yield: 4.97 ± 0.50 GBq/μA). These yields corresponds to 600-700 GBq (16-19 Ci) of [ 99m Tc]TcO 4 - at release (i.e. 3 hour post-EOB). For all tested batches, the QC results were within the recently published specifications in the European Pharmacopoeia. Reliable near-TBq production yields for 99m Tc can be obtained using medium-energy cyclotrons. This work presents evidence that medium-energy high-current cyclotrons can provide high yields of [ 99m Tc]TcO 4 - with radionuclidic impurities levels within the specifications of the existing European Pharmacopoeia monograph, indicating that this

  7. Fundamental Tradeoffs among Reliability, Latency and Throughput in Cellular Networks

    DEFF Research Database (Denmark)

    Soret, Beatriz; Mogensen, Preben; Pedersen, Klaus I.

    2014-01-01

    We address the fundamental tradeoffs among latency, reliability and throughput in a cellular network. The most important elements influencing the KPIs in a 4G network are identified, and the inter-relationships among them is discussed. We use the effective bandwidth and the effective capacity......, in which latency and reliability will be two of the principal KPIs....

  8. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    Science.gov (United States)

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  9. Study on a digital pulse processing algorithm based on template-matching for high-throughput spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Xianfei; Yang, Haori

    2015-06-01

    A major challenge in utilizing spectroscopy techniques for nuclear safeguards is to perform high-resolution measurements at an ultra-high throughput rate. Traditionally, piled-up pulses are rejected to ensure good energy resolution. To improve throughput rate, high-pass filters are normally implemented to shorten pulses. However, this reduces signal-to-noise ratio and causes degradation in energy resolution. In this work, a pulse pile-up recovery algorithm based on template-matching was proved to be an effective approach to achieve high-throughput gamma ray spectroscopy. First, a discussion of the algorithm was given in detail. Second, the algorithm was then successfully utilized to process simulated piled-up pulses from a scintillator detector. Third, the algorithm was implemented to analyze high rate data from a NaI detector, a silicon drift detector and a HPGe detector. The promising results demonstrated the capability of this algorithm to achieve high-throughput rate without significant sacrifice in energy resolution. The performance of the template-matching algorithm was also compared with traditional shaping methods. - Highlights: • A detailed discussion on the template-matching algorithm was given. • The algorithm was tested on data from a NaI and a Si detector. • The algorithm was successfully implemented on high rate data from a HPGe detector. • The performance of the algorithm was compared with traditional shaping methods. • The advantage of the algorithm in active interrogation was discussed.

  10. An automated high throughput screening-compatible assay to identify regulators of stem cell neural differentiation.

    Science.gov (United States)

    Casalino, Laura; Magnani, Dario; De Falco, Sandro; Filosa, Stefania; Minchiotti, Gabriella; Patriarca, Eduardo J; De Cesare, Dario

    2012-03-01

    The use of Embryonic Stem Cells (ESCs) holds considerable promise both for drug discovery programs and the treatment of degenerative disorders in regenerative medicine approaches. Nevertheless, the successful use of ESCs is still limited by the lack of efficient control of ESC self-renewal and differentiation capabilities. In this context, the possibility to modulate ESC biological properties and to obtain homogenous populations of correctly specified cells will help developing physiologically relevant screens, designed for the identification of stem cell modulators. Here, we developed a high throughput screening-suitable ESC neural differentiation assay by exploiting the Cell(maker) robotic platform and demonstrated that neural progenies can be generated from ESCs in complete automation, with high standards of accuracy and reliability. Moreover, we performed a pilot screening providing proof of concept that this assay allows the identification of regulators of ESC neural differentiation in full automation.

  11. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    Science.gov (United States)

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  12. High-Throughput Phenotyping and QTL Mapping Reveals the Genetic Architecture of Maize Plant Growth.

    Science.gov (United States)

    Zhang, Xuehai; Huang, Chenglong; Wu, Di; Qiao, Feng; Li, Wenqiang; Duan, Lingfeng; Wang, Ke; Xiao, Yingjie; Chen, Guoxing; Liu, Qian; Xiong, Lizhong; Yang, Wanneng; Yan, Jianbing

    2017-03-01

    With increasing demand for novel traits in crop breeding, the plant research community faces the challenge of quantitatively analyzing the structure and function of large numbers of plants. A clear goal of high-throughput phenotyping is to bridge the gap between genomics and phenomics. In this study, we quantified 106 traits from a maize ( Zea mays ) recombinant inbred line population ( n = 167) across 16 developmental stages using the automatic phenotyping platform. Quantitative trait locus (QTL) mapping with a high-density genetic linkage map, including 2,496 recombinant bins, was used to uncover the genetic basis of these complex agronomic traits, and 988 QTLs have been identified for all investigated traits, including three QTL hotspots. Biomass accumulation and final yield were predicted using a combination of dissected traits in the early growth stage. These results reveal the dynamic genetic architecture of maize plant growth and enhance ideotype-based maize breeding and prediction. © 2017 American Society of Plant Biologists. All Rights Reserved.

  13. A Functional High-Throughput Assay of Myelination in Vitro

    Science.gov (United States)

    2014-07-01

    Human induced pluripotent stem cells, hydrogels, 3D culture, electrophysiology, high-throughput assay 16. SECURITY CLASSIFICATION OF: 17...image the 3D rat dorsal root ganglion ( DRG ) cultures with sufficiently low background as to detect electrically-evoked depolarization events, as...of voltage-sensitive dyes. 8    We have made substantial progress in Task 4.1. We have fabricated neural fiber tracts from DRG explants and

  14. Printing Proteins as Microarrays for High-Throughput Function Determination

    Science.gov (United States)

    MacBeath, Gavin; Schreiber, Stuart L.

    2000-09-01

    Systematic efforts are currently under way to construct defined sets of cloned genes for high-throughput expression and purification of recombinant proteins. To facilitate subsequent studies of protein function, we have developed miniaturized assays that accommodate extremely low sample volumes and enable the rapid, simultaneous processing of thousands of proteins. A high-precision robot designed to manufacture complementary DNA microarrays was used to spot proteins onto chemically derivatized glass slides at extremely high spatial densities. The proteins attached covalently to the slide surface yet retained their ability to interact specifically with other proteins, or with small molecules, in solution. Three applications for protein microarrays were demonstrated: screening for protein-protein interactions, identifying the substrates of protein kinases, and identifying the protein targets of small molecules.

  15. Modeling Disordered Materials with a High Throughput ab-initio Approach

    Science.gov (United States)

    2015-11-13

    Modeling Disordered Materials with a High Throughput ab - initio Approach Kesong Yang,1 Corey Oses,2 and Stefano Curtarolo3, 4 1Department of...J. Furthmüller, Efficient iterative schemes for ab initio total-energy calculations using a plane-wave basis set, Phys. Rev. B 54, 11169–11186 (1996

  16. Novel high-throughput cell-based hybridoma screening methodology using the Celigo Image Cytometer.

    Science.gov (United States)

    Zhang, Haohai; Chan, Leo Li-Ying; Rice, William; Kassam, Nasim; Longhi, Maria Serena; Zhao, Haitao; Robson, Simon C; Gao, Wenda; Wu, Yan

    2017-08-01

    Hybridoma screening is a critical step for antibody discovery, which necessitates prompt identification of potential clones from hundreds to thousands of hybridoma cultures against the desired immunogen. Technical issues associated with ELISA- and flow cytometry-based screening limit accuracy and diminish high-throughput capability, increasing time and cost. Conventional ELISA screening with coated antigen is also impractical for difficult-to-express hydrophobic membrane antigens or multi-chain protein complexes. Here, we demonstrate novel high-throughput screening methodology employing the Celigo Image Cytometer, which avoids nonspecific signals by contrasting antibody binding signals directly on living cells, with and without recombinant antigen expression. The image cytometry-based high-throughput screening method was optimized by detecting the binding of hybridoma supernatants to the recombinant antigen CD39 expressed on Chinese hamster ovary (CHO) cells. Next, the sensitivity of the image cytometer was demonstrated by serial dilution of purified CD39 antibody. Celigo was used to measure antibody affinities of commercial and in-house antibodies to membrane-bound CD39. This cell-based screening procedure can be completely accomplished within one day, significantly improving throughput and efficiency of hybridoma screening. Furthermore, measuring direct antibody binding to living cells eliminated both false positive and false negative hits. The image cytometry method was highly sensitive and versatile, and could detect positive antibody in supernatants at concentrations as low as ~5ng/mL, with concurrent K d binding affinity coefficient determination. We propose that this screening method will greatly facilitate antibody discovery and screening technologies. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Screening for Antifibrotic Compounds Using High Throughput System Based on Fluorescence Polarization

    Directory of Open Access Journals (Sweden)

    Branko Stefanovic

    2014-04-01

    Full Text Available Fibroproliferative diseases are one of the leading causes of death worldwide. They are characterized by reactive fibrosis caused by uncontrolled synthesis of type I collagen. There is no cure for fibrosis and development of therapeutics that can inhibit collagen synthesis is urgently needed. Collagen α1(I mRNA and α2(I mRNA encode for type I collagen and they have a unique 5' stem-loop structure in their 5' untranslated regions (5'SL. Collagen 5'SL binds protein LARP6 with high affinity and specificity. The interaction between LARP6 and the 5'SL is critical for biosynthesis of type I collagen and development of fibrosis in vivo. Therefore, this interaction represents is an ideal target to develop antifibrotic drugs. A high throughput system to screen for chemical compounds that can dissociate LARP6 from 5'SL has been developed. It is based on fluorescence polarization and can be adapted to screen for inhibitors of other protein-RNA interactions. Screening of 50,000 chemical compounds yielded a lead compound that can inhibit type I collagen synthesis at nanomolar concentrations. The development, characteristics, and critical appraisal of this assay are presented.

  18. Development of a highly reliable CRT processor

    International Nuclear Information System (INIS)

    Shimizu, Tomoya; Saiki, Akira; Hirai, Kenji; Jota, Masayoshi; Fujii, Mikiya

    1996-01-01

    Although CRT processors have been employed by the main control board to reduce the operator's workload during monitoring, the control systems are still operated by hardware switches. For further advancement, direct controller operation through a display device is expected. A CRT processor providing direct controller operation must be as reliable as the hardware switches are. The authors are developing a new type of highly reliable CRT processor that enables direct controller operations. In this paper, we discuss the design principles behind a highly reliable CRT processor. The principles are defined by studies of software reliability and of the functional reliability of the monitoring and operation systems. The functional configuration of an advanced CRT processor is also addressed. (author)

  19. High throughput diffractive multi-beam femtosecond laser processing using a spatial light modulator

    Energy Technology Data Exchange (ETDEWEB)

    Kuang Zheng [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom)], E-mail: z.kuang@liv.ac.uk; Perrie, Walter [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom); Leach, Jonathan [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom); Sharp, Martin; Edwardson, Stuart P. [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom); Padgett, Miles [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom); Dearden, Geoff; Watkins, Ken G. [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom)

    2008-12-30

    High throughput femtosecond laser processing is demonstrated by creating multiple beams using a spatial light modulator (SLM). The diffractive multi-beam patterns are modulated in real time by computer generated holograms (CGHs), which can be calculated by appropriate algorithms. An interactive LabVIEW program is adopted to generate the relevant CGHs. Optical efficiency at this stage is shown to be {approx}50% into first order beams and real time processing has been carried out at 50 Hz refresh rate. Results obtained demonstrate high precision surface micro-structuring on silicon and Ti6Al4V with throughput gain >1 order of magnitude.

  20. High-throughput and automatic typing via human papillomavirus identification map for cervical cancer screening and prognosis.

    Science.gov (United States)

    Yi, Linglu; Xu, Xueqin; Lin, Xuexia; Li, Haifang; Ma, Yuan; Lin, Jin-Ming

    2014-07-07

    A novel human papillomavirus (HPV) typing assay for cervical cancer screening and prognosis was developed by the combination of restriction fragment length polymorphism (RFLP) and microchip electrophoresis (MCE) to achieve higher levels of sensitivity and throughput. The detection limit of 2 × 10(2) copies, high sensitivity and typing accuracy on the account of PCR-RFLP-MCE method guarantee the successful diagnosis results of 4-fold higher infection rate over cytologic tests. From clinical samples, eleven kinds of HPV types were identified with a good compatibility degree of over 90%. The described method showed good reliability in clinical samples and provided a promising alternative for pathological studies at the molecular level.

  1. High-throughput, temperature-controlled microchannel acoustophoresis device made with rapid prototyping

    DEFF Research Database (Denmark)

    Adams, Jonathan D; Ebbesen, Christian L.; Barnkob, Rune

    2012-01-01

    -slide format using low-cost, rapid-prototyping techniques. This high-throughput acoustophoresis chip (HTAC) utilizes a temperature-stabilized, standing ultrasonic wave, which imposes differential acoustic radiation forces that can separate particles according to size, density and compressibility. The device...

  2. PCR cycles above routine numbers do not compromise high-throughput DNA barcoding results.

    Science.gov (United States)

    Vierna, J; Doña, J; Vizcaíno, A; Serrano, D; Jovani, R

    2017-10-01

    High-throughput DNA barcoding has become essential in ecology and evolution, but some technical questions still remain. Increasing the number of PCR cycles above the routine 20-30 cycles is a common practice when working with old-type specimens, which provide little amounts of DNA, or when facing annealing issues with the primers. However, increasing the number of cycles can raise the number of artificial mutations due to polymerase errors. In this work, we sequenced 20 COI libraries in the Illumina MiSeq platform. Libraries were prepared with 40, 45, 50, 55, and 60 PCR cycles from four individuals belonging to four species of four genera of cephalopods. We found no relationship between the number of PCR cycles and the number of mutations despite using a nonproofreading polymerase. Moreover, even when using a high number of PCR cycles, the resulting number of mutations was low enough not to be an issue in the context of high-throughput DNA barcoding (but may still remain an issue in DNA metabarcoding due to chimera formation). We conclude that the common practice of increasing the number of PCR cycles should not negatively impact the outcome of a high-throughput DNA barcoding study in terms of the occurrence of point mutations.

  3. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  4. High-throughput approach to the catalytic combustion of diesel soot

    Energy Technology Data Exchange (ETDEWEB)

    Iojoiu, Eduard Emil; Bassou, Badr; Guilhaume, Nolven; Farrusseng, David; Desmartin-Chomel, Arnold; Bianchi, Daniel; Mirodatos, Claude [Institut de recherches sur la catalyse et l' environnement de Lyon IRCELYON, UMR5256 CNRS Universite Lyon 1, 2 avenue Albert Einstein, F-69626 Villeurbanne Cedex (France); Lombaert, Karine [Renault, Diesel Innovative Catalytic Materials, Direction de l' Ingenierie Materiaux, 1 Allee Cornuel, 91510 Lardy (France)

    2008-08-30

    A methodology for the evaluation of diesel soot oxidation catalysts by high-throughput (HT) screening was developed. The optimal experimental conditions (soot amount, catalyst/soot ratio, type of contact, composition and flow rate of gas reactants) ensuring a reliable and reproducible detection of light-off temperatures in a 16 parallel channels reactor were set up. The temperature profile measured in the catalyst/soot bed under TPO conditions when the exothermic combustion of soot takes place was shown to provide an accurate measurement of the ignition. Its reproducibility and relevance were checked. The results obtained with a reference noble metal free catalyst (La{sub 0.8}Cr{sub 0.8}Li{sub 0.2}O{sub 3} perovskite) agree very well with literature data. Qualitative mechanistic features could be derived from these experiments, stressing the likely limiting step of oxygen transfer from catalyst surface to soot particulates to ignite the soot combustion. Ceria material was shown to be more appropriate than perovskite one. From an HT screening of a large diverse library (over 100 mixed oxides catalysts) under optimized conditions, about 10 new formulations were found to perform better than selected noble metal free reference materials. (author)

  5. Assessing the utility and limitations of high throughput virtual screening

    Directory of Open Access Journals (Sweden)

    Paul Daniel Phillips

    2016-05-01

    Full Text Available Due to low cost, speed, and unmatched ability to explore large numbers of compounds, high throughput virtual screening and molecular docking engines have become widely utilized by computational scientists. It is generally accepted that docking engines, such as AutoDock, produce reliable qualitative results for ligand-macromolecular receptor binding, and molecular docking results are commonly reported in literature in the absence of complementary wet lab experimental data. In this investigation, three variants of the sixteen amino acid peptide, α-conotoxin MII, were docked to a homology model of the a3β2-nicotinic acetylcholine receptor. DockoMatic version 2.0 was used to perform a virtual screen of each peptide ligand to the receptor for ten docking trials consisting of 100 AutoDock cycles per trial. The results were analyzed for both variation in the calculated binding energy obtained from AutoDock, and the orientation of bound peptide within the receptor. The results show that, while no clear correlation exists between consistent ligand binding pose and the calculated binding energy, AutoDock is able to determine a consistent positioning of bound peptide in the majority of trials when at least ten trials were evaluated.

  6. Simple and rapid preparation of [11C]DASB with high quality and reliability for routine applications

    International Nuclear Information System (INIS)

    Haeusler, D.; Mien, L.-K.; Nics, L.; Ungersboeck, J.; Philippe, C.; Lanzenberger, R.R.; Kletter, K.; Dudczak, R.; Mitterhauser, M.; Wadsak, W.

    2009-01-01

    [ 11 C]DASB combines all major prerequisites for a successful SERT-ligand, providing excellent biological properties and in-vivo behaviour. Thus, we aimed to establish a fully automated procedure for the synthesis and purification of [ 11 C]DASB with a high degree of reliability reducing the overall synthesis time while conserving high yields and purity. The optimized [ 11 C]DASB synthesis was applied in more than 60 applications with a very low failure rate (3.2%). We obtained yields up to 8.9 GBq (average 5.3±1.6 GBq). Radiochemical yields based on [ 11 C]CH 3 I, (corrected for decay) were 66.3±6.9% with a specific radioactivity (A s ) of 86.8±24.3 GBq/μmol (both at the end of synthesis, EOS). Time consumption was kept to a minimum, resulting in 43 min from end of bombardment to release of the product after quality control. Form our data, it is evident that the presented method can be implemented for routine preparations of [ 11 C]DASB with high reliability.

  7. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  8. High-throughput single-molecule force spectroscopy for membrane proteins

    Science.gov (United States)

    Bosshart, Patrick D.; Casagrande, Fabio; Frederix, Patrick L. T. M.; Ratera, Merce; Bippes, Christian A.; Müller, Daniel J.; Palacin, Manuel; Engel, Andreas; Fotiadis, Dimitrios

    2008-09-01

    Atomic force microscopy-based single-molecule force spectroscopy (SMFS) is a powerful tool for studying the mechanical properties, intermolecular and intramolecular interactions, unfolding pathways, and energy landscapes of membrane proteins. One limiting factor for the large-scale applicability of SMFS on membrane proteins is its low efficiency in data acquisition. We have developed a semi-automated high-throughput SMFS (HT-SMFS) procedure for efficient data acquisition. In addition, we present a coarse filter to efficiently extract protein unfolding events from large data sets. The HT-SMFS procedure and the coarse filter were validated using the proton pump bacteriorhodopsin (BR) from Halobacterium salinarum and the L-arginine/agmatine antiporter AdiC from the bacterium Escherichia coli. To screen for molecular interactions between AdiC and its substrates, we recorded data sets in the absence and in the presence of L-arginine, D-arginine, and agmatine. Altogether ~400 000 force-distance curves were recorded. Application of coarse filtering to this wealth of data yielded six data sets with ~200 (AdiC) and ~400 (BR) force-distance spectra in each. Importantly, the raw data for most of these data sets were acquired in one to two days, opening new perspectives for HT-SMFS applications.

  9. High-throughput single-molecule force spectroscopy for membrane proteins

    International Nuclear Information System (INIS)

    Bosshart, Patrick D; Casagrande, Fabio; Frederix, Patrick L T M; Engel, Andreas; Fotiadis, Dimitrios; Ratera, Merce; Palacin, Manuel; Bippes, Christian A; Mueller, Daniel J

    2008-01-01

    Atomic force microscopy-based single-molecule force spectroscopy (SMFS) is a powerful tool for studying the mechanical properties, intermolecular and intramolecular interactions, unfolding pathways, and energy landscapes of membrane proteins. One limiting factor for the large-scale applicability of SMFS on membrane proteins is its low efficiency in data acquisition. We have developed a semi-automated high-throughput SMFS (HT-SMFS) procedure for efficient data acquisition. In addition, we present a coarse filter to efficiently extract protein unfolding events from large data sets. The HT-SMFS procedure and the coarse filter were validated using the proton pump bacteriorhodopsin (BR) from Halobacterium salinarum and the L-arginine/agmatine antiporter AdiC from the bacterium Escherichia coli. To screen for molecular interactions between AdiC and its substrates, we recorded data sets in the absence and in the presence of L-arginine, D-arginine, and agmatine. Altogether ∼400 000 force-distance curves were recorded. Application of coarse filtering to this wealth of data yielded six data sets with ∼200 (AdiC) and ∼400 (BR) force-distance spectra in each. Importantly, the raw data for most of these data sets were acquired in one to two days, opening new perspectives for HT-SMFS applications

  10. High-throughput single-molecule force spectroscopy for membrane proteins

    Energy Technology Data Exchange (ETDEWEB)

    Bosshart, Patrick D; Casagrande, Fabio; Frederix, Patrick L T M; Engel, Andreas; Fotiadis, Dimitrios [M E Mueller Institute for Structural Biology, Biozentrum of the University of Basel, CH-4056 Basel (Switzerland); Ratera, Merce; Palacin, Manuel [Institute for Research in Biomedicine, Barcelona Science Park, Department of Biochemistry and Molecular Biology, Faculty of Biology, University of Barcelona and Centro de Investigacion Biomedica en Red de Enfermedades Raras, E-08028 Barcelona (Spain); Bippes, Christian A; Mueller, Daniel J [BioTechnology Center, Technical University, Tatzberg 47, D-01307 Dresden (Germany)], E-mail: andreas.engel@unibas.ch, E-mail: dimitrios.fotiadis@mci.unibe.ch

    2008-09-24

    Atomic force microscopy-based single-molecule force spectroscopy (SMFS) is a powerful tool for studying the mechanical properties, intermolecular and intramolecular interactions, unfolding pathways, and energy landscapes of membrane proteins. One limiting factor for the large-scale applicability of SMFS on membrane proteins is its low efficiency in data acquisition. We have developed a semi-automated high-throughput SMFS (HT-SMFS) procedure for efficient data acquisition. In addition, we present a coarse filter to efficiently extract protein unfolding events from large data sets. The HT-SMFS procedure and the coarse filter were validated using the proton pump bacteriorhodopsin (BR) from Halobacterium salinarum and the L-arginine/agmatine antiporter AdiC from the bacterium Escherichia coli. To screen for molecular interactions between AdiC and its substrates, we recorded data sets in the absence and in the presence of L-arginine, D-arginine, and agmatine. Altogether {approx}400 000 force-distance curves were recorded. Application of coarse filtering to this wealth of data yielded six data sets with {approx}200 (AdiC) and {approx}400 (BR) force-distance spectra in each. Importantly, the raw data for most of these data sets were acquired in one to two days, opening new perspectives for HT-SMFS applications.

  11. Alignment of time-resolved data from high throughput experiments.

    Science.gov (United States)

    Abidi, Nada; Franke, Raimo; Findeisen, Peter; Klawonn, Frank

    2016-12-01

    To better understand the dynamics of the underlying processes in cells, it is necessary to take measurements over a time course. Modern high-throughput technologies are often used for this purpose to measure the behavior of cell products like metabolites, peptides, proteins, [Formula: see text]RNA or mRNA at different points in time. Compared to classical time series, the number of time points is usually very limited and the measurements are taken at irregular time intervals. The main reasons for this are the costs of the experiments and the fact that the dynamic behavior usually shows a strong reaction and fast changes shortly after a stimulus and then slowly converges to a certain stable state. Another reason might simply be missing values. It is common to repeat the experiments and to have replicates in order to carry out a more reliable analysis. The ideal assumptions that the initial stimulus really started exactly at the same time for all replicates and that the replicates are perfectly synchronized are seldom satisfied. Therefore, there is a need to first adjust or align the time-resolved data before further analysis is carried out. Dynamic time warping (DTW) is considered as one of the common alignment techniques for time series data with equidistant time points. In this paper, we modified the DTW algorithm so that it can align sequences with measurements at different, non-equidistant time points with large gaps in between. This type of data is usually known as time-resolved data characterized by irregular time intervals between measurements as well as non-identical time points for different replicates. This new algorithm can be easily used to align time-resolved data from high-throughput experiments and to come across existing problems such as time scarcity and existing noise in the measurements. We propose a modified method of DTW to adapt requirements imposed by time-resolved data by use of monotone cubic interpolation splines. Our presented approach

  12. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  13. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    Science.gov (United States)

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  14. Patterning cell using Si-stencil for high-throughput assay

    KAUST Repository

    Wu, Jinbo

    2011-01-01

    In this communication, we report a newly developed cell pattering methodology by a silicon-based stencil, which exhibited advantages such as easy handling, reusability, hydrophilic surface and mature fabrication technologies. Cell arrays obtained by this method were used to investigate cell growth under a temperature gradient, which demonstrated the possibility of studying cell behavior in a high-throughput assay. This journal is © The Royal Society of Chemistry 2011.

  15. Fabrication of combinatorial nm-planar electrode array for high throughput evaluation of organic semiconductors

    International Nuclear Information System (INIS)

    Haemori, M.; Edura, T.; Tsutsui, K.; Itaka, K.; Wada, Y.; Koinuma, H.

    2006-01-01

    We have fabricated a combinatorial nm-planar electrode array by using photolithography and chemical mechanical polishing processes for high throughput electrical evaluation of organic devices. Sub-nm precision was achieved with respect to the average level difference between each pair of electrodes and a dielectric layer. The insulating property between the electrodes is high enough to measure I-V characteristics of organic semiconductors. Bottom-contact field-effect-transistors (FETs) of pentacene were fabricated on this electrode array by use of molecular beam epitaxy. It was demonstrated that the array could be used as a pre-patterned device substrate for high throughput screening of the electrical properties of organic semiconductors

  16. Chromatographic Monoliths for High-Throughput Immunoaffinity Isolation of Transferrin from Human Plasma

    Directory of Open Access Journals (Sweden)

    Irena Trbojević-Akmačić

    2016-06-01

    Full Text Available Changes in protein glycosylation are related to different diseases and have a potential as diagnostic and prognostic disease biomarkers. Transferrin (Tf glycosylation changes are common marker for congenital disorders of glycosylation. However, biological interindividual variability of Tf N-glycosylation and genes involved in glycosylation regulation are not known. Therefore, high-throughput Tf isolation method and large scale glycosylation studies are needed in order to address these questions. Due to their unique chromatographic properties, the use of chromatographic monoliths enables very fast analysis cycle, thus significantly increasing sample preparation throughput. Here, we are describing characterization of novel immunoaffinity-based monolithic columns in a 96-well plate format for specific high-throughput purification of human Tf from blood plasma. We optimized the isolation and glycan preparation procedure for subsequent ultra performance liquid chromatography (UPLC analysis of Tf N-glycosylation and managed to increase the sensitivity for approximately three times compared to initial experimental conditions, with very good reproducibility. This work is licensed under a Creative Commons Attribution 4.0 International License.

  17. An UPLC-MS/MS method for highly sensitive high-throughput analysis of phytohormones in plant tissues

    Directory of Open Access Journals (Sweden)

    Balcke Gerd Ulrich

    2012-11-01

    Full Text Available Abstract Background Phytohormones are the key metabolites participating in the regulation of multiple functions of plant organism. Among them, jasmonates, as well as abscisic and salicylic acids are responsible for triggering and modulating plant reactions targeted against pathogens and herbivores, as well as resistance to abiotic stress (drought, UV-irradiation and mechanical wounding. These factors induce dramatic changes in phytohormone biosynthesis and transport leading to rapid local and systemic stress responses. Understanding of underlying mechanisms is of principle interest for scientists working in various areas of plant biology. However, highly sensitive, precise and high-throughput methods for quantification of these phytohormones in small samples of plant tissues are still missing. Results Here we present an LC-MS/MS method for fast and highly sensitive determination of jasmonates, abscisic and salicylic acids. A single-step sample preparation procedure based on mixed-mode solid phase extraction was efficiently combined with essential improvements in mobile phase composition yielding higher efficiency of chromatographic separation and MS-sensitivity. This strategy resulted in dramatic increase in overall sensitivity, allowing successful determination of phytohormones in small (less than 50 mg of fresh weight tissue samples. The method was completely validated in terms of analyte recovery, sensitivity, linearity and precision. Additionally, it was cross-validated with a well-established GC-MS-based procedure and its applicability to a variety of plant species and organs was verified. Conclusion The method can be applied for the analyses of target phytohormones in small tissue samples obtained from any plant species and/or plant part relying on any commercially available (even less sensitive tandem mass spectrometry instrumentation.

  18. An Embedded System for Safe, Secure and Reliable Execution of High Consequence Software

    Energy Technology Data Exchange (ETDEWEB)

    MCCOY,JAMES A.

    2000-08-29

    As more complex and functionally diverse requirements are placed on high consequence embedded applications, ensuring safe and secure operation requires an execution environment that is ultra reliable from a system viewpoint. In many cases the safety and security of the system depends upon the reliable cooperation between the hardware and the software to meet real-time system throughput requirements. The selection of a microprocessor and its associated development environment for an embedded application has the most far-reaching effects on the development and production of the system than any other element in the design. The effects of this choice ripple through the remainder of the hardware design and profoundly affect the entire software development process. While state-of-the-art software engineering principles indicate that an object oriented (OO) methodology provides a superior development environment, traditional programming languages available for microprocessors targeted for deeply embedded applications do not directly support OO techniques. Furthermore, the microprocessors themselves do not typically support nor do they enforce an OO environment. This paper describes a system level approach for the design of a microprocessor intended for use in deeply embedded high consequence applications that both supports and enforces an OO execution environment.

  19. Caveats and limitations of plate reader-based high-throughput kinetic measurements of intracellular calcium levels

    International Nuclear Information System (INIS)

    Heusinkveld, Harm J.; Westerink, Remco H.S.

    2011-01-01

    Calcium plays a crucial role in virtually all cellular processes, including neurotransmission. The intracellular Ca 2+ concentration ([Ca 2+ ] i ) is therefore an important readout in neurotoxicological and neuropharmacological studies. Consequently, there is an increasing demand for high-throughput measurements of [Ca 2+ ] i , e.g. using multi-well microplate readers, in hazard characterization, human risk assessment and drug development. However, changes in [Ca 2+ ] i are highly dynamic, thereby creating challenges for high-throughput measurements. Nonetheless, several protocols are now available for real-time kinetic measurement of [Ca 2+ ] i in plate reader systems, though the results of such plate reader-based measurements have been questioned. In view of the increasing use of plate reader systems for measurements of [Ca 2+ ] i a careful evaluation of current technologies is warranted. We therefore performed an extensive set of experiments, using two cell lines (PC12 and B35) and two fluorescent calcium-sensitive dyes (Fluo-4 and Fura-2), for comparison of a linear plate reader system with single cell fluorescence microscopy. Our data demonstrate that the use of plate reader systems for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with many pitfalls and limitations, including erroneous sustained increases in fluorescence, limited sensitivity and lack of single cell resolution. Additionally, our data demonstrate that probenecid, which is often used to prevent dye leakage, effectively inhibits the depolarization-evoked increase in [Ca 2+ ] i . Overall, the data indicate that the use of current plate reader-based strategies for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with caveats and limitations that require further investigation. - Research highlights: → The use of plate readers for high-throughput screening of intracellular Ca 2+ is associated with many pitfalls and limitations. → Single cell

  20. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  1. A high content, high throughput cellular thermal stability assay for measuring drug-target engagement in living cells.

    Science.gov (United States)

    Massey, Andrew J

    2018-01-01

    Determining and understanding drug target engagement is critical for drug discovery. This can be challenging within living cells as selective readouts are often unavailable. Here we describe a novel method for measuring target engagement in living cells based on the principle of altered protein thermal stabilization / destabilization in response to ligand binding. This assay (HCIF-CETSA) utilizes high content, high throughput single cell immunofluorescent detection to determine target protein levels following heating of adherent cells in a 96 well plate format. We have used target engagement of Chk1 by potent small molecule inhibitors to validate the assay. Target engagement measured by this method was subsequently compared to target engagement measured by two alternative methods (autophosphorylation and CETSA). The HCIF-CETSA method appeared robust and a good correlation in target engagement measured by this method and CETSA for the selective Chk1 inhibitor V158411 was observed. However, these EC50 values were 23- and 12-fold greater than the autophosphorylation IC50. The described method is therefore a valuable advance in the CETSA method allowing the high throughput determination of target engagement in adherent cells.

  2. Transferring Aviation Practices into Clinical Medicine for the Promotion of High Reliability.

    Science.gov (United States)

    Powell-Dunford, Nicole; McPherson, Mark K; Pina, Joseph S; Gaydos, Steven J

    2017-05-01

    Aviation is a classic example of a high reliability organization (HRO)-an organization in which catastrophic events are expected to occur without control measures. As health care systems transition toward high reliability, aviation practices are increasingly transferred for clinical implementation. A PubMed search using the terms aviation, crew resource management, and patient safety was undertaken. Manuscripts authored by physician pilots and accident investigation regulations were analyzed. Subject matter experts involved in adoption of aviation practices into the medical field were interviewed. A PubMed search yielded 621 results with 22 relevant for inclusion. Improved clinical outcomes were noted in five research trials in which aviation practices were adopted, particularly with regard to checklist usage and crew resource-management training. Effectiveness of interventions was influenced by intensity of application, leadership involvement, and provision of staff training. The usefulness of incorporating mishap investigation techniques has not been established. Whereas aviation accident investigation is highly standardized, the investigation of medical error is characterized by variation. The adoption of aviation practices into clinical medicine facilitates an evolution toward high reliability. Evidence for the efficacy of the checklist and crew resource-management training is robust. Transference of aviation accident investigation practices is preliminary. A standardized, independent investigation process could facilitate the development of a safety culture commensurate with that achieved in the aviation industry.Powell-Dunford N, McPherson MK, Pina JS, Gaydos SJ. Transferring aviation practices into clinical medicine for the promotion of high reliability. Aerosp Med Hum Perform. 2017; 88(5):487-491.

  3. High-throughput investigation of polymerization kinetics by online monitoring of GPC and GC

    NARCIS (Netherlands)

    Hoogenboom, R.; Fijten, M.W.M.; Abeln, C.H.; Schubert, U.S.

    2004-01-01

    Gel permeation chromatography (GPC) and gas chromatography (GC) were successfully introduced into a high-throughput workflow. The feasibility and limitations of online GPC with a high-speed column was evaluated by measuring polystyrene standards and comparison of the results with regular offline GPC

  4. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  5. High-throughput diagnosis of potato cyst nematodes in soil samples.

    Science.gov (United States)

    Reid, Alex; Evans, Fiona; Mulholland, Vincent; Cole, Yvonne; Pickup, Jon

    2015-01-01

    Potato cyst nematode (PCN) is a damaging soilborne pest of potatoes which can cause major crop losses. In 2010, a new European Union directive (2007/33/EC) on the control of PCN came into force. Under the new directive, seed potatoes can only be planted on land which has been found to be free from PCN infestation following an official soil test. A major consequence of the new directive was the introduction of a new harmonized soil sampling rate resulting in a threefold increase in the number of samples requiring testing. To manage this increase with the same staffing resources, we have replaced the traditional diagnostic methods. A system has been developed for the processing of soil samples, extraction of DNA from float material, and detection of PCN by high-throughput real-time PCR. Approximately 17,000 samples are analyzed each year using this method. This chapter describes the high-throughput processes for the production of float material from soil samples, DNA extraction from the entire float, and subsequent detection and identification of PCN within these samples.

  6. A multilayer microdevice for cell-based high-throughput drug screening

    International Nuclear Information System (INIS)

    Liu, Chong; Wang, Lei; Li, Jingmin; Ding, Xiping; Chunyu, Li; Xu, Zheng; Wang, Qi

    2012-01-01

    A multilayer polydimethylsiloxane microdevice for cell-based high-throughput drug screening is described in this paper. This established microdevice was based on a modularization method and it integrated a drug/medium concentration gradient generator (CGG), pneumatic microvalves and a cell culture microchamber array. The CGG was able to generate five steps of linear concentrations with the same outlet flow rate. The medium/drug flowed through CGG and then into the pear-shaped cell culture microchambers vertically. This vertical perfusion mode was used to reduce the impact of the shear stress on the physiology of cells induced by the fluid flow in the microchambers. Pear-shaped microchambers with two arrays of miropillars at each outlet were adopted in this microdevice, which were beneficial to cell distribution. The chemotherapeutics Cisplatin (DDP)-induced Cisplatin-resistant cell line A549/DDP apoptotic experiments were performed well on this platform. The results showed that this novel microdevice could not only provide well-defined and stable conditions for cell culture, but was also useful for cell-based high-throughput drug screening with less reagents and time consumption. (paper)

  7. SERS Substrates by the Assembly of Silver Nano cubes: High-Throughput and Enhancement Reliability Considerations

    International Nuclear Information System (INIS)

    Rabin, O.; Lee, S.Y.; Rabin, O.

    2012-01-01

    Small clusters of nanoparticles are ideal substrates for SERS measurements, but the SERS signal enhancement by a particular cluster is strongly dependent on its structural characteristics and the measurement conditions. Two methods for high-throughput assembly of silver nano cubes into small clusters at predetermined locations on a substrate are presented. These fabrication techniques make it possible to study both the structure and the plasmonic properties of hundreds of nanoparticle clusters. The variations in SERS enhancement factors from cluster to cluster were analyzed and correlated with cluster size and configuration, and laser frequency and polarization. Using Raman instruments with 633 nm and 785 nm lasers and linear clusters of nano cubes, an increase in the reproducibility of the enhancement and an increase in the average enhancement values were achieved by increasing the number of nano cubes in the cluster, up to 4 nano cubes per cluster. By examining the effect of cluster configuration, it is shown that linear clusters with nano cubes attached in a face-to-face configuration are not as effective SERS substrates as linear clusters in which nano cubes are attached along an edge

  8. Correction of Microplate Data from High-Throughput Screening.

    Science.gov (United States)

    Wang, Yuhong; Huang, Ruili

    2016-01-01

    High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.

  9. High-throughput microfluidics automated cytogenetic processing for effectively lowering biological process time and aid triage during radiation accidents

    International Nuclear Information System (INIS)

    Ramakumar, Adarsh

    2016-01-01

    Nuclear or radiation mass casualties require individual, rapid, and accurate dose-based triage of exposed subjects for cytokine therapy and supportive care, to save life. Radiation mass casualties will demand high-throughput individual diagnostic dose assessment for medical management of exposed subjects. Cytogenetic techniques are widely used for triage and definitive radiation biodosimetry. Prototype platform to demonstrate high-throughput microfluidic micro incubation to support the logistics of sample in miniaturized incubators from the site of accident to analytical labs has been developed. Efforts have been made, both at the level of developing concepts and advanced system for higher throughput in processing the samples and also implementing better and efficient methods of logistics leading to performance of lab-on-chip analyses. Automated high-throughput platform with automated feature extraction, storage, cross platform data linkage, cross platform validation and inclusion of multi-parametric biomarker approaches will provide the first generation high-throughput platform systems for effective medical management, particularly during radiation mass casualty events

  10. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  11. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  12. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  13. High-throughput machining using a high-average power ultrashort pulse laser and high-speed polygon scanner

    Science.gov (United States)

    Schille, Joerg; Schneider, Lutz; Streek, André; Kloetzer, Sascha; Loeschner, Udo

    2016-09-01

    High-throughput ultrashort pulse laser machining is investigated on various industrial grade metals (aluminum, copper, and stainless steel) and Al2O3 ceramic at unprecedented processing speeds. This is achieved by using a high-average power picosecond laser in conjunction with a unique, in-house developed polygon mirror-based biaxial scanning system. Therefore, different concepts of polygon scanners are engineered and tested to find the best architecture for high-speed and precision laser beam scanning. In order to identify the optimum conditions for efficient processing when using high-average laser powers, the depths of cavities made in the samples by varying the processing parameter settings are analyzed and, from the results obtained, the characteristic removal values are specified. For overlapping pulses of optimum fluence, the removal rate is as high as 27.8 mm3/min for aluminum, 21.4 mm3/min for copper, 15.3 mm3/min for stainless steel, and 129.1 mm3/min for Al2O3, when a laser beam of 187 W average laser powers irradiates. On stainless steel, it is demonstrated that the removal rate increases to 23.3 mm3/min when the laser beam is very fast moving. This is thanks to the low pulse overlap as achieved with 800 m/s beam deflection speed; thus, laser beam shielding can be avoided even when irradiating high-repetitive 20-MHz pulses.

  14. High-yield pulping effluent treatment technologies

    International Nuclear Information System (INIS)

    Su, W.X.; Hsieh, J.S.

    1993-03-01

    The objective of this report is to examine the high-yield (mechanical) pulp processes with respect to environmental issues affected by the discharge of their waste streams. Various statistics are given that support the view that high-yield pulping processes will have major growth in the US regions where pulp mills are located, and sites for projects in the development phase are indicated. Conventional and innovative effluent-treatment technologies applicable to these processes are reviewed. The different types of mechanical pulping or high-yield processes are explained, and the chemical additives are discussed. The important relationship between pulp yield and measure of BOD in the effluent is graphically presented. Effluent contaminants are identified, along with other important characteristics of the streams. Current and proposed environmental limitations specifically related to mechanical pulp production are reviewed. Conventional and innovative effluent-treatment technologies are discussed, along with their principle applications, uses, advantages, and disadvantages. Sludge management and disposal techniques become an intimate part of the treatment of waste streams. The conclusion is made that conventional technologies can successfully treat effluent streams under current waste-water discharge limitations, but these systems may not be adequate when stricter standards are imposed. At present, the most important issue in the treatment of pulp-mill waste is the management and disposal of the resultant sludge

  15. GROMACS 4.5: A high-throughput and highly parallel open source molecular simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pronk, Sander [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Pall, Szilard [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Schulz, Roland [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Larsson, Per [Univ. of Virginia, Charlottesville, VA (United States); Bjelkmar, Par [Science for Life Lab., Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden); Apostolov, Rossen [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Shirts, Michael R. [Univ. of Virginia, Charlottesville, VA (United States); Smith, Jeremy C. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kasson, Peter M. [Univ. of Virginia, Charlottesville, VA (United States); van der Spoel, David [Science for Life Lab., Stockholm (Sweden); Uppsala Univ., Uppsala (Sweden); Hess, Berk [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Lindahl, Erik [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden)

    2013-02-13

    In this study, molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. As a result, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.

  16. High pressure, high current, low inductance, high reliability sealed terminals

    Science.gov (United States)

    Hsu, John S [Oak Ridge, TN; McKeever, John W [Oak Ridge, TN

    2010-03-23

    The invention is a terminal assembly having a casing with at least one delivery tapered-cone conductor and at least one return tapered-cone conductor routed there-through. The delivery and return tapered-cone conductors are electrically isolated from each other and positioned in the annuluses of ordered concentric cones at an off-normal angle. The tapered cone conductor service can be AC phase conductors and DC link conductors. The center core has at least one service conduit of gate signal leads, diagnostic signal wires, and refrigerant tubing routed there-through. A seal material is in direct contact with the casing inner surface, the tapered-cone conductors, and the service conduits thereby hermetically filling the interstitial space in the casing interior core and center core. The assembly provides simultaneous high-current, high-pressure, low-inductance, and high-reliability service.

  17. 40 CFR Table 3 to Subpart Eeee of... - Operating Limits-High Throughput Transfer Racks

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Operating Limits-High Throughput Transfer Racks 3 Table 3 to Subpart EEEE of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION... Throughput Transfer Racks As stated in § 63.2346(e), you must comply with the operating limits for existing...

  18. Luminex and other multiplex high throughput technologies for the identification of, and host response to, environmental triggers of type 1 diabetes.

    Science.gov (United States)

    Purohit, Sharad; Sharma, Ashok; She, Jin-Xiong

    2015-01-01

    Complex interactions between a series of environmental factors and genes result in progression to clinical type 1 diabetes in genetically susceptible individuals. Despite several decades of research in the area, these interactions remain poorly understood. Several studies have yielded associations of certain foods, infections, and immunizations with the onset and progression of diabetes autoimmunity, but most findings are still inconclusive. Environmental triggers are difficult to identify mainly due to (i) large number and complex nature of environmental exposures, including bacteria, viruses, dietary factors, and environmental pollutants, (ii) reliance on low throughput technology, (iii) less efforts in quantifying host response, (iv) long silent period between the exposure and clinical onset of T1D which may lead to loss of the exposure fingerprints, and (v) limited sample sets. Recent development in multiplex technologies has enabled systematic evaluation of different classes of molecules or macroparticles in a high throughput manner. However, the use of multiplex assays in type 1 diabetes research is limited to cytokine assays. In this review, we will discuss the potential use of multiplex high throughput technologies in identification of environmental triggers and host response in type 1 diabetes.

  19. Multiplex and high-throughput DNA detection using surface plasmon mediated fluorescence

    Science.gov (United States)

    Mei, Zhong

    The overall objective of this research project was to develop a user-friendly and sensitive biosensor for nucleic acid aptamers with multiplexing and high-throughput capability. The sensing was based on the fluorescence signals emitted by the fluorophores coupling with plamonic nanoparticle (gold nanorod) deposited on a patterned substrate. Gold nanorods (GNRs) were synthesized using a binary mixture of hexadecyltrimethylammonium bromide (CTAB) and sodium oleate (NaOL) in seed mediated growth method. Polytetrafluoroethylene (PTFE) printed glass slides were selectively coated with a gold thin-film to define hydrophilic areas for GNR deposition. Due to the wettablity contrast, GNR solution dropped on the slide was induced to assemble exclusively in the hydrophilic spots. By controlling temperature and humidity of the evaporation process, vertically-standing GNR arrays were achieved on the pattered slide. Fluorescence was conjugated to GNR surface via DNA double strand with tunable length. Theoretical simulation predicted a flat layer ( 30 nm thick) of uniform "hot spots" presented on the GNR tips, which could modify the nearby fluorescence. Experimentally, the vertical GNR arrays yielded metallic enhanced fluorescence (MEF) effect, which was dependent on the spectrum overlap and GNR-fluorophore distance. Specifically, the maximum enhancement of Quasar 670 and Alexa 750 was observed when it was coupled with GNR664 (plasmonic wavelength 664 nm) and GNR778 respectively at a distance of 16 nm, while the carboxyfluorescein (FAM) was at maximal intensity when attached to gold nanosphere520. This offers an opportunity for multiplexed DNA sensing. Based on this, we developed a novel GNR mediated fluorescence biosensor for DNA detection. Fluorescence labeled haipin-DNA probes were introduced to designated spots of GNR array with the matching LSPR wavelengths on the substrate. The fluorescence was quenched originally because of Forster resonance energy transfer (FRET) effect

  20. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  1. Application of high-throughput sequencing in understanding human oral microbiome related with health and disease

    OpenAIRE

    Chen, Hui; Jiang, Wen

    2014-01-01

    The oral microbiome is one of most diversity habitat in the human body and they are closely related with oral health and disease. As the technique developing,, high throughput sequencing has become a popular approach applied for oral microbial analysis. Oral bacterial profiles have been studied to explore the relationship between microbial diversity and oral diseases such as caries and periodontal disease. This review describes the application of high-throughput sequencing for characterizati...

  2. Rapid and high-throughput detection of highly pathogenic bacteria by Ibis PLEX-ID technology.

    Directory of Open Access Journals (Sweden)

    Daniela Jacob

    Full Text Available In this manuscript, we describe the identification of highly pathogenic bacteria using an assay coupling biothreat group-specific PCR with electrospray ionization mass spectrometry (PCR/ESI-MS run on an Ibis PLEX-ID high-throughput platform. The biothreat cluster assay identifies most of the potential bioterrorism-relevant microorganisms including Bacillus anthracis, Francisella tularensis, Yersinia pestis, Burkholderia mallei and pseudomallei, Brucella species, and Coxiella burnetii. DNA from 45 different reference materials with different formulations and different concentrations were chosen and sent to a service screening laboratory that uses the PCR/ESI-MS platform to provide a microbial identification service. The standard reference materials were produced out of a repository built up in the framework of the EU funded project "Establishment of Quality Assurances for Detection of Highly Pathogenic Bacteria of Potential Bioterrorism Risk" (EQADeBa. All samples were correctly identified at least to the genus level.

  3. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  4. High-throughput screening of tick-borne pathogens in Europe

    DEFF Research Database (Denmark)

    Michelet, Lorraine; Delannoy, Sabine; Devillers, Elodie

    2014-01-01

    was conducted on 7050 Ixodes ricinus nymphs collected from France, Denmark, and the Netherlands using a powerful new high-throughput approach. This advanced methodology permitted the simultaneous detection of 25 bacterial, and 12 parasitic species (including; Borrelia, Anaplasma, Ehrlichia, Rickettsia......, Bartonella, Candidatus Neoehrlichia, Coxiella, Francisella, Babesia, and Theileria genus) across 94 samples. We successfully determined the prevalence of expected (Borrelia burgdorferi sensu lato, Anaplasma phagocytophilum, Rickettsia helvetica, Candidatus Neoehrlichia mikurensis, Babesia divergens, Babesia...

  5. High Throughput Single-cell and Multiple-cell Micro-encapsulation

    OpenAIRE

    Lagus, Todd P.; Edd, Jon F.

    2012-01-01

    Microfluidic encapsulation methods have been previously utilized to capture cells in picoliter-scale aqueous, monodisperse drops, providing confinement from a bulk fluid environment with applications in high throughput screening, cytometry, and mass spectrometry. We describe a method to not only encapsulate single cells, but to repeatedly capture a set number of cells (here we demonstrate one- and two-cell encapsulation) to study both isolation and the interactions between cells in groups of ...

  6. Contribution to high voltage matrix switches reliability

    International Nuclear Information System (INIS)

    Lausenaz, Yvan

    2000-01-01

    Nowadays, power electronic equipment requirements are important, concerning performances, quality and reliability. On the other hand, costs have to be reduced in order to satisfy the market rules. To provide cheap, reliability and performances, many standard components with mass production are developed. But the construction of specific products must be considered following these two different points: in one band you can produce specific components, with delay, over-cost problems and eventuality quality and reliability problems, in the other and you can use standard components in a adapted topologies. The CEA of Pierrelatte has adopted this last technique of power electronic conception for the development of these high voltage pulsed power converters. The technique consists in using standard components and to associate them in series and in parallel. The matrix constitutes high voltage macro-switch where electrical parameters are distributed between the synchronized components. This study deals with the reliability of these structures. It brings up the high reliability aspect of MOSFETs matrix associations. Thanks to several homemade test facilities, we obtained lots of data concerning the components we use. The understanding of defects propagation mechanisms in matrix structures has allowed us to put forwards the necessity of robust drive system, adapted clamping voltage protection, and careful geometrical construction. All these reliability considerations in matrix associations have notably allowed the construction of a new matrix structure regrouping all solutions insuring reliability. Reliable and robust, this product has already reaches the industrial stage. (author) [fr

  7. The search for new amphiphiles: synthesis of a modular, high-throughput library

    Directory of Open Access Journals (Sweden)

    George C. Feast

    2014-07-01

    Full Text Available Amphiphilic compounds are used in a variety of applications due to their lyotropic liquid-crystalline phase formation, however only a limited number of compounds, in a potentially limitless field, are currently in use. A library of organic amphiphilic compounds was synthesised consisting of glucose, galactose, lactose, xylose and mannose head groups and double and triple-chain hydrophobic tails. A modular, high-throughput approach was developed, whereby head and tail components were conjugated using the copper-catalysed azide–alkyne cycloaddition (CuAAC reaction. The tails were synthesised from two core alkyne-tethered intermediates, which were subsequently functionalised with hydrocarbon chains varying in length and degree of unsaturation and branching, while the five sugar head groups were selected with ranging substitution patterns and anomeric linkages. A library of 80 amphiphiles was subsequently produced, using a 24-vial array, with the majority formed in very good to excellent yields. A preliminary assessment of the liquid-crystalline phase behaviour is also presented.

  8. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  9. A cell-based high-throughput screening assay for radiation susceptibility using automated cell counting

    International Nuclear Information System (INIS)

    Hodzic, Jasmina; Dingjan, Ilse; Maas, Mariëlle JP; Meulen-Muileman, Ida H van der; Menezes, Renee X de; Heukelom, Stan; Verheij, Marcel; Gerritsen, Winald R; Geldof, Albert A; Triest, Baukelien van; Beusechem, Victor W van

    2015-01-01

    Radiotherapy is one of the mainstays in the treatment for cancer, but its success can be limited due to inherent or acquired resistance. Mechanisms underlying radioresistance in various cancers are poorly understood and available radiosensitizers have shown only modest clinical benefit. There is thus a need to identify new targets and drugs for more effective sensitization of cancer cells to irradiation. Compound and RNA interference high-throughput screening technologies allow comprehensive enterprises to identify new agents and targets for radiosensitization. However, the gold standard assay to investigate radiosensitivity of cancer cells in vitro, the colony formation assay (CFA), is unsuitable for high-throughput screening. We developed a new high-throughput screening method for determining radiation susceptibility. Fast and uniform irradiation of batches up to 30 microplates was achieved using a Perspex container and a clinically employed linear accelerator. The readout was done by automated counting of fluorescently stained nuclei using the Acumen eX3 laser scanning cytometer. Assay performance was compared to that of the CFA and the CellTiter-Blue homogeneous uniform-well cell viability assay. The assay was validated in a whole-genome siRNA library screening setting using PC-3 prostate cancer cells. On 4 different cancer cell lines, the automated cell counting assay produced radiation dose response curves that followed a linear-quadratic equation and that exhibited a better correlation to the results of the CFA than did the cell viability assay. Moreover, the cell counting assay could be used to detect radiosensitization by silencing DNA-PKcs or by adding caffeine. In a high-throughput screening setting, using 4 Gy irradiated and control PC-3 cells, the effects of DNA-PKcs siRNA and non-targeting control siRNA could be clearly discriminated. We developed a simple assay for radiation susceptibility that can be used for high-throughput screening. This will aid

  10. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  11. High throughput nanoimprint lithography for semiconductor memory applications

    Science.gov (United States)

    Ye, Zhengmao; Zhang, Wei; Khusnatdinov, Niyaz; Stachowiak, Tim; Irving, J. W.; Longsine, Whitney; Traub, Matthew; Fletcher, Brian; Liu, Weijun

    2017-03-01

    Imprint lithography is a promising technology for replication of nano-scale features. For semiconductor device applications, Canon deposits a low viscosity resist on a field by field basis using jetting technology. A patterned mask is lowered into the resist fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are two critical components to meeting throughput requirements for imprint lithography. Using a similar approach to what is already done for many deposition and etch processes, imprint stations can be clustered to enhance throughput. The FPA-1200NZ2C is a four station cluster system designed for high volume manufacturing. For a single station, throughput includes overhead, resist dispense, resist fill time (or spread time), exposure and separation. Resist exposure time and mask/wafer separation are well understood processing steps with typical durations on the order of 0.10 to 0.20 seconds. To achieve a total process throughput of 17 wafers per hour (wph) for a single station, it is necessary to complete the fluid fill step in 1.2 seconds. For a throughput of 20 wph, fill time must be reduced to only one 1.1 seconds. There are several parameters that can impact resist filling. Key parameters include resist drop volume (smaller is better), system controls (which address drop spreading after jetting), Design for Imprint or DFI (to accelerate drop spreading) and material engineering (to promote wetting between the resist and underlying adhesion layer). In addition, it is mandatory to maintain fast filling, even for edge field imprinting. In this paper, we address the improvements made in all of these parameters to first enable a 1.20 second filling process for a device like pattern and have demonstrated this capability for both full fields and edge fields. Non

  12. Approaches to achieve high grain yield and high resource use efficiency in rice

    Directory of Open Access Journals (Sweden)

    Jianchang YANG

    2015-06-01

    Full Text Available This article discusses approaches to simultaneously increase grain yield and resource use efficiency in rice. Breeding nitrogen efficient cultivars without sacrificing rice yield potential, improving grain fill in later-flowering inferior spikelets and enhancing harvest index are three important approaches to achieving the dual goal of high grain yield and high resource use efficiency. Deeper root distribution and higher leaf photosynthetic N use efficiency at lower N rates could be used as selection criteria to develop N-efficient cultivars. Enhancing sink activity through increasing sugar-spikelet ratio at the heading time and enhancing the conversion efficiency from sucrose to starch though increasing the ratio of abscisic acid to ethylene in grains during grain fill could effectively improve grain fill in inferior spikelets. Several practices, such as post-anthesis controlled soil drying, an alternate wetting and moderate soil drying regime during the whole growing season, and non-flooded straw mulching cultivation, could substantially increase grain yield and water use efficiency, mainly via enhanced remobilization of stored carbon from vegetative tissues to grains and improved harvest index. Further research is needed to understand synergistic interaction between water and N on crop and soil and the mechanism underlying high resource use efficiency in high-yielding rice.

  13. High-throughput selection for cellulase catalysts using chemical complementation.

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W

    2008-12-24

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.

  14. A family of E. coli expression vectors for laboratory scale and high throughput soluble protein production

    Directory of Open Access Journals (Sweden)

    Bottomley Stephen P

    2006-03-01

    Full Text Available Abstract Background In the past few years, both automated and manual high-throughput protein expression and purification has become an accessible means to rapidly screen and produce soluble proteins for structural and functional studies. However, many of the commercial vectors encoding different solubility tags require different cloning and purification steps for each vector, considerably slowing down expression screening. We have developed a set of E. coli expression vectors with different solubility tags that allow for parallel cloning from a single PCR product and can be purified using the same protocol. Results The set of E. coli expression vectors, encode for either a hexa-histidine tag or the three most commonly used solubility tags (GST, MBP, NusA and all with an N-terminal hexa-histidine sequence. The result is two-fold: the His-tag facilitates purification by immobilised metal affinity chromatography, whilst the fusion domains act primarily as solubility aids during expression, in addition to providing an optional purification step. We have also incorporated a TEV recognition sequence following the solubility tag domain, which allows for highly specific cleavage (using TEV protease of the fusion protein to yield native protein. These vectors are also designed for ligation-independent cloning and they possess a high-level expressing T7 promoter, which is suitable for auto-induction. To validate our vector system, we have cloned four different genes and also one gene into all four vectors and used small-scale expression and purification techniques. We demonstrate that the vectors are capable of high levels of expression and that efficient screening of new proteins can be readily achieved at the laboratory level. Conclusion The result is a set of four rationally designed vectors, which can be used for streamlined cloning, expression and purification of target proteins in the laboratory and have the potential for being adaptable to a high-throughput

  15. An Automated High Performance Capillary Liquid Chromatography Fourier Transform Ion Cyclotron Resonance Mass Spectrometer for High-Throughput Proteomics

    International Nuclear Information System (INIS)

    Belov, Mikhail E.; Anderson, Gordon A.; Wingerd, Mark A.; Udseth, Harold R.; Tang, Keqi; Prior, David C.; Swanson, Kenneth R.; Buschbach, Michael A.; Strittmatter, Eric F.; Moore, Ronald J.; Smith, Richard D.

    2004-01-01

    We report on a fully automated 9.4 tesla Fourier transform ion resonance cyclotron (FTICR) mass spectrometer coupled to reverse-phase chromatography for high-throughput proteomic studies. Modifications made to the front-end of a commercial FTICR instrument--a dual-ESI-emitter ion source; dual-channel electrodynamic ion funnel; and collisional-cooling, selection and accumulation quadrupoles--significantly improved the sensitivity, dynamic range and mass measurement accuracy of the mass spectrometer. A high-pressure capillary liquid chromatography (LC) system was incorporated with an autosampler that enabled 24 h/day operation. A novel method for accumulating ions in the ICR cell was also developed. Unattended operation of the instrument revealed the exceptional reproducibility (1-5% deviation in elution times for peptides from a bacterial proteome), repeatability (10-20% deviation in detected abundances for peptides from the same aliquot analyzed a few weeks apart) and robustness (high-throughput operation for 5 months without downtime) of the LC/FTICR system. When combined with modulated-ion-energy gated trapping, the internal calibration of FTICR mass spectra decreased dispersion of mass measurement errors for peptide identifications in conjunction with high resolution capillary LC separations to < 5 ppm over a dynamic range for each spectrum of 10 3

  16. High-biomass C4 grasses-Filling the yield gap.

    Science.gov (United States)

    Mullet, John E

    2017-08-01

    A significant increase in agricultural productivity will be required by 2050 to meet the needs of an expanding and rapidly developing world population, without allocating more land and water resources to agriculture, and despite slowing rates of grain yield improvement. This review examines the proposition that high-biomass C 4 grasses could help fill the yield gap. High-biomass C 4 grasses exhibit high yield due to C 4 photosynthesis, long growth duration, and efficient capture and utilization of light, water, and nutrients. These C 4 grasses exhibit high levels of drought tolerance during their long vegetative growth phase ideal for crops grown in water-limited regions of agricultural production. The stems of some high-biomass C 4 grasses can accumulate high levels of non-structural carbohydrates that could be engineered to enhance biomass yield and utility as feedstocks for animals and biofuels production. The regulatory pathway that delays flowering of high-biomass C 4 grasses in long days has been elucidated enabling production and deployment of hybrids. Crop and landscape-scale modeling predict that utilization of high-biomass C 4 grass crops on land and in regions where water resources limit grain crop yield could increase agricultural productivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Analysis of high-throughput sequencing and annotation strategies for phage genomes.

    Directory of Open Access Journals (Sweden)

    Matthew R Henn

    Full Text Available BACKGROUND: Bacterial viruses (phages play a critical role in shaping microbial populations as they influence both host mortality and horizontal gene transfer. As such, they have a significant impact on local and global ecosystem function and human health. Despite their importance, little is known about the genomic diversity harbored in phages, as methods to capture complete phage genomes have been hampered by the lack of knowledge about the target genomes, and difficulties in generating sufficient quantities of genomic DNA for sequencing. Of the approximately 550 phage genomes currently available in the public domain, fewer than 5% are marine phage. METHODOLOGY/PRINCIPAL FINDINGS: To advance the study of phage biology through comparative genomic approaches we used marine cyanophage as a model system. We compared DNA preparation methodologies (DNA extraction directly from either phage lysates or CsCl purified phage particles, and sequencing strategies that utilize either Sanger sequencing of a linker amplification shotgun library (LASL or of a whole genome shotgun library (WGSL, or 454 pyrosequencing methods. We demonstrate that genomic DNA sample preparation directly from a phage lysate, combined with 454 pyrosequencing, is best suited for phage genome sequencing at scale, as this method is capable of capturing complete continuous genomes with high accuracy. In addition, we describe an automated annotation informatics pipeline that delivers high-quality annotation and yields few false positives and negatives in ORF calling. CONCLUSIONS/SIGNIFICANCE: These DNA preparation, sequencing and annotation strategies enable a high-throughput approach to the burgeoning field of phage genomics.

  18. A high throughput platform for understanding the influence of excipients on physical and chemical stability

    DEFF Research Database (Denmark)

    Raijada, Dhara; Cornett, Claus; Rantanen, Jukka

    2013-01-01

    The present study puts forward a miniaturized high-throughput platform to understand influence of excipient selection and processing on the stability of a given drug compound. Four model drugs (sodium naproxen, theophylline, amlodipine besylate and nitrofurantoin) and ten different excipients were...... for chemical degradation. The proposed high-throughput platform can be used during early drug development to simulate typical processing induced stress in a small scale and to understand possible phase transformation behaviour and influence of excipients on this....

  19. High throughput octal alpha/gamma spectrometer for low level bioassay estimations

    International Nuclear Information System (INIS)

    Bhasin, B.D.; Shirke, S.H.; Suri, M.M.; Vaidya, P.P.; Ghodgaonkar, M.D.

    1995-01-01

    The present paper describes the development of a high throughput octal alpha spectrometry system specially developed for the estimation of low levels of actinides in bioassay and environmental samples. The system processes simultaneously the outputs coming from eight independent detectors. It can be configured to simultaneously record low level alpha and gamma spectra. The high throughput is achieved by using a prioritised multiplexer router. The prioritised multiplexing and routing coupled with fast 8K ADC (conversion time 20 μsec) allow simultaneous acquisition of multiple spectra without any significant loss in counts. The dual (8K, 24bit) port memory facilitates easy online viewing of spectrum buildup. A menu driven user friendly software makes the operating system convenient to use. A specially developed software provides built-in routines for processing the spectra and estimating the isotopic activity. The interactive mode of software provides easy identification of isotopes compatible with the separation chemistry of different actinides. (author). 6 refs., 2 figs

  20. Design of a High-Throughput Biological Crystallography Beamline for Superconducting Wiggler

    International Nuclear Information System (INIS)

    Tseng, P.C.; Chang, C.H.; Fung, H.S.; Ma, C.I.; Huang, L.J.; Jean, Y.C.; Song, Y.F.; Huang, Y.S.; Tsang, K.L.; Chen, C.T.

    2004-01-01

    We are constructing a high-throughput biological crystallography beamline BL13B, which utilizes the radiation generated from a 3.2 Tesla, 32-pole superconducting multipole wiggler, for multi-wavelength anomalous diffraction (MAD), single-wavelength anomalous diffraction (SAD), and other related experiments. This beamline is a standard double crystal monochromator (DCM) x-ray beamline equipped with a collimating mirror (CM) and a focusing mirror (FM). Both the CM and FM are one meter long and made of Si substrate, and the CM is side-cooled by water. Based on detailed thermal analysis, liquid nitrogen (LN2) cooling for both crystals of the DCM has been adopted to optimize the energy resolution and photon beam throughput. This beamline will deliver, through a 100 μm diameter pinhole, photon flux of greater than 1011 photons/sec in the energy range from 6.5 keV to 19 keV, which is comparable to existing protein crystallography beamlines from bending magnet source at high energy storage rings

  1. Peptide Binding to HLA Class I Molecules: Homogenous, High-Throughput Screening, and Affinity Assays

    DEFF Research Database (Denmark)

    Harndahl, Mikkel; Justesen, Sune Frederik Lamdahl; Lamberth, Kasper

    2009-01-01

    , better signal-to-background ratios, and a higher capacity. They also describe an efficient approach to screen peptides for binding to HLA molecules. For the occasional user, this will serve as a robust, simple peptide-HLA binding assay. For the more dedicated user, it can easily be performed in a high-throughput...... the luminescent oxygen channeling immunoassay technology (abbreviated LOCI and commercialized as AlphaScreen (TM)). Compared with an enzyme-linked immunosorbent assay-based peptide-HLA class I binding assay, the LOCI assay yields virtually identical affinity measurements, although having a broader dynamic range...... screening mode using standard liquid handling robotics and 384-well plates. We have successfully applied this assay to more than 60 different HLA molecules, leading to more than 2 million measurements. (Journal of Biomolecular Screening 2009: 173-180)...

  2. A comparison of high-throughput techniques for assaying circadian rhythms in plants.

    Science.gov (United States)

    Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony

    2015-01-01

    Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.

  3. Meta-Analysis of High-Throughput Datasets Reveals Cellular Responses Following Hemorrhagic Fever Virus Infection

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2011-05-01

    Full Text Available The continuing use of high-throughput assays to investigate cellular responses to infection is providing a large repository of information. Due to the large number of differentially expressed transcripts, often running into the thousands, the majority of these data have not been thoroughly investigated. Advances in techniques for the downstream analysis of high-throughput datasets are providing additional methods for the generation of additional hypotheses for further investigation. The large number of experimental observations, combined with databases that correlate particular genes and proteins with canonical pathways, functions and diseases, allows for the bioinformatic exploration of functional networks that may be implicated in replication or pathogenesis. Herein, we provide an example of how analysis of published high-throughput datasets of cellular responses to hemorrhagic fever virus infection can generate additional functional data. We describe enrichment of genes involved in metabolism, post-translational modification and cardiac damage; potential roles for specific transcription factors and a conserved involvement of a pathway based around cyclooxygenase-2. We believe that these types of analyses can provide virologists with additional hypotheses for continued investigation.

  4. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  5. Micellar Surfactant Association in the Presence of a Glucoside-based Amphiphile Detected via High-Throughput Small Angle X-ray Scattering

    Energy Technology Data Exchange (ETDEWEB)

    Stanic, Vesna [Brazilian Synchrotron Light Source, Campinas (Brazil); Broadbent, Charlotte [Columbia Univ., New York, NY (United States). Engineering Dept.; DiMasi, Elaine [Brookhaven National Lab. (BNL), Upton, NY (United States). Photon Sciences Division; Galleguillos, Ramiro [Lubrizol Advanced Materials, Cleveland, OH (United States); Woodward, Valerie [Lubrizol Advanced Materials, Cleveland, OH (United States)

    2016-11-14

    The interactions of mixtures of anionic and amphoteric surfactants with sugar amphiphiles were studied via high throughput small angle x-ray scattering (SAXS). The sugar amphiphile was composed of Caprate, Caprylate, and Oleate mixed ester of methyl glucoside, MeGCCO. Optimal surfactant interactions are sought which have desirable physical properties, which must be identified in a cost effective manner that can access the large phase space of possible molecular combinations. X-ray scattering patterns obtained via high throughput SAXS can probe a combinatorial sample space and reveal the incorporation of MeGCCO into the micelles and the molecular associations between surfactant molecules. Such data make it possible to efficiently assess the effects of the new amphiphiles in the formulation. A specific finding of this study is that formulations containing comparatively monodisperse and homogeneous surfactant mixtures can be reliably tuned by addition of NaCl, which swells the surfactant micelles with a monotonic dependence on salt concentration. In contrast, the presence of multiple different surfactants destroys clear correlations with NaCl concentration, even in otherwise similar series of formulations.

  6. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  7. Simple and rapid preparation of [{sup 11}C]DASB with high quality and reliability for routine applications

    Energy Technology Data Exchange (ETDEWEB)

    Haeusler, D.; Mien, L.-K. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Pharmaceutical Technology and Biopharmaceutics, University of Vienna, A-1090 Vienna (Austria); Nics, L. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Nutritional Sciences, University of Vienna, A-1090 Vienna (Austria); Ungersboeck, J. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Inorganic Chemistry, University of Vienna, A-1090 Vienna (Austria); Philippe, C. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Pharmaceutical Technology and Biopharmaceutics, University of Vienna, A-1090 Vienna (Austria); Lanzenberger, R.R. [Department of Psychiatry and Psychotherapy, Medical University of Vienna, A-1090 Vienna (Austria); Kletter, K.; Dudczak, R. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Mitterhauser, M. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Pharmaceutical Technology and Biopharmaceutics, University of Vienna, A-1090 Vienna (Austria); Hospital Pharmacy of the General Hospital of Vienna, A-1090 Vienna (Austria); Wadsak, W. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Inorganic Chemistry, University of Vienna, A-1090 Vienna (Austria)], E-mail: wolfgang.wadsak@meduniwien.ac.at

    2009-09-15

    [{sup 11}C]DASB combines all major prerequisites for a successful SERT-ligand, providing excellent biological properties and in-vivo behaviour. Thus, we aimed to establish a fully automated procedure for the synthesis and purification of [{sup 11}C]DASB with a high degree of reliability reducing the overall synthesis time while conserving high yields and purity. The optimized [{sup 11}C]DASB synthesis was applied in more than 60 applications with a very low failure rate (3.2%). We obtained yields up to 8.9 GBq (average 5.3{+-}1.6 GBq). Radiochemical yields based on [{sup 11}C]CH{sub 3}I, (corrected for decay) were 66.3{+-}6.9% with a specific radioactivity (A{sub s}) of 86.8{+-}24.3 GBq/{mu}mol (both at the end of synthesis, EOS). Time consumption was kept to a minimum, resulting in 43 min from end of bombardment to release of the product after quality control. Form our data, it is evident that the presented method can be implemented for routine preparations of [{sup 11}C]DASB with high reliability.

  8. Multiplexing spheroid volume, resazurin and acid phosphatase viability assays for high-throughput screening of tumour spheroids and stem cell neurospheres.

    Directory of Open Access Journals (Sweden)

    Delyan P Ivanov

    Full Text Available Three-dimensional cell culture has many advantages over monolayer cultures, and spheroids have been hailed as the best current representation of small avascular tumours in vitro. However their adoption in regular screening programs has been hindered by uneven culture growth, poor reproducibility and lack of high-throughput analysis methods for 3D. The objective of this study was to develop a method for a quick and reliable anticancer drug screen in 3D for tumour and human foetal brain tissue in order to investigate drug effectiveness and selective cytotoxic effects. Commercially available ultra-low attachment 96-well round-bottom plates were employed to culture spheroids in a rapid, reproducible manner amenable to automation. A set of three mechanistically different methods for spheroid health assessment (Spheroid volume, metabolic activity and acid phosphatase enzyme activity were validated against cell numbers in healthy and drug-treated spheroids. An automated open-source ImageJ macro was developed to enable high-throughput volume measurements. Although spheroid volume determination was superior to the other assays, multiplexing it with resazurin reduction and phosphatase activity produced a richer picture of spheroid condition. The ability to distinguish between effects on malignant and the proliferating component of normal brain was tested using etoposide on UW228-3 medulloblastoma cell line and human neural stem cells. At levels below 10 µM etoposide exhibited higher toxicity towards proliferating stem cells, whereas at concentrations above 10 µM the tumour spheroids were affected to a greater extent. The high-throughput assay procedures use ready-made plates, open-source software and are compatible with standard plate readers, therefore offering high predictive power with substantial savings in time and money.

  9. 40 CFR Table 9 to Subpart Eeee of... - Continuous Compliance With Operating Limits-High Throughput Transfer Racks

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With Operating Limits-High Throughput Transfer Racks 9 Table 9 to Subpart EEEE of Part 63 Protection of Environment...—Continuous Compliance With Operating Limits—High Throughput Transfer Racks As stated in §§ 63.2378(a) and (b...

  10. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  11. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  12. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    Science.gov (United States)

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980

  13. Large-scale DNA Barcode Library Generation for Biomolecule Identification in High-throughput Screens.

    Science.gov (United States)

    Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio

    2017-10-24

    High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.

  14. Assessment of microelectronics packaging for high temperature, high reliability applications

    Energy Technology Data Exchange (ETDEWEB)

    Uribe, F.

    1997-04-01

    This report details characterization and development activities in electronic packaging for high temperature applications. This project was conducted through a Department of Energy sponsored Cooperative Research and Development Agreement between Sandia National Laboratories and General Motors. Even though the target application of this collaborative effort is an automotive electronic throttle control system which would be located in the engine compartment, results of this work are directly applicable to Sandia`s national security mission. The component count associated with the throttle control dictates the use of high density packaging not offered by conventional surface mount. An enabling packaging technology was selected and thermal models defined which characterized the thermal and mechanical response of the throttle control module. These models were used to optimize thick film multichip module design, characterize the thermal signatures of the electronic components inside the module, and to determine the temperature field and resulting thermal stresses under conditions that may be encountered during the operational life of the throttle control module. Because the need to use unpackaged devices limits the level of testing that can be performed either at the wafer level or as individual dice, an approach to assure a high level of reliability of the unpackaged components was formulated. Component assembly and interconnect technologies were also evaluated and characterized for high temperature applications. Electrical, mechanical and chemical characterizations of enabling die and component attach technologies were performed. Additionally, studies were conducted to assess the performance and reliability of gold and aluminum wire bonding to thick film conductor inks. Kinetic models were developed and validated to estimate wire bond reliability.

  15. A reactor for high-throughput high-pressure nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Beach, N. J.; Knapp, S. M. M.; Landis, C. R., E-mail: landis@chem.wisc.edu [Department of Chemistry, University of Wisconsin-Madison, Madison, Wisconsin 53719 (United States)

    2015-10-15

    The design of a reactor for operando nuclear magnetic resonance (NMR) monitoring of high-pressure gas-liquid reactions is described. The Wisconsin High Pressure NMR Reactor (WiHP-NMRR) design comprises four modules: a sapphire NMR tube with titanium tube holder rated for pressures as high as 1000 psig (68 atm) and temperatures ranging from −90 to 90 °C, a gas circulation system that maintains equilibrium concentrations of dissolved gases during gas-consuming or gas-releasing reactions, a liquid injection apparatus that is capable of adding measured amounts of solutions to the reactor under high pressure conditions, and a rapid wash system that enables the reactor to be cleaned without removal from the NMR instrument. The WiHP-NMRR is compatible with commercial 10 mm NMR probes. Reactions performed in the WiHP-NMRR yield high quality, information-rich, and multinuclear NMR data over the entire reaction time course with rapid experimental turnaround.

  16. High-throughput screening of small-molecule adsorption in MOF-74

    Science.gov (United States)

    Thonhauser, T.; Canepa, P.

    2014-03-01

    Using high-throughput screening coupled with state-of-the-art van der Waals density functional theory, we investigate the adsorption properties of four important molecules, H2, CO2, CH4, and H2O in MOF-74-  with  = Be, Mg, Al, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sr, Zr, Nb, Ru, Rh, Pd, La, W, Os, Ir, and Pt. We show that high-throughput techniques can aid in speeding up the development and refinement of effective materials for hydrogen storage, carbon capture, and gas separation. The exploration of the configurational adsorption space allows us to extract crucial information concerning, for example, the competition of water with CO2 for the adsorption binding sites. We find that only a few noble metals--Rh, Pd, Os, Ir, and Pt--favor the adsorption of CO2 and hence are potential candidates for effective carbon-capture materials. Our findings further reveal significant differences in the binding characteristics of H2, CO2, CH4, and H2O within the MOF structure, indicating that molecular blends can be successfully separated by these nano-porous materials. Supported by DOE DE-FG02-08ER46491.

  17. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  18. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  19. PREOVULATORY FOLLICLE DEVELOPMENT IN HIGH YIELDING COWS

    Directory of Open Access Journals (Sweden)

    Radovan Tomášek

    2013-06-01

    Full Text Available The aim of the study was to examine the development of preovulatory follicles in pregnant and non-pregnant high yielding cows. The treatment by supergestran and oestrophan was used to synchronize the estrous cycle. Ovaries were monitored by transrectal ultrasonography. The linear increase of preovulatory follicles was observed in pregnant (P < 0,001 and non-pregnant (P < 0,001 cows during 8 days before ovulation. In conclusion, preovulatory follicles in pregnant and non-pregnant high yielding cows developed similarly.

  20. Achieving High Reliability with People, Processes, and Technology.

    Science.gov (United States)

    Saunders, Candice L; Brennan, John A

    2017-01-01

    High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.

  1. High-throughput screening of ionic conductivity in polymer membranes

    International Nuclear Information System (INIS)

    Zapata, Pedro; Basak, Pratyay; Carson Meredith, J.

    2009-01-01

    Combinatorial and high-throughput techniques have been successfully used for efficient and rapid property screening in multiple fields. The use of these techniques can be an advantageous new approach to assay ionic conductivity and accelerate the development of novel materials in research areas such as fuel cells. A high-throughput ionic conductivity (HTC) apparatus is described and applied to screening candidate polymer electrolyte membranes for fuel cell applications. The device uses a miniature four-point probe for rapid, automated point-to-point AC electrochemical impedance measurements in both liquid and humid air environments. The conductivity of Nafion 112 HTC validation standards was within 1.8% of the manufacturer's specification. HTC screening of 40 novel Kynar poly(vinylidene fluoride) (PVDF)/acrylic polyelectrolyte (PE) membranes focused on varying the Kynar type (5x) and PE composition (8x) using reduced sample sizes. Two factors were found to be significant in determining the proton conducting capacity: (1) Kynar PVDF series: membranes containing a particular Kynar PVDF type exhibited statistically identical mean conductivity as other membranes containing different Kynar PVDF types that belong to the same series or family. (2) Maximum effective amount of polyelectrolyte: increments in polyelectrolyte content from 55 wt% to 60 wt% showed no statistically significant effect in increasing conductivity. In fact, some membranes experienced a reduction in conductivity.

  2. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit.

    Science.gov (United States)

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-04-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. GROMACS is an open source and free software available from http://www.gromacs.org. Supplementary data are available at Bioinformatics online.

  3. A high throughput architecture for a low complexity soft-output demapping algorithm

    Science.gov (United States)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  4. Multilayer Porous Crucibles for the High Throughput Salt Separation from Uranium Deposits

    International Nuclear Information System (INIS)

    Kwon, S. W.; Park, K. M.; Kim, J. G.; Kim, I. T.; Seo, B. K.; Moon, J. G.

    2013-01-01

    Solid cathode processing is necessary to separate the salt from the cathode since the uranium deposit in a solid cathode contains electrolyte salt. A physical separation process, such as a distillation separation, is more attractive than a chemical or dissolution process because physical processes generate much less secondary process. Distillation process was employed for the cathode processsing due to the advantages of minimal generation of secondary waste, compact unit process, simple and low cost equipment. The basis for vacuum distillation separation is the difference in vapor pressures between salt and uranium. A solid cathode deposit is heated in a heating region and salt vaporizes, while nonvolatile uranium remains behind. It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites. The evaporation rate of the LiCl-KCl eutectic salt in vacuum distiller is not so high to come up with the generation capacity of uranium dendrites in an electro-refiner. Therefore, a wide evaporation area or high distillation temperature is necessary for the successful salt separation. In this study, it was attempted to enlarge a throughput of the salt distiller with a multilayer porous crucibles for the separation of adhered salt in the uranium deposits generated from the electrorefiner. The feasibility of the porous crucibles was tested by the salt distillation experiments. In this study, the salt distiller with multilayer porous crucibles was proposed and the feasibility of liquid salt separation was examined to increase a throughput. It was found that the effective separation of salt from uranium deposits was possible by the multilayer porous crucibles

  5. Multilayer Porous Crucibles for the High Throughput Salt Separation from Uranium Deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S. W.; Park, K. M.; Kim, J. G.; Kim, I. T.; Seo, B. K.; Moon, J. G. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    Solid cathode processing is necessary to separate the salt from the cathode since the uranium deposit in a solid cathode contains electrolyte salt. A physical separation process, such as a distillation separation, is more attractive than a chemical or dissolution process because physical processes generate much less secondary process. Distillation process was employed for the cathode processsing due to the advantages of minimal generation of secondary waste, compact unit process, simple and low cost equipment. The basis for vacuum distillation separation is the difference in vapor pressures between salt and uranium. A solid cathode deposit is heated in a heating region and salt vaporizes, while nonvolatile uranium remains behind. It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites. The evaporation rate of the LiCl-KCl eutectic salt in vacuum distiller is not so high to come up with the generation capacity of uranium dendrites in an electro-refiner. Therefore, a wide evaporation area or high distillation temperature is necessary for the successful salt separation. In this study, it was attempted to enlarge a throughput of the salt distiller with a multilayer porous crucibles for the separation of adhered salt in the uranium deposits generated from the electrorefiner. The feasibility of the porous crucibles was tested by the salt distillation experiments. In this study, the salt distiller with multilayer porous crucibles was proposed and the feasibility of liquid salt separation was examined to increase a throughput. It was found that the effective separation of salt from uranium deposits was possible by the multilayer porous crucibles.

  6. AELAS: Automatic ELAStic property derivations via high-throughput first-principles computation

    Science.gov (United States)

    Zhang, S. H.; Zhang, R. F.

    2017-11-01

    The elastic properties are fundamental and important for crystalline materials as they relate to other mechanical properties, various thermodynamic qualities as well as some critical physical properties. However, a complete set of experimentally determined elastic properties is only available for a small subset of known materials, and an automatic scheme for the derivations of elastic properties that is adapted to high-throughput computation is much demanding. In this paper, we present the AELAS code, an automated program for calculating second-order elastic constants of both two-dimensional and three-dimensional single crystal materials with any symmetry, which is designed mainly for high-throughput first-principles computation. Other derivations of general elastic properties such as Young's, bulk and shear moduli as well as Poisson's ratio of polycrystal materials, Pugh ratio, Cauchy pressure, elastic anisotropy and elastic stability criterion, are also implemented in this code. The implementation of the code has been critically validated by a lot of evaluations and tests on a broad class of materials including two-dimensional and three-dimensional materials, providing its efficiency and capability for high-throughput screening of specific materials with targeted mechanical properties. Program Files doi:http://dx.doi.org/10.17632/f8fwg4j9tw.1 Licensing provisions: BSD 3-Clause Programming language: Fortran Nature of problem: To automate the calculations of second-order elastic constants and the derivations of other elastic properties for two-dimensional and three-dimensional materials with any symmetry via high-throughput first-principles computation. Solution method: The space-group number is firstly determined by the SPGLIB code [1] and the structure is then redefined to unit cell with IEEE-format [2]. Secondly, based on the determined space group number, a set of distortion modes is automatically specified and the distorted structure files are generated

  7. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  8. Applications of high-throughput sequencing to chromatin structure and function in mammals

    OpenAIRE

    Dunham, Ian

    2009-01-01

    High-throughput DNA sequencing approaches have enabled direct interrogation of chromatin samples from mammalian cells. We are beginning to develop a genome-wide description of nuclear function during development, but further data collection, refinement, and integration are needed.

  9. From Classical to High Throughput Screening Methods for Feruloyl Esterases: A Review.

    Science.gov (United States)

    Ramírez-Velasco, Lorena; Armendáriz-Ruiz, Mariana; Rodríguez-González, Jorge Alberto; Müller-Santos, Marcelo; Asaff-Torres, Ali; Mateos-Díaz, Juan Carlos

    2016-01-01

    Feruloyl esterases (FAEs) are a diverse group of hydrolases widely distributed in plants and microorganisms which catalyzes the cleavage and formation of ester bonds between plant cell wall polysaccharides and phenolic acids. FAEs have gained importance in biofuel, medicine and food industries due to their capability of acting on a large range of substrates for cleaving ester bonds and synthesizing highadded value molecules through esterification and transesterification reactions. During the past two decades extensive studies have been carried out on the production, characterization and classification of FAEs, however only a few reports of suitable High Throughput Screening assays for this kind of enzymes have been reported. This review is focused on a concise but complete revision of classical to High Throughput Screening methods for FAEs, highlighting its advantages and disadvantages, and finally suggesting future perspectives for this important research field.

  10. Label-free detection of cellular drug responses by high-throughput bright-field imaging and machine learning.

    Science.gov (United States)

    Kobayashi, Hirofumi; Lei, Cheng; Wu, Yi; Mao, Ailin; Jiang, Yiyue; Guo, Baoshan; Ozeki, Yasuyuki; Goda, Keisuke

    2017-09-29

    In the last decade, high-content screening based on multivariate single-cell imaging has been proven effective in drug discovery to evaluate drug-induced phenotypic variations. Unfortunately, this method inherently requires fluorescent labeling which has several drawbacks. Here we present a label-free method for evaluating cellular drug responses only by high-throughput bright-field imaging with the aid of machine learning algorithms. Specifically, we performed high-throughput bright-field imaging of numerous drug-treated and -untreated cells (N = ~240,000) by optofluidic time-stretch microscopy with high throughput up to 10,000 cells/s and applied machine learning to the cell images to identify their morphological variations which are too subtle for human eyes to detect. Consequently, we achieved a high accuracy of 92% in distinguishing drug-treated and -untreated cells without the need for labeling. Furthermore, we also demonstrated that dose-dependent, drug-induced morphological change from different experiments can be inferred from the classification accuracy of a single classification model. Our work lays the groundwork for label-free drug screening in pharmaceutical science and industry.

  11. Low Cost, High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI

    Science.gov (United States)

    2017-10-01

    greater gas polarizations and production amounts/ throughputs- benefiting in particular from the advent of com- pact, high-power, relatively low- cost ...Award Number: W81XWH-15-1-0271 TITLE: Low- Cost , High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI...DISTRIBUTION STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the

  12. Life in the fast lane: high-throughput chemistry for lead generation and optimisation.

    Science.gov (United States)

    Hunter, D

    2001-01-01

    The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.

  13. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy

    DEFF Research Database (Denmark)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H

    2017-01-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analy...

  14. A high throughput biochemical fluorometric method for measuring lipid peroxidation in HDL.

    Directory of Open Access Journals (Sweden)

    Theodoros Kelesidis

    Full Text Available Current cell-based assays for determining the functional properties of high-density lipoproteins (HDL have limitations. We report here the development of a new, robust fluorometric cell-free biochemical assay that measures HDL lipid peroxidation (HDLox based on the oxidation of the fluorochrome Amplex Red. HDLox correlated with previously validated cell-based (r = 0.47, p<0.001 and cell-free assays (r = 0.46, p<0.001. HDLox distinguished dysfunctional HDL in established animal models of atherosclerosis and Human Immunodeficiency Virus (HIV patients. Using an immunoaffinity method for capturing HDL, we demonstrate the utility of this novel assay for measuring HDLox in a high throughput format. Furthermore, HDLox correlated significantly with measures of cardiovascular diseases including carotid intima media thickness (r = 0.35, p<0.01 and subendocardial viability ratio (r = -0.21, p = 0.05 and physiological parameters such as metabolic and anthropometric parameters (p<0.05. In conclusion, we report the development of a new fluorometric method that offers a reproducible and rapid means for determining HDL function/quality that is suitable for high throughput implementation.

  15. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    Science.gov (United States)

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  16. A versatile, high through-put, bead-based phagocytosis assay for Plasmodium falciparum

    DEFF Research Database (Denmark)

    Lloyd, Yukie M.; Ngati, Elise P.; Salanti, Ali

    2017-01-01

    Antibody-mediated phagocytosis is an important immune effector mechanism against Plasmodium falciparum-infected erythrocytes (IE); however, current phagocytosis assays use IE collected from infected individuals or from in vitro cultures of P. falciparum, making them prone to high variation....... A simple, high-throughput flow cytometric assay was developed that uses THP-1 cells and fluorescent beads covalently-coupled with the malarial antigen VAR2CSA. The assay is highly repeatable, provides both the overall percent phagocytosis and semi-quantitates the number of antigen-coupled beads...

  17. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  18. MetaUniDec: High-Throughput Deconvolution of Native Mass Spectra

    Science.gov (United States)

    Reid, Deseree J.; Diesing, Jessica M.; Miller, Matthew A.; Perry, Scott M.; Wales, Jessica A.; Montfort, William R.; Marty, Michael T.

    2018-04-01

    The expansion of native mass spectrometry (MS) methods for both academic and industrial applications has created a substantial need for analysis of large native MS datasets. Existing software tools are poorly suited for high-throughput deconvolution of native electrospray mass spectra from intact proteins and protein complexes. The UniDec Bayesian deconvolution algorithm is uniquely well suited for high-throughput analysis due to its speed and robustness but was previously tailored towards individual spectra. Here, we optimized UniDec for deconvolution, analysis, and visualization of large data sets. This new module, MetaUniDec, centers around a hierarchical data format 5 (HDF5) format for storing datasets that significantly improves speed, portability, and file size. It also includes code optimizations to improve speed and a new graphical user interface for visualization, interaction, and analysis of data. To demonstrate the utility of MetaUniDec, we applied the software to analyze automated collision voltage ramps with a small bacterial heme protein and large lipoprotein nanodiscs. Upon increasing collisional activation, bacterial heme-nitric oxide/oxygen binding (H-NOX) protein shows a discrete loss of bound heme, and nanodiscs show a continuous loss of lipids and charge. By using MetaUniDec to track changes in peak area or mass as a function of collision voltage, we explore the energetic profile of collisional activation in an ultra-high mass range Orbitrap mass spectrometer. [Figure not available: see fulltext.

  19. Tiered High-Throughput Screening Approach to Identify ...

    Science.gov (United States)

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using

  20. High Throughput Preparation of Aligned Nanofibers Using an Improved Bubble-Electrospinning

    Directory of Open Access Journals (Sweden)

    Liang Yu

    2017-11-01

    Full Text Available An improved bubble-electrospinning, consisting of a cone shaped air nozzle, a copper solution reservoir connected directly to the power generator, and a high speed rotating copper wire drum as a collector, was presented successfully to obtain high throughput preparation of aligned nanofibers. The influences of drum rotation speed on morphology and properties of obtained nanofibers were explored and researched. The results showed that the alignment degree, diameter distribution, and properties of nanofibers were improved with the increase of the drum rotation speed.

  1. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    Science.gov (United States)

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  2. OptoDyCE: Automated system for high-throughput all-optical dynamic cardiac electrophysiology

    Science.gov (United States)

    Klimas, Aleksandra; Yu, Jinzhu; Ambrosi, Christina M.; Williams, John C.; Bien, Harold; Entcheva, Emilia

    2016-02-01

    In the last two decades, market were due to cardiac toxicity, where unintended interactions with ion channels disrupt the heart's normal electrical function. Consequently, all new drugs must undergo preclinical testing for cardiac liability, adding to an already expensive and lengthy process. Recognition that proarrhythmic effects often result from drug action on multiple ion channels demonstrates a need for integrative and comprehensive measurements. Additionally, patient-specific therapies relying on emerging technologies employing stem-cell derived cardiomyocytes (e.g. induced pluripotent stem-cell-derived cardiomyocytes, iPSC-CMs) require better screening methods to become practical. However, a high-throughput, cost-effective approach for cellular cardiac electrophysiology has not been feasible. Optical techniques for manipulation and recording provide a contactless means of dynamic, high-throughput testing of cells and tissues. Here, we consider the requirements for all-optical electrophysiology for drug testing, and we implement and validate OptoDyCE, a fully automated system for all-optical cardiac electrophysiology. We demonstrate the high-throughput capabilities using multicellular samples in 96-well format by combining optogenetic actuation with simultaneous fast high-resolution optical sensing of voltage or intracellular calcium. The system can also be implemented using iPSC-CMs and other cell-types by delivery of optogenetic drivers, or through the modular use of dedicated light-sensitive somatic cells in conjunction with non-modified cells. OptoDyCE provides a truly modular and dynamic screening system, capable of fully-automated acquisition of high-content information integral for improved discovery and development of new drugs and biologics, as well as providing a means of better understanding of electrical disturbances in the heart.

  3. High-throughput SNP genotyping in the highly heterozygous genome of Eucalyptus: assay success, polymorphism and transferability across species

    Science.gov (United States)

    2011-01-01

    Background High-throughput SNP genotyping has become an essential requirement for molecular breeding and population genomics studies in plant species. Large scale SNP developments have been reported for several mainstream crops. A growing interest now exists to expand the speed and resolution of genetic analysis to outbred species with highly heterozygous genomes. When nucleotide diversity is high, a refined diagnosis of the target SNP sequence context is needed to convert queried SNPs into high-quality genotypes using the Golden Gate Genotyping Technology (GGGT). This issue becomes exacerbated when attempting to transfer SNPs across species, a scarcely explored topic in plants, and likely to become significant for population genomics and inter specific breeding applications in less domesticated and less funded plant genera. Results We have successfully developed the first set of 768 SNPs assayed by the GGGT for the highly heterozygous genome of Eucalyptus from a mixed Sanger/454 database with 1,164,695 ESTs and the preliminary 4.5X draft genome sequence for E. grandis. A systematic assessment of in silico SNP filtering requirements showed that stringent constraints on the SNP surrounding sequences have a significant impact on SNP genotyping performance and polymorphism. SNP assay success was high for the 288 SNPs selected with more rigorous in silico constraints; 93% of them provided high quality genotype calls and 71% of them were polymorphic in a diverse panel of 96 individuals of five different species. SNP reliability was high across nine Eucalyptus species belonging to three sections within subgenus Symphomyrtus and still satisfactory across species of two additional subgenera, although polymorphism declined as phylogenetic distance increased. Conclusions This study indicates that the GGGT performs well both within and across species of Eucalyptus notwithstanding its nucleotide diversity ≥2%. The development of a much larger array of informative SNPs across

  4. Low-Cost, High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI

    Science.gov (United States)

    2017-10-01

    low- cost and high-throughput was a key element proposed for this project, which we believe will be of significant benefit to the patients suffering...Award Number: W81XWH-15-1-0272 TITLE: Low- Cost , High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI...STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the author(s

  5. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    OpenAIRE

    Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E

    2011-01-01

    Abstract The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated p...

  6. A High-Throughput Antibody-Based Microarray Typing Platform

    Directory of Open Access Journals (Sweden)

    Ashan Perera

    2013-05-01

    Full Text Available Many rapid methods have been developed for screening foods for the presence of pathogenic microorganisms. Rapid methods that have the additional ability to identify microorganisms via multiplexed immunological recognition have the potential for classification or typing of microbial contaminants thus facilitating epidemiological investigations that aim to identify outbreaks and trace back the contamination to its source. This manuscript introduces a novel, high throughput typing platform that employs microarrayed multiwell plate substrates and laser-induced fluorescence of the nucleic acid intercalating dye/stain SYBR Gold for detection of antibody-captured bacteria. The aim of this study was to use this platform for comparison of different sets of antibodies raised against the same pathogens as well as demonstrate its potential effectiveness for serotyping. To that end, two sets of antibodies raised against each of the “Big Six” non-O157 Shiga toxin-producing E. coli (STEC as well as E. coli O157:H7 were array-printed into microtiter plates, and serial dilutions of the bacteria were added and subsequently detected. Though antibody specificity was not sufficient for the development of an STEC serotyping method, the STEC antibody sets performed reasonably well exhibiting that specificity increased at lower capture antibody concentrations or, conversely, at lower bacterial target concentrations. The favorable results indicated that with sufficiently selective and ideally concentrated sets of biorecognition elements (e.g., antibodies or aptamers, this high-throughput platform can be used to rapidly type microbial isolates derived from food samples within ca. 80 min of total assay time. It can also potentially be used to detect the pathogens from food enrichments and at least serve as a platform for testing antibodies.

  7. Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.

    Science.gov (United States)

    Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin

    2016-02-01

    High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.

  8. High-throughput sockets over RDMA for the Intel Xeon Phi coprocessor

    CERN Document Server

    Santogidis, Aram

    2017-01-01

    In this paper we describe the design, implementation and performance of Trans4SCIF, a user-level socket-like transport library for the Intel Xeon Phi coprocessor. Trans4SCIF library is primarily intended for high-throughput applications. It uses RDMA transfers over the native SCIF support, in a way that is transparent for the application, which has the illusion of using conventional stream sockets. We also discuss the integration of Trans4SCIF with the ZeroMQ messaging library, used extensively by several applications running at CERN. We show that this can lead to a substantial, up to 3x, increase of application throughput compared to the default TCP/IP transport option.

  9. A High-Throughput SU-8Microfluidic Magnetic Bead Separator

    DEFF Research Database (Denmark)

    Bu, Minqiang; Christensen, T. B.; Smistrup, Kristian

    2007-01-01

    We present a novel microfluidic magnetic bead separator based on SU-8 fabrication technique for high through-put applications. The experimental results show that magnetic beads can be captured at an efficiency of 91 % and 54 % at flow rates of 1 mL/min and 4 mL/min, respectively. Integration...... of soft magnetic elements in the chip leads to a slightly higher capturing efficiency and a more uniform distribution of captured beads over the separation chamber than the system without soft magnetic elements....

  10. Retrofit Strategies for Incorporating Xenobiotic Metabolism into High Throughput Screening Assays (EMGS)

    Science.gov (United States)

    The US EPA’s ToxCast program is designed to assess chemical perturbations of molecular and cellular endpoints using a variety of high-throughput screening (HTS) assays. However, existing HTS assays have limited or no xenobiotic metabolism which could lead to a mischaracterization...

  11. Recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers

    DEFF Research Database (Denmark)

    Andersen, Lars Dyrskjøt; Zieger, Karsten; Ørntoft, Torben Falck

    2007-01-01

    individually contributed to the management of the disease. However, the development of high-throughput techniques for simultaneous assessment of a large number of markers has allowed classification of tumors into clinically relevant molecular subgroups beyond those possible by pathological classification. Here......Bladder cancer is the fifth most common neoplasm in industrialized countries. Due to frequent recurrences of the superficial form of this disease, bladder cancer ranks as one of the most common cancers. Despite the description of a large number of tumor markers for bladder cancers, none have......, we review the recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers....

  12. Development of rapid high throughput biodosimetry tools for radiological triage

    International Nuclear Information System (INIS)

    Balajee, Adayabalam S.; Escalona, Maria; Smith, Tammy; Ryan, Terri; Dainiak, Nicholas

    2018-01-01

    Accidental or intentional radiological or nuclear (R/N) disasters constitute a major threat around the globe that can affect several tens, hundreds and thousands of humans. Currently available cytogenetic biodosimeters are time consuming and laborious to perform making them impractical for triage scenarios. Therefore, it is imperative to develop high throughput techniques which will enable timely assessment of personalized dose for making an appropriate 'life-saving' clinical decision

  13. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  14. A multichannel smartphone optical biosensor for high-throughput point-of-care diagnostics.

    Science.gov (United States)

    Wang, Li-Ju; Chang, Yu-Chung; Sun, Rongrong; Li, Lei

    2017-01-15

    Current reported smartphone spectrometers are only used to monitor or measure one sample at a time. For the first time, we demonstrate a multichannel smartphone spectrometer (MSS) as an optical biosensor that can simultaneously optical sense multiple samples. In this work, we developed a novel method to achieve the multichannel optical spectral sensing with nanometer resolution on a smartphone. A 3D printed cradle held the smartphone integrated with optical components. This optical sensor performed accurate and reliable spectral measurements by optical intensity changes at specific wavelength or optical spectral shifts. A custom smartphone multi-view App was developed to control the optical sensing parameters and to align each sample to the corresponding channel. The captured images were converted to the transmission spectra in the visible wavelength range from 400nm to 700nm with the high resolution of 0.2521nm per pixel. We validated the performance of this MSS via measuring the concentrations of protein and immunoassaying a type of human cancer biomarker. Compared to the standard laboratory instrument, the results sufficiently showed that this MSS can achieve the comparative analysis detection limits, accuracy and sensitivity. We envision that this multichannel smartphone optical biosensor will be useful in high-throughput point-of-care diagnostics with its minimizing size, light weight, low cost and data transmission function. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Identifying Inhibitors of Inflammation: A Novel High-Throughput MALDI-TOF Screening Assay for Salt-Inducible Kinases (SIKs).

    Science.gov (United States)

    Heap, Rachel E; Hope, Anthony G; Pearson, Lesley-Anne; Reyskens, Kathleen M S E; McElroy, Stuart P; Hastie, C James; Porter, David W; Arthur, J Simon C; Gray, David W; Trost, Matthias

    2017-12-01

    Matrix-assisted laser desorption/ionization time-of-flight (MALDI TOF) mass spectrometry has become a promising alternative for high-throughput drug discovery as new instruments offer high speed, flexibility and sensitivity, and the ability to measure physiological substrates label free. Here we developed and applied high-throughput MALDI TOF mass spectrometry to identify inhibitors of the salt-inducible kinase (SIK) family, which are interesting drug targets in the field of inflammatory disease as they control production of the anti-inflammatory cytokine interleukin-10 (IL-10) in macrophages. Using peptide substrates in in vitro kinase assays, we can show that hit identification of the MALDI TOF kinase assay correlates with indirect ADP-Hunter kinase assays. Moreover, we can show that both techniques generate comparable IC 50 data for a number of hit compounds and known inhibitors of SIK kinases. We further take these inhibitors to a fluorescence-based cellular assay using the SIK activity-dependent translocation of CRTC3 into the nucleus, thereby providing a complete assay pipeline for the identification of SIK kinase inhibitors in vitro and in cells. Our data demonstrate that MALDI TOF mass spectrometry is fully applicable to high-throughput kinase screening, providing label-free data comparable to that of current high-throughput fluorescence assays.

  16. A continuous high-throughput bioparticle sorter based on 3D traveling-wave dielectrophoresis.

    Science.gov (United States)

    Cheng, I-Fang; Froude, Victoria E; Zhu, Yingxi; Chang, Hsueh-Chia; Chang, Hsien-Chang

    2009-11-21

    We present a high throughput (maximum flow rate approximately 10 microl/min or linear velocity approximately 3 mm/s) continuous bio-particle sorter based on 3D traveling-wave dielectrophoresis (twDEP) at an optimum AC frequency of 500 kHz. The high throughput sorting is achieved with a sustained twDEP particle force normal to the continuous through-flow, which is applied over the entire chip by a single 3D electrode array. The design allows continuous fractionation of micron-sized particles into different downstream sub-channels based on differences in their twDEP mobility on both sides of the cross-over. Conventional DEP is integrated upstream to focus the particles into a single levitated queue to allow twDEP sorting by mobility difference and to minimize sedimentation and field-induced lysis. The 3D electrode array design minimizes the offsetting effect of nDEP (negative DEP with particle force towards regions with weak fields) on twDEP such that both forces increase monotonically with voltage to further increase the throughput. Effective focusing and separation of red blood cells from debris-filled heterogeneous samples are demonstrated, as well as size-based separation of poly-dispersed liposome suspensions into two distinct bands at 2.3 to 4.6 microm and 1.5 to 2.7 microm, at the highest throughput recorded in hand-held chips of 6 microl/min.

  17. A quality assurance initiative for commercial-scale production in high-throughput cryopreservation of blue catfish sperm.

    Science.gov (United States)

    Hu, E; Liao, T W; Tiersch, T R

    2013-10-01

    Cryopreservation of fish sperm has been studied for decades at a laboratory (research) scale. However, high-throughput cryopreservation of fish sperm has recently been developed to enable industrial-scale production. This study treated blue catfish (Ictalurus furcatus) sperm high-throughput cryopreservation as a manufacturing production line and initiated quality assurance plan development. The main objectives were to identify: (1) the main production quality characteristics; (2) the process features for quality assurance; (3) the internal quality characteristics and their specification designs; (4) the quality control and process capability evaluation methods, and (5) the directions for further improvements and applications. The essential product quality characteristics were identified as fertility-related characteristics. Specification design which established the tolerance levels according to demand and process constraints was performed based on these quality characteristics. Meanwhile, to ensure integrity throughout the process, internal quality characteristics (characteristics at each quality control point within process) that could affect fertility-related quality characteristics were defined with specifications. Due to the process feature of 100% inspection (quality inspection of every fish), a specific calculation method, use of cumulative sum (CUSUM) control charts, was applied to monitor each quality characteristic. An index of overall process evaluation, process capacity, was analyzed based on in-control process and the designed specifications, which further integrates the quality assurance plan. With the established quality assurance plan, the process could operate stably and quality of products would be reliable. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Application of high-throughput DNA sequencing in phytopathology.

    Science.gov (United States)

    Studholme, David J; Glover, Rachel H; Boonham, Neil

    2011-01-01

    The new sequencing technologies are already making a big impact in academic research on medically important microbes and may soon revolutionize diagnostics, epidemiology, and infection control. Plant pathology also stands to gain from exploiting these opportunities. This manuscript reviews some applications of these high-throughput sequencing methods that are relevant to phytopathology, with emphasis on the associated computational and bioinformatics challenges and their solutions. Second-generation sequencing technologies have recently been exploited in genomics of both prokaryotic and eukaryotic plant pathogens. They are also proving to be useful in diagnostics, especially with respect to viruses. Copyright © 2011 by Annual Reviews. All rights reserved.

  19. High-throughput ab-initio dilute solute diffusion database.

    Science.gov (United States)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-19

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  20. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    Science.gov (United States)

    Alexander, Kristen; Hampton, Meredith; Lopez, Rene; Desimone, Joseph

    2009-03-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  1. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.

  2. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  3. High-throughput gene expression profiling of memory differentiation in primary human T cells

    Directory of Open Access Journals (Sweden)

    Russell Kate

    2008-08-01

    Full Text Available Abstract Background The differentiation of naive T and B cells into memory lymphocytes is essential for immunity to pathogens. Therapeutic manipulation of this cellular differentiation program could improve vaccine efficacy and the in vitro expansion of memory cells. However, chemical screens to identify compounds that induce memory differentiation have been limited by 1 the lack of reporter-gene or functional assays that can distinguish naive and memory-phenotype T cells at high throughput and 2 a suitable cell-line representative of naive T cells. Results Here, we describe a method for gene-expression based screening that allows primary naive and memory-phenotype lymphocytes to be discriminated based on complex genes signatures corresponding to these differentiation states. We used ligation-mediated amplification and a fluorescent, bead-based detection system to quantify simultaneously 55 transcripts representing naive and memory-phenotype signatures in purified populations of human T cells. The use of a multi-gene panel allowed better resolution than any constituent single gene. The method was precise, correlated well with Affymetrix microarray data, and could be easily scaled up for high-throughput. Conclusion This method provides a generic solution for high-throughput differentiation screens in primary human T cells where no single-gene or functional assay is available. This screening platform will allow the identification of small molecules, genes or soluble factors that direct memory differentiation in naive human lymphocytes.

  4. The development of a high-throughput measurement method of octanol/water distribution coefficient based on hollow fiber membrane solvent microextraction technique.

    Science.gov (United States)

    Bao, James J; Liu, Xiaojing; Zhang, Yong; Li, Youxin

    2014-09-15

    This paper describes the development of a novel high-throughput hollow fiber membrane solvent microextraction technique for the simultaneous measurement of the octanol/water distribution coefficient (logD) for organic compounds such as drugs. The method is based on a designed system, which consists of a 96-well plate modified with 96 hollow fiber membrane tubes and a matching lid with 96 center holes and 96 side holes distributing in 96 grids. Each center hole was glued with a sealed on one end hollow fiber membrane tube, which is used to separate the aqueous phase from the octanol phase. A needle, such as microsyringe or automatic sampler, can be directly inserted into the membrane tube to deposit octanol as the accepted phase or take out the mixture of the octanol and the drug. Each side hole is filled with aqueous phase and could freely take in/out solvent as the donor phase from the outside of the hollow fiber membranes. The logD can be calculated by measuring the drug concentration in each phase after extraction equilibrium. After a comprehensive comparison, the polytetrafluoroethylene hollow fiber with the thickness of 210 μm, an extraction time of 300 min, a temperature of 25 °C and atmospheric pressure without stirring are selected for the high throughput measurement. The correlation coefficient of the linear fit of the logD values of five drugs determined by our system to reference values is 0.9954, showed a nice accurate. The -8.9% intra-day and -4.4% inter-day precision of logD for metronidazole indicates a good precision. In addition, the logD values of eight drugs were simultaneously and successfully measured, which indicated that the 96 throughput measure method of logD value was accurate, precise, reliable and useful for high throughput screening. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Machine Learning for High-Throughput Stress Phenotyping in Plants.

    Science.gov (United States)

    Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh Kumar; Sarkar, Soumik

    2016-02-01

    Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large corpus of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping. Four stages of the decision cycle in plant stress phenotyping and plant breeding activities where different ML approaches can be deployed are (i) identification, (ii) classification, (iii) quantification, and (iv) prediction (ICQP). We provide here a comprehensive overview and user-friendly taxonomy of ML tools to enable the plant community to correctly and easily apply the appropriate ML tools and best-practice guidelines for various biotic and abiotic stress traits. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    OpenAIRE

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gen...

  7. High-throughput computational methods and software for quantitative trait locus (QTL) mapping

    NARCIS (Netherlands)

    Arends, Danny

    2014-01-01

    De afgelopen jaren zijn vele nieuwe technologieen zoals Tiling arrays en High throughput DNA sequencing een belangrijke rol gaan spelen binnen het onderzoeksveld van de systeem genetica. Voor onderzoekers is het extreem belangrijk om te begrijpen dat deze methodes hun manier van werken zullen gaan

  8. Insights into Sonogashira cross-coupling by high-throughput kinetics and descriptor modeling

    NARCIS (Netherlands)

    an der Heiden, M.R.; Plenio, H.; Immel, S.; Burello, E.; Rothenberg, G.; Hoefsloot, H.C.J.

    2008-01-01

    A method is presented for the high-throughput monitoring of reaction kinetics in homogeneous catalysis, running up to 25 coupling reactions in a single reaction vessel. This method is demonstrated and validated on the Sonogashira reaction, analyzing the kinetics for almost 500 coupling reactions.

  9. High-throughput measurement of polymer film thickness using optical dyes

    Science.gov (United States)

    Grunlan, Jaime C.; Mehrabi, Ali R.; Ly, Tien

    2005-01-01

    Optical dyes were added to polymer solutions in an effort to create a technique for high-throughput screening of dry polymer film thickness. Arrays of polystyrene films, cast from a toluene solution, containing methyl red or solvent green were used to demonstrate the feasibility of this technique. Measurements of the peak visible absorbance of each film were converted to thickness using the Beer-Lambert relationship. These absorbance-based thickness calculations agreed within 10% of thickness measured using a micrometer for polystyrene films that were 10-50 µm. At these thicknesses it is believed that the absorbance values are actually more accurate. At least for this solvent-based system, thickness was shown to be accurately measured in a high-throughput manner that could potentially be applied to other equivalent systems. Similar water-based films made with poly(sodium 4-styrenesulfonate) dyed with malachite green oxalate or congo red did not show the same level of agreement with the micrometer measurements. Extensive phase separation between polymer and dye resulted in inflated absorbance values and calculated thickness that was often more than 25% greater than that measured with the micrometer. Only at thicknesses below 15 µm could reasonable accuracy be achieved for the water-based films.

  10. Delivering high performance BWR fuel reliably

    International Nuclear Information System (INIS)

    Schardt, J.F.

    1998-01-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  11. High-Throughput Phenotyping and QTL Mapping Reveals the Genetic Architecture of Maize Plant Growth1[OPEN

    Science.gov (United States)

    Huang, Chenglong; Wu, Di; Qiao, Feng; Li, Wenqiang; Duan, Lingfeng; Wang, Ke; Xiao, Yingjie; Chen, Guoxing; Liu, Qian; Yang, Wanneng

    2017-01-01

    With increasing demand for novel traits in crop breeding, the plant research community faces the challenge of quantitatively analyzing the structure and function of large numbers of plants. A clear goal of high-throughput phenotyping is to bridge the gap between genomics and phenomics. In this study, we quantified 106 traits from a maize (Zea mays) recombinant inbred line population (n = 167) across 16 developmental stages using the automatic phenotyping platform. Quantitative trait locus (QTL) mapping with a high-density genetic linkage map, including 2,496 recombinant bins, was used to uncover the genetic basis of these complex agronomic traits, and 988 QTLs have been identified for all investigated traits, including three QTL hotspots. Biomass accumulation and final yield were predicted using a combination of dissected traits in the early growth stage. These results reveal the dynamic genetic architecture of maize plant growth and enhance ideotype-based maize breeding and prediction. PMID:28153923

  12. X-ray phase microtomography with a single grating for high-throughput investigations of biological tissue.

    Science.gov (United States)

    Zdora, Marie-Christine; Vila-Comamala, Joan; Schulz, Georg; Khimchenko, Anna; Hipp, Alexander; Cook, Andrew C; Dilg, Daniel; David, Christian; Grünzweig, Christian; Rau, Christoph; Thibault, Pierre; Zanette, Irene

    2017-02-01

    The high-throughput 3D visualisation of biological specimens is essential for studying diseases and developmental disorders. It requires imaging methods that deliver high-contrast, high-resolution volumetric information at short sample preparation and acquisition times. Here we show that X-ray phase-contrast tomography using a single grating can provide a powerful alternative to commonly employed techniques, such as high-resolution episcopic microscopy (HREM). We present the phase tomography of a mouse embryo in paraffin obtained with an X-ray single-grating interferometer at I13-2 Beamline at Diamond Light Source and discuss the results in comparison with HREM measurements. The excellent contrast and quantitative density information achieved non-destructively and without staining using a simple, robust setup make X-ray single-grating interferometry an optimum candidate for high-throughput imaging of biological specimens as an alternative for existing methods like HREM.

  13. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought

    Directory of Open Access Journals (Sweden)

    Riccardo Ludovisi

    2017-09-01

    Full Text Available Poplars are fast-growing, high-yielding forest tree species, whose cultivation as second-generation biofuel crops is of increasing interest and can efficiently meet emission reduction goals. Yet, breeding elite poplar trees for drought resistance remains a major challenge. Worldwide breeding programs are largely focused on intra/interspecific hybridization, whereby Populus nigra L. is a fundamental parental pool. While high-throughput genotyping has resulted in unprecedented capabilities to rapidly decode complex genetic architecture of plant stress resistance, linking genomics to phenomics is hindered by technically challenging phenotyping. Relying on unmanned aerial vehicle (UAV-based remote sensing and imaging techniques, high-throughput field phenotyping (HTFP aims at enabling highly precise and efficient, non-destructive screening of genotype performance in large populations. To efficiently support forest-tree breeding programs, ground-truthing observations should be complemented with standardized HTFP. In this study, we develop a high-resolution (leaf level HTFP approach to investigate the response to drought of a full-sib F2 partially inbred population (termed here ‘POP6’, whose F1 was obtained from an intraspecific P. nigra controlled cross between genotypes with highly divergent phenotypes. We assessed the effects of two water treatments (well-watered and moderate drought on a population of 4603 trees (503 genotypes hosted in two adjacent experimental plots (1.67 ha by conducting low-elevation (25 m flights with an aerial drone and capturing 7836 thermal infrared (TIR images. TIR images were undistorted, georeferenced, and orthorectified to obtain radiometric mosaics. Canopy temperature (Tc was extracted using two independent semi-automated segmentation techniques, eCognition- and Matlab-based, to avoid the mixed-pixel problem. Overall, results showed that the UAV platform-based thermal imaging enables to effectively assess genotype

  14. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought.

    Science.gov (United States)

    Ludovisi, Riccardo; Tauro, Flavia; Salvati, Riccardo; Khoury, Sacha; Mugnozza Scarascia, Giuseppe; Harfouche, Antoine

    2017-01-01

    Poplars are fast-growing, high-yielding forest tree species, whose cultivation as second-generation biofuel crops is of increasing interest and can efficiently meet emission reduction goals. Yet, breeding elite poplar trees for drought resistance remains a major challenge. Worldwide breeding programs are largely focused on intra/interspecific hybridization, whereby Populus nigra L. is a fundamental parental pool. While high-throughput genotyping has resulted in unprecedented capabilities to rapidly decode complex genetic architecture of plant stress resistance, linking genomics to phenomics is hindered by technically challenging phenotyping. Relying on unmanned aerial vehicle (UAV)-based remote sensing and imaging techniques, high-throughput field phenotyping (HTFP) aims at enabling highly precise and efficient, non-destructive screening of genotype performance in large populations. To efficiently support forest-tree breeding programs, ground-truthing observations should be complemented with standardized HTFP. In this study, we develop a high-resolution (leaf level) HTFP approach to investigate the response to drought of a full-sib F 2 partially inbred population (termed here 'POP6'), whose F 1 was obtained from an intraspecific P. nigra controlled cross between genotypes with highly divergent phenotypes. We assessed the effects of two water treatments (well-watered and moderate drought) on a population of 4603 trees (503 genotypes) hosted in two adjacent experimental plots (1.67 ha) by conducting low-elevation (25 m) flights with an aerial drone and capturing 7836 thermal infrared (TIR) images. TIR images were undistorted, georeferenced, and orthorectified to obtain radiometric mosaics. Canopy temperature ( T c ) was extracted using two independent semi-automated segmentation techniques, eCognition- and Matlab-based, to avoid the mixed-pixel problem. Overall, results showed that the UAV platform-based thermal imaging enables to effectively assess genotype

  15. High-throughput design of low-activation, high-strength creep-resistant steels for nuclear-reactor applications

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qi; Zwaag, Sybrand van der [Novel Aerospace Materials Group, Faculty of Aerospace Engineering, Delft University of Technology, Kluyverweg 1, 2629 HS, Delft (Netherlands); Xu, Wei, E-mail: xuwei@ral.neu.edu.cn [State Key Laboratory of Rolling and Automation, Northeastern University, 110819, Shenyang (China); Novel Aerospace Materials Group, Faculty of Aerospace Engineering, Delft University of Technology, Kluyverweg 1, 2629 HS, Delft (Netherlands)

    2016-02-15

    Reduced-activation ferritic/martensitic steels are prime candidate materials for structural applications in nuclear power reactors. However, their creep strength is much lower than that of creep-resistant steel developed for conventional fossil-fired power plants as alloying elements with a high neutron activation cannot be used. To improve the creep strength and to maintain a low activation, a high-throughput computational alloy design model coupling thermodynamics, precipitate-coarsening kinetics and an optimization genetic algorithm, is developed. Twelve relevant alloying elements with either low or high activation are considered simultaneously. The activity levels at 0–10 year after the end of irradiation are taken as optimization parameter. The creep-strength values (after exposure for 10 years at 650 °C) are estimated on the basis of the solid-solution strengthening and the precipitation hardening (taking into account precipitate coarsening). Potential alloy compositions leading to a high austenite fraction or a high percentage of undesirable second phase particles are rejected automatically in the optimization cycle. The newly identified alloys have a much higher precipitation hardening and solid-solution strengthening at the same activity level as existing reduced-activation ferritic/martensitic steels.

  16. High-Throughput Computing on High-Performance Platforms: A Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Matteo, Turilli [Rutgers University; Angius, Alessio [Rutgers University; Oral, H Sarp [ORNL; De, K [University of Texas at Arlington; Klimentov, A [Brookhaven National Laboratory (BNL); Wells, Jack C. [ORNL; Jha, S [Rutgers University

    2017-10-01

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i) a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.

  17. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    Science.gov (United States)

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  18. The Stanford Automated Mounter: Enabling High-Throughput Protein Crystal Screening at SSRL

    International Nuclear Information System (INIS)

    Smith, C.A.; Cohen, A.E.

    2009-01-01

    The macromolecular crystallography experiment lends itself perfectly to high-throughput technologies. The initial steps including the expression, purification, and crystallization of protein crystals, along with some of the later steps involving data processing and structure determination have all been automated to the point where some of the last remaining bottlenecks in the process have been crystal mounting, crystal screening, and data collection. At the Stanford Synchrotron Radiation Laboratory, a National User Facility that provides extremely brilliant X-ray photon beams for use in materials science, environmental science, and structural biology research, the incorporation of advanced robotics has enabled crystals to be screened in a true high-throughput fashion, thus dramatically accelerating the final steps. Up to 288 frozen crystals can be mounted by the beamline robot (the Stanford Auto-Mounting System) and screened for diffraction quality in a matter of hours without intervention. The best quality crystals can then be remounted for the collection of complete X-ray diffraction data sets. Furthermore, the entire screening and data collection experiment can be controlled from the experimenter's home laboratory by means of advanced software tools that enable network-based control of the highly automated beamlines.

  19. High-throughput simultaneous determination of plasma water deuterium and 18-oxygen enrichment using a high-temperature conversion elemental analyzer with isotope ratio mass spectrometry.

    Science.gov (United States)

    Richelle, M; Darimont, C; Piguet-Welsch, C; Fay, L B

    2004-01-01

    This paper presents a high-throughput method for the simultaneous determination of deuterium and oxygen-18 (18O) enrichment of water samples isolated from blood. This analytical method enables rapid and simple determination of these enrichments of microgram quantities of water. Water is converted into hydrogen and carbon monoxide gases by the use of a high-temperature conversion elemental analyzer (TC-EA), that are then transferred on-line into the isotope ratio mass spectrometer. Accuracy determined with the standard light Antartic precipitation (SLAP) and Greenland ice sheet precipitation (GISP) is reliable for deuterium and 18O enrichments. The range of linearity is from 0 up to 0.09 atom percent excess (APE, i.e. -78 up to 5725 delta per mil (dpm)) for deuterium enrichment and from 0 up to 0.17 APE (-11 up to 890 dpm) for 18O enrichment. Memory effects do exist but can be avoided by analyzing the biological samples in quintuplet. This method allows the determination of 1440 samples per week, i.e. 288 biological samples per week. Copyright 2004 John Wiley & Sons, Ltd.

  20. Risk-based high-throughput chemical screening and prioritization using exposure models and in vitro bioactivity assays

    DEFF Research Database (Denmark)

    Shin, Hyeong-Moo; Ernstoff, Alexi; Arnot, Jon

    2015-01-01

    We present a risk-based high-throughput screening (HTS) method to identify chemicals for potential health concerns or for which additional information is needed. The method is applied to 180 organic chemicals as a case study. We first obtain information on how the chemical is used and identify....../oral contact, or dermal exposure. The method provides high-throughput estimates of exposure and important input for decision makers to identify chemicals of concern for further evaluation with additional information or more refined models....

  1. High-Throughput and Low-Latency Network Communication with NetIO

    Science.gov (United States)

    Schumacher, Jörn; Plessl, Christian; Vandelli, Wainer

    2017-10-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low- latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS exclusively target the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs, but it requires a non-negligible effort and expert knowledge. At the same time, message services like ZeroMQ have gained popularity in the HEP community. They make it possible to build distributed applications with a high-level approach and provide good performance. Unfortunately, their usage usually limits developers to TCP/IP- based networks. While it is possible to operate a TCP/IP stack on top of Infiniband and OmniPath, this approach may not be very efficient compared to a direct use of native APIs. NetIO is a simple, novel asynchronous message service that can operate on Ethernet, Infiniband and similar network fabrics. In this paper the design and implementation of NetIO is presented and described, and its use is evaluated in comparison to other approaches. NetIO supports different high-level programming models and typical workloads of HEP applications. The ATLAS FELIX project [1] successfully uses NetIO as its central communication platform. The architecture of NetIO is described in this paper, including the user-level API and the internal data-flow design. The paper includes a performance evaluation of NetIO including throughput and latency measurements. The performance is compared against the state-of-the- art ZeroMQ message service. Performance measurements are performed in a lab environment with Ethernet and FDR Infiniband networks.

  2. High-throughput mutational analysis of TOR1A in primary dystonia

    Directory of Open Access Journals (Sweden)

    Truong Daniel D

    2009-03-01

    Full Text Available Abstract Background Although the c.904_906delGAG mutation in Exon 5 of TOR1A typically manifests as early-onset generalized dystonia, DYT1 dystonia is genetically and clinically heterogeneous. Recently, another Exon 5 mutation (c.863G>A has been associated with early-onset generalized dystonia and some ΔGAG mutation carriers present with late-onset focal dystonia. The aim of this study was to identify TOR1A Exon 5 mutations in a large cohort of subjects with mainly non-generalized primary dystonia. Methods High resolution melting (HRM was used to examine the entire TOR1A Exon 5 coding sequence in 1014 subjects with primary dystonia (422 spasmodic dysphonia, 285 cervical dystonia, 67 blepharospasm, 41 writer's cramp, 16 oromandibular dystonia, 38 other primary focal dystonia, 112 segmental dystonia, 16 multifocal dystonia, and 17 generalized dystonia and 250 controls (150 neurologically normal and 100 with other movement disorders. Diagnostic sensitivity and specificity were evaluated in an additional 8 subjects with known ΔGAG DYT1 dystonia and 88 subjects with ΔGAG-negative dystonia. Results HRM of TOR1A Exon 5 showed high (100% diagnostic sensitivity and specificity. HRM was rapid and economical. HRM reliably differentiated the TOR1A ΔGAG and c.863G>A mutations. Melting curves were normal in 250/250 controls and 1012/1014 subjects with primary dystonia. The two subjects with shifted melting curves were found to harbor the classic ΔGAG deletion: 1 a non-Jewish Caucasian female with childhood-onset multifocal dystonia and 2 an Ashkenazi Jewish female with adolescent-onset spasmodic dysphonia. Conclusion First, HRM is an inexpensive, diagnostically sensitive and specific, high-throughput method for mutation discovery. Second, Exon 5 mutations in TOR1A are rarely associated with non-generalized primary dystonia.

  3. Novel method for the high-throughput processing of slides for the comet assay.

    Science.gov (United States)

    Karbaschi, Mahsa; Cooke, Marcus S

    2014-11-26

    Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. "Scoring", or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure.

  4. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  5. RSCA genotyping of MHC for high-throughput evolutionary studies in the model organism three-spined stickleback Gasterosteus aculeatus

    Science.gov (United States)

    Lenz, Tobias L; Eizaguirre, Christophe; Becker, Sven; Reusch, Thorsten BH

    2009-01-01

    Background In all jawed vertebrates, highly polymorphic genes of the major histocompatibility complex (MHC) encode antigen presenting molecules that play a key role in the adaptive immune response. Their polymorphism is composed of multiple copies of recently duplicated genes, each possessing many alleles within populations, as well as high nucleotide divergence between alleles of the same species. Experimental evidence is accumulating that MHC polymorphism is a result of balancing selection by parasites and pathogens. In order to describe MHC diversity and analyse the underlying mechanisms that maintain it, a reliable genotyping technique is required that is suitable for such highly variable genes. Results We present a genotyping protocol that uses Reference Strand-mediated Conformation Analysis (RSCA), optimised for recently duplicated MHC class IIB genes that are typical for many fish and bird species, including the three-spined stickleback, Gasterosteus aculeatus. In addition we use a comprehensive plasmid library of MHC class IIB alleles to determine the nucleotide sequence of alleles represented by RSCA allele peaks. Verification of the RSCA typing by cloning and sequencing demonstrates high congruency between both methods and provides new insight into the polymorphism of classical stickleback MHC genes. Analysis of the plasmid library additionally reveals the high resolution and reproducibility of the RSCA technique. Conclusion This new RSCA genotyping protocol offers a fast, but sensitive and reliable way to determine the MHC allele repertoire of three-spined sticklebacks. It therefore provides a valuable tool to employ this highly polymorphic and adaptive marker in future high-throughput studies of host-parasite co-evolution and ecological speciation in this emerging model organism. PMID:19291291

  6. RSCA genotyping of MHC for high-throughput evolutionary studies in the model organism three-spined stickleback Gasterosteus aculeatus

    Directory of Open Access Journals (Sweden)

    Becker Sven

    2009-03-01

    Full Text Available Abstract Background In all jawed vertebrates, highly polymorphic genes of the major histocompatibility complex (MHC encode antigen presenting molecules that play a key role in the adaptive immune response. Their polymorphism is composed of multiple copies of recently duplicated genes, each possessing many alleles within populations, as well as high nucleotide divergence between alleles of the same species. Experimental evidence is accumulating that MHC polymorphism is a result of balancing selection by parasites and pathogens. In order to describe MHC diversity and analyse the underlying mechanisms that maintain it, a reliable genotyping technique is required that is suitable for such highly variable genes. Results We present a genotyping protocol that uses Reference Strand-mediated Conformation Analysis (RSCA, optimised for recently duplicated MHC class IIB genes that are typical for many fish and bird species, including the three-spined stickleback, Gasterosteus aculeatus. In addition we use a comprehensive plasmid library of MHC class IIB alleles to determine the nucleotide sequence of alleles represented by RSCA allele peaks. Verification of the RSCA typing by cloning and sequencing demonstrates high congruency between both methods and provides new insight into the polymorphism of classical stickleback MHC genes. Analysis of the plasmid library additionally reveals the high resolution and reproducibility of the RSCA technique. Conclusion This new RSCA genotyping protocol offers a fast, but sensitive and reliable way to determine the MHC allele repertoire of three-spined sticklebacks. It therefore provides a valuable tool to employ this highly polymorphic and adaptive marker in future high-throughput studies of host-parasite co-evolution and ecological speciation in this emerging model organism.

  7. The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio).

    Science.gov (United States)

    Usui, Takuji; Noble, Daniel W A; O'Dea, Rose E; Fangmeier, Melissa L; Lagisz, Malgorzata; Hesselson, Daniel; Nakagawa, Shinichi

    2018-01-01

    Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34-0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes.

  8. High-throughput phenotyping (HTP) identifies seedling root traits linked to variation in seed yield and nutrient capture in field-grown oilseed rape (Brassica napus L.).

    Science.gov (United States)

    Thomas, C L; Graham, N S; Hayden, R; Meacham, M C; Neugebauer, K; Nightingale, M; Dupuy, L X; Hammond, J P; White, P J; Broadley, M R

    2016-04-06

    Root traits can be selected for crop improvement. Techniques such as soil excavations can be used to screen root traits in the field, but are limited to genotypes that are well-adapted to field conditions. The aim of this study was to compare a low-cost, high-throughput root phenotyping (HTP) technique in a controlled environment with field performance, using oilseed rape (OSR;Brassica napus) varieties. Primary root length (PRL), lateral root length and lateral root density (LRD) were measured on 14-d-old seedlings of elite OSR varieties (n = 32) using a 'pouch and wick' HTP system (∼40 replicates). Six field experiments were conducted using the same varieties at two UK sites each year for 3 years. Plants were excavated at the 6- to 8-leaf stage for general vigour assessments of roots and shoots in all six experiments, and final seed yield was determined. Leaves were sampled for mineral composition from one of the field experiments. Seedling PRL in the HTP system correlated with seed yield in four out of six (r = 0·50, 0·50, 0·33, 0·49;P emergence in three out of five (r = 0·59, 0·22, 0·49;P emergence, general early vigour or yield in the field. Associations between PRL and field performance are generally related to early vigour. These root traits might therefore be of limited additional selection value, given that vigour can be measured easily on shoots/canopies. In contrast, LRD cannot be assessed easily in the field and, if LRD can improve nutrient uptake, then it may be possible to use HTP systems to screen this trait in both elite and more genetically diverse, non-field-adapted OSR. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company.

  9. High throughput single-cell and multiple-cell micro-encapsulation.

    Science.gov (United States)

    Lagus, Todd P; Edd, Jon F

    2012-06-15

    Microfluidic encapsulation methods have been previously utilized to capture cells in picoliter-scale aqueous, monodisperse drops, providing confinement from a bulk fluid environment with applications in high throughput screening, cytometry, and mass spectrometry. We describe a method to not only encapsulate single cells, but to repeatedly capture a set number of cells (here we demonstrate one- and two-cell encapsulation) to study both isolation and the interactions between cells in groups of controlled sizes. By combining drop generation techniques with cell and particle ordering, we demonstrate controlled encapsulation of cell-sized particles for efficient, continuous encapsulation. Using an aqueous particle suspension and immiscible fluorocarbon oil, we generate aqueous drops in oil with a flow focusing nozzle. The aqueous flow rate is sufficiently high to create ordering of particles which reach the nozzle at integer multiple frequencies of the drop generation frequency, encapsulating a controlled number of cells in each drop. For representative results, 9.9 μm polystyrene particles are used as cell surrogates. This study shows a single-particle encapsulation efficiency P(k=1) of 83.7% and a double-particle encapsulation efficiency P(k=2) of 79.5% as compared to their respective Poisson efficiencies of 39.3% and 33.3%, respectively. The effect of consistent cell and particle concentration is demonstrated to be of major importance for efficient encapsulation, and dripping to jetting transitions are also addressed. Continuous media aqueous cell suspensions share a common fluid environment which allows cells to interact in parallel and also homogenizes the effects of specific cells in measurements from the media. High-throughput encapsulation of cells into picoliter-scale drops confines the samples to protect drops from cross-contamination, enable a measure of cellular diversity within samples, prevent dilution of reagents and expressed biomarkers, and amplify

  10. High-throughput combinatorial chemical bath deposition: The case of doping Cu (In, Ga) Se film with antimony

    Science.gov (United States)

    Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong

    2018-01-01

    The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.

  11. Delivering high performance BWR fuel reliably

    Energy Technology Data Exchange (ETDEWEB)

    Schardt, J.F. [GE Nuclear Energy, Wilmington, NC (United States)

    1998-07-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  12. Vector soup: high-throughput identification of Neotropical phlebotomine sand flies using metabarcoding.

    Science.gov (United States)

    Kocher, Arthur; Gantier, Jean-Charles; Gaborit, Pascal; Zinger, Lucie; Holota, Helene; Valiere, Sophie; Dusfour, Isabelle; Girod, Romain; Bañuls, Anne-Laure; Murienne, Jerome

    2017-03-01

    Phlebotomine sand flies are haematophagous dipterans of primary medical importance. They represent the only proven vectors of leishmaniasis worldwide and are involved in the transmission of various other pathogens. Studying the ecology of sand flies is crucial to understand the epidemiology of leishmaniasis and further control this disease. A major limitation in this regard is that traditional morphological-based methods for sand fly species identifications are time-consuming and require taxonomic expertise. DNA metabarcoding holds great promise in overcoming this issue by allowing the identification of multiple species from a single bulk sample. Here, we assessed the reliability of a short insect metabarcode located in the mitochondrial 16S rRNA for the identification of Neotropical sand flies, and constructed a reference database for 40 species found in French Guiana. Then, we conducted a metabarcoding experiment on sand flies mixtures of known content and showed that the method allows an accurate identification of specimens in pools. Finally, we applied metabarcoding to field samples caught in a 1-ha forest plot in French Guiana. Besides providing reliable molecular data for species-level assignations of phlebotomine sand flies, our study proves the efficiency of metabarcoding based on the mitochondrial 16S rRNA for studying sand fly diversity from bulk samples. The application of this high-throughput identification procedure to field samples can provide great opportunities for vector monitoring and eco-epidemiological studies. © 2016 John Wiley & Sons Ltd.

  13. PRIMEGENSw3: a web-based tool for high-throughput primer and probe design.

    Science.gov (United States)

    Kushwaha, Garima; Srivastava, Gyan Prakash; Xu, Dong

    2015-01-01

    Highly specific and efficient primer and probe design has been a major hurdle in many high-throughput techniques. Successful implementation of any PCR or probe hybridization technique depends on the quality of primers and probes used in terms of their specificity and cross-hybridization. Here we describe PRIMEGENSw3, a set of web-based utilities for high-throughput primer and probe design. These utilities allow users to select genomic regions and to design primer/probe for selected regions in an interactive, user-friendly, and automatic fashion. The system runs the PRIMEGENS algorithm in the back-end on the high-performance server with the stored genomic database or user-provided custom database for cross-hybridization check. Cross-hybridization is checked not only using BLAST but also by checking mismatch positions and energy calculation of potential hybridization hits. The results can be visualized online and also can be downloaded. The average success rate of primer design using PRIMEGENSw3 is ~90 %. The web server also supports primer design for methylated sequences, which is used in epigenetic studies. Stand-alone version of the software is also available for download at the website.

  14. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    Science.gov (United States)

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  15. Introduction of a high-throughput double-stent animal model for the evaluation of biodegradable vascular stents.

    Science.gov (United States)

    Borinski, Mauricio; Flege, Christian; Schreiber, Fabian; Krott, Nicole; Gries, Thomas; Liehn, Elisa; Blindt, Rüdiger; Marx, Nikolaus; Vogt, Felix

    2012-11-01

    Current stent system efficacy for the treatment of coronary artery disease is hampered by in-stent restenosis (ISR) rates of up to 20% in certain high-risk settings and by the risk of stent thrombosis, which is characterized by a high mortality rate. In theory, biodegradable vascular devices exhibit crucial advantages. Most absorbable implant materials are based on poly-L-lactic acid (PLLA) owing to its mechanical properties; however, PLLA might induce an inflammatory reaction in the vessel wall. Evaluation of biodegradable implant efficacy includes a long-term examination of tissue response; therefore, a simple in vivo tool for thorough biocompatibility and biodegradation evaluation would facilitate future stent system development. Rats have been used for the study of in vivo degradation processes, and stent implantation into the abdominal aorta of rats is a proven model for stent evaluation. Here, we report the transformation of the porcine double-stent animal model into the high-throughput rat abdominal aorta model. As genetic manipulation of rats was introduced recently, this novel method presents a powerful tool for future in vivo biodegradable candidate stent biocompatibility and biodegradation characterization in a reliable simple model of coronary ISR. Copyright © 2012 Wiley Periodicals, Inc.

  16. A High-throughput Selection for Cellulase Catalysts Using Chemical Complementation

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T.; Lin, Hening; Tao, Haiyan; Cornish, Virginia W.

    2010-01-01

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases however is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Due to the large number of enzyme variants selections can test compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity. PMID:19053460

  17. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  18. Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.

    Science.gov (United States)

    Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai

    2018-03-01

    With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.

  19. In-situ nanoelectrospray for high-throughput screening of enzymes and real-time monitoring of reactions.

    Science.gov (United States)

    Yang, Yuhan; Han, Feifei; Ouyang, Jin; Zhao, Yunling; Han, Juan; Na, Na

    2016-01-01

    The in-situ and high-throughput evaluation of enzymes and real-time monitoring of enzyme catalyzed reactions in liquid phase is quite significant in the catalysis industry. In-situ nanoelectrospray, the direct sampling and ionization method for mass spectrometry, has been applied for high-throughput evaluation of enzymes, as well as the on-line monitoring of reactions. Simply inserting a capillary into a liquid system with high-voltage applied, analytes in liquid reaction system can be directly ionized at the capillary tip with small volume consumption. With no sample pre-treatment or injection procedure, different analytes such as saccharides, amino acids, alkaloids, peptides and proteins can be rapidly and directly extracted from liquid phase and ionized at the capillary tip. Taking irreversible transesterification reaction of vinyl acetate and ethanol as an example, this technique has been used for the high-throughput evaluation of enzymes, fast optimizations, as well as real-time monitoring of reaction catalyzed by different enzymes. In addition, it is even softer than traditional electrospray ionization. The present method can also be used for the monitoring of other homogenous and heterogeneous reactions in liquid phases, which will show potentials in the catalysis industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    Science.gov (United States)

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  1. High-throughput computational search for strengthening precipitates in alloys

    International Nuclear Information System (INIS)

    Kirklin, S.; Saal, James E.; Hegde, Vinay I.; Wolverton, C.

    2016-01-01

    The search for high-strength alloys and precipitation hardened systems has largely been accomplished through Edisonian trial and error experimentation. Here, we present a novel strategy using high-throughput computational approaches to search for promising precipitate/alloy systems. We perform density functional theory (DFT) calculations of an extremely large space of ∼200,000 potential compounds in search of effective strengthening precipitates for a variety of different alloy matrices, e.g., Fe, Al, Mg, Ni, Co, and Ti. Our search strategy involves screening phases that are likely to produce coherent precipitates (based on small lattice mismatch) and are composed of relatively common alloying elements. When combined with the Open Quantum Materials Database (OQMD), we can computationally screen for precipitates that either have a stable two-phase equilibrium with the host matrix, or are likely to precipitate as metastable phases. Our search produces (for the structure types considered) nearly all currently known high-strength precipitates in a variety of fcc, bcc, and hcp matrices, thus giving us confidence in the strategy. In addition, we predict a number of new, currently-unknown precipitate systems that should be explored experimentally as promising high-strength alloy chemistries.

  2. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    Science.gov (United States)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  3. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  4. High-throughput machining using high average power ultrashort pulse lasers and ultrafast polygon scanner

    Science.gov (United States)

    Schille, Joerg; Schneider, Lutz; Streek, André; Kloetzer, Sascha; Loeschner, Udo

    2016-03-01

    In this paper, high-throughput ultrashort pulse laser machining is investigated on various industrial grade metals (Aluminium, Copper, Stainless steel) and Al2O3 ceramic at unprecedented processing speeds. This is achieved by using a high pulse repetition frequency picosecond laser with maximum average output power of 270 W in conjunction with a unique, in-house developed two-axis polygon scanner. Initially, different concepts of polygon scanners are engineered and tested to find out the optimal architecture for ultrafast and precision laser beam scanning. Remarkable 1,000 m/s scan speed is achieved on the substrate, and thanks to the resulting low pulse overlap, thermal accumulation and plasma absorption effects are avoided at up to 20 MHz pulse repetition frequencies. In order to identify optimum processing conditions for efficient high-average power laser machining, the depths of cavities produced under varied parameter settings are analyzed and, from the results obtained, the characteristic removal values are specified. The maximum removal rate is achieved as high as 27.8 mm3/min for Aluminium, 21.4 mm3/min for Copper, 15.3 mm3/min for Stainless steel and 129.1 mm3/min for Al2O3 when full available laser power is irradiated at optimum pulse repetition frequency.

  5. Breeding high yielding, high protein spring wheats: Problems, progress and approaches to further advances

    International Nuclear Information System (INIS)

    Konzak, C.F.; Rubenthaler, G.L.

    1984-01-01

    Preliminary data offer promise that advances have been made in breeding hard red spring wheat selections with a yielding capacity about equal to current cultivars and with an increased capacity for producing high protein grain. The most promising new selections are derivatives of Magnif 41M1, CI17689, a semi-dwarf mutant of an Argentinian high protein cultivar. Rapid changes in disease and pest problems also required immediate attention and a reorientation of breeding materials and goals. Selection procedures suggested as promising include early generation (F 2 and F 3 ) screening for disease resistance and agronomic type, with screening for protein content delayed until F 4 or F 5 . Cultural conditions conducive for expressing the highest yield capacity are proposed as optimum for identifying those selections also able to produce high protein grain. A goal of routine production of 14.5% (or higher) protein grain is considered necessary and achievable under fertility management conditions required for maximum yield expression of agronomically competitive cultivars. Agronomically improved sources of high protein genes, an increasing number of induced high protein mutants, and numerous high protein crossbred derivatives of T. dicoccoides and Aegilops species have recently become available. These new or improved germplasm sources as well as a considerable reserve of yet untapped germplasm variability in other accessions of wild T. dicoccoides offer increased optimism that further, rapid advances in the breeding of adapted high yielding, high protein wheats are achievable. Improved breeding schemes, using induced male sterility mutants either to aid in crossing or to develop male sterile facilitated recurrent selection (MSFRS) populations, should contribute towards an earlier achievement of the desired goal while providing the basis for buffering against rapid changes in disease and pest problems

  6. Column Grid Array Rework for High Reliability

    Science.gov (United States)

    Mehta, Atul C.; Bodie, Charles C.

    2008-01-01

    Due to requirements for reduced size and weight, use of grid array packages in space applications has become common place. To meet the requirement of high reliability and high number of I/Os, ceramic column grid array packages (CCGA) were selected for major electronic components used in next MARS Rover mission (specifically high density Field Programmable Gate Arrays). ABSTRACT The probability of removal and replacement of these devices on the actual flight printed wiring board assemblies is deemed to be very high because of last minute discoveries in final test which will dictate changes in the firmware. The questions and challenges presented to the manufacturing organizations engaged in the production of high reliability electronic assemblies are, Is the reliability of the PWBA adversely affected by rework (removal and replacement) of the CGA package? and How many times can we rework the same board without destroying a pad or degrading the lifetime of the assembly? To answer these questions, the most complex printed wiring board assembly used by the project was chosen to be used as the test vehicle, the PWB was modified to provide a daisy chain pattern, and a number of bare PWB s were acquired to this modified design. Non-functional 624 pin CGA packages with internal daisy chained matching the pattern on the PWB were procured. The combination of the modified PWB and the daisy chained packages enables continuity measurements of every soldered contact during subsequent testing and thermal cycling. Several test vehicles boards were assembled, reworked and then thermal cycled to assess the reliability of the solder joints and board material including pads and traces near the CGA. The details of rework process and results of thermal cycling are presented in this paper.

  7. High yielding mutants of blackgram variety 'PH-25'

    International Nuclear Information System (INIS)

    Misra, R.C.; Mohapatra, B.D.; Panda, B.S.

    2001-01-01

    Seeds of blackgram (Vigna mungo L.) variety 'PH-5' were treated with chemical mutagens ethyl methanesulfonate (EMS), nitrosoguanidine (NG), maleic hydrazide (MH) and sodium azide (NaN 3 ), each at 3 different concentrations. Thirty six mutant lines developed from mutagenic treatments along with parent varieties were tested in M 4 generation. The mutants showed wide variation in most of the traits and multivariante D 2 analysis showed genetic divergence among themselves. Twenty of the thirty mutants showed genetic divergence from parent. Ten selected high yielding mutants were tested in M 5 . Yield and other productive traits of five high yielding mutants in M 4 and M 5 are presented

  8. High lysine and high yielding mutants in wheat (Triticum aestivum) L

    International Nuclear Information System (INIS)

    Mohammad, T.; Mahmood, F.; Ahmad, A.; Sattar, A.; Khan, I.

    1988-01-01

    The dry seeds of the variety Lu-26 were irradiated with 20 krad of gamma rays. In M 2 about 300 mutant plants were selected for short stature, rust resistance and other desirable traits. As a result of further selection, in M 6 , eight superior lines were finally identified. The agronomic characteristics of these mutants, the parent variety (Lu-26) and a standard check variety (Pak-81) are shown. The selected mutants and commercially grown cultivars (Lu-26 and Pak-81) were studied for total protein content and amino acid pattern. The mutants WM-89-1, WM-6-17 and WM-81-2 showing high yield also contained comparatively high amounts of methionine and lysine. The lysine contents were 565, 410, and 370 mg/100g in the mutants WM-89-1, WM-6-17 and WM-81-2, respectively against a range value of 210-370 mg/100g in other mutants and 250-320 in the commercial cultivars. The mutant WM-81-2 was comparable to WM-56-1-5 in lysine content. The results of these experiments show a possibility of developing varieties having high yield and high amounts of essential amino acids such as lysine and methionine

  9. EMBRYONIC VASCULAR DISRUPTION ADVERSE OUTCOMES: LINKING HIGH THROUGHPUT SIGNALING SIGNATURES WITH FUNCTIONAL CONSEQUENCES

    Science.gov (United States)

    Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...

  10. Bacterial Pathogens and Community Composition in Advanced Sewage Treatment Systems Revealed by Metagenomics Analysis Based on High-Throughput Sequencing

    Science.gov (United States)

    Lu, Xin; Zhang, Xu-Xiang; Wang, Zhu; Huang, Kailong; Wang, Yuan; Liang, Weigang; Tan, Yunfei; Liu, Bo; Tang, Junying

    2015-01-01

    This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP) applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence. PMID:25938416

  11. Quantification of rapid Myosin regulatory light chain phosphorylation using high-throughput in-cell Western assays: comparison to Western immunoblots.

    Directory of Open Access Journals (Sweden)

    Hector N Aguilar

    2010-04-01

    Full Text Available Quantification of phospho-proteins (PPs is crucial when studying cellular signaling pathways. Western immunoblotting (WB is commonly used for the measurement of relative levels of signaling intermediates in experimental samples. However, WB is in general a labour-intensive and low-throughput technique. Because of variability in protein yield and phospho-signal preservation during protein harvesting, and potential loss of antigen during protein transfer, WB provides only semi-quantitative data. By comparison, the "in-cell western" (ICW technique has high-throughput capacity and requires less extensive sample preparation. Thus, we compared the ICW technique to WB for measuring phosphorylated myosin regulatory light chain (PMLC(20 in primary cultures of uterine myocytes to assess their relative specificity, sensitivity, precision, and quantification of biologically relevant responses.ICWs are cell-based microplate assays for quantification of protein targets in their cellular context. ICWs utilize a two-channel infrared (IR scanner (Odyssey(R to quantify signals arising from near-infrared (NIR fluorophores conjugated to secondary antibodies. One channel is dedicated to measuring the protein of interest and the second is used for data normalization of the signal in each well of the microplate. Using uterine myocytes, we assessed oxytocin (OT-stimulated MLC(20 phosphorylation measured by ICW and WB, both using NIR fluorescence. ICW and WB data were comparable regarding signal linearity, signal specificity, and time course of phosphorylation response to OT.ICW and WB yield comparable biological data. The advantages of ICW over WB are its high-throughput capacity, improved precision, and reduced sample preparation requirements. ICW might provide better sensitivity and precision with low-quantity samples or for protocols requiring large numbers of samples. These features make the ICW technique an excellent tool for the study of phosphorylation endpoints

  12. A method for high throughput bioelectrochemical research based on small scale microbial electrolysis cells

    KAUST Repository

    Call, Douglas F.; Logan, Bruce E.

    2011-01-01

    There is great interest in studying exoelectrogenic microorganisms, but existing methods can require expensive electrochemical equipment and specialized reactors. We developed a simple system for conducting high throughput bioelectrochemical

  13. On the optimal trimming of high-throughput mRNA sequence data

    Directory of Open Access Journals (Sweden)

    Matthew D MacManes

    2014-01-01

    Full Text Available The widespread and rapid adoption of high-throughput sequencing technologies has afforded researchers the opportunity to gain a deep understanding of genome level processes that underlie evolutionary change, and perhaps more importantly, the links between genotype and phenotype. In particular, researchers interested in functional biology and adaptation have used these technologies to sequence mRNA transcriptomes of specific tissues, which in turn are often compared to other tissues, or other individuals with different phenotypes. While these techniques are extremely powerful, careful attention to data quality is required. In particular, because high-throughput sequencing is more error-prone than traditional Sanger sequencing, quality trimming of sequence reads should be an important step in all data processing pipelines. While several software packages for quality trimming exist, no general guidelines for the specifics of trimming have been developed. Here, using empirically derived sequence data, I provide general recommendations regarding the optimal strength of trimming, specifically in mRNA-Seq studies. Although very aggressive quality trimming is common, this study suggests that a more gentle trimming, specifically of those nucleotides whose Phred score < 2 or < 5, is optimal for most studies across a wide variety of metrics.

  14. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2018-01-01

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org. Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  15. Combining Amplification Typing of L1 Active Subfamilies (ATLAS) with High-Throughput Sequencing.

    Science.gov (United States)

    Rahbari, Raheleh; Badge, Richard M

    2016-01-01

    With the advent of new generations of high-throughput sequencing technologies, the catalog of human genome variants created by retrotransposon activity is expanding rapidly. However, despite these advances in describing L1 diversity and the fact that L1 must retrotranspose in the germline or prior to germline partitioning to be evolutionarily successful, direct assessment of de novo L1 retrotransposition in the germline or early embryogenesis has not been achieved for endogenous L1 elements. A direct study of de novo L1 retrotransposition into susceptible loci within sperm DNA (Freeman et al., Hum Mutat 32(8):978-988, 2011) suggested that the rate of L1 retrotransposition in the germline is much lower than previously estimated (ATLAS L1 display technique (Badge et al., Am J Hum Genet 72(4):823-838, 2003) to investigate de novo L1 retrotransposition in human genomes. In this chapter, we describe how we combined a high-coverage ATLAS variant with high-throughput sequencing, achieving 11-25× sequence depth per single amplicon, to study L1 retrotransposition in whole genome amplified (WGA) DNAs.

  16. Induction of high yielding and high protein containing chickpea mutant variety through gamma radiation

    International Nuclear Information System (INIS)

    Hassan, S.; Javed, M.A.; Khan, A.J.; Tariq, M.

    1997-01-01

    Pure seeds of a blight susceptible but high yielding chickpea variety 6153 were irradiated at 20 Kr(0.2 kGy) dose of gamma radiation and the mutant line CMN-446-4 was selected in M3 generation on the basis of high yield and disease resistance. After confirmation of its resistance to blight in M4 and M5, the mutant line CMN-446-4 along with other promising chickpea mutants were evaluated in various yield trials at different locations. The mutant line CMN-446-4 was got evaluated in chickpea national uniform yield trial conducted over two locations in the country during 1993-94. The mutant line, on average, ranked 3rd by producing significantly higher yield of 1528 kg/ha as compared to the two checked varieties Punjab-91 and Paidar-91 which yielded 1316 and 1391 kg/ha respectively. The mutant CMN-446-4 has significantly greater percentage of protein content (25.22%) compared to its parental variety having (20.12%). (author)

  17. High-Throughput Screening Using Mass Spectrometry within Drug Discovery.

    Science.gov (United States)

    Rohman, Mattias; Wingfield, Jonathan

    2016-01-01

    In order to detect a biochemical analyte with a mass spectrometer (MS) it is necessary to ionize the analyte of interest. The analyte can be ionized by a number of different mechanisms, however, one common method is electrospray ionization (ESI). Droplets of analyte are sprayed through a highly charged field, the droplets pick up charge, and this is transferred to the analyte. High levels of salt in the assay buffer will potentially steal charge from the analyte and suppress the MS signal. In order to avoid this suppression of signal, salt is often removed from the sample prior to injection into the MS. Traditional ESI MS relies on liquid chromatography (LC) to remove the salt and reduce matrix effects, however, this is a lengthy process. Here we describe the use of RapidFire™ coupled to a triple-quadrupole MS for high-throughput screening. This system uses solid-phase extraction to de-salt samples prior to injection, reducing processing time such that a sample is injected into the MS ~every 10 s.

  18. High-throughput liquid-absorption air-sampling apparatus and methods

    Science.gov (United States)

    Zaromb, Solomon

    2000-01-01

    A portable high-throughput liquid-absorption air sampler [PHTLAAS] has an asymmetric air inlet through which air is drawn upward by a small and light-weight centrifugal fan driven by a direct current motor that can be powered by a battery. The air inlet is so configured as to impart both rotational and downward components of motion to the sampled air near said inlet. The PHTLAAS comprises a glass tube of relatively small size through which air passes at a high rate in a swirling, highly turbulent motion, which facilitates rapid transfer of vapors and particulates to a liquid film covering the inner walls of the tube. The pressure drop through the glass tube is 20% for vapors or airborne particulates in the 2-3.mu. range and >50% for particles larger than 4.mu.. In conjunction with various analyzers, the PHTLAAS can serve to monitor a variety of hazardous or illicit airborne substances, such as lead-containing particulates, tritiated water vapor, biological aerosols, or traces of concealed drugs or explosives.

  19. High-throughput liquid-absorption air-sampling apparatus and methods

    International Nuclear Information System (INIS)

    2000-01-01

    A portable high-throughput liquid-absorption air sampler [PHTLAAS] has an asymmetric air inlet through which air is drawn upward by a small and light-weight centrifugal fan driven by a direct current motor that can be powered by a battery. The air inlet is so configured as to impart both rotational and downward components of motion to the sampled air near said inlet. The PHTLAAS comprises a glass tube of relatively small size through which air passes at a high rate in a swirling, highly turbulent motion, which facilitates rapid transfer of vapors and particulates to a liquid film covering the inner walls of the tube. The pressure drop through the glass tube is 20% for vapors or airborne particulates in the 2--3 microns range and > 50% for particles larger than 4 microns. In conjunction with various analyzers, the PHTLAAS can serve to monitor a variety of hazardous or illicit airborne substances, such as lead-containing particulates, tritiated water vapor, biological aerosols, or traces of concealed drugs or explosives

  20. Development of high-throughput analysis system using highly-functional organic polymer monoliths

    International Nuclear Information System (INIS)

    Umemura, Tomonari; Kojima, Norihisa; Ueki, Yuji

    2008-01-01

    The growing demand for high-throughput analysis in the current competitive life sciences and industries has promoted the development of high-speed HPLC techniques and tools. As one of such tools, monolithic columns have attracted increasing attention and interest in the last decade due to the low flow-resistance and excellent mass transfer, allowing for rapid separations and reactions at high flow rates with minimal loss of column efficiency. Monolithic materials are classified into two main groups: silica- and organic polymer-based monoliths, each with their own advantages and disadvantages. Organic polymer monoliths have several distinct advantages in life-science research, including wide pH stability, less irreversible adsorption, facile preparation and modification. Thus, we have so far tried to develop organic polymer monoliths for various chemical operations, such as separation, extraction, preconcentration, and reaction. In the present paper, recent progress in the development of organic polymer monoliths is discussed. Especially, the procedure for the preparation of methacrylate-based monoliths with various functional groups is described, where the influence of different compositional and processing parameters on the monolithic structure is also addressed. Furthermore, the performance of the produced monoliths is demonstrated through the results for (1) rapid separations of alklybenzenes at high flow rates, (2) flow-through enzymatic digestion of cytochrome c on a trypsin-immobilized monolithic column, and (3) separation of the tryptic digest on a reversed-phase monolithic column. The flexibility and versatility of organic polymer monoliths will be beneficial for further enhancing analytical performance, and will open the way for new applications and opportunities both in scientific and industrial research. (author)

  1. Operational evaluation of high-throughput community-based mass prophylaxis using Just-in-time training.

    Science.gov (United States)

    Spitzer, James D; Hupert, Nathaniel; Duckart, Jonathan; Xiong, Wei

    2007-01-01

    Community-based mass prophylaxis is a core public health operational competency, but staffing needs may overwhelm the local trained health workforce. Just-in-time (JIT) training of emergency staff and computer modeling of workforce requirements represent two complementary approaches to address this logistical problem. Multnomah County, Oregon, conducted a high-throughput point of dispensing (POD) exercise to test JIT training and computer modeling to validate POD staffing estimates. The POD had 84% non-health-care worker staff and processed 500 patients per hour. Post-exercise modeling replicated observed staff utilization levels and queue formation, including development and amelioration of a large medical evaluation queue caused by lengthy processing times and understaffing in the first half-hour of the exercise. The exercise confirmed the feasibility of using JIT training for high-throughput antibiotic dispensing clinics staffed largely by nonmedical professionals. Patient processing times varied over the course of the exercise, with important implications for both staff reallocation and future POD modeling efforts. Overall underutilization of staff revealed the opportunity for greater efficiencies and even higher future throughputs.

  2. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  3. Multiple and high-throughput droplet reactions via combination of microsampling technique and microfluidic chip

    KAUST Repository

    Wu, Jinbo

    2012-11-20

    Microdroplets offer unique compartments for accommodating a large number of chemical and biological reactions in tiny volume with precise control. A major concern in droplet-based microfluidics is the difficulty to address droplets individually and achieve high throughput at the same time. Here, we have combined an improved cartridge sampling technique with a microfluidic chip to perform droplet screenings and aggressive reaction with minimal (nanoliter-scale) reagent consumption. The droplet composition, distance, volume (nanoliter to subnanoliter scale), number, and sequence could be precisely and digitally programmed through the improved sampling technique, while sample evaporation and cross-contamination are effectively eliminated. Our combined device provides a simple model to utilize multiple droplets for various reactions with low reagent consumption and high throughput. © 2012 American Chemical Society.

  4. Development of Microfluidic Systems Enabling High-Throughput Single-Cell Protein Characterization

    OpenAIRE

    Fan, Beiyuan; Li, Xiufeng; Chen, Deyong; Peng, Hongshang; Wang, Junbo; Chen, Jian

    2016-01-01

    This article reviews recent developments in microfluidic systems enabling high-throughput characterization of single-cell proteins. Four key perspectives of microfluidic platforms are included in this review: (1) microfluidic fluorescent flow cytometry; (2) droplet based microfluidic flow cytometry; (3) large-array micro wells (microengraving); and (4) large-array micro chambers (barcode microchips). We examine the advantages and limitations of each technique and discuss future research oppor...

  5. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  6. Dimensioning storage and computing clusters for efficient high throughput computing

    International Nuclear Information System (INIS)

    Accion, E; Bria, A; Bernabeu, G; Caubet, M; Delfino, M; Espinal, X; Merino, G; Lopez, F; Martinez, F; Planas, E

    2012-01-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  7. Frequency of cardiac arrhythmias in high and low- yielding dairy cows

    Directory of Open Access Journals (Sweden)

    Afshin Jafari Dehkordi

    2014-06-01

    Full Text Available Electrocardiography (ECG may be used to recognize cardiac disorders. Levels of milk production may change the serum electrolytes which its imbalance has a role in cardiac arrhythmia. Fifty high yielding and fifty low yielding Holstein dairy cows were used in this study. Electrocardiography was recorded by base-apex lead and blood samples were collected from jugular vein for measurement of serum elements such as sodium, potassium, calcium, phosphorous, iron and magnesium. Cardiac dysrhythmias were detected more frequent in low yielding Holstein cows (62.00% compared to high yielding Holstein cows (46.00%. The cardiac dysrhythmias that were observed in low yielding Holstein cows included sinus arrhythmia (34.70%, wandering pacemaker (22.45 %, bradycardia (18.37%, tachycardia (10.20%, atrial premature beat (2.04%, sinoatrial block (2.04%, atrial fibrillation (8.16% and atrial tachycardia (2.04%. The cardiac dysrhythmias were observed in high yielding Holstein cows including, sinus arrhythmia (86.95% and wandering pacemaker (13.05%. Also, notched P wave was observed to be 30% and 14% in high- and low- yielding Holstein cows respectively. The serum calcium concentration of low yielding Holstein cows was significantly lower than that of high yielding Holstein cows. There was not any detectable significant difference in other serum elements between high- and low- yielding Holstein cows. Based on the result of present study, could be concluded that low serum concentration of calcium results to more frequent dysrhythmias in low yielding Holstein cows.

  8. High-throughput mouse genotyping using robotics automation.

    Science.gov (United States)

    Linask, Kaari L; Lo, Cecilia W

    2005-02-01

    The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.

  9. High throughput screening and profiling of high-value carotenoids from a wide diversity of bacteria in surface seawater.

    Science.gov (United States)

    Asker, Dalal

    2018-09-30

    Carotenoids are valuable natural colorants that exhibit numerous health promoting properties, and thus are widely used in food, feeds, pharmaceutical and nutraceuticals industries. In this study, we isolated and identified novel microbial sources that produced high-value carotenoids using high throughput screening (HTS). A total of 701 pigmented microbial strains library including marine bacteria and red yeast was constructed. Carotenoids profiling using HPLC-DAD-MS methods showed 88 marine bacterial strains with potential for the production of high-value carotenoids including astaxanthin (28 strains), zeaxanthin (21 strains), lutein (1 strains) and canthaxanthin (2 strains). A comprehensive 16S rRNA gene based phylogenetic analysis revealed that these strains can be classified into 30 species belonging to five bacterial classes (Flavobacteriia, α-Proteobacteria, γ-Proteobacteria, Actinobacteria and Bacilli). Importantly, we discovered novel producers of zeaxanthin and lutein, and a high diversity in both carotenoids and producing microbial strains, which are promising and highly selective biotechnological sources for high-value carotenoids. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    International Nuclear Information System (INIS)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan; Durmus, Naside Gozde

    2011-01-01

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  11. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    Energy Technology Data Exchange (ETDEWEB)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan [Department of Medicine, Demirci Bio-Acoustic-MEMS in Medicine (BAMM) Laboratory, Center for Biomedical Engineering, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Durmus, Naside Gozde, E-mail: udemirci@rics.bwh.harvard.edu [School of Engineering and Division of Biology and Medicine, Brown University, Providence, RI (United States)

    2011-09-15

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  12. A DVD-ROM based high-throughput cantilever sensing platform

    DEFF Research Database (Denmark)

    Bosco, Filippo

    and October 2011. The project was part of the Xsense research network, funded by the Strategic Danish Research Council, and supervised by Prof. Anja Boisen. The goal of the Xsense project is to design and fabricate a compact and cheap device for explosive sensing in air and liquid. Four different technologies...... of a high-throughput label-free sensor platform utilizing cantilever based sensors. These sensors have often been acclaimed to facilitate highly parallelized operation. Unfortunately, so far no concept has been presented which offers large data sets as well as easy liquid sample handling. We use optics...... and mechanics from a DVD player to handle liquid samples and to read-out cantilever deflection and resonant frequency. In a few minutes, several liquid samples can be analyzed in parallel, measuring over several hundreds of individual cantilevers. Three generations of systems have been developed and tested...

  13. Solid-phase cloning for high-throughput assembly of single and multiple DNA parts

    DEFF Research Database (Denmark)

    Lundqvist, Magnus; Edfors, Fredrik; Sivertsson, Åsa

    2015-01-01

    We describe solid-phase cloning (SPC) for high-throughput assembly of expression plasmids. Our method allows PCR products to be put directly into a liquid handler for capture and purification using paramagnetic streptavidin beads and conversion into constructs by subsequent cloning reactions. We ...

  14. A high-throughput method for GMO multi-detection using a microfluidic dynamic array

    NARCIS (Netherlands)

    Brod, F.C.A.; Dijk, van J.P.; Voorhuijzen, M.M.; Dinon, A.Z.; Guimarães, L.H.S.; Scholtens, I.M.J.; Arisi, A.C.M.; Kok, E.J.

    2014-01-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNAbased methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the

  15. Defining the taxonomic domain of applicability for mammalian-based high-throughput screening assays

    Science.gov (United States)

    Cell-based high throughput screening (HTS) technologies are becoming mainstream in chemical safety evaluations. The US Environmental Protection Agency (EPA) Toxicity Forecaster (ToxCastTM) and the multi-agency Tox21 Programs have been at the forefront in advancing this science, m...

  16. Molecular classification of fatty liver by high-throughput profiling of protein post-translational modifications.

    Science.gov (United States)

    Urasaki, Yasuyo; Fiscus, Ronald R; Le, Thuc T

    2016-04-01

    We describe an alternative approach to classifying fatty liver by profiling protein post-translational modifications (PTMs) with high-throughput capillary isoelectric focusing (cIEF) immunoassays. Four strains of mice were studied, with fatty livers induced by different causes, such as ageing, genetic mutation, acute drug usage, and high-fat diet. Nutrient-sensitive PTMs of a panel of 12 liver metabolic and signalling proteins were simultaneously evaluated with cIEF immunoassays, using nanograms of total cellular protein per assay. Changes to liver protein acetylation, phosphorylation, and O-N-acetylglucosamine glycosylation were quantified and compared between normal and diseased states. Fatty liver tissues could be distinguished from one another by distinctive protein PTM profiles. Fatty liver is currently classified by morphological assessment of lipid droplets, without identifying the underlying molecular causes. In contrast, high-throughput profiling of protein PTMs has the potential to provide molecular classification of fatty liver. Copyright © 2016 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.

  17. Accurate Classification of Protein Subcellular Localization from High-Throughput Microscopy Images Using Deep Learning

    Directory of Open Access Journals (Sweden)

    Tanel Pärnamaa

    2017-05-01

    Full Text Available High-throughput microscopy of many single cells generates high-dimensional data that are far from straightforward to analyze. One important problem is automatically detecting the cellular compartment where a fluorescently-tagged protein resides, a task relatively simple for an experienced human, but difficult to automate on a computer. Here, we train an 11-layer neural network on data from mapping thousands of yeast proteins, achieving per cell localization classification accuracy of 91%, and per protein accuracy of 99% on held-out images. We confirm that low-level network features correspond to basic image characteristics, while deeper layers separate localization classes. Using this network as a feature calculator, we train standard classifiers that assign proteins to previously unseen compartments after observing only a small number of training examples. Our results are the most accurate subcellular localization classifications to date, and demonstrate the usefulness of deep learning for high-throughput microscopy.

  18. Accurate Classification of Protein Subcellular Localization from High-Throughput Microscopy Images Using Deep Learning.

    Science.gov (United States)

    Pärnamaa, Tanel; Parts, Leopold

    2017-05-05

    High-throughput microscopy of many single cells generates high-dimensional data that are far from straightforward to analyze. One important problem is automatically detecting the cellular compartment where a fluorescently-tagged protein resides, a task relatively simple for an experienced human, but difficult to automate on a computer. Here, we train an 11-layer neural network on data from mapping thousands of yeast proteins, achieving per cell localization classification accuracy of 91%, and per protein accuracy of 99% on held-out images. We confirm that low-level network features correspond to basic image characteristics, while deeper layers separate localization classes. Using this network as a feature calculator, we train standard classifiers that assign proteins to previously unseen compartments after observing only a small number of training examples. Our results are the most accurate subcellular localization classifications to date, and demonstrate the usefulness of deep learning for high-throughput microscopy. Copyright © 2017 Parnamaa and Parts.

  19. A high-throughput and quantitative method to assess the mutagenic potential of translesion DNA synthesis

    Science.gov (United States)

    Taggart, David J.; Camerlengo, Terry L.; Harrison, Jason K.; Sherrer, Shanen M.; Kshetry, Ajay K.; Taylor, John-Stephen; Huang, Kun; Suo, Zucai

    2013-01-01

    Cellular genomes are constantly damaged by endogenous and exogenous agents that covalently and structurally modify DNA to produce DNA lesions. Although most lesions are mended by various DNA repair pathways in vivo, a significant number of damage sites persist during genomic replication. Our understanding of the mutagenic outcomes derived from these unrepaired DNA lesions has been hindered by the low throughput of existing sequencing methods. Therefore, we have developed a cost-effective high-throughput short oligonucleotide sequencing assay that uses next-generation DNA sequencing technology for the assessment of the mutagenic profiles of translesion DNA synthesis catalyzed by any error-prone DNA polymerase. The vast amount of sequencing data produced were aligned and quantified by using our novel software. As an example, the high-throughput short oligonucleotide sequencing assay was used to analyze the types and frequencies of mutations upstream, downstream and at a site-specifically placed cis–syn thymidine–thymidine dimer generated individually by three lesion-bypass human Y-family DNA polymerases. PMID:23470999

  20. High-Throughput Screening Using Fourier-Transform Infrared Imaging

    Directory of Open Access Journals (Sweden)

    Erdem Sasmaz

    2015-06-01

    Full Text Available Efficient parallel screening of combinatorial libraries is one of the most challenging aspects of the high-throughput (HT heterogeneous catalysis workflow. Today, a number of methods have been used in HT catalyst studies, including various optical, mass-spectrometry, and gas-chromatography techniques. Of these, rapid-scanning Fourier-transform infrared (FTIR imaging is one of the fastest and most versatile screening techniques. Here, the new design of the 16-channel HT reactor is presented and test results for its accuracy and reproducibility are shown. The performance of the system was evaluated through the oxidation of CO over commercial Pd/Al2O3 and cobalt oxide nanoparticles synthesized with different reducer-reductant molar ratios, surfactant types, metal and surfactant concentrations, synthesis temperatures, and ramp rates.

  1. High Reliability Oscillators for Terahertz Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — To develop reliable THz sources with high power and high DC-RF efficiency, Virginia Diodes, Inc. will develop a thorough understanding of the complex interactions...

  2. High Throughput Line-of-Sight MIMO Systems for Next Generation Backhaul Applications

    Science.gov (United States)

    Song, Xiaohang; Cvetkovski, Darko; Hälsig, Tim; Rave, Wolfgang; Fettweis, Gerhard; Grass, Eckhard; Lankl, Berthold

    2017-09-01

    The evolution to ultra-dense next generation networks requires a massive increase in throughput and deployment flexibility. Therefore, novel wireless backhaul solutions that can support these demands are needed. In this work we present an approach for a millimeter wave line-of-sight MIMO backhaul design, targeting transmission rates in the order of 100 Gbit/s. We provide theoretical foundations for the concept showcasing its potential, which are confirmed through channel measurements. Furthermore, we provide insights into the system design with respect to antenna array setup, baseband processing, synchronization, and channel equalization. Implementation in a 60 GHz demonstrator setup proves the feasibility of the system concept for high throughput backhauling in next generation networks.

  3. SINA: accurate high-throughput multiple sequence alignment of ribosomal RNA genes.

    Science.gov (United States)

    Pruesse, Elmar; Peplies, Jörg; Glöckner, Frank Oliver

    2012-07-15

    In the analysis of homologous sequences, computation of multiple sequence alignments (MSAs) has become a bottleneck. This is especially troublesome for marker genes like the ribosomal RNA (rRNA) where already millions of sequences are publicly available and individual studies can easily produce hundreds of thousands of new sequences. Methods have been developed to cope with such numbers, but further improvements are needed to meet accuracy requirements. In this study, we present the SILVA Incremental Aligner (SINA) used to align the rRNA gene databases provided by the SILVA ribosomal RNA project. SINA uses a combination of k-mer searching and partial order alignment (POA) to maintain very high alignment accuracy while satisfying high throughput performance demands. SINA was evaluated in comparison with the commonly used high throughput MSA programs PyNAST and mothur. The three BRAliBase III benchmark MSAs could be reproduced with 99.3, 97.6 and 96.1 accuracy. A larger benchmark MSA comprising 38 772 sequences could be reproduced with 98.9 and 99.3% accuracy using reference MSAs comprising 1000 and 5000 sequences. SINA was able to achieve higher accuracy than PyNAST and mothur in all performed benchmarks. Alignment of up to 500 sequences using the latest SILVA SSU/LSU Ref datasets as reference MSA is offered at http://www.arb-silva.de/aligner. This page also links to Linux binaries, user manual and tutorial. SINA is made available under a personal use license.

  4. Morphology control in polymer blend fibers—a high throughput computing approach

    Science.gov (United States)

    Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar

    2016-08-01

    Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.

  5. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  6. Development on the High-throughput Vol-oxidizer for Decladding and Voloxidation of Spent Fuel Rod-cuts

    International Nuclear Information System (INIS)

    Kim, Young Hwang; Jung, Jae Hoo; Kim, Ki Ho; Park, Byung Buk; Lee, Hyo Jik; Kim, Sung Hyun; Park, Hee Sung; Lee, Jong Kwang; Kim, Ho Dong

    2009-12-01

    A high-throughput vol-oxidizer which can handle a several ten kg HM/batch is being developed to supply U 3 O 8 powders to an electrolytic reduction reactor in pyro-processing. At the first year step(2007), for enhancement of oxidation and recovery rate, we analyzed the mechanical and chemical methods, and devised the main mechanism with ball drop methods and rotary kiln type. Also, the main devices for oxidation and recovery of rod-cuts were designed by using the Solid Works and COSMOS program tools, and manufactured after thermal/mechanical analysis. In order to verify the main devices, simulation fuels(W 90%+SiO 2 10%) were manufactured and the main devices were tested for the oxidation and recovery rate of its. Here the expansion ratio of simulation fuel is similar to U 3 O 8 (2.7). At the second year step(2008), with the constant ration of rod-cuts volume and expansion ratio of U 3 O 8 (2.7), we produced a theoretical equation that can estimate the volume of rod-cuts according to a variation of their weight and lengths. We considered various materials such as ceramics and Ni-Cr, finally, the APM material which can constantly maintain against high temperature(1,200 .deg. C) and vacuum(1 torr) was selected and a vol-oxidizer was designed. At the third year step(2009), in order to manufacture a high-throughput vol-oxidizer, we have analyzed the vol-oxidizer for remote operability and maintainability, also the remote assembling and disassembling possibilities of the selected modules have been analyzed in terms of visibility, interference, approach, weight, and so on. We have presented final modular design and manufactured a high-throughput vol-oxidizer. Also, we have conducted the blank, heating(over 500 .deg. C) and hull separation test(capacity : 50 kg HM/batch, hull length 50mm) on the high-throughput vol-oxidizer. Also, these design technologies for the high-throughput vol-oxidizer will be utilized in the development of a more efficient vol-oxidizer with higher

  7. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types.

    Directory of Open Access Journals (Sweden)

    Aaron T L Lun

    2018-05-01

    Full Text Available Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set.

  8. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types.

    Science.gov (United States)

    Lun, Aaron T L; Pagès, Hervé; Smith, Mike L

    2018-05-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set.

  9. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types

    Science.gov (United States)

    Pagès, Hervé

    2018-01-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set. PMID:29723188

  10. Generalized schemes for high throughput manipulation of the Desulfovibrio vulgaris Hildenborough genome

    Energy Technology Data Exchange (ETDEWEB)

    Chhabra, S.R.; Butland, G.; Elias, D.; Chandonia, J.-M.; Fok, V.; Juba, T.; Gorur, A.; Allen, S.; Leung, C.-M.; Keller, K.; Reveco, S.; Zane, G.; Semkiw, E.; Prathapam, R.; Gold, B.; Singer, M.; Ouellet, M.; Sazakal, E.; Jorgens, D.; Price, M.; Witkowska, E.; Beller, H.; Hazen, T.C.; Biggin, M.; Auer, M.; Wall, J.; Keasling, J.

    2011-07-15

    The ability to conduct advanced functional genomic studies of the thousands of sequenced bacteria has been hampered by the lack of available tools for making high- throughput chromosomal manipulations in a systematic manner that can be applied across diverse species. In this work, we highlight the use of synthetic biological tools to assemble custom suicide vectors with reusable and interchangeable DNA “parts” to facilitate chromosomal modification at designated loci. These constructs enable an array of downstream applications including gene replacement and creation of gene fusions with affinity purification or localization tags. We employed this approach to engineer chromosomal modifications in a bacterium that has previously proven difficult to manipulate genetically, Desulfovibrio vulgaris Hildenborough, to generate a library of over 700 strains. Furthermore, we demonstrate how these modifications can be used for examining metabolic pathways, protein-protein interactions, and protein localization. The ubiquity of suicide constructs in gene replacement throughout biology suggests that this approach can be applied to engineer a broad range of species for a diverse array of systems biological applications and is amenable to high-throughput implementation.

  11. Genecentric: a package to uncover graph-theoretic structure in high-throughput epistasis data.

    Science.gov (United States)

    Gallant, Andrew; Leiserson, Mark D M; Kachalov, Maxim; Cowen, Lenore J; Hescott, Benjamin J

    2013-01-18

    New technology has resulted in high-throughput screens for pairwise genetic interactions in yeast and other model organisms. For each pair in a collection of non-essential genes, an epistasis score is obtained, representing how much sicker (or healthier) the double-knockout organism will be compared to what would be expected from the sickness of the component single knockouts. Recent algorithmic work has identified graph-theoretic patterns in this data that can indicate functional modules, and even sets of genes that may occur in compensatory pathways, such as a BPM-type schema first introduced by Kelley and Ideker. However, to date, any algorithms for finding such patterns in the data were implemented internally, with no software being made publically available. Genecentric is a new package that implements a parallelized version of the Leiserson et al. algorithm (J Comput Biol 18:1399-1409, 2011) for generating generalized BPMs from high-throughput genetic interaction data. Given a matrix of weighted epistasis values for a set of double knock-outs, Genecentric returns a list of generalized BPMs that may represent compensatory pathways. Genecentric also has an extension, GenecentricGO, to query FuncAssociate (Bioinformatics 25:3043-3044, 2009) to retrieve GO enrichment statistics on generated BPMs. Python is the only dependency, and our web site provides working examples and documentation. We find that Genecentric can be used to find coherent functional and perhaps compensatory gene sets from high throughput genetic interaction data. Genecentric is made freely available for download under the GPLv2 from http://bcb.cs.tufts.edu/genecentric.

  12. High throughput automated microbial bioreactor system used for clone selection and rapid scale-down process optimization.

    Science.gov (United States)

    Velez-Suberbie, M Lourdes; Betts, John P J; Walker, Kelly L; Robinson, Colin; Zoro, Barney; Keshavarz-Moore, Eli

    2018-01-01

    High throughput automated fermentation systems have become a useful tool in early bioprocess development. In this study, we investigated a 24 x 15 mL single use microbioreactor system, ambr 15f, designed for microbial culture. We compared the fed-batch growth and production capabilities of this system for two Escherichia coli strains, BL21 (DE3) and MC4100, and two industrially relevant molecules, hGH and scFv. In addition, different carbon sources were tested using bolus, linear or exponential feeding strategies, showing the capacity of the ambr 15f system to handle automated feeding. We used power per unit volume (P/V) as a scale criterion to compare the ambr 15f with 1 L stirred bioreactors which were previously scaled-up to 20 L with a different biological system, thus showing a potential 1,300 fold scale comparability in terms of both growth and product yield. By exposing the cells grown in the ambr 15f system to a level of shear expected in an industrial centrifuge, we determined that the cells are as robust as those from a bench scale bioreactor. These results provide evidence that the ambr 15f system is an efficient high throughput microbial system that can be used for strain and molecule selection as well as rapid scale-up. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 34:58-68, 2018. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  13. Yield strength of molybdenum, tantalum and tungsten at high strain rates and very high temperatures

    International Nuclear Information System (INIS)

    Škoro, G.P.; Bennett, J.R.J.; Edgecock, T.R.; Booth, C.N.

    2012-01-01

    Highlights: ► New experimental data on the yield strength of molybdenum, tantalum and tungsten. ► High strain rate effects at record high temperatures (up to 2700 K). ► Test of the consistency of the Zerilli–Armstrong model at very high temperatures. - Abstract: Recently reported results of the high strain rate, high temperature measurements of the yield strength of tantalum and tungsten have been analyzed along with new experimental results on the yield strength of molybdenum. Thin wires are subjected to high stress by passing a short, fast, high current pulse through a thin wire; the amplitude of the current governs the stress and the repetition rate of the pulses determines the temperature of the wire. The highest temperatures reached in the experiments were 2100 °C (for molybdenum), 2250 °C (for tantalum) and 2450 °C (for tungsten). The strain-rates in the tests were in the range from 500 to 1500 s −1 . The parameters for the constitutive equation developed by Zerilli and Armstrong have been determined from the experimental data and the results have been compared with the data obtained at lower temperatures. An exceptionally good fit is obtained for the deformation of tungsten.

  14. DRABAL: novel method to mine large high-throughput screening assays using Bayesian active learning

    KAUST Repository

    Soufan, Othman; Ba Alawi, Wail; Afeef, Moataz A.; Essack, Magbubah; Kalnis, Panos; Bajic, Vladimir B.

    2016-01-01

    Mining high-throughput screening (HTS) assays is key for enhancing decisions in the area of drug repositioning and drug discovery. However, many challenges are encountered in the process of developing suitable and accurate methods

  15. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an

  16. Discovery of viruses and virus-like pathogens in pistachio using high-throughput sequencing

    Science.gov (United States)

    Pistachio (Pistacia vera L.) trees from the National Clonal Germplasm Repository (NCGR) and orchards in California were surveyed for viruses and virus-like agents by high-throughput sequencing (HTS). Analyses of 60 trees including clonal UCB-1 hybrid rootstock (P. atlantica × P. integerrima) identif...

  17. Real-time, high-throughput measurements of peptide-MHC-I dissociation using a scintillation proximity assay

    DEFF Research Database (Denmark)

    Harndahl, Mikkel; Rasmussen, Michael; Røder, Gustav Andreas

    2011-01-01

    and it is well suited for high-throughput screening. To exemplify this, we screened a panel of 384 high-affinity peptides binding to the MHC class I molecule, HLA-A*02:01, and observed the rates of dissociation that ranged from 0.1h to 46h depending on the peptide used.......Efficient presentation of peptide-MHC class I complexes to immune T cells depends upon stable peptide-MHC class I interactions. Theoretically, determining the rate of dissociation of a peptide-MHC class I complexes is straightforward; in practical terms, however, generating the accurate and closely...... timed data needed to determine the rate of dissociation is not simple. Ideally, one should use a homogenous assay involving an inexhaustible and label-free assay principle. Here, we present a homogenous, high-throughput peptide-MHC class I dissociation assay, which by and large fulfill these ideal...

  18. Meeting Report: High-Throughput Technologies for In Vivo Imaging Agents

    Directory of Open Access Journals (Sweden)

    Robert J. Gillies

    2005-04-01

    Full Text Available Combinatorial chemistry and high-throughput screening have become standard tools for discovering new drug candidates with suitable pharmacological properties. Now, those same technologies are starting to be applied to the problem of discovering novel in vivo imaging agents. Important differences in the biological and pharmacological properties needed for imaging agents, compared to those for a therapeutic agent, require new screening methods that emphasize those characteristics, such as optimized residence time and tissue specificity, that make for a good imaging agent candidate.

  19. High-throughput screening assay of hepatitis C virus helicase inhibitors using fluorescence-quenching phenomenon

    International Nuclear Information System (INIS)

    Tani, Hidenori; Akimitsu, Nobuyoshi; Fujita, Osamu; Matsuda, Yasuyoshi; Miyata, Ryo; Tsuneda, Satoshi; Igarashi, Masayuki; Sekiguchi, Yuji; Noda, Naohiro

    2009-01-01

    We have developed a novel high-throughput screening assay of hepatitis C virus (HCV) nonstructural protein 3 (NS3) helicase inhibitors using the fluorescence-quenching phenomenon via photoinduced electron transfer between fluorescent dyes and guanine bases. We prepared double-stranded DNA (dsDNA) with a 5'-fluorescent-dye (BODIPY FL)-labeled strand hybridized with a complementary strand, the 3'-end of which has guanine bases. When dsDNA is unwound by helicase, the dye emits fluorescence owing to its release from the guanine bases. Our results demonstrate that this assay is suitable for quantitative assay of HCV NS3 helicase activity and useful for high-throughput screening for inhibitors. Furthermore, we applied this assay to the screening for NS3 helicase inhibitors from cell extracts of microorganisms, and found several cell extracts containing potential inhibitors.

  20. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    BACKGROUND: Unbiased flow cytometry-based methods have become the technique of choice in many laboratories for high-throughput, accurate assessments of malaria parasites in bioassays. A method to quantify live parasites based on mitotracker red CMXRos was recently described but consistent...... distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...... for enumerating live parasites in bioassays was developed. The technique was applied to estimate the specific growth inhibition index (SGI) in the antibody-dependent cellular inhibition (ADCI) assay and compared to parasite quantification by microscopy and mitotracker red staining. The Bland-Altman analysis...

  1. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    Directory of Open Access Journals (Sweden)

    Prasanth VP

    2006-08-01

    Full Text Available Abstract Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat genotyping data from the legume (chickpea, groundnut and pigeonpea and cereal (sorghum and millets crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping

  2. Evaluation of Simple and Inexpensive High-Throughput Methods for Phytic Acid Determination

    DEFF Research Database (Denmark)

    Raboy, Victor; Johnson, Amy; Bilyeu, Kristin

    2017-01-01

    High-throughput/low-cost/low-tech methods for phytic acid determination that are sufficiently accurate and reproducible would be of value in plant genetics, crop breeding and in the food and feed industries. Variants of two candidate methods, those described by Vaintraub and Lapteva (Anal Biochem...... and legume flours regardless of endogenous phytic acid levels or matrix constituents....

  3. High-throughput phenotyping of plant resistance to aphids by automated video tracking.

    Science.gov (United States)

    Kloth, Karen J; Ten Broeke, Cindy Jm; Thoen, Manus Pm; Hanhart-van den Brink, Marianne; Wiegers, Gerrie L; Krips, Olga E; Noldus, Lucas Pjj; Dicke, Marcel; Jongsma, Maarten A

    2015-01-01

    Piercing-sucking insects are major vectors of plant viruses causing significant yield losses in crops. Functional genomics of plant resistance to these insects would greatly benefit from the availability of high-throughput, quantitative phenotyping methods. We have developed an automated video tracking platform that quantifies aphid feeding behaviour on leaf discs to assess the level of plant resistance. Through the analysis of aphid movement, the start and duration of plant penetrations by aphids were estimated. As a case study, video tracking confirmed the near-complete resistance of lettuce cultivar 'Corbana' against Nasonovia ribisnigri (Mosely), biotype Nr:0, and revealed quantitative resistance in Arabidopsis accession Co-2 against Myzus persicae (Sulzer). The video tracking platform was benchmarked against Electrical Penetration Graph (EPG) recordings and aphid population development assays. The use of leaf discs instead of intact plants reduced the intensity of the resistance effect in video tracking, but sufficiently replicated experiments resulted in similar conclusions as EPG recordings and aphid population assays. One video tracking platform could screen 100 samples in parallel. Automated video tracking can be used to screen large plant populations for resistance to aphids and other piercing-sucking insects.

  4. In-Field High-Throughput Phenotyping of Cotton Plant Height Using LiDAR

    OpenAIRE

    Shangpeng Sun; Changying Li; Andrew H. Paterson

    2017-01-01

    A LiDAR-based high-throughput phenotyping (HTP) system was developed for cotton plant phenotyping in the field. The HTP system consists of a 2D LiDAR and an RTK-GPS mounted on a high clearance tractor. The LiDAR scanned three rows of cotton plots simultaneously from the top and the RTK-GPS was used to provide the spatial coordinates of the point cloud during data collection. Configuration parameters of the system were optimized to ensure the best data quality. A height profile for each plot w...

  5. High-throughput and low-latency network communication with NetIO

    CERN Document Server

    AUTHOR|(CDS)2088631; The ATLAS collaboration

    2017-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs, but it requires a non-negligible effort and expert knowledge. At the same time, message services like ZeroMQ have gained popularity in the HEP community. They allow building distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on top of Infiniband and OmniPath...

  6. High-throughput screening assay used in pharmacognosy: Selection, optimization and validation of methods of enzymatic inhibition by UV-visible spectrophotometry

    Directory of Open Access Journals (Sweden)

    Graciela Granados-Guzmán

    2014-02-01

    Full Text Available In research laboratories of both organic synthesis and extraction of natural products, every day a lot of products that can potentially introduce some biological activity are obtained. Therefore it is necessary to have in vitro assays, which provide reliable information for further evaluation in in vivo systems. From this point of view, in recent years has intensified the use of high-throughput screening assays. Such trials should be optimized and validated for accurate and precise results, i.e. reliable. The present review addresses the steps needed to develop and validate bioanalytical methods, emphasizing UV-Visible spectrophotometry as detection system. Particularly focuses on the selection of the method, the optimization to determine the best experimental conditions, validation, implementation of optimized and validated method to real samples, and finally maintenance and possible transfer it to a new laboratory.

  7. Quantitative high throughput analytics to support polysaccharide production process development.

    Science.gov (United States)

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  8. Complexity optimization and high-throughput low-latency hardware implementation of a multi-electrode spike-sorting algorithm.

    Science.gov (United States)

    Dragas, Jelena; Jackel, David; Hierlemann, Andreas; Franke, Felix

    2015-03-01

    Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction.

  9. Reliability and Failure in NASA Missions: Blunders, Normal Accidents, High Reliability, Bad Luck

    Science.gov (United States)

    Jones, Harry W.

    2015-01-01

    NASA emphasizes crew safety and system reliability but several unfortunate failures have occurred. The Apollo 1 fire was mistakenly unanticipated. After that tragedy, the Apollo program gave much more attention to safety. The Challenger accident revealed that NASA had neglected safety and that management underestimated the high risk of shuttle. Probabilistic Risk Assessment was adopted to provide more accurate failure probabilities for shuttle and other missions. NASA's "faster, better, cheaper" initiative and government procurement reform led to deliberately dismantling traditional reliability engineering. The Columbia tragedy and Mars mission failures followed. Failures can be attributed to blunders, normal accidents, or bad luck. Achieving high reliability is difficult but possible.

  10. Sources of PCR-induced distortions in high-throughput sequencing data sets

    Science.gov (United States)

    Kebschull, Justus M.; Zador, Anthony M.

    2015-01-01

    PCR permits the exponential and sequence-specific amplification of DNA, even from minute starting quantities. PCR is a fundamental step in preparing DNA samples for high-throughput sequencing. However, there are errors associated with PCR-mediated amplification. Here we examine the effects of four important sources of error—bias, stochasticity, template switches and polymerase errors—on sequence representation in low-input next-generation sequencing libraries. We designed a pool of diverse PCR amplicons with a defined structure, and then used Illumina sequencing to search for signatures of each process. We further developed quantitative models for each process, and compared predictions of these models to our experimental data. We find that PCR stochasticity is the major force skewing sequence representation after amplification of a pool of unique DNA amplicons. Polymerase errors become very common in later cycles of PCR but have little impact on the overall sequence distribution as they are confined to small copy numbers. PCR template switches are rare and confined to low copy numbers. Our results provide a theoretical basis for removing distortions from high-throughput sequencing data. In addition, our findings on PCR stochasticity will have particular relevance to quantification of results from single cell sequencing, in which sequences are represented by only one or a few molecules. PMID:26187991

  11. High-throughput genotyping of single nucleotide polymorphisms with rolling circle amplification

    Directory of Open Access Journals (Sweden)

    Sun Zhenyu

    2001-08-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs are the foundation of powerful complex trait and pharmacogenomic analyses. The availability of large SNP databases, however, has emphasized a need for inexpensive SNP genotyping methods of commensurate simplicity, robustness, and scalability. We describe a solution-based, microtiter plate method for SNP genotyping of human genomic DNA. The method is based upon allele discrimination by ligation of open circle probes followed by rolling circle amplification of the signal using fluorescent primers. Only the probe with a 3' base complementary to the SNP is circularized by ligation. Results SNP scoring by ligation was optimized to a 100,000 fold discrimination against probe mismatched to the SNP. The assay was used to genotype 10 SNPs from a set of 192 genomic DNA samples in a high-throughput format. Assay directly from genomic DNA eliminates the need to preamplify the target as done for many other genotyping methods. The sensitivity of the assay was demonstrated by genotyping from 1 ng of genomic DNA. We demonstrate that the assay can detect a single molecule of the circularized probe. Conclusions Compatibility with homogeneous formats and the ability to assay small amounts of genomic DNA meets the exacting requirements of automated, high-throughput SNP scoring.

  12. The Protein Maker: an automated system for high-throughput parallel purification

    International Nuclear Information System (INIS)

    Smith, Eric R.; Begley, Darren W.; Anderson, Vanessa; Raymond, Amy C.; Haffner, Taryn E.; Robinson, John I.; Edwards, Thomas E.; Duncan, Natalie; Gerdts, Cory J.; Mixon, Mark B.; Nollert, Peter; Staker, Bart L.; Stewart, Lance J.

    2011-01-01

    The Protein Maker instrument addresses a critical bottleneck in structural genomics by allowing automated purification and buffer testing of multiple protein targets in parallel with a single instrument. Here, the use of this instrument to (i) purify multiple influenza-virus proteins in parallel for crystallization trials and (ii) identify optimal lysis-buffer conditions prior to large-scale protein purification is described. The Protein Maker is an automated purification system developed by Emerald BioSystems for high-throughput parallel purification of proteins and antibodies. This instrument allows multiple load, wash and elution buffers to be used in parallel along independent lines for up to 24 individual samples. To demonstrate its utility, its use in the purification of five recombinant PB2 C-terminal domains from various subtypes of the influenza A virus is described. Three of these constructs crystallized and one diffracted X-rays to sufficient resolution for structure determination and deposition in the Protein Data Bank. Methods for screening lysis buffers for a cytochrome P450 from a pathogenic fungus prior to upscaling expression and purification are also described. The Protein Maker has become a valuable asset within the Seattle Structural Genomics Center for Infectious Disease (SSGCID) and hence is a potentially valuable tool for a variety of high-throughput protein-purification applications

  13. HTP-OligoDesigner: An Online Primer Design Tool for High-Throughput Gene Cloning and Site-Directed Mutagenesis.

    Science.gov (United States)

    Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor

    2016-01-01

    Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.

  14. High Throughput System for Plant Height and Hyperspectral Measurement

    Science.gov (United States)

    Zhao, H.; Xu, L.; Jiang, H.; Shi, S.; Chen, D.

    2018-04-01

    Hyperspectral and three-dimensional measurement can obtain the intrinsic physicochemical properties and external geometrical characteristics of objects, respectively. Currently, a variety of sensors are integrated into a system to collect spectral and morphological information in agriculture. However, previous experiments were usually performed with several commercial devices on a single platform. Inadequate registration and synchronization among instruments often resulted in mismatch between spectral and 3D information of the same target. And narrow field of view (FOV) extends the working hours in farms. Therefore, we propose a high throughput prototype that combines stereo vision and grating dispersion to simultaneously acquire hyperspectral and 3D information.

  15. HIGH THROUGHPUT SYSTEM FOR PLANT HEIGHT AND HYPERSPECTRAL MEASUREMENT

    Directory of Open Access Journals (Sweden)

    H. Zhao

    2018-04-01

    Full Text Available Hyperspectral and three-dimensional measurement can obtain the intrinsic physicochemical properties and external geometrical characteristics of objects, respectively. Currently, a variety of sensors are integrated into a system to collect spectral and morphological information in agriculture. However, previous experiments were usually performed with several commercial devices on a single platform. Inadequate registration and synchronization among instruments often resulted in mismatch between spectral and 3D information of the same target. And narrow field of view (FOV extends the working hours in farms. Therefore, we propose a high throughput prototype that combines stereo vision and grating dispersion to simultaneously acquire hyperspectral and 3D information.

  16. Developing a novel fiber optic fluorescence device for multiplexed high-throughput cytotoxic screening.

    Science.gov (United States)

    Lee, Dennis; Barnes, Stephen

    2010-01-01

    The need for new pharmacological agents is unending. Yet the drug discovery process has changed substantially over the past decade and continues to evolve in response to new technologies. There is presently a high demand to reduce discovery time by improving specific lab disciplines and developing new technology platforms in the area of cell-based assay screening. Here we present the developmental concept and early stage testing of the Ab-Sniffer, a novel fiber optic fluorescence device for high-throughput cytotoxicity screening using an immobilized whole cell approach. The fused silica fibers are chemically functionalized with biotin to provide interaction with fluorescently labeled, streptavidin functionalized alginate-chitosan microspheres. The microspheres are also functionalized with Concanavalin A to facilitate binding to living cells. By using lymphoma cells and rituximab in an adaptation of a well-known cytotoxicity protocol we demonstrate the utility of the Ab-Sniffer for functional screening of potential drug compounds rather than indirect, non-functional screening via binding assay. The platform can be extended to any assay capable of being tied to a fluorescence response including multiple target cells in each well of a multi-well plate for high-throughput screening.

  17. A high-throughput fluorescence resonance energy transfer (FRET)-based endothelial cell apoptosis assay and its application for screening vascular disrupting agents

    International Nuclear Information System (INIS)

    Zhu, Xiaoming; Fu, Afu; Luo, Kathy Qian

    2012-01-01

    Highlights: ► An endothelial cell apoptosis assay using FRET-based biosensor was developed. ► The fluorescence of the cells changed from green to blue during apoptosis. ► This method was developed into a high-throughput assay in 96-well plates. ► This assay was applied to screen vascular disrupting agents. -- Abstract: In this study, we developed a high-throughput endothelial cell apoptosis assay using a fluorescence resonance energy transfer (FRET)-based biosensor. After exposure to apoptotic inducer UV-irradiation or anticancer drugs such as paclitaxel, the fluorescence of the cells changed from green to blue. We developed this method into a high-throughput assay in 96-well plates by measuring the emission ratio of yellow fluorescent protein (YFP) to cyan fluorescent protein (CFP) to monitor the activation of a key protease, caspase-3, during apoptosis. The Z′ factor for this assay was above 0.5 which indicates that this assay is suitable for a high-throughput analysis. Finally, we applied this functional high-throughput assay for screening vascular disrupting agents (VDA) which could induce endothelial cell apoptosis from our in-house compounds library and dioscin was identified as a hit. As this assay allows real time and sensitive detection of cell apoptosis, it will be a useful tool for monitoring endothelial cell apoptosis in living cell situation and for identifying new VDA candidates via a high-throughput screening.

  18. “httk”: EPA’s Tool for High Throughput Toxicokinetics (CompTox CoP)

    Science.gov (United States)

    Thousands of chemicals have been pro?led by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concentr...

  19. Versatile High-Throughput Fluorescence Assay for Monitoring Cas9 Activity.

    Science.gov (United States)

    Seamon, Kyle J; Light, Yooli K; Saada, Edwin A; Schoeniger, Joseph S; Harmon, Brooke

    2018-06-05

    The RNA-guided DNA nuclease Cas9 is now widely used for the targeted modification of genomes of human cells and various organisms. Despite the extensive use of Clustered Regularly Interspaced Palindromic Repeats (CRISPR) systems for genome engineering and the rapid discovery and engineering of new CRISPR-associated nucleases, there are no high-throughput assays for measuring enzymatic activity. The current laboratory and future therapeutic uses of CRISPR technology have a significant risk of accidental exposure or clinical off-target effects, underscoring the need for therapeutically effective inhibitors of Cas9. Here, we develop a fluorescence assay for monitoring Cas9 nuclease activity and demonstrate its utility with S. pyogenes (Spy), S. aureus (Sau), and C. jejuni (Cje) Cas9. The assay was validated by quantitatively profiling the species specificity of published anti-CRISPR (Acr) proteins, confirming the reported inhibition of Spy Cas9 by AcrIIA4 and Cje Cas9 by AcrIIC1 and no inhibition of Sau Cas9 by either anti-CRISPR. To identify drug-like inhibitors, we performed a screen of 189 606 small molecules for inhibition of Spy Cas9. Of 437 hits (0.2% hit rate), six were confirmed as Cas9 inhibitors in a direct gel electrophoresis secondary assay. The high-throughput nature of this assay makes it broadly applicable for the discovery of additional Cas9 inhibitors or the characterization of Cas9 enzyme variants.

  20. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina Lundgaard; Login, Frédéric H.; Jensen, Helene Halkjær

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacteria...