WorldWideScience

Sample records for high throughput crystallography

  1. High-throughput methods for electron crystallography.

    Science.gov (United States)

    Stokes, David L; Ubarretxena-Belandia, Iban; Gonen, Tamir; Engel, Andreas

    2013-01-01

    Membrane proteins play a tremendously important role in cell physiology and serve as a target for an increasing number of drugs. Structural information is key to understanding their function and for developing new strategies for combating disease. However, the complex physical chemistry associated with membrane proteins has made them more difficult to study than their soluble cousins. Electron crystallography has historically been a successful method for solving membrane protein structures and has the advantage of providing a native lipid environment for these proteins. Specifically, when membrane proteins form two-dimensional arrays within a lipid bilayer, electron microscopy can be used to collect images and diffraction and the corresponding data can be combined to produce a three-dimensional reconstruction, which under favorable conditions can extend to atomic resolution. Like X-ray crystallography, the quality of the structures are very much dependent on the order and size of the crystals. However, unlike X-ray crystallography, high-throughput methods for screening crystallization trials for electron crystallography are not in general use. In this chapter, we describe two alternative methods for high-throughput screening of membrane protein crystallization within the lipid bilayer. The first method relies on the conventional use of dialysis for removing detergent and thus reconstituting the bilayer; an array of dialysis wells in the standard 96-well format allows the use of a liquid-handling robot and greatly increases throughput. The second method relies on titration of cyclodextrin as a chelating agent for detergent; a specialized pipetting robot has been designed not only to add cyclodextrin in a systematic way, but to use light scattering to monitor the reconstitution process. In addition, the use of liquid-handling robots for making negatively stained grids and methods for automatically imaging samples in the electron microscope are described.

  2. Automated High Throughput Drug Target Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  3. Fluorescent Approaches to High Throughput Crystallography

    Science.gov (United States)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    We have shown that by covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and the presence of the probe at low concentrations does not affect the X-ray data quality or the crystallization behavior. The presence of the trace fluorescent label gives a number of advantages when used with high throughput crystallizations. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a dark background. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Brightly fluorescent crystals are readily found against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. We are now testing the use of high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that kinetics leading to non-structured phases may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Preliminary experiments with test proteins have resulted in the extraction of a number of crystallization conditions from screening outcomes based solely on the presence of bright fluorescent regions. Subsequent experiments will test this approach using a wider

  4. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    Science.gov (United States)

    Gaponov, Yu. A.; Igarashi, N.; Hiraki, M.; Sasajima, K.; Matsugaki, N.; Suzuki, M.; Kosuge, T.; Wakatsuki, S.

    2004-05-01

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view, create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments.

  5. A Compact X-Ray System for Support of High Throughput Crystallography

    Science.gov (United States)

    Ciszak, Ewa; Gubarev, Mikhail; Gibson, Walter M.; Joy, Marshall K.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Standard x-ray systems for crystallography rely on massive generators coupled with optics that guide X-ray beams onto the crystal sample. Optics for single-crystal diffractometry include total reflection mirrors, polycapillary optics or graded multilayer monochromators. The benefit of using polycapillary optic is that it can collect x-rays over tile greatest solid angle, and thus most efficiently, utilize the greatest portion of X-rays emitted from the Source, The x-ray generator has to have a small anode spot, and thus its size and power requirements can be substantially reduced We present the design and results from the first high flux x-ray system for crystallography that combine's a microfocus X-ray generator (40microns FWHM Spot size at a power of 45 W) and a collimating, polycapillary optic. Diffraction data collected from small test crystals with cell dimensions up to 160A (lysozyme and thaumatin) are of high quality. For example, diffraction data collected from a lysozyme crystal at RT yielded R=5.0% for data extending to 1.70A. We compare these results with measurements taken from standard crystallographic systems. Our current microfocus X-ray diffraction system is attractive for supporting crystal growth research in the standard crystallography laboratory as well as in remote, automated crystal growth laboratory. Its small volume, light-weight, and low power requirements are sufficient to have it installed in unique environments, i.e.. on-board International Space Station.

  6. Integrated crystal mounting and alignment system for high-throughput biological crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William; Yegian, Derek; Earnest, Thomas N.; Jaklevic, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.

    2005-07-19

    A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.

  7. Integrated crystal mounting and alignment system for high-throughput biological crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Nordmeyer, Robert A. (San Leandro, CA); Snell, Gyorgy P. (Richmond, CA); Cornell, Earl W. (Antioch, CA); Kolbe, William F. (Moraga, CA); Yegian, Derek T. (Oakland, CA); Earnest, Thomas N. (Berkeley, CA); Jaklevich, Joseph M. (Lafayette, CA); Cork, Carl W. (Walnut Creek, CA); Santarsiero, Bernard D. (Chicago, IL); Stevens, Raymond C. (La Jolla, CA)

    2007-09-25

    A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.

  8. High-throughput plasmid construction using homologous recombination in yeast: its mechanisms and application to protein production for X-ray crystallography.

    Science.gov (United States)

    Mizutani, Kimihiko

    2015-01-01

    Homologous recombination is a system for repairing the broken genomes of living organisms by connecting two DNA strands at their homologous sequences. Today, homologous recombination in yeast is used for plasmid construction as a substitute for traditional methods using restriction enzymes and ligases. This method has various advantages over the traditional method, including flexibility in the position of DNA insertion and ease of manipulation. Recently, the author of this review reported the construction of plasmids by homologous recombination in the methanol-utilizing yeast Pichia pastoris, which is known to be an excellent expression host for secretory proteins and membrane proteins. The method enabled high-throughput construction of expression systems of proteins using P. pastoris; the constructed expression systems were used to investigate the expression conditions of membrane proteins and to perform X-ray crystallography of secretory proteins. This review discusses the mechanisms and applications of homologous recombination, including the production of proteins for X-ray crystallography.

  9. 2dx_automator: implementation of a semiautomatic high-throughput high-resolution cryo-electron crystallography pipeline.

    Science.gov (United States)

    Scherer, Sebastian; Kowal, Julia; Chami, Mohamed; Dandey, Venkata; Arheit, Marcel; Ringler, Philippe; Stahlberg, Henning

    2014-05-01

    The introduction of direct electron detectors (DED) to cryo-electron microscopy has tremendously increased the signal-to-noise ratio (SNR) and quality of the recorded images. We discuss the optimal use of DEDs for cryo-electron crystallography, introduce a new automatic image processing pipeline, and demonstrate the vast improvement in the resolution achieved by the use of both together, especially for highly tilted samples. The new processing pipeline (now included in the software package 2dx) exploits the high SNR and frame readout frequency of DEDs to automatically correct for beam-induced sample movement, and reliably processes individual crystal images without human interaction as data are being acquired. A new graphical user interface (GUI) condenses all information required for quality assessment in one window, allowing the imaging conditions to be verified and adjusted during the data collection session. With this new pipeline an automatically generated unit cell projection map of each recorded 2D crystal is available less than 5 min after the image was recorded. The entire processing procedure yielded a three-dimensional reconstruction of the 2D-crystallized ion-channel membrane protein MloK1 with a much-improved resolution of 5Å in-plane and 7Å in the z-direction, within 2 days of data acquisition and simultaneous processing. The results obtained are superior to those delivered by conventional photographic film-based methodology of the same sample, and demonstrate the importance of drift-correction.

  10. Discovery of novel BRD4 inhibitors by high-throughput screening, crystallography, and cell-based assays.

    Science.gov (United States)

    Sun, Zhongya; Zhang, Hao; Chen, Zhifeng; Xie, Yiqian; Jiang, Hao; Chen, Limin; Ding, Hong; Zhang, Yuanyuan; Jiang, Hualiang; Zheng, Mingyue; Luo, Cheng

    2017-03-09

    As an epigenetic reader, BRD4 regulates the transcription of important downstream genes that are essential for the survival of tumor cells. Small molecular inhibitors targeting the first bromodomain of BRD4 (BRD4-BD1) have showed promising potentials in the therapies of BRD4-related cancers. Through AlphaScreen-based high-throughput screening assay, a novel small molecular inhibitor was identified, and named DCBD-005, which inhibited the binding between BRD4-BD1 and acetylated lysines with an IC50 value of 0.81±0.03μM. The compound DCBD-005 effectively inhibited the viability, caused cell cycle arrest, and induced apoptosis in human leukemia MV4-11 cells. Moreover, the crystal structure of compound DCBD-005 with the BRD4-BD1 was determined at 1.72Å resolution, which revealed the binding mechanism of the leading compound, and also provided solid basis for further structure-based optimization. These results indicated that this novel BRD4-BD1 inhibitor DCBD-005 is promising to be developed into a drug candidate in the treatment of BRD4-related diseases.

  11. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  12. High throughput drug profiling

    OpenAIRE

    Entzeroth, Michael; Chapelain, Béatrice; Guilbert, Jacques; Hamon, Valérie

    2000-01-01

    High throughput screening has significantly contributed to advances in drug discovery. The great increase in the number of samples screened has been accompanied by increases in costs and in the data required for the investigated compounds. High throughput profiling addresses the issues of compound selectivity and specificity. It combines conventional screening with data mining technologies to give a full set of data, enabling development candidates to be more fully compared.

  13. A high-pressure MWPC detector for crystallography

    DEFF Research Database (Denmark)

    Ortuno-Prados, F.; Bazzano, A.; Berry, A.;

    1999-01-01

    The application of the Multi-Wire Proportional Counter (MWPC) as a potential detector for protein crystallography and other wide-angle diffraction experiments is presented. Electrostatic problems found with our large area MWPC when operated at high pressure are discussed. We suggest that a solution...

  14. Functional Sub-states by High-pressure Macromolecular Crystallography.

    Science.gov (United States)

    Dhaussy, Anne-Claire; Girard, Eric

    2015-01-01

    At the molecular level, high-pressure perturbation is of particular interest for biological studies as it allows trapping conformational substates. Moreover, within the context of high-pressure adaptation of deep-sea organisms, it allows to decipher the molecular determinants of piezophily. To provide an accurate description of structural changes produced by pressure in a macromolecular system, developments have been made to adapt macromolecular crystallography to high-pressure studies. The present chapter is an overview of results obtained so far using high-pressure macromolecular techniques, from nucleic acids to virus capsid through monomeric as well as multimeric proteins.

  15. High-resolution protein structure determination by serial femtosecond crystallography.

    Science.gov (United States)

    Boutet, Sébastien; Lomb, Lukas; Williams, Garth J; Barends, Thomas R M; Aquila, Andrew; Doak, R Bruce; Weierstall, Uwe; DePonte, Daniel P; Steinbrener, Jan; Shoeman, Robert L; Messerschmidt, Marc; Barty, Anton; White, Thomas A; Kassemeyer, Stephan; Kirian, Richard A; Seibert, M Marvin; Montanez, Paul A; Kenney, Chris; Herbst, Ryan; Hart, Philip; Pines, Jack; Haller, Gunther; Gruner, Sol M; Philipp, Hugh T; Tate, Mark W; Hromalik, Marianne; Koerner, Lucas J; van Bakel, Niels; Morse, John; Ghonsalves, Wilfred; Arnlund, David; Bogan, Michael J; Caleman, Carl; Fromme, Raimund; Hampton, Christina Y; Hunter, Mark S; Johansson, Linda C; Katona, Gergely; Kupitz, Christopher; Liang, Mengning; Martin, Andrew V; Nass, Karol; Redecke, Lars; Stellato, Francesco; Timneanu, Nicusor; Wang, Dingjie; Zatsepin, Nadia A; Schafer, Donald; Defever, James; Neutze, Richard; Fromme, Petra; Spence, John C H; Chapman, Henry N; Schlichting, Ilme

    2012-07-20

    Structure determination of proteins and other macromolecules has historically required the growth of high-quality crystals sufficiently large to diffract x-rays efficiently while withstanding radiation damage. We applied serial femtosecond crystallography (SFX) using an x-ray free-electron laser (XFEL) to obtain high-resolution structural information from microcrystals (less than 1 micrometer by 1 micrometer by 3 micrometers) of the well-characterized model protein lysozyme. The agreement with synchrotron data demonstrates the immediate relevance of SFX for analyzing the structure of the large group of difficult-to-crystallize molecules.

  16. High-pressure crystallography of periodic and aperiodic crystals.

    Science.gov (United States)

    Hejny, Clivia; Minkov, Vasily S

    2015-03-01

    More than five decades have passed since the first single-crystal X-ray diffraction experiments at high pressure were performed. These studies were applied historically to geochemical processes occurring in the Earth and other planets, but high-pressure crystallography has spread across different fields of science including chemistry, physics, biology, materials science and pharmacy. With each passing year, high-pressure studies have become more precise and comprehensive because of the development of instrumentation and software, and the systems investigated have also become more complicated. Starting with crystals of simple minerals and inorganic compounds, the interests of researchers have shifted to complicated metal-organic frameworks, aperiodic crystals and quasicrystals, molecular crystals, and even proteins and viruses. Inspired by contributions to the microsymposium 'High-Pressure Crystallography of Periodic and Aperiodic Crystals' presented at the 23rd IUCr Congress and General Assembly, the authors have tried to summarize certain recent results of single-crystal studies of molecular and aperiodic structures under high pressure. While the selected contributions do not cover the whole spectrum of high-pressure research, they demonstrate the broad diversity of novel and fascinating results and may awaken the reader's interest in this topic.

  17. High-pressure crystallography of periodic and aperiodic crystals

    Directory of Open Access Journals (Sweden)

    Clivia Hejny

    2015-03-01

    Full Text Available More than five decades have passed since the first single-crystal X-ray diffraction experiments at high pressure were performed. These studies were applied historically to geochemical processes occurring in the Earth and other planets, but high-pressure crystallography has spread across different fields of science including chemistry, physics, biology, materials science and pharmacy. With each passing year, high-pressure studies have become more precise and comprehensive because of the development of instrumentation and software, and the systems investigated have also become more complicated. Starting with crystals of simple minerals and inorganic compounds, the interests of researchers have shifted to complicated metal–organic frameworks, aperiodic crystals and quasicrystals, molecular crystals, and even proteins and viruses. Inspired by contributions to the microsymposium `High-Pressure Crystallography of Periodic and Aperiodic Crystals' presented at the 23rd IUCr Congress and General Assembly, the authors have tried to summarize certain recent results of single-crystal studies of molecular and aperiodic structures under high pressure. While the selected contributions do not cover the whole spectrum of high-pressure research, they demonstrate the broad diversity of novel and fascinating results and may awaken the reader's interest in this topic.

  18. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  19. High-efficiency screening of monoclonal antibodies for membrane protein crystallography.

    Directory of Open Access Journals (Sweden)

    Hyun-Ho Lim

    Full Text Available Determination of crystal structures of membrane proteins is often limited by difficulties obtaining crystals diffracting to high resolution. Co-crystallization with Fab fragments of monoclonal antibodies has been reported to improve diffraction of membrane proteins crystals. However, it is not simple to generate useful monoclonal antibodies for membrane protein crystallography. In this report, we present an optimized process for efficient screening from immunization to final validation of monoclonal antibody for membrane protein crystallography.

  20. New methodologies at PF AR-NW12A: the implementation of high-pressure macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Chavas, Leonard Michel Gabriel, E-mail: leonard.chavas@kek.jp [PF/IMSS/KEK, 1-1 Oho, Tsukuba, Ibaraki 300-0801 (Japan); Nagae, Tadayuki [Nagoya University, Nagoya, Aichi 464-8603 (Japan); Nagoya University, Nagoya, Aichi 464-8603 (Japan); Yamada, Hiroyuki [Nagoya University, Nagoya, Aichi 464-8603 (Japan); Watanabe, Nobuhisa [Nagoya University, Nagoya, Aichi 464-8603 (Japan); Nagoya University, Furo-cho Chikusa-ku, Nagoya, Aichi 464-8603 (Japan); Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro [PF/IMSS/KEK, 1-1 Oho, Tsukuba, Ibaraki 300-0801 (Japan)

    2013-11-01

    The evolution of AR-NW12A into a multi-purpose end-station with optional high-pressure crystallography is described. The macromolecular crystallography (MX) beamline AR-NW12A is evolving from its original design of high-throughput crystallography to a multi-purpose end-station. Among the various options to be implemented, great efforts were made in making available high-pressure MX (HPMX) at the beamline. High-pressure molecular biophysics is a developing field that attracts the interest of a constantly growing scientific community. A plethora of activities can benefit from high pressure, and investigations have been performed on its applicability to study multimeric complex assemblies, compressibility of proteins and their crystals, macromolecules originating from extremophiles, or even the trapping of higher-energy conformers for molecules of biological interest. Recent studies using HPMX showed structural hydrostatic-pressure-induced changes in proteins. The conformational modifications could explain the enzymatic mechanism differences between proteins of the same family, living at different environmental pressures, as well as the initial steps in the pressure-denaturation process that have been attributed to water penetration into the protein interior. To facilitate further HPMX, while allowing access to various individualized set-ups and experiments, the AR-NW12A sample environment has been revisited. Altogether, the newly added implementations will bring a fresh breath of life to AR-NW12A and allow the MX community to experiment in a larger set of fields related to structural biology.

  1. High-throughput crystallization screening.

    Science.gov (United States)

    Skarina, Tatiana; Xu, Xiaohui; Evdokimova, Elena; Savchenko, Alexei

    2014-01-01

    Protein structure determination by X-ray crystallography is dependent on obtaining a single protein crystal suitable for diffraction data collection. Due to this requirement, protein crystallization represents a key step in protein structure determination. The conditions for protein crystallization have to be determined empirically for each protein, making this step also a bottleneck in the structure determination process. Typical protein crystallization practice involves parallel setup and monitoring of a considerable number of individual protein crystallization experiments (also called crystallization trials). In these trials the aliquots of purified protein are mixed with a range of solutions composed of a precipitating agent, buffer, and sometimes an additive that have been previously successful in prompting protein crystallization. The individual chemical conditions in which a particular protein shows signs of crystallization are used as a starting point for further crystallization experiments. The goal is optimizing the formation of individual protein crystals of sufficient size and quality to make them suitable for diffraction data collection. Thus the composition of the primary crystallization screen is critical for successful crystallization.Systematic analysis of crystallization experiments carried out on several hundred proteins as part of large-scale structural genomics efforts allowed the optimization of the protein crystallization protocol and identification of a minimal set of 96 crystallization solutions (the "TRAP" screen) that, in our experience, led to crystallization of the maximum number of proteins.

  2. Celebrating the International Year of Crystallography with a Wisconsin High School Crystal Growing Competition

    Science.gov (United States)

    Guzei, Ilia A.

    2014-01-01

    In honor of the 2014 International Year of Crystallography, the first Wisconsin Crystal Growing Competition was successfully organized and conducted. High school students from 26 schools across the state competed for prizes by growing large crystals of CuSO[subscript4]·5(H[subscript2]O). This paper describes how the event was planned and carried…

  3. Celebrating the International Year of Crystallography with a Wisconsin High School Crystal Growing Competition

    Science.gov (United States)

    Guzei, Ilia A.

    2014-01-01

    In honor of the 2014 International Year of Crystallography, the first Wisconsin Crystal Growing Competition was successfully organized and conducted. High school students from 26 schools across the state competed for prizes by growing large crystals of CuSO[subscript4]·5(H[subscript2]O). This paper describes how the event was planned and carried…

  4. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  5. Data Management for High-Throughput Genomics

    CERN Document Server

    Roehm, Uwe

    2009-01-01

    Today's sequencing technology allows sequencing an individual genome within a few weeks for a fraction of the costs of the original Human Genome project. Genomics labs are faced with dozens of TB of data per week that have to be automatically processed and made available to scientists for further analysis. This paper explores the potential and the limitations of using relational database systems as the data processing platform for high-throughput genomics. In particular, we are interested in the storage management for high-throughput sequence data and in leveraging SQL and user-defined functions for data analysis inside a database system. We give an overview of a database design for high-throughput genomics, how we used a SQL Server database in some unconventional ways to prototype this scenario, and we will discuss some initial findings about the scalability and performance of such a more database-centric approach.

  6. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    Science.gov (United States)

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  7. High-Throughput Contact Flow Lithography.

    Science.gov (United States)

    Le Goff, Gaelle C; Lee, Jiseok; Gupta, Ankur; Hill, William Adam; Doyle, Patrick S

    2015-10-01

    High-throughput fabrication of graphically encoded hydrogel microparticles is achieved by combining flow contact lithography in a multichannel microfluidic device and a high capacity 25 mm LED UV source. Production rates of chemically homogeneous particles are improved by two orders of magnitude. Additionally, the custom-built contact lithography instrument provides an affordable solution for patterning complex microstructures on surfaces.

  8. High-throughput computing in the sciences.

    Science.gov (United States)

    Morgan, Mark; Grimshaw, Andrew

    2009-01-01

    While it is true that the modern computer is many orders of magnitude faster than that of yesteryear; this tremendous growth in CPU clock rates is now over. Unfortunately, however, the growth in demand for computational power has not abated; whereas researchers a decade ago could simply wait for computers to get faster, today the only solution to the growing need for more powerful computational resource lies in the exploitation of parallelism. Software parallelization falls generally into two broad categories--"true parallel" and high-throughput computing. This chapter focuses on the latter of these two types of parallelism. With high-throughput computing, users can run many copies of their software at the same time across many different computers. This technique for achieving parallelism is powerful in its ability to provide high degrees of parallelism, yet simple in its conceptual implementation. This chapter covers various patterns of high-throughput computing usage and the skills and techniques necessary to take full advantage of them. By utilizing numerous examples and sample codes and scripts, we hope to provide the reader not only with a deeper understanding of the principles behind high-throughput computing, but also with a set of tools and references that will prove invaluable as she explores software parallelism with her own software applications and research.

  9. INTRODUCTION OF THE HIGH THROUGHPUT SCREENING SYSTEM

    Institute of Scientific and Technical Information of China (English)

    李元

    2001-01-01

    In this article, we introduce the system of high throughput screening (HTS). Its role in new drug study and current development is described. The relationship between research achievements of genome study and new type screening model of new drugs is emphasized. The personal opinions of current problems about HTS study in China are raised.

  10. INTRODUCTION OF THE HIGH THROUGHPUT SCREENING SYSTEM

    Institute of Scientific and Technical Information of China (English)

    李元

    2001-01-01

    In this article, we introduce the system of high throughput screening (HTS). Its role in new drug study and current development is described. The relationship between research achievements of genome study and new type screening model of new drugs is emphasized. The personal opinions of current problems about HTS study in China are raised.``

  11. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Romao, Joana; Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for lar

  12. High-throughput diffusion multiples

    Directory of Open Access Journals (Sweden)

    J.-C. Zhao

    2005-10-01

    Full Text Available A diffusion multiple is an assembly of three or more different metal blocks, in intimate interfacial contact, subjected to high temperature to allow thermal interdiffusion to create solid-solution compositions and intermetallic compounds. Using microscale probes, composition-structure-phase-property relationships can be established with an efficiency orders of magnitude higher than conventional one-composition-at-a-time practice. For structural materials, such relationships include phase diagrams, diffusion coefficients, precipitation kinetics, solution strengthening effects, and precipitation strengthening effects. Many microscale probes can also be used to study several materials phenomena. For instance, microscale thermal conductivity measurements can be used to study order-disordering transformation, site preference in intermetallic compounds, solid-solution effect on conductivity, and compositional point defect propensity. This article will use a few examples to illustrate the capabilities and developmental needs of this approach.

  13. Crystallography of decahedral and icosahedral particles. II - High symmetry orientations

    Science.gov (United States)

    Yang, C. Y.; Yacaman, M. J.; Heinemann, K.

    1979-01-01

    Based on the exact crystal structure of decahedral and icosahedral particles, high energy electron diffraction patterns and image profiles have been derived for various high symmetry orientations of the particles with respect to the incident beam. These results form a basis for the identification of small metal particle structures with advanced methods of transmission electron microscopy.

  14. High throughput-per-footprint inertial focusing.

    Science.gov (United States)

    Ciftlik, Ata Tuna; Ettori, Maxime; Gijs, Martin A M

    2013-08-26

    Matching the scale of microfluidic flow systems with that of microelectronic chips for realizing monolithically integrated systems still needs to be accomplished. However, this is appealing only if such re-scaling does not compromise the fluidic throughput. This is related to the fact that the cost of microelectronic circuits primarily depends on the layout footprint, while the performance of many microfluidic systems, like flow cytometers, is measured by the throughput. The simple operation of inertial particle focusing makes it a promising technique for use in such integrated flow cytometer applications, however, microfluidic footprints demonstrated so far preclude monolithic integration. Here, the scaling limits of throughput-per-footprint (TPFP) in using inertial focusing are explored by studying the interplay between theory, the effect of channel Reynolds numbers up to 1500 on focusing, the entry length for the laminar flow to develop, and pressure resistance of the microchannels. Inertial particle focusing is demonstrated with a TPFP up to 0.3 L/(min cm²) in high aspect-ratio rectangular microfluidic channels that are readily fabricated with a post-CMOS integratable process, suggesting at least a 100-fold improvement compared to previously demonstrated techniques. Not only can this be an enabling technology for realizing cost-effective monolithically integrated flow cytometry devices, but the methodology represented here can also open perspectives for miniaturization of many biomedical microfluidic applications requiring monolithic integration with microelectronics without compromising the throughput.

  15. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  16. Advances in structural and functional analysis of membrane proteins by electron crystallography.

    Science.gov (United States)

    Wisedchaisri, Goragot; Reichow, Steve L; Gonen, Tamir

    2011-10-12

    Electron crystallography is a powerful technique for the study of membrane protein structure and function in the lipid environment. When well-ordered two-dimensional crystals are obtained the structure of both protein and lipid can be determined and lipid-protein interactions analyzed. Protons and ionic charges can be visualized by electron crystallography and the protein of interest can be captured for structural analysis in a variety of physiologically distinct states. This review highlights the strengths of electron crystallography and the momentum that is building up in automation and the development of high throughput tools and methods for structural and functional analysis of membrane proteins by electron crystallography.

  17. Affinity Crystallography: A New Approach to Extracting High-Affinity Enzyme Inhibitors from Natural Extracts.

    Science.gov (United States)

    Aguda, Adeleke H; Lavallee, Vincent; Cheng, Ping; Bott, Tina M; Meimetis, Labros G; Law, Simon; Nguyen, Nham T; Williams, David E; Kaleta, Jadwiga; Villanueva, Ivan; Davies, Julian; Andersen, Raymond J; Brayer, Gary D; Brömme, Dieter

    2016-08-26

    Natural products are an important source of novel drug scaffolds. The highly variable and unpredictable timelines associated with isolating novel compounds and elucidating their structures have led to the demise of exploring natural product extract libraries in drug discovery programs. Here we introduce affinity crystallography as a new methodology that significantly shortens the time of the hit to active structure cycle in bioactive natural product discovery research. This affinity crystallography approach is illustrated by using semipure fractions of an actinomycetes culture extract to isolate and identify a cathepsin K inhibitor and to compare the outcome with the traditional assay-guided purification/structural analysis approach. The traditional approach resulted in the identification of the known inhibitor antipain (1) and its new but lower potency dehydration product 2, while the affinity crystallography approach led to the identification of a new high-affinity inhibitor named lichostatinal (3). The structure and potency of lichostatinal (3) was verified by total synthesis and kinetic characterization. To the best of our knowledge, this is the first example of isolating and characterizing a potent enzyme inhibitor from a partially purified crude natural product extract using a protein crystallographic approach.

  18. Microfabricated high-throughput electronic particle detector

    Science.gov (United States)

    Wood, D. K.; Requa, M. V.; Cleland, A. N.

    2007-10-01

    We describe the design, fabrication, and use of a radio frequency reflectometer integrated with a microfluidic system, applied to the very high-throughput measurement of micron-scale particles, passing in a microfluidic channel through the sensor region. The device operates as a microfabricated Coulter counter [U.S. Patent No. 2656508 (1953)], similar to a design we have described previously, but here with significantly improved electrode geometry as well as including electronic tuning of the reflectometer; the two improvements yielding an improvement by more than a factor of 10 in the signal to noise and in the diametric discrimination of single particles. We demonstrate the high-throughput discrimination of polystyrene beads with diameters in the 4-10μm range, achieving diametric resolutions comparable to the intrinsic spread of diameters in the bead distribution, at rates in excess of 15×106beads/h.

  19. HIGH THROUGHPUT DRILLING OF TITANIUM ALLOYS

    Institute of Scientific and Technical Information of China (English)

    LI Rui; SHIH Albert Jau-Min

    2007-01-01

    The experiments of high throughput drilling of Ti-6Al-4V at 183 m/min cutting speed and 156 mm3/s material removal rate using a 4 mm diameter WC-Co spiral point drill are conducted. At this material removal rate, it took only 0.57 s to drill a hole in a 6.35 mm thick Ti plate. Supplying the cutting fluid via through-the-drill holes and the balance of cutting speed and feed have proven to be critical for drill life. An inverse heat transfer model is developed to predict the heat flux and the drill temperature distribution in drilling. A three-dimensional finite element modeling of drilling is conducted to predict the thrust force and torque. Experimental result demonstrates that, using proper machining process parameters, tool geometry, and fine-grained WC-Co tool material, the high throughput machining of Ti alloy is technically feasible.

  20. Weissenberg reflection high-energy electron diffraction for surface crystallography.

    Science.gov (United States)

    Abukawa, Tadashi; Yamazaki, Tomoyuki; Yajima, Kentaro; Yoshimura, Koji

    2006-12-15

    The principle of a Weissenberg camera is applied to surface crystallographic analysis by reflection high-energy electron diffraction. By removing inelastic electrons and measuring hundreds of patterns as a function of sample rotation angle phi, kinematical analysis can be performed over a large volume of reciprocal space. The data set is equivalent to a three-dimensional stack of Weissenberg photographs. The method is applied to analysis of an Si(111)-square root of 3 x square root of 3-Ag surface, and the structural data obtained are in excellent agreement with the known atomic structure.

  1. High-Pressure Crystallography From Fundamental Phenomena to Technological Applications

    CERN Document Server

    Boldyreva, Elena

    2010-01-01

    This book is devoted to the theme of crystallographic studies at high pressure, with emphasis on the phenomena characteristic to the compressed state of matter, as well as experimental and theoretical techniques used to study these phenomena. As a thermodynamic parameter, pressure is remarkable in many ways. In the visible universe its value spans over sixty orders of magnitude, from the non-equilibrium pressure of hydrogen in intergalactic space, to the kind of pressure encountered within neutron stars. In the laboratory, it provides the unique possibility to control the structure and properties of materials, to dramatically alter electronic properties, and to break existing, or form new chemical bonds. This agenda naturally encompasses elements of physics (properties, structure and transformations), chemistry (reactions, transport), materials science (new materials) and engineering (mechanical properties); in addition it has direct applications and implications for geology (minerals in deep Earth environmen...

  2. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  3. In situ macromolecular crystallography using microbeams.

    Science.gov (United States)

    Axford, Danny; Owen, Robin L; Aishima, Jun; Foadi, James; Morgan, Ann W; Robinson, James I; Nettleship, Joanne E; Owens, Raymond J; Moraes, Isabel; Fry, Elizabeth E; Grimes, Jonathan M; Harlos, Karl; Kotecha, Abhay; Ren, Jingshan; Sutton, Geoff; Walter, Thomas S; Stuart, David I; Evans, Gwyndaf

    2012-05-01

    Despite significant progress in high-throughput methods in macromolecular crystallography, the production of diffraction-quality crystals remains a major bottleneck. By recording diffraction in situ from crystals in their crystallization plates at room temperature, a number of problems associated with crystal handling and cryoprotection can be side-stepped. Using a dedicated goniometer installed on the microfocus macromolecular crystallography beamline I24 at Diamond Light Source, crystals have been studied in situ with an intense and flexible microfocus beam, allowing weakly diffracting samples to be assessed without a manual crystal-handling step but with good signal to noise, despite the background scatter from the plate. A number of case studies are reported: the structure solution of bovine enterovirus 2, crystallization screening of membrane proteins and complexes, and structure solution from crystallization hits produced via a high-throughput pipeline. These demonstrate the potential for in situ data collection and structure solution with microbeams.

  4. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    Energy Technology Data Exchange (ETDEWEB)

    Li, Fenglei [Iowa State Univ., Ames, IA (United States)

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  5. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  6. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  7. Achieving High Data Throughput in Research Networks

    Institute of Scientific and Technical Information of China (English)

    WarrenMatthews; LesCottrell

    2001-01-01

    After less than a year of operation ,the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB.Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon,France,and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory(LLNL),BaBar Collaborators plan to double data collection each year and export a third of the data to IN2P3.So within a few years the SLAC OC3 (155Mbps) connection will be fully utilized by file transfer to France alone.Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical.In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed.Methods for achieving the ambitious requirements will be discussed.

  8. High-throughput rod-induced electrospinning

    Science.gov (United States)

    Wu, Dezhi; Xiao, Zhiming; Teh, Kwok Siong; Han, Zhibin; Luo, Guoxi; Shi, Chuan; Sun, Daoheng; Zhao, Jinbao; Lin, Liwei

    2016-09-01

    A high throughput electrospinning process, directly from flat polymer solution surfaces induced by a moving insulating rod, has been proposed and demonstrated. Different rods made of either phenolic resin or paper with a diameter of 1-3 cm and a resistance of about 100-500 MΩ, has been successfully utilized in the process. The rod is placed approximately 10 mm above the flat polymer solution surface with a moving speed of 0.005-0.4 m s-1 this causes the solution to generate multiple liquid jets under an applied voltage of 15-60 kV for the tip-less electrospinning process. The local electric field induced by the rod can boost electrohydrodynamic instability in order to generate Taylor cones and liquid jets. Experimentally, it is found that a large rod diameter and a small solution-to-rod distance can enhance the local electrical field to reduce the magnitude of the applied voltage. In the prototype setup with poly (ethylene oxide) polymer solution, an area of 5 cm  ×  10 cm and under an applied voltage of 60 kV, the maximum throughput of nanofibers is recorded to be approximately144 g m-2 h-1.

  9. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  10. High Throughput Screening for Neurodegeneration and Complex Disease Phenotypes

    OpenAIRE

    Varma, Hemant; Lo, Donald C.; Stockwell, Brent R.

    2008-01-01

    High throughput screening (HTS) for complex diseases is challenging. This stems from the fact that complex phenotypes are difficult to adapt to rapid, high throughput assays. We describe the recent development of high throughput and high-content screens (HCS) for neurodegenerative diseases, with a focus on inherited neurodegenerative disorders, such as Huntington's disease. We describe, among others, HTS assays based on protein aggregation, neuronal death, caspase activation and mutant protei...

  11. A high-throughput neutron spectrometer

    Science.gov (United States)

    Stampfl, Anton; Noakes, Terry; Bartsch, Friedl; Bertinshaw, Joel; Veliscek-Carolan, Jessica; Nateghi, Ebrahim; Raeside, Tyler; Yethiraj, Mohana; Danilkin, Sergey; Kearley, Gordon

    2010-03-01

    A cross-disciplinary high-throughput neutron spectrometer is currently under construction at OPAL, ANSTO's open pool light-water research reactor. The spectrometer is based on the design of a Be-filter spectrometer (FANS) that is operating at the National Institute of Standards research reactor in the USA. The ANSTO filter-spectrometer will be switched in and out with another neutron spectrometer, the triple-axis spectrometer, Taipan. Thus two distinct types of neutron spectrometers will be accessible: one specialised to perform phonon dispersion analysis and the other, the filter-spectrometer, designed specifically to measure vibrational density of states. A summary of the design will be given along with a detailed ray-tracing analysis. Some preliminary results will be presented from the spectrometer.

  12. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come......-sequencing, a study of the effects on alternative RNA splicing of KO of the nonsense mediated RNA decay system in Mus, using digital gene expression and a custom-built exon-exon junction mapping pipeline is presented (article I). Evolved from this work, a Bioconductor package, spliceR, for classifying alternative...... splicing events and coding potential of isoforms from full isoform deconvolution software, such as Cufflinks (article II), is presented. Finally, a study using 5’-end RNA-seq for alternative promoter detection between healthy patients and patients with acute promyelocytic leukemia is presented (article III...

  13. C. elegans in high-throughput drug discovery

    OpenAIRE

    O’Reilly, Linda P.; Cliff J Luke; Perlmutter, David H.; Silverman, Gary A.; Pak, Stephen C.

    2013-01-01

    C. elegans has proven to be a useful model organism for investigating molecular and cellular aspects of numerous human diseases. More recently, investigators have explored the use of this organism as a tool for drug discovery. Although earlier drug screens were labor-intensive and low in throughput, recent advances in high-throughput liquid workflows, imaging platforms and data analysis software have made C. elegans a viable option for automated high-throughput drug screens. This review will ...

  14. High Throughput Direct Detection Doppler Lidar Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Lite Cycles, Inc. (LCI) proposes to develop a direct-detection Doppler lidar (D3L) technology called ELITE that improves the system optical throughput by more than...

  15. High Volume Throughput Computing: Identifying and Characterizing Throughput Oriented Workloads in Data Centers

    CERN Document Server

    Zhan, Jianfeng; Sun, Ninghui; Wang, Lei; Jia, Zhen; Luo, Chunjie

    2012-01-01

    For the first time, this paper systematically identifies three categories of throughput oriented workloads in data centers: services, data processing applications, and interactive real-time applications, whose targets are to increase the volume of throughput in terms of processed requests or data, or supported maximum number of simultaneous subscribers, respectively, and we coins a new term high volume throughput computing (in short HVC) to describe those workloads and data center systems designed for them. We characterize and compare HVC with other computing paradigms, e.g., high throughput computing, warehouse-scale computing, and cloud computing, in terms of levels, workloads, metrics, coupling degree, data scales, and number of jobs or service instances. We also preliminarily report our ongoing work on the metrics and benchmarks for HVC systems, which is the foundation of designing innovative data center systems for HVC workloads.

  16. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  17. Orthogonal NGS for High Throughput Clinical Diagnostics.

    Science.gov (United States)

    Chennagiri, Niru; White, Eric J; Frieden, Alexander; Lopez, Edgardo; Lieber, Daniel S; Nikiforov, Anastasia; Ross, Tristen; Batorsky, Rebecca; Hansen, Sherry; Lip, Va; Luquette, Lovelace J; Mauceli, Evan; Margulies, David; Milos, Patrice M; Napolitano, Nichole; Nizzari, Marcia M; Yu, Timothy; Thompson, John F

    2016-04-19

    Next generation sequencing is a transformative technology for discovering and diagnosing genetic disorders. However, high-throughput sequencing remains error-prone, necessitating variant confirmation in order to meet the exacting demands of clinical diagnostic sequencing. To address this, we devised an orthogonal, dual platform approach employing complementary target capture and sequencing chemistries to improve speed and accuracy of variant calls at a genomic scale. We combined DNA selection by bait-based hybridization followed by Illumina NextSeq reversible terminator sequencing with DNA selection by amplification followed by Ion Proton semiconductor sequencing. This approach yields genomic scale orthogonal confirmation of ~95% of exome variants. Overall variant sensitivity improves as each method covers thousands of coding exons missed by the other. We conclude that orthogonal NGS offers improvements in variant calling sensitivity when two platforms are used, better specificity for variants identified on both platforms, and greatly reduces the time and expense of Sanger follow-up, thus enabling physicians to act on genomic results more quickly.

  18. Multigrain crystallography

    DEFF Research Database (Denmark)

    Sørensen, Henning Osholm; Schmidt, Søren; Wright, Jonathan P.;

    2012-01-01

    We summarize exploratory work on multigrain crystallography. The experimental arrangement comprises a monochromatic beam, a fully illuminated sample with up to several hundred grains in transmission geometry on a rotary table and a 2D detector. Novel algorithms are presented for indexing, integra......We summarize exploratory work on multigrain crystallography. The experimental arrangement comprises a monochromatic beam, a fully illuminated sample with up to several hundred grains in transmission geometry on a rotary table and a 2D detector. Novel algorithms are presented for indexing...... of the methodology in terms of number of grains, size of unit cell and direct space resolution. First experimental results in the fields of chemistry, structural biology and time-resolved studies in photochemistry are presented. As an outlook, the concept of TotalCrystallography is introduced, defined...

  19. Preparation of 2D crystals of membrane proteins for high-resolution electron crystallography data collection.

    Science.gov (United States)

    Abeyrathne, Priyanka D; Chami, Mohamed; Pantelic, Radosav S; Goldie, Kenneth N; Stahlberg, Henning

    2010-01-01

    Electron crystallography is a powerful technique for the structure determination of membrane proteins as well as soluble proteins. Sample preparation for 2D membrane protein crystals is a crucial step, as proteins have to be prepared for electron microscopy at close to native conditions. In this review, we discuss the factors of sample preparation that are key to elucidating the atomic structure of membrane proteins using electron crystallography.

  20. High Throughput Bent-Pipe Processor Demonstrator

    Science.gov (United States)

    Tabacco, P.; Vernucci, A.; Russo, L.; Cangini, P.; Botticchio, T.; Angeletti, P.

    2008-08-01

    The work associated to this article is a study initiative sponsored by ESA/ESTEC that responds to the crucial need of developing new Satellite payload aimed at making rapid progresses in handling large amounts of data at a competitive price with respect to terrestrial one in the telecommunication field. Considering the quite limited band allowed to space communications at Ka band, reusing the same band in a large number of beams is mandatory: therefore beam-forming is the right technological answer. Technological progresses - mainly in the digital domain - also help greatly in increasing the satellite capacity. Next Satellite payload target are set in throughput range of 50Gbps. Despite the fact that the implementation of a wideband transparent processor for a high capacity communication payload is a very challenging task, Space Engineering team in the frame of this ESA study proposed an intermediate step of development for a scalable unit able to demonstrate both the capacity and flexibility objectives for different type of Wideband Beamforming antennas designs. To this aim the article describes the features of Wideband HW (analog and digital) platform purposely developed by Space Engineering in the frame of this ESA/ESTEC contract ("WDBFN" contract) with some preliminary system test results. The same platform and part of the associated SW will be used in the development and demonstration of the real payload digital front end Mux and Demux algorithms as well as the Beam Forming and on Board channel switching in frequency domain. At the time of this article writing, despite new FPGA and new ADC and DAC converters have become available as choices for wideband system implementation, the two HW platforms developed by Space Engineering, namely WDBFN ADC and DAC Boards, represent still the most performing units in terms of analog bandwidth, processing capability (in terms of FPGA module density), SERDES (SERiliazer DESerializers) external links density, integration form

  1. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.

    2016-01-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  2. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  3. Large area high-resolution CCD-based X-ray detector for macromolecular crystallography

    CERN Document Server

    Pokric, M; Jorden, A R; Cox, M P; Marshall, A; Long, P G; Moon, K; Jerram, P A; Pool, P; Nave, C; Derbyshire, G E; Helliwell, J R

    2002-01-01

    An X-ray detector system for macromolecular crystallography based on a large area charge-coupled device (CCD) sensor has been developed as part of a large research and development programme for advanced X-ray sensor technology, funded by industry and the Particle Physics and Astronomy Research Council (PPARC) in the UK. The prototype detector consists of two large area three-sides buttable charge-coupled devices (CCD 46-62 EEV), where the single CCD area is 55.3 mmx41.5 mm. Overall detector imaging area is easily extendable to 85 mmx110 mm. The detector consists of an optically coupled X-ray sensitive phosphor, skewed fibre-optic studs and CCDs. The crystallographic measurement requirements at synchrotron sources are met through a high spatial resolution (2048x1536 pixel array), high dynamic range (approx 10 sup 5), a fast readout (approx 1 s), low noise (<10e sup -) and much reduced parallax error. Additionally, the prototype detector system has been optimised by increasing its efficiency at low X-ray ene...

  4. Resolution- and throughput-enhanced spectroscopy using high-throughput computational slit

    CERN Document Server

    Kazemzadeh, Farnoud

    2016-01-01

    There exists a fundamental tradeoff between spectral resolution and the efficiency or throughput for all optical spectrometers. The primary factors affecting the spectral resolution and throughput of an optical spectrometer are the size of the entrance aperture and the optical power of the focusing element. Thus far collective optimization of the above mentioned has proven difficult. Here, we introduce the concept of high-throughput computational slits (HTCS) for improving both the effective spectral resolution and efficiency of a spectrometer. The proposed HTCS approach was experimentally validated using an optical spectrometer configured with a 200 um entrance aperture, test, and a 50 um entrance aperture, control, demonstrating improvements in spectral resolution of the spectrum by ~20% over the control spectral resolution and improvements in efficiency of > 2 times the efficiency of the largest entrance aperture used in the study while producing highly accurate spectra.

  5. HIGH THROUGHPUT OF MAP PROCESSOR USING PIPELINE WINDOW DECODING

    Directory of Open Access Journals (Sweden)

    P. Nithya

    2012-11-01

    Full Text Available Turbo codes are one of the most efficient error correcting code which approaches the Shannon limit.The high throughput in turbo decoder can be achieved by parallelizing several soft Input Soft Output(SISOunits together.In this way,multiple SISO decoders work on the same data frame at the same values and delievers soft outputs can be split into three terms like the soft channel and a priori input and the extrinsic value.The extrinsic value is used for the next iteration.The high throughput of Max-Log-MAP processor tha supports both single Binary(SBand Double-binary(DB convolutional turbo codes.Decoding of these codes however an iterative processing is requires high computation rate and latency.Thus in order to achieve high throughput and to reduce latency by using serial processing techniques.The pipeline window(PWdecoding is introduced to support arbitrary frame sizes with high throughput.

  6. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek Co. Inc. proposes to develop a high throughput, nominal 100 W Hall Effect Thruster (HET). This HET will be sized for small spacecraft (< 180 kg), including...

  7. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek is developing a high throughput nominal 100-W Hall Effect Thruster. This device is well sized for spacecraft ranging in size from several tens of kilograms to...

  8. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    Science.gov (United States)

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  9. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  10. Room-temperature serial crystallography at synchrotron X-ray sources using slowly flowing free-standing high-viscosity microstreams.

    Science.gov (United States)

    Botha, Sabine; Nass, Karol; Barends, Thomas R M; Kabsch, Wolfgang; Latz, Beatrice; Dworkowski, Florian; Foucar, Lutz; Panepucci, Ezequiel; Wang, Meitian; Shoeman, Robert L; Schlichting, Ilme; Doak, R Bruce

    2015-02-01

    Recent advances in synchrotron sources, beamline optics and detectors are driving a renaissance in room-temperature data collection. The underlying impetus is the recognition that conformational differences are observed in functionally important regions of structures determined using crystals kept at ambient as opposed to cryogenic temperature during data collection. In addition, room-temperature measurements enable time-resolved studies and eliminate the need to find suitable cryoprotectants. Since radiation damage limits the high-resolution data that can be obtained from a single crystal, especially at room temperature, data are typically collected in a serial fashion using a number of crystals to spread the total dose over the entire ensemble. Several approaches have been developed over the years to efficiently exchange crystals for room-temperature data collection. These include in situ collection in trays, chips and capillary mounts. Here, the use of a slowly flowing microscopic stream for crystal delivery is demonstrated, resulting in extremely high-throughput delivery of crystals into the X-ray beam. This free-stream technology, which was originally developed for serial femtosecond crystallography at X-ray free-electron lasers, is here adapted to serial crystallography at synchrotrons. By embedding the crystals in a high-viscosity carrier stream, high-resolution room-temperature studies can be conducted at atmospheric pressure using the unattenuated X-ray beam, thus permitting the analysis of small or weakly scattering crystals. The high-viscosity extrusion injector is described, as is its use to collect high-resolution serial data from native and heavy-atom-derivatized lysozyme crystals at the Swiss Light Source using less than half a milligram of protein crystals. The room-temperature serial data allow de novo structure determination. The crystal size used in this proof-of-principle experiment was dictated by the available flux density. However, upcoming

  11. Status of the crystallography beamlines at PETRA III

    Science.gov (United States)

    Burkhardt, Anja; Pakendorf, Tim; Reime, Bernd; Meyer, Jan; Fischer, Pontus; Stübe, Nicolas; Panneerselvam, Saravanan; Lorbeer, Olga; Stachnik, Karolina; Warmer, Martin; Rödig, Philip; Göries, Dennis; Meents, Alke

    2016-03-01

    Since 2013, three beamlines for macromolecular crystallography are available to users at the third-generation synchrotron PETRA III in Hamburg: P11, P13 and P14, the latter two operated by EMBL. Beamline P11 is operated by DESY and is equipped with a Pilatus 6M detector. Together with the photon flux of 2× 10^{13} ph/s provided by the very brilliant X-ray source of PETRA III, a full data set can be typically collected in less than 2min. P11 provides state-of-the-art microfocusing capabilities with beam sizes down to 1× 1 μ m2, which makes the beamline ideally suited for investigation of microcrystals and serial crystallography experiments. An automatic sample changer allows fast sample exchange in less than 20s, which enables high-throughput crystallography and fast crystal screening. For sample preparation, an S2 biosafety laboratory is available in close proximity to the beamline.

  12. Electron crystallography--the waking beauty of structural biology.

    Science.gov (United States)

    Pope, Christopher R; Unger, Vinzenz M

    2012-08-01

    Since its debut in the mid 1970s, electron crystallography has been a valuable alternative in the structure determination of biological macromolecules. Its reliance on single-layered or double-layered two-dimensionally ordered arrays and the ability to obtain structural information from small and disordered crystals make this approach particularly useful for the study of membrane proteins in a lipid bilayer environment. Despite its unique advantages, technological hurdles have kept electron crystallography from reaching its full potential. Addressing the issues, recent initiatives developed high-throughput pipelines for crystallization and screening. Adding progress in automating data collection, image analysis and phase extension methods, electron crystallography is poised to raise its profile and may lead the way in exploring the structural biology of macromolecular complexes.

  13. FLASH Assembly of TALENs Enables High-Throughput Genome Editing

    OpenAIRE

    Reyon, Deepak; Tsai, Shengdar Q.; Khayter, Cyd; Foden, Jennifer A.; Sander, Jeffry D.; Joung, J. Keith

    2012-01-01

    Engineered transcription activator-like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the Fast Ligation-based Automatable Solid-phase High-throughput (FLASH) platform, a rapid and cost-effective method we developed to enable ...

  14. Inferential literacy for experimental high-throughput biology.

    Science.gov (United States)

    Miron, Mathieu; Nadon, Robert

    2006-02-01

    Many biologists believe that data analysis expertise lags behind the capacity for producing high-throughput data. One view within the bioinformatics community is that biological scientists need to develop algorithmic skills to meet the demands of the new technologies. In this article, we argue that the broader concept of inferential literacy, which includes understanding of data characteristics, experimental design and statistical analysis, in addition to computation, more adequately encompasses what is needed for efficient progress in high-throughput biology.

  15. Virtual high throughput screening (vHTS) - A perspective

    OpenAIRE

    Subramaniam, Sangeetha; Mehrotra, Monica; Gupta, Dinesh,

    2008-01-01

    With the exponential rise in the number of viable novel drug targets, computational methods are being increasingly applied to accelerate the drug discovery process. Virtual High Throughput Screening (vHTS) is one such established methodology to identify drug candidates from large collection of compound libraries. Although it complements the expensive and time consuming High Throughput Screening (HTS) of compound libraries, vHTS possess inherent challenges. The successful vHTS requires the car...

  16. Fixed target matrix for femtosecond time-resolved and in situ serial micro-crystallography

    Directory of Open Access Journals (Sweden)

    C. Mueller

    2015-09-01

    Full Text Available We present a crystallography chip enabling in situ room temperature crystallography at microfocus synchrotron beamlines and X-ray free-electron laser (X-FEL sources. Compared to other in situ approaches, we observe extremely low background and high diffraction data quality. The chip design is robust and allows fast and efficient loading of thousands of small crystals. The ability to load a large number of protein crystals, at room temperature and with high efficiency, into prescribed positions enables high throughput automated serial crystallography with microfocus synchrotron beamlines. In addition, we demonstrate the application of this chip for femtosecond time-resolved serial crystallography at the Linac Coherent Light Source (LCLS, Menlo Park, California, USA. The chip concept enables multiple images to be acquired from each crystal, allowing differential detection of changes in diffraction intensities in order to obtain high signal-to-noise and fully exploit the time resolution capabilities of XFELs.

  17. Introduction to electron crystallography.

    Science.gov (United States)

    Kühlbrandt, Werner

    2013-01-01

    From the earliest work on regular arrays in negative stain, electron crystallography has contributed greatly to our understanding of the structure and function of biological macromolecules. The development of electron cryo-microscopy (cryo-EM) then lead to the first groundbreaking atomic models of the membrane proteins bacteriorhodopsin and light harvesting complex II within lipid bilayers. Key contributions towards cryo-EM and electron crystallography methods included specimen preparation and vitrification, liquid-helium cooling, data collection, and image processing. These methods are now applied almost routinely to both membrane and soluble proteins. Here we outline the advances and the breakthroughs that paved the way towards high-resolution structures by electron crystallography, both in terms of methods development and biological milestones.

  18. Optimized beamline design for macromolecular crystallography at the Cornell High Energy Synchrotron Source (CHESS) (abstract)

    Science.gov (United States)

    Schildkamp, Wilfried; Bilderback, Donald; Moffat, Keith

    1989-07-01

    The A1 station on the CHESS wiggler beamline has been the workhorse for most macromolecular crystallographic experiments. This station is equipped with a fixed energy focusing germanium (111) monochromator and a focusing total reflection mirror. Our macromolecular crystallographers made full use of the high flux of more than 1012 photons/s/mm2 and the stable beam conditions, both in position and energy resolution. As a result, the A1 station was heavily oversubscribed. CHESS is presently expanding its capabilities and a new diffraction station for macromolecular crystallography is under construction. This beamline will be powered by a 24-pole hybrid permanent magnet wiggler with a critical energy of 25 keV. A focusing monochromator, which handles a specific heat load of 10 W/mm2, will have a range of tunability which covers all relevant absorption edges from 7 to 15 keV using a Ge(111) crystal. The energy resolution and the focusing properties remain constant within a factor of 2 over the entire tunability range. We expect a brilliance of about 1013 photons/s/mm2/mrad2/0.1% bandpass. The diffraction station will be equipped with an oscillation camera which can be used with x-ray film of 5×5 or 8×10 in. size or alternatively with Kodak storage phosphors. A wide variety of clamp-on accessories, like crystal coolers, fast shutters, helium pathways, polarimeter, etc. are available. The station will contain a beampipe system, which can also be used for small angle scattering experiments with sample-to-detector distances of up to 3000 mm. The entire diffraction station, its control area, a biological preparation area, and a darkroom are to be embedded in a biological safety containment of the level BL3. This will allow diffraction studies of virulent strains of viruses and other biohazards, which could not previously be studied at synchrotron radiation sources before without causing major disruption to the normal laboratory procedure.

  19. High energy x-ray diffraction/x-ray fluorescence spectroscopy for high-throughput analysis of composition spread thin films.

    Science.gov (United States)

    Gregoire, John M; Dale, Darren; Kazimirov, Alexander; DiSalvo, Francis J; van Dover, R Bruce

    2009-12-01

    High-throughput crystallography is an important tool in materials research, particularly for the rapid assessment of structure-property relationships. We present a technique for simultaneous acquisition of diffraction images and fluorescence spectra on a continuous composition spread thin film using a 60 keV x-ray source. Subsequent noninteractive data processing provides maps of the diffraction profiles, thin film fiber texture, and composition. Even for highly textured films, our diffraction technique provides detection of diffraction from each family of Bragg reflections, which affords direct comparison of the measured profiles with powder patterns of known phases. These techniques are important for high throughput combinatorial studies as they provide structure and composition maps which may be correlated with performance trends within an inorganic library.

  20. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  1. A system for performing high throughput assays of synaptic function.

    Directory of Open Access Journals (Sweden)

    Chris M Hempel

    Full Text Available Unbiased, high-throughput screening has proven invaluable for dissecting complex biological processes. Application of this general approach to synaptic function would have a major impact on neuroscience research and drug discovery. However, existing techniques for studying synaptic physiology are labor intensive and low-throughput. Here, we describe a new high-throughput technology for performing assays of synaptic function in primary neurons cultured in microtiter plates. We show that this system can perform 96 synaptic vesicle cycling assays in parallel with high sensitivity, precision, uniformity, and reproducibility and can detect modulators of presynaptic function. By screening libraries of pharmacologically defined compounds on rat forebrain cultures, we have used this system to identify novel effects of compounds on specific aspects of presynaptic function. As a system for unbiased compound as well as genomic screening, this technology has significant applications for basic neuroscience research and for the discovery of novel, mechanism-based treatments for central nervous system disorders.

  2. Combinatorial and high-throughput screening approaches for strain engineering.

    Science.gov (United States)

    Liu, Wenshan; Jiang, Rongrong

    2015-03-01

    Microbes have long been used in the industry to produce valuable biochemicals. Combinatorial engineering approaches, new strain engineering tools derived from inverse metabolic engineering, have started to attract attention in recent years, including genome shuffling, error-prone DNA polymerase, global transcription machinery engineering (gTME), random knockout/overexpression libraries, ribosome engineering, multiplex automated genome engineering (MAGE), customized optimization of metabolic pathways by combinatorial transcriptional engineering (COMPACTER), and library construction of "tunable intergenic regions" (TIGR). Since combinatorial approaches and high-throughput screening methods are fundamentally interconnected, color/fluorescence-based, growth-based, and biosensor-based high-throughput screening methods have been reviewed. We believe that with the help of metabolic engineering tools and new combinatorial approaches, plus effective high-throughput screening methods, researchers will be able to achieve better results on improving microorganism performance under stress or enhancing biochemical yield.

  3. Droplet microfluidics for high-throughput biological assays.

    Science.gov (United States)

    Guo, Mira T; Rotem, Assaf; Heyman, John A; Weitz, David A

    2012-06-21

    Droplet microfluidics offers significant advantages for performing high-throughput screens and sensitive assays. Droplets allow sample volumes to be significantly reduced, leading to concomitant reductions in cost. Manipulation and measurement at kilohertz speeds enable up to 10(8) samples to be screened in one day. Compartmentalization in droplets increases assay sensitivity by increasing the effective concentration of rare species and decreasing the time required to reach detection thresholds. Droplet microfluidics combines these powerful features to enable currently inaccessible high-throughput screening applications, including single-cell and single-molecule assays.

  4. High-throughput Binary Vectors for Plant Gene Function Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhi-Yong Lei; Ping Zhao; Min-Jie Cao; Rong Cui; Xi Chen; Li-Zhong Xiong; Qi-Fa Zhang; David J. Oliver; Cheng-Bin Xiang

    2007-01-01

    A series of high-throughput binary cloning vectors were constructed to facilitate gene function analysis in higher plants. This vector series consists of plasmids designed for plant expression, promoter analysis, gene silencing,and green fluorescent protein fusions for protein localization. These vectors provide for high-throughput and efficient cloning utilizing sites for λ phage integrase/excisionase. In addition, unique restriction sites are incorporated in a multiple cloning site and enable promoter replacement. The entire vector series are available with complete sequence information and detailed annotations and are freely distributed to the scientific community for non-commercial uses.

  5. Screening and synthesis: high throughput technologies applied to parasitology.

    Science.gov (United States)

    Morgan, R E; Westwood, N J

    2004-01-01

    High throughput technologies continue to develop in response to the challenges set by the genome projects. This article discusses how the techniques of both high throughput screening (HTS) and synthesis can influence research in parasitology. Examples of the use of targeted and phenotype-based HTS using unbiased compound collections are provided. The important issue of identifying the protein target(s) of bioactive compounds is discussed from the synthetic chemist's perspective. This article concludes by reviewing recent examples of successful target identification studies in parasitology.

  6. Perspective: Data infrastructure for high throughput materials discovery

    Science.gov (United States)

    Pfeif, E. A.; Kroenlein, K.

    2016-05-01

    Computational capability has enabled materials design to evolve from trial-and-error towards more informed methodologies that require large amounts of data. Expert-designed tools and their underlying databases facilitate modern-day high throughput computational methods. Standard data formats and communication standards increase the impact of traditional data, and applying these technologies to a high throughput experimental design provides dense, targeted materials data that are valuable for material discovery. Integrated computational materials engineering requires both experimentally and computationally derived data. Harvesting these comprehensively requires different methods of varying degrees of automation to accommodate variety and volume. Issues of data quality persist independent of type.

  7. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.;

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification....... Monoclonal antibodies were raised to different targets in single batch runs of 6-10 wk using multiplexed immunisations, automated fusion and cell-culture, and a novel antigen-coated microarray-screening assay. In a large-scale experiment, where eight mice were immunized with ten antigens each, we generated...

  8. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  9. High throughput recombinant protein production of fungal secreted proteins

    DEFF Research Database (Denmark)

    Vala, Andrea Lages Lino; Roth, Doris; Grell, Morten Nedergaard

    2011-01-01

    a high-throughput protein production system with a special focus on fungal secreted proteins. We use a ligation independent cloning to clone target genes into expression vectors for E. coli and P. pastoris and a small scale test expression to identify constructs producing soluble protein. Expressed...... interaction), between fungi of the order Entomophthorales and aphids (pathogenic interaction), and in the mycoparasitic interaction between the oomycetes Pythium oligandrum and P. ultimum. In general, the high-throughput protein production system can lead to a better understanding of fungal/host interactions...

  10. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  11. Efficient Management of High-Throughput Screening Libraries with SAVANAH

    DEFF Research Database (Denmark)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen;

    2016-01-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such scr...

  12. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning (EF

  13. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  14. Efficient Management of High-Throughput Screening Libraries with SAVANAH

    DEFF Research Database (Denmark)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen

    2017-01-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis...

  15. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning (EF

  16. A high-throughput label-free nanoparticle analyser

    Science.gov (United States)

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M.; Ruoslahti, Erkki; Cleland, Andrew N.

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10-6 l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  17. High-throughput bioinformatics with the Cyrille2 pipeline system.

    NARCIS (Netherlands)

    Fiers, M.W.E.J.; Burgt, van der A.; Datema, E.; Groot, de J.C.W.; Ham, van R.C.H.J.

    2008-01-01

    Background - Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses a

  18. Algorithms for mapping high-throughput DNA sequences

    DEFF Research Database (Denmark)

    Frellsen, Jes; Menzel, Peter; Krogh, Anders

    2014-01-01

    Abstract High-throughput sequencing (HTS) technologies revolutionized the field of molecular biology by enabling large scale whole genome sequencing as well as a broad range of experiments for studying the cell's inner workings directly on DNA or RNA level. Given the dramatically increased rate...

  19. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    2007-01-01

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an organism

  20. Chemometric Optimization Studies in Catalysis Employing High-Throughput Experimentation

    NARCIS (Netherlands)

    Pereira, S.R.M.

    2008-01-01

    The main topic of this thesis is the investigation of the synergies between High-Throughput Experimentation (HTE) and Chemometric Optimization methodologies in Catalysis research and of the use of such methodologies to maximize the advantages of using HTE methods. Several case studies were analysed

  1. High-Throughput, Large-Scale SNP Genotyping: Bioinformatics Considerations

    OpenAIRE

    Margetic, Nino

    2004-01-01

    In order to provide a high-throughput, large-scale genotyping facility at the national level we have developed a set of inter-dependent information systems. A combination of commercial, publicly-available and in-house developed tools links a series of data repositories based both on flat files and relational databases providing an almost complete semi-automated pipeline.

  2. An improved high throughput sequencing method for studying oomycete communities

    DEFF Research Database (Denmark)

    Sapkota, Rumakanta; Nicolaisen, Mogens

    2015-01-01

    Culture-independent studies using next generation sequencing have revolutionizedmicrobial ecology, however, oomycete ecology in soils is severely lagging behind. The aimof this study was to improve and validate standard techniques for using high throughput sequencing as a tool for studying oomyce...

  3. Automatic Spot Identification for High Throughput Microarray Analysis

    Science.gov (United States)

    Wu, Eunice; Su, Yan A.; Billings, Eric; Brooks, Bernard R.; Wu, Xiongwu

    2013-01-01

    High throughput microarray analysis has great potential in scientific research, disease diagnosis, and drug discovery. A major hurdle toward high throughput microarray analysis is the time and effort needed to accurately locate gene spots in microarray images. An automatic microarray image processor will allow accurate and efficient determination of spot locations and sizes so that gene expression information can be reliably extracted in a high throughput manner. Current microarray image processing tools require intensive manual operations in addition to the input of grid parameters to correctly and accurately identify gene spots. This work developed a method, herein called auto-spot, to automate the spot identification process. Through a series of correlation and convolution operations, as well as pixel manipulations, this method makes spot identification an automatic and accurate process. Testing with real microarray images has demonstrated that this method is capable of automatically extracting subgrids from microarray images and determining spot locations and sizes within each subgrid, regardless of variations in array patterns and background noises. With this method, we are one step closer to the goal of high throughput microarray analysis. PMID:24298393

  4. MIPHENO: data normalization for high throughput metabolite analysis

    Directory of Open Access Journals (Sweden)

    Bell Shannon M

    2012-01-01

    Full Text Available Abstract Background High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course of months and years, often without the controls needed to compare directly across the dataset. Few methods are available to facilitate comparisons of high throughput metabolic data generated in batches where explicit in-group controls for normalization are lacking. Results Here we describe MIPHENO (Mutant Identification by Probabilistic High throughput-Enabled Normalization, an approach for post-hoc normalization of quantitative first-pass screening data in the absence of explicit in-group controls. This approach includes a quality control step and facilitates cross-experiment comparisons that decrease the false non-discovery rates, while maintaining the high accuracy needed to limit false positives in first-pass screening. Results from simulation show an improvement in both accuracy and false non-discovery rate over a range of population parameters (p -16 and a modest but significant (p -16 improvement in area under the receiver operator characteristic curve of 0.955 for MIPHENO vs 0.923 for a group-based statistic (z-score. Analysis of the high throughput phenotypic data from the Arabidopsis Chloroplast 2010 Project (http://www.plastid.msu.edu/ showed ~ 4-fold increase in the ability to detect previously described or expected phenotypes over the group based statistic. Conclusions Results demonstrate MIPHENO offers substantial benefit in improving the ability to detect putative mutant phenotypes from post-hoc analysis of large data sets. Additionally, it facilitates data interpretation and permits cross-dataset comparison where group-based controls are missing. MIPHENO is applicable to a wide range of high throughput screenings and the code is

  5. Acoustic transfer of protein crystals from agarose pedestals to micromeshes for high-throughput screening

    Energy Technology Data Exchange (ETDEWEB)

    Cuttitta, Christina M. [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); The City University of New York, 2800 Victory Boulevard, Staten Island, NY 10314 (United States); Ericson, Daniel L. [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); University at Buffalo, SUNY, 12 Capen Hall, Buffalo, NY 14260 (United States); Scalia, Alexander [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Binghamton University, 4400 Vestal Parkway East, Binghamton, NY 11973-5000 (United States); Roessler, Christian G. [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Teplitsky, Ella [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Stony Brook University, Stony Brook, NY 11794-5215 (United States); Joshi, Karan [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); PEC University of Technology, Chandigarh (India); Campos, Olven [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Florida Atlantic University, 777 Glades Road, Boca Raton, FL 33414 (United States); Agarwal, Rakhi; Allaire, Marc [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Orville, Allen M. [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Sweet, Robert M.; Soares, Alexei S., E-mail: soares@bnl.gov [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2015-01-01

    An acoustic high-throughput screening method is described for harvesting protein crystals and combining the protein crystals with chemicals such as a fragment library. Acoustic droplet ejection (ADE) is an emerging technology with broad applications in serial crystallography such as growing, improving and manipulating protein crystals. One application of this technology is to gently transfer crystals onto MiTeGen micromeshes with minimal solvent. Once mounted on a micromesh, each crystal can be combined with different chemicals such as crystal-improving additives or a fragment library. Acoustic crystal mounting is fast (2.33 transfers s{sup −1}) and all transfers occur in a sealed environment that is in vapor equilibrium with the mother liquor. Here, a system is presented to retain crystals near the ejection point and away from the inaccessible dead volume at the bottom of the well by placing the crystals on a concave agarose pedestal (CAP) with the same chemical composition as the crystal mother liquor. The bowl-shaped CAP is impenetrable to crystals. Consequently, gravity will gently move the crystals into the optimal location for acoustic ejection. It is demonstrated that an agarose pedestal of this type is compatible with most commercially available crystallization conditions and that protein crystals are readily transferred from the agarose pedestal onto micromeshes with no loss in diffraction quality. It is also shown that crystals can be grown directly on CAPs, which avoids the need to transfer the crystals from the hanging drop to a CAP. This technology has been used to combine thermolysin and lysozyme crystals with an assortment of anomalously scattering heavy atoms. The results point towards a fast nanolitre method for crystal mounting and high-throughput screening.

  6. High-throughput theoretical design of lithium battery materials

    Science.gov (United States)

    Shi-Gang, Ling; Jian, Gao; Rui-Juan, Xiao; Li-Quan, Chen

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. Project supported by the National Natural Science Foundation of China (Grant Nos. 11234013 and 51172274) and the National High Technology Research and Development Program of China (Grant No. 2015AA034201).

  7. Racemic protein crystallography.

    Science.gov (United States)

    Yeates, Todd O; Kent, Stephen B H

    2012-01-01

    Although natural proteins are chiral and are all of one "handedness," their mirror image forms can be prepared by chemical synthesis. This opens up new opportunities for protein crystallography. A racemic mixture of the enantiomeric forms of a protein molecule can crystallize in ways that natural proteins cannot. Recent experimental data support a theoretical prediction that this should make racemic protein mixtures highly amenable to crystallization. Crystals obtained from racemic mixtures also offer advantages in structure determination strategies. The relevance of these potential advantages is heightened by advances in synthetic methods, which are extending the size limit for proteins that can be prepared by chemical synthesis. Recent ideas and results in the area of racemic protein crystallography are reviewed.

  8. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy

    Directory of Open Access Journals (Sweden)

    Hossein Pourmodheji

    2016-06-01

    Full Text Available Nuclear Magnetic Resonance (NMR is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS. In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery.

  9. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy.

    Science.gov (United States)

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-06-09

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery.

  10. Multiple column high-throughput e-beam inspection (EBI)

    Science.gov (United States)

    Lam, David K.; Monahan, Kevin M.; Liu, Enden D.; Tran, Cong; Prescop, Ted

    2012-03-01

    Single-column e-beam systems are used in production for the detection of electrical defects, but are too slow to be used for the detection of small physical defects, and can't meet future inspection requirements. This paper presents a multiplecolumn e-beam technology for high throughput wafer inspection. Multibeam has developed all-electrostatic columns for high-resolution imaging. The elimination of magnetic coils enables the columns to be small; e-beam deflection is faster in the absence of magnetic hysteresis. Multiple miniaturecolumns are assembled in an array. An array of 100 columns covers the entire surface of a 300mm wafer, affording simultaneous cross-wafer sampling. Column performance simulations and system architecture are presented. Also provided are examples of high throughput, more efficient, multiple-column wafer inspection.

  11. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  12. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  13. High-throughput screening of cell responses to biomaterials.

    Science.gov (United States)

    Yliperttula, Marjo; Chung, Bong Geun; Navaladi, Akshay; Manbachi, Amir; Urtti, Arto

    2008-10-02

    Biomaterials have emerged as powerful regulators of the cellular microenvironment for drug discovery, tissue engineering research and chemical testing. Although biomaterial-based matrices control the cellular behavior, these matrices are still far from being optimal. In principle, efficacy of biomaterial development for the cell cultures can be improved by using high-throughput techniques that allow screening of a large number of materials and manipulate microenvironments in a controlled manner. Several cell responses such as toxicity, proliferation, and differentiation have been used to evaluate the biomaterials thus providing basis for further selection of the lead biomimetic materials or microenvironments. Although high-throughput techniques provide an initial screening of the desired properties, more detailed follow-up studies of the selected materials are required to understand the true value of a 'positive hit'. High-throughput methods may become important tools in the future development of biomaterials-based cell cultures that will enable more realistic pre-clinical prediction of pharmacokinetics, pharmacodynamics, and toxicity. This is highly important, because predictive pre-clinical methods are needed to improve the high attrition rate of drug candidates during clinical testing.

  14. A high-throughput strategy to screen 2D crystallization trials of membrane proteins.

    Science.gov (United States)

    Vink, Martin; Derr, Kd; Love, James; Stokes, David L; Ubarretxena-Belandia, Iban

    2007-12-01

    Electron microscopy of two-dimensional (2D) crystals has demonstrated potential for structure determination of membrane proteins. Technical limitations in large-scale crystallization screens have, however, prevented a major breakthrough in the routine application of this technology. Dialysis is generally used for detergent removal and reconstitution of the protein into a lipid bilayer, and devices for testing numerous conditions in parallel are not readily available. Furthermore, the small size of resulting 2D crystals requires electron microscopy to evaluate the results and automation of the necessary steps is essential to achieve a reasonable throughput. We have designed a crystallization block, using standard microplate dimensions, by which 96 unique samples can be dialyzed simultaneously against 96 different buffers and have demonstrated that the rate of detergent dialysis is comparable to those obtained with conventional dialysis devices. A liquid-handling robot was employed to set up 2D crystallization trials with the membrane proteins CopA from Archaeoglobus fulgidus and light-harvesting complex II (LH2) from Rhodobacter sphaeroides. For CopA, 1 week of dialysis yielded tubular crystals and, for LH2, large and well-ordered vesicular 2D crystals were obtained after 24 h, illustrating the feasibility of this approach. Combined with a high-throughput procedure for preparation of EM-grids and automation of the subsequent negative staining step, the crystallization block offers a novel pipeline that promises to speed up large-scale screening of 2D crystallization and to increase the likelihood of producing well-ordered crystals for analysis by electron crystallography.

  15. Improvement of an automated protein crystal exchange system PAM for high-throughput data collection

    Energy Technology Data Exchange (ETDEWEB)

    Hiraki, Masahiko, E-mail: masahiko.hiraki@kek.jp; Yamada, Yusuke; Chavas, Leonard M. G. [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Wakatsuki, Soichi [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); SLAC National Accelerator Laboratory, 2575 Sand Hill Road, MS 69, Menlo Park, CA 94025-7015 (United States); Stanford University, Beckman Center B105, Stanford, CA 94305-5126 (United States); Matsugaki, Naohiro [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)

    2013-11-01

    A special liquid-nitrogen Dewar with double capacity for the sample-exchange robot has been created at AR-NE3A at the Photon Factory, allowing continuous fully automated data collection. In this work, this new system is described and the stability of its calibration is discussed. Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable.

  16. DNAshape: a method for the high-throughput prediction of DNA structural features on a genomic scale

    Science.gov (United States)

    Zhou, Tianyin; Yang, Lin; Lu, Yan; Dror, Iris; Dantas Machado, Ana Carolina; Ghane, Tahereh; Di Felice, Rosa; Rohs, Remo

    2013-01-01

    We present a method and web server for predicting DNA structural features in a high-throughput (HT) manner for massive sequence data. This approach provides the framework for the integration of DNA sequence and shape analyses in genome-wide studies. The HT methodology uses a sliding-window approach to mine DNA structural information obtained from Monte Carlo simulations. It requires only nucleotide sequence as input and instantly predicts multiple structural features of DNA (minor groove width, roll, propeller twist and helix twist). The results of rigorous validations of the HT predictions based on DNA structures solved by X-ray crystallography and NMR spectroscopy, hydroxyl radical cleavage data, statistical analysis and cross-validation, and molecular dynamics simulations provide strong confidence in this approach. The DNAshape web server is freely available at http://rohslab.cmb.usc.edu/DNAshape/. PMID:23703209

  17. Human transcriptome array for high-throughput clinical studies.

    Science.gov (United States)

    Xu, Weihong; Seok, Junhee; Mindrinos, Michael N; Schweitzer, Anthony C; Jiang, Hui; Wilhelmy, Julie; Clark, Tyson A; Kapur, Karen; Xing, Yi; Faham, Malek; Storey, John D; Moldawer, Lyle L; Maier, Ronald V; Tompkins, Ronald G; Wong, Wing Hung; Davis, Ronald W; Xiao, Wenzhong

    2011-03-01

    A 6.9 million-feature oligonucleotide array of the human transcriptome [Glue Grant human transcriptome (GG-H array)] has been developed for high-throughput and cost-effective analyses in clinical studies. This array allows comprehensive examination of gene expression and genome-wide identification of alternative splicing as well as detection of coding SNPs and noncoding transcripts. The performance of the array was examined and compared with mRNA sequencing (RNA-Seq) results over multiple independent replicates of liver and muscle samples. Compared with RNA-Seq of 46 million uniquely mappable reads per replicate, the GG-H array is highly reproducible in estimating gene and exon abundance. Although both platforms detect similar expression changes at the gene level, the GG-H array is more sensitive at the exon level. Deeper sequencing is required to adequately cover low-abundance transcripts. The array has been implemented in a multicenter clinical program and has generated high-quality, reproducible data. Considering the clinical trial requirements of cost, sample availability, and throughput, the GG-H array has a wide range of applications. An emerging approach for large-scale clinical genomic studies is to first use RNA-Seq to the sufficient depth for the discovery of transcriptome elements relevant to the disease process followed by high-throughput and reliable screening of these elements on thousands of patient samples using custom-designed arrays.

  18. Computational analysis of high-throughput flow cytometry data

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  19. High throughput electrophysiology: new perspectives for ion channel drug discovery.

    Science.gov (United States)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter; Jensen, Bo Skaaning; Korsgaard, Mads P G; Christophersen, Palle

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels. A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion channel targets accessible for drug screening. Specifically, genuine HTS parallel processing techniques based on arrays of planar silicon chips are being developed, but also lower throughput sequential techniques may be of value in compound screening, lead optimization, and safety screening. The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery.

  20. Fluorescent biosensors for high throughput screening of protein kinase inhibitors.

    Science.gov (United States)

    Prével, Camille; Pellerano, Morgan; Van, Thi Nhu Ngoc; Morris, May C

    2014-02-01

    High throughput screening assays aim to identify small molecules that interfere with protein function, activity, or conformation, which can serve as effective tools for chemical biology studies of targets involved in physiological processes or pathways of interest or disease models, as well as templates for development of therapeutics in medicinal chemistry. Fluorescent biosensors constitute attractive and powerful tools for drug discovery programs, from high throughput screening assays, to postscreen characterization of hits, optimization of lead compounds, and preclinical evaluation of candidate drugs. They provide a means of screening for inhibitors that selectively target enzymatic activity, conformation, and/or function in vitro. Moreover, fluorescent biosensors constitute useful tools for cell- and image-based, multiplex and multiparametric, high-content screening. Application of fluorescence-based sensors to screen large and complex libraries of compounds in vitro, in cell-based formats or whole organisms requires several levels of optimization to establish robust and reproducible assays. In this review, we describe the different fluorescent biosensor technologies which have been applied to high throughput screens, and discuss the prerequisite criteria underlying their successful application. Special emphasis is placed on protein kinase biosensors, since these enzymes constitute one of the most important classes of therapeutic targets in drug discovery.

  1. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  2. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...... maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content...... and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers...

  3. Sensitivity study of reliable, high-throughput resolution metricsfor photoresists

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Christopher N.; Naulleau, Patrick P.

    2007-07-30

    The resolution of chemically amplified resists is becoming an increasing concern, especially for lithography in the extreme ultraviolet (EUV) regime. Large-scale screening and performance-based down-selection is currently underway to identify resist platforms that can support shrinking feature sizes. Resist screening efforts, however, are hampered by the absence of reliable resolution metrics that can objectively quantify resist resolution in a high-throughput fashion. Here we examine two high-throughput metrics for resist resolution determination. After summarizing their details and justifying their utility, we characterize the sensitivity of both metrics to two of the main experimental uncertainties associated with lithographic exposure tools, namely: limited focus control and limited knowledge of optical aberrations. For an implementation at EUV wavelengths, we report aberration and focus limited error bars in extracted resolution of {approx} 1.25 nm RMS for both metrics making them attractive candidates for future screening and down-selection efforts.

  4. Galaxy High Throughput Genotyping Pipeline for GeneTitan.

    Science.gov (United States)

    Karpenko, Oleksiy; Bahroos, Neil; Chukhman, Morris; Dong, Xiao; Kanabar, Pinal; Arbieva, Zarema; Jackson, Tommie; Hendrickson, William

    2013-01-01

    Latest genotyping solutions allow for rapid testing of more than two million markers in one experiment. Fully automated instruments such as Affymetrix GeneTitan enable processing of large numbers of samples in a truly high-throughput manner. In concert with solutions like Axiom, fully customizable array plates can now utilize automated workflows that can leverage multi-channel instrumentation like the GeneTitan. With the growing size of raw data output, the serial computational architecture of the software, typically distributed by the vendors on turnkey desktop solutions for quality control and genotype calling, becomes legacy rather than an advantage. Advanced software techniques provide power, flexibility, and can be deployed in an HPC environment, but become technically inconvenient for biologists to use. Here we present a pipeline that uses Galaxy as a mechanism to lower the barrier for complex analysis, and increase efficiency by leveraging high-throughput computing.

  5. High-throughput screening in the C. elegans nervous system.

    Science.gov (United States)

    Kinser, Holly E; Pincus, Zachary

    2016-06-03

    The nematode Caenorhabditis elegans is widely used as a model organism in the field of neurobiology. The wiring of the C. elegans nervous system has been entirely mapped, and the animal's optical transparency allows for in vivo observation of neuronal activity. The nematode is also small in size, self-fertilizing, and inexpensive to cultivate and maintain, greatly lending to its utility as a whole-animal model for high-throughput screening (HTS) in the nervous system. However, the use of this organism in large-scale screens presents unique technical challenges, including reversible immobilization of the animal, parallel single-animal culture and containment, automation of laser surgery, and high-throughput image acquisition and phenotyping. These obstacles require significant modification of existing techniques and the creation of new C. elegans-based HTS platforms. In this review, we outline these challenges in detail and survey the novel technologies and methods that have been developed to address them.

  6. High-throughput screening for modulators of cellular contractile force

    CERN Document Server

    Park, Chan Young; Tambe, Dhananjay; Chen, Bohao; Lavoie, Tera; Dowell, Maria; Simeonov, Anton; Maloney, David J; Marinkovic, Aleksandar; Tschumperlin, Daniel J; Burger, Stephanie; Frykenberg, Matthew; Butler, James P; Stamer, W Daniel; Johnson, Mark; Solway, Julian; Fredberg, Jeffrey J; Krishnan, Ramaswamy

    2014-01-01

    When cellular contractile forces are central to pathophysiology, these forces comprise a logical target of therapy. Nevertheless, existing high-throughput screens are limited to upstream signaling intermediates with poorly defined relationship to such a physiological endpoint. Using cellular force as the target, here we screened libraries to identify novel drug candidates in the case of human airway smooth muscle cells in the context of asthma, and also in the case of Schlemm's canal endothelial cells in the context of glaucoma. This approach identified several drug candidates for both asthma and glaucoma. We attained rates of 1000 compounds per screening day, thus establishing a force-based cellular platform for high-throughput drug discovery.

  7. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    High-throughput screening is extensively applied for identification of drug targets and drug discovery and recently it found entry into toxicity testing. Reverse phase protein arrays (RPPAs) are used widespread for quantification of protein markers. We reasoned that RPPAs also can be utilized...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si...... a robotic screening platform. Furthermore, we automated sample tracking and data analysis by developing a bundled bioinformatics tool named “MIRACLE”. Automation and RPPA-based viability/toxicity readouts enable rapid testing of large sample numbers, while granting the possibility for flexible consecutive...

  8. Spotsizer: High-throughput quantitative analysis of microbial growth

    Science.gov (United States)

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  9. The future of crystallography in drug discovery

    Science.gov (United States)

    Zheng, Heping; Hou, Jing; Zimmerman, Matthew D; Wlodawer, Alexander; Minor, Wladek

    2014-01-01

    Introduction X-ray crystallography plays an important role in structure-based drug design (SBDD), and accurate analysis of crystal structures of target macromolecules and macromolecule–ligand complexes is critical at all stages. However, whereas there has been significant progress in improving methods of structural biology, particularly in X-ray crystallography, corresponding progress in the development of computational methods (such as in silico high-throughput screening) is still on the horizon. Crystal structures can be overinterpreted and thus bias hypotheses and follow-up experiments. As in any experimental science, the models of macromolecular structures derived from X-ray diffraction data have their limitations, which need to be critically evaluated and well understood for structure-based drug discovery. Areas covered This review describes how the validity, accuracy and precision of a protein or nucleic acid structure determined by X-ray crystallography can be evaluated from three different perspectives: i) the nature of the diffraction experiment; ii) the interpretation of an electron density map; and iii) the interpretation of the structural model in terms of function and mechanism. The strategies to optimally exploit a macromolecular structure are also discussed in the context of ‘Big Data’ analysis, biochemical experimental design and structure-based drug discovery. Expert opinion Although X-ray crystallography is one of the most detailed ‘microscopes’ available today for examining macromolecular structures, the authors would like to re-emphasize that such structures are only simplified models of the target macromolecules. The authors also wish to reinforce the idea that a structure should not be thought of as a set of precise coordinates but rather as a framework for generating hypotheses to be explored. Numerous biochemical and biophysical experiments, including new diffraction experiments, can and should be performed to verify or falsify

  10. The Joint Structural Biology Group beam lines at the ESRF: Modern macromolecular crystallography

    CERN Document Server

    Mitchell, E P

    2001-01-01

    Macromolecular crystallography has evolved considerably over the last decade. Data sets in under an hour are now possible on high throughput beam lines leading to electron density and, possibly, initial models calculated on-site. There are five beam lines currently dedicated to macromolecular crystallography: the ID14 complex and BM-14 (soon to be superseded by ID-29). These lines handle over five hundred projects every six months and demand is increasing. Automated sample handling, alignment and data management protocols will be required to work efficiently with this demanding load. Projects developing these themes are underway within the JSBG.

  11. Ink-jet printer heads for ultra-small-drop protein crystallography.

    Science.gov (United States)

    Howard, E I; Cachau, R E

    2002-12-01

    Mass-produced automated piezoelectric driven picoliter delivery systems (printer heads) are fast, inexpensive, and reliable devices that are capable of delivering a very large range of volumes and are ideally suited for high-throughput protein crystallography studies. We used this technology to set up under-oil crystallization experiments with drop sizes from the 200-nL to 3-microL volume range, commonly used in protein crystallography, and show its application in setting ultra-small (2 nL) drops, the smallest drop volume reported to date for this type of assay.

  12. High throughput screening operations at the University of Kansas.

    Science.gov (United States)

    Roy, Anuradha

    2014-05-01

    The High Throughput Screening Laboratory at University of Kansas plays a critical role in advancing academic interest in the identification of chemical probes as tools to better understand the biological and biochemical basis of new therapeutic targets. The HTS laboratory has an open service policy and collaborates with internal and external academia as well as for-profit organizations to execute projects requiring HTS-compatible assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization.

  13. Systematic error detection in experimental high-throughput screening

    OpenAIRE

    2011-01-01

    Abstract Background High-throughput screening (HTS) is a key part of the drug discovery process during which thousands of chemical compounds are screened and their activity levels measured in order to identify potential drug candidates (i.e., hits). Many technical, procedural or environmental factors can cause systematic measurement error or inequalities in the conditions in which the measurements are taken. Such systematic error has the potential to critically affect the hit selection proces...

  14. Targeted high-throughput sequencing of tagged nucleic acid samples

    OpenAIRE

    M.; Meyer; Stenzel, U.; Myles, S.; Prüfer, K; Hofreiter, M.

    2007-01-01

    High-throughput 454 DNA sequencing technology allows much faster and more cost-effective sequencing than traditional Sanger sequencing. However, the technology imposes inherent limitations on the number of samples that can be processed in parallel. Here we introduce parallel tagged sequencing (PTS), a simple, inexpensive and flexible barcoding technique that can be used for parallel sequencing any number and type of double-stranded nucleic acid samples. We demonstrate that PTS is particularly...

  15. Mass spectrometry for high-throughput metabolomics analysis of urine

    OpenAIRE

    Abdelrazig, Salah M.A.

    2015-01-01

    Direct electrospray ionisation-mass spectrometry (direct ESI-MS), by omitting the chromatographic step, has great potential for application as a high-throughput approach for untargeted urine metabolomics analysis compared to liquid chromatography-mass spectrometry (LC-MS). The rapid development and technical innovations revealed in the field of ambient ionisation MS such as nanoelectrospray ionisation (nanoESI) chip-based infusion and liquid extraction surface analysis mass spectrometry (LESA...

  16. Generating barcoded libraries for multiplex high-throughput sequencing.

    Science.gov (United States)

    Knapp, Michael; Stiller, Mathias; Meyer, Matthias

    2012-01-01

    Molecular barcoding is an essential tool to use the high throughput of next generation sequencing platforms optimally in studies involving more than one sample. Various barcoding strategies allow for the incorporation of short recognition sequences (barcodes) into sequencing libraries, either by ligation or polymerase chain reaction (PCR). Here, we present two approaches optimized for generating barcoded sequencing libraries from low copy number extracts and amplification products typical of ancient DNA studies.

  17. Condor-COPASI: high-throughput computing for biochemical networks

    OpenAIRE

    Kent Edward; Hoops Stefan; Mendes Pedro

    2012-01-01

    Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary experti...

  18. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Document Server

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  19. FLASH assembly of TALENs for high-throughput genome editing.

    Science.gov (United States)

    Reyon, Deepak; Tsai, Shengdar Q; Khayter, Cyd; Foden, Jennifer A; Sander, Jeffry D; Joung, J Keith

    2012-05-01

    Engineered transcription activator–like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published, and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the fast ligation-based automatable solid-phase high-throughput (FLASH) system, a rapid and cost-effective method for large-scale assembly of TALENs. We tested 48 FLASH-assembled TALEN pairs in a human cell–based EGFP reporter system and found that all 48 possessed efficient gene-modification activities. We also used FLASH to assemble TALENs for 96 endogenous human genes implicated in cancer and/or epigenetic regulation and found that 84 pairs were able to efficiently introduce targeted alterations. Our results establish the robustness of TALEN technology and demonstrate that FLASH facilitates high-throughput genome editing at a scale not currently possible with other genome modification technologies.

  20. Condor-COPASI: high-throughput computing for biochemical networks

    Directory of Open Access Journals (Sweden)

    Kent Edward

    2012-07-01

    Full Text Available Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage.

  1. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  2. High throughput biotechnology in traditional fermented food industry.

    Science.gov (United States)

    Yang, Yong; Xu, Rong-man; Song, Jia; Wang, Wei-min

    2010-11-01

    Traditional fermented food is not only the staple food for most of developing countries but also the key healthy food for developed countries. As the healthy function of these foods are gradually discovered, more and more high throughput biotechnologies are being used to promote the old and new industry. As a result, the microflora, manufacturing processes and product healthy function of these foods were pushed forward either in the respect of profundity or extensiveness nowadays. The application and progress of the high throughput biotechnologies into traditional fermented food industries were different from each other, which was reviewed and detailed by the catalogues of fermented milk products (yogurt, cheese), fermented sausages, fermented vegetables (kimchi, sauerkraut), fermented cereals (sourdough) and fermented beans (tempeh, natto). Given the further promotion by high throughput biotechnologies, the middle and/or down-stream process of traditional fermented foods would be optimized and the process of industrialization of local traditional fermented food having many functional factors but in small quantity would be accelerated. The article presents some promising patents on traditional fermented food industry.

  3. A microdroplet dilutor for high-throughput screening

    Science.gov (United States)

    Niu, Xize; Gielen, Fabrice; Edel, Joshua B.; Demello, Andrew J.

    2011-06-01

    Pipetting and dilution are universal processes used in chemical and biological laboratories to assay and experiment. In microfluidics such operations are equally in demand, but difficult to implement. Recently, droplet-based microfluidics has emerged as an exciting new platform for high-throughput experimentation. However, it is challenging to vary the concentration of droplets rapidly and controllably. To this end, we developed a dilution module for high-throughput screening using droplet-based microfluidics. Briefly, a nanolitre-sized sample droplet of defined concentration is trapped within a microfluidic chamber. Through a process of droplet merging, mixing and re-splitting, this droplet is combined with a series of smaller buffer droplets to generate a sequence of output droplets that define a digital concentration gradient. Importantly, the formed droplets can be merged with other reagent droplets to enable rapid chemical and biological screens. As a proof of concept, we used the dilutor to perform a high-throughput homogeneous DNA-binding assay using only nanolitres of sample.

  4. NCBI GEO: archive for high-throughput functional genomic data.

    Science.gov (United States)

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron

    2009-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.

  5. High-throughput electrical characterization for robust overlay lithography control

    Science.gov (United States)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  6. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput.

  7. High-throughput mass spectrometric cytochrome P450 inhibition screening.

    Science.gov (United States)

    Lim, Kheng B; Ozbal, Can C; Kassel, Daniel B

    2013-01-01

    We describe here a high-throughput assay to support rapid evaluation of drug discovery compounds for possible drug-drug interaction (DDI). Each compound is evaluated for its DDI potential by incubating over a range of eight concentrations and against a panel of six cytochrome P450 (CYP) enzymes: 1A2, 2C8, 2C9, 2C19, 2D6, and 3A4. The method utilizes automated liquid handling for sample preparation, and online solid-phase extraction/tandem mass spectrometry (SPE/MS/MS) for sample analyses. The system is capable of generating two 96-well assay plates in 30 min, and completes the data acquisition and analysis of both plates in about 30 min. Many laboratories that perform the CYP inhibition screening automate only part of the processes leaving a throughput bottleneck within the workflow. The protocols described in this chapter are aimed to streamline the entire process from assay to data acquisition and processing by incorporating automation and utilizing high-precision instrument to maximize throughput and minimize bottleneck.

  8. Discovery of novel targets with high throughput RNA interference screening.

    Science.gov (United States)

    Kassner, Paul D

    2008-03-01

    High throughput technologies have the potential to affect all aspects of drug discovery. Considerable attention is paid to high throughput screening (HTS) for small molecule lead compounds. The identification of the targets that enter those HTS campaigns had been driven by basic research until the advent of genomics level data acquisition such as sequencing and gene expression microarrays. Large-scale profiling approaches (e.g., microarrays, protein analysis by mass spectrometry, and metabolite profiling) can yield vast quantities of data and important information. However, these approaches usually require painstaking in silico analysis and low-throughput basic wet-lab research to identify the function of a gene and validate the gene product as a potential therapeutic drug target. Functional genomic screening offers the promise of direct identification of genes involved in phenotypes of interest. In this review, RNA interference (RNAi) mediated loss-of-function screens will be discussed and as well as their utility in target identification. Some of the genes identified in these screens should produce similar phenotypes if their gene products are antagonized with drugs. With a carefully chosen phenotype, an understanding of the biology of RNAi and appreciation of the limitations of RNAi screening, there is great potential for the discovery of new drug targets.

  9. Fundamentals of crystallography

    CERN Document Server

    2011-01-01

    Crystallography is a basic tool for scientists in many diverse disciplines. This text offers a clear description of fundamentals and of modern applications. It supports curricula in crystallography at undergraduate level.

  10. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  11. HTRF(®): pioneering technology for high-throughput screening.

    Science.gov (United States)

    Degorce, François

    2006-12-01

    Cisbio international pioneered the field of homogeneous fluorescence methodologies and time-resolved fluorescence resonance in particular, through its proprietary technology, HTRF(®). The development was based on Prof. Jean-Marie Lehn's research on rare earth fluorescence properties (awarded the Nobel Prize in Chemistry in 1987) and on Cisbio's expertise in homogenous time-resolved fluorescence (HTRF). The technology is used in assay development and drug screening, most notably in high-throughput screening applications. This highly powerful technology is particularly applied to the areas of G-protein-coupled receptor and kinase screening, as well as a series of targets related to inflammation, metabolic diseases and CNS disorders.

  12. SSFinder: High Throughput CRISPR-Cas Target Sites Prediction Tool

    Directory of Open Access Journals (Sweden)

    Santosh Kumar Upadhyay

    2014-01-01

    Full Text Available Clustered regularly interspaced short palindromic repeats (CRISPR and CRISPR-associated protein (Cas system facilitates targeted genome editing in organisms. Despite high demand of this system, finding a reliable tool for the determination of specific target sites in large genomic data remained challenging. Here, we report SSFinder, a python script to perform high throughput detection of specific target sites in large nucleotide datasets. The SSFinder is a user-friendly tool, compatible with Windows, Mac OS, and Linux operating systems, and freely available online.

  13. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r...... to the presence of filamentous microorganisms was monitored weekly over 4 months. Microthrix was identified as a causative filament and suitable control measures were introduced. The level of Microthrix was reduced after 1-2 months but a number of other filamentous species were still present, with most of them...

  14. High-throughput screening to enhance oncolytic virus immunotherapy

    Directory of Open Access Journals (Sweden)

    Allan KJ

    2016-04-01

    Full Text Available KJ Allan,1,2 David F Stojdl,1–3 SL Swift1 1Children’s Hospital of Eastern Ontario (CHEO Research Institute, 2Department of Biology, Microbiology and Immunology, 3Department of Pediatrics, University of Ottawa, Ottawa, ON, Canada Abstract: High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. Keywords: oncolytic, virus, screen, high-throughput, cancer, chemical, genomic, immunotherapy

  15. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Energy Technology Data Exchange (ETDEWEB)

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  16. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  17. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families....

  18. High-throughput DNA sequencing: a genomic data manufacturing process.

    Science.gov (United States)

    Huang, G M

    1999-01-01

    The progress trends in automated DNA sequencing operation are reviewed. Technological development in sequencing instruments, enzymatic chemistry and robotic stations has resulted in ever-increasing capacity of sequence data production. This progress leads to a higher demand on laboratory information management and data quality assessment. High-throughput laboratories face the challenge of organizational management, as well as technology management. Engineering principles of process control should be adopted in this biological data manufacturing procedure. While various systems attempt to provide solutions to automate different parts of, or even the entire process, new technical advances will continue to change the paradigm and provide new challenges.

  19. High-throughput synthesis and analysis of acylated cyanohydrins.

    Science.gov (United States)

    Hamberg, Anders; Lundgren, Stina; Wingstrand, Erica; Moberg, Christina; Hult, Karl

    2007-01-01

    The yields and optical purities of products obtained from chiral Lewis acid/Lewis base-catalysed additions of alpha-ketonitriles to prochiral aldehydes could be accurately determined by an enzymatic method. The amount of remaining aldehyde was determined after its reduction to an alcohol, whilst the two product enantiomers were analysed after subsequent hydrolysis first by the (S)-selective Candida antarctica lipase B and then by the unselective pig liver esterase. The method could be used for analysis of products obtained from a number of aromatic aldehydes and aliphatic ketonitriles. Microreactor technology was successfully combined with high-throughput analysis for efficient catalyst optimization.

  20. A High-Throughput SU-8Microfluidic Magnetic Bead Separator

    DEFF Research Database (Denmark)

    Bu, Minqiang; Christensen, T. B.; Smistrup, Kristian

    2007-01-01

    We present a novel microfluidic magnetic bead separator based on SU-8 fabrication technique for high through-put applications. The experimental results show that magnetic beads can be captured at an efficiency of 91 % and 54 % at flow rates of 1 mL/min and 4 mL/min, respectively. Integration...... of soft magnetic elements in the chip leads to a slightly higher capturing efficiency and a more uniform distribution of captured beads over the separation chamber than the system without soft magnetic elements....

  1. Computational Proteomics: High-throughput Analysis for Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  2. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  3. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  4. New high-throughput methods of investigating polymer electrolytes

    Science.gov (United States)

    Alcock, Hannah J.; White, Oliver C.; Jegelevicius, Grazvydas; Roberts, Matthew R.; Owen, John R.

    2011-03-01

    Polymer electrolyte films have been prepared by solution casting techniques from precursor solutions of a poly(vinylidene fluoride-co-hexafluoropropylene) (PVdF-HFP), lithium-bis(trifluoromethane) sulfonimide (LiTFSI), and propylene carbonate (PC). Arrays of graded composition were characterised by electrochemical impedance spectroscopy (EIS), differential scanning calorimetry (DSC) and X-ray diffraction (XRD) using high throughput techniques. Impedance analysis showed the resistance of the films as a function of LiTFSI, PC and polymer content. The ternary plot of conductivity shows an area that combines a solid-like mechanical stability with high conductivity, 1 × 10-5 S cm-1 at the composition 0.55/0.15/0.30 wt% PVdF-HFP/LiTFSI/PC, increasing with PC content. In regions with less than a 50 wt% fraction of PVdF-HFP the films were too soft to give meaningful results by this method. The DSC measurements on solvent free, salt-doped polymers show a reduced crystallinity, and high throughput XRD patterns show that non-polar crystalline phases are suppressed by the presence of LiTFSI and PC.

  5. High-throughput protein analysis integrating bioinformatics and experimental assays.

    Science.gov (United States)

    del Val, Coral; Mehrle, Alexander; Falkenhahn, Mechthild; Seiler, Markus; Glatting, Karl-Heinz; Poustka, Annemarie; Suhai, Sandor; Wiemann, Stefan

    2004-01-01

    The wealth of transcript information that has been made publicly available in recent years requires the development of high-throughput functional genomics and proteomics approaches for its analysis. Such approaches need suitable data integration procedures and a high level of automation in order to gain maximum benefit from the results generated. We have designed an automatic pipeline to analyse annotated open reading frames (ORFs) stemming from full-length cDNAs produced mainly by the German cDNA Consortium. The ORFs are cloned into expression vectors for use in large-scale assays such as the determination of subcellular protein localization or kinase reaction specificity. Additionally, all identified ORFs undergo exhaustive bioinformatic analysis such as similarity searches, protein domain architecture determination and prediction of physicochemical characteristics and secondary structure, using a wide variety of bioinformatic methods in combination with the most up-to-date public databases (e.g. PRINTS, BLOCKS, INTERPRO, PROSITE SWISSPROT). Data from experimental results and from the bioinformatic analysis are integrated and stored in a relational database (MS SQL-Server), which makes it possible for researchers to find answers to biological questions easily, thereby speeding up the selection of targets for further analysis. The designed pipeline constitutes a new automatic approach to obtaining and administrating relevant biological data from high-throughput investigations of cDNAs in order to systematically identify and characterize novel genes, as well as to comprehensively describe the function of the encoded proteins.

  6. Probabilistic Assessment of High-Throughput Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Robin E. Kim

    2016-05-01

    Full Text Available Structural health monitoring (SHM using wireless smart sensors (WSS has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved.

  7. High throughput discovery of new fouling-resistant surfaces†

    Science.gov (United States)

    Zhou, Mingyan; Liu, Hongwei; Venkiteshwaran, Adith; Kilduff, James; Anderson, Daniel G.; Langer, Robert; Belfort, Georges

    2017-01-01

    A novel high throughput method for synthesis and screening of customized protein-resistant surfaces was developed. This method is an inexpensive, fast, reproducible and scalable approach to synthesize and screen protein-resistant surfaces appropriate for a specific feed. The method is illustrated here by combining a high throughput platform (HTP) approach together with our patented photo-induced graft polymerization (PGP) method developed for facile modification of commercial poly(aryl sulfone) membranes. We demonstrate that the HTP–PGP approach to synthesize and screen fouling-resistant surfaces is general, and thus provides the capability to develop surfaces optimized for specific feeds. Surfaces were prepared via graft polymerization onto poly(ether sulfone) (PES) membranes and were evaluated using a protein adsorption assay followed by pressure-driven filtration. We have employed the HTP–PGP approach to confirm previously reported successful monomers and to develop new antifouling surfaces from a library of 66 monomers for four different challenges of interest to the biotechnology community: hen egg-white lysozyme, supernatant from Chinese Hamster Ovary (CHO) cells in phosphate buffered saline (PBS) solution as a model cell suspension, and immunoglobulin G (IgG) precipitated in the absence and presence of bovine serum albumin (BSA) in high salt solution as a model precipitation process.

  8. High-throughput fragment screening by affinity LC-MS.

    Science.gov (United States)

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in 3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  9. Probabilistic Assessment of High-Throughput Wireless Sensor Networks.

    Science.gov (United States)

    Kim, Robin E; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F; Song, Junho

    2016-05-31

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved.

  10. Benchmarking procedures for high-throughput context specific reconstruction algorithms

    Directory of Open Access Journals (Sweden)

    Maria ePires Pacheco

    2016-01-01

    Full Text Available Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX (Duarte et al., 2007; Thiele et al., 2013 or HMR (Agren et al., 2013 has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last ten years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding.This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished, consistency testing and comparison based testing. The former includes methods like cross validation or testing with artificial networks. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms, that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms

  11. Evaluation of a high throughput starch analysis optimised for wood.

    Directory of Open Access Journals (Sweden)

    Chandra Bellasio

    Full Text Available Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11 was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood of four species (coniferous and flowering plants. The optimised protocol proved to be remarkably precise and accurate (3%, suitable for a high throughput routine analysis (35 samples a day of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  12. Evaluation of a high throughput starch analysis optimised for wood.

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  13. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  14. A high-throughput cidality screen for Mycobacterium tuberculosis.

    Directory of Open Access Journals (Sweden)

    Parvinder Kaur

    Full Text Available Exposure to Mycobacterium tuberculosis (Mtb aerosols is a major threat to tuberculosis (TB researchers, even in bio-safety level-3 (BSL-3 facilities. Automation and high-throughput screens (HTS in BSL3 facilities are essential for minimizing manual aerosol-generating interventions and facilitating TB research. In the present study, we report the development and validation of a high-throughput, 24-well 'spot-assay' for selecting bactericidal compounds against Mtb. The bactericidal screen concept was first validated in the fast-growing surrogate Mycobacterium smegmatis (Msm and subsequently confirmed in Mtb using the following reference anti-tubercular drugs: rifampicin, isoniazid, ofloxacin and ethambutol (RIOE, acting on different targets. The potential use of the spot-assay to select bactericidal compounds from a large library was confirmed by screening on Mtb, with parallel plating by the conventional gold standard method (correlation, r2 = 0.808. An automated spot-assay further enabled an MBC90 determination on resistant and sensitive Mtb clinical isolates. The implementation of the spot-assay in kinetic screens to enumerate residual Mtb after either genetic silencing (anti-sense RNA, AS-RNA or chemical inhibition corroborated its ability to detect cidality. This relatively simple, economical and quantitative HTS considerably minimized the bio-hazard risk and enabled the selection of novel vulnerable Mtb targets and mycobactericidal compounds. Thus, spot-assays have great potential to impact the TB drug discovery process.

  15. A scalable approach for high throughput branch flow filtration.

    Science.gov (United States)

    Inglis, David W; Herman, Nick

    2013-05-07

    Microfluidic continuous flow filtration methods have the potential for very high size resolution using minimum feature sizes that are larger than the separation size, thereby circumventing the problem of clogging. Branch flow filtration is particularly promising because it has an unlimited dynamic range (ratio of largest passable particle to the smallest separated particle) but suffers from very poor volume throughput because when many branches are used, they cannot be identical if each is to have the same size cut-off. We describe a new iterative approach to the design of branch filtration devices able to overcome this limitation without large dead volumes. This is demonstrated by numerical modelling, fabrication and testing of devices with 20 branches, with dynamic ranges up to 6.9, and high filtration ratios (14-29%) on beads and fungal spores. The filters have a sharp size cutoff (10× depletion for 12% size difference), with large particle rejection equivalent to a 20th order Butterworth low pass filter. The devices are fully scalable, enabling higher throughput and smaller cutoff sizes and they are compatible with ultra low cost fabrication.

  16. New technologies for ultra-high throughput genotyping in plants.

    Science.gov (United States)

    Appleby, Nikki; Edwards, David; Batley, Jacqueline

    2009-01-01

    Molecular genetic markers represent one of the most powerful tools for the analysis of plant genomes and the association of heritable traits with underlying genetic variation. Molecular marker technology has developed rapidly over the last decade, with the development of high-throughput genotyping methods. Two forms of sequence-based marker, simple sequence repeats (SSRs), also known as microsatellites and single nucleotide polymorphisms (SNPs) now predominate applications in modern plant genetic analysis, along the anonymous marker systems such as amplified fragment length polymorphisms (AFLPs) and diversity array technology (DArT). The reducing cost of DNA sequencing and increasing availability of large sequence data sets permits the mining of this data for large numbers of SSRs and SNPs. These may then be used in applications such as genetic linkage analysis and trait mapping, diversity analysis, association studies and marker-assisted selection. Here, we describe automated methods for the discovery of molecular markers and new technologies for high-throughput, low-cost molecular marker genotyping. Genotyping examples include multiplexing of SSRs using Multiplex-Ready marker technology (MRT); DArT genotyping; SNP genotyping using the Invader assay, the single base extension (SBE), oligonucleotide ligation assay (OLA) SNPlex system, and Illumina GoldenGate and Infinium methods.

  17. High throughput instruments, methods, and informatics for systems biology.

    Energy Technology Data Exchange (ETDEWEB)

    Sinclair, Michael B.; Cowie, Jim R. (New Mexico State University, Las Cruces, NM); Van Benthem, Mark Hilary; Wylie, Brian Neil; Davidson, George S.; Haaland, David Michael; Timlin, Jerilyn Ann; Aragon, Anthony D. (University of New Mexico, Albuquerque, NM); Keenan, Michael Robert; Boyack, Kevin W.; Thomas, Edward Victor; Werner-Washburne, Margaret C. (University of New Mexico, Albuquerque, NM); Mosquera-Caro, Monica P. (University of New Mexico, Albuquerque, NM); Martinez, M. Juanita (University of New Mexico, Albuquerque, NM); Martin, Shawn Bryan; Willman, Cheryl L. (University of New Mexico, Albuquerque, NM)

    2003-12-01

    High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery.

  18. High-throughput screening to enhance oncolytic virus immunotherapy

    Science.gov (United States)

    Allan, KJ; Stojdl, David F; Swift, SL

    2016-01-01

    High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs) are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. PMID:27579293

  19. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  20. High-throughput nanoparticle catalysis: partial oxidation of propylene.

    Science.gov (United States)

    Duan, Shici; Kahn, Michael; Senkan, Selim

    2007-02-01

    Partial oxidation of propylene was investigated at 1 atm pressure over Rh/TiO(2) catalysts as a function of reaction temperature, metal loading and particle size using high-throughput methods. Catalysts were prepared by ablating thin sheets of pure rhodium metal using an excimer laser and by collecting the nanoparticles created on the external surfaces of TiO(2) pellets that were placed inside the ablation plume. Rh nanoparticles before the experiments were characterized by transmission electron microscopy (TEM) by collecting them on carbon film. Catalyst evaluations were performed using a high-throughput array channel microreactor system coupled to quadrupole mass spectrometry (MS) and gas chromatography (GC). The reaction conditions were 23% C(3)H(6), 20% O(2) and the balance helium in the feed, 20,000 h(-1) GHSV and a temperature range of 250-325 degrees C. The reaction products included primarily acetone (AT) and to a lesser degree propionaldehyde (PaL) as the C(3) products, together with deep oxidation products COx.

  1. Compression of structured high-throughput sequencing data.

    Directory of Open Access Journals (Sweden)

    Fabien Campagne

    Full Text Available Large biological datasets are being produced at a rapid pace and create substantial storage challenges, particularly in the domain of high-throughput sequencing (HTS. Most approaches currently used to store HTS data are either unable to quickly adapt to the requirements of new sequencing or analysis methods (because they do not support schema evolution, or fail to provide state of the art compression of the datasets. We have devised new approaches to store HTS data that support seamless data schema evolution and compress datasets substantially better than existing approaches. Building on these new approaches, we discuss and demonstrate how a multi-tier data organization can dramatically reduce the storage, computational and network burden of collecting, analyzing, and archiving large sequencing datasets. For instance, we show that spliced RNA-Seq alignments can be stored in less than 4% the size of a BAM file with perfect data fidelity. Compared to the previous compression state of the art, these methods reduce dataset size more than 40% when storing exome, gene expression or DNA methylation datasets. The approaches have been integrated in a comprehensive suite of software tools (http://goby.campagnelab.org that support common analyses for a range of high-throughput sequencing assays.

  2. Fluorescent foci quantitation for high-throughput analysis

    Science.gov (United States)

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  3. Field high-throughput phenotyping: the new crop breeding frontier.

    Science.gov (United States)

    Araus, José Luis; Cairns, Jill E

    2014-01-01

    Constraints in field phenotyping capability limit our ability to dissect the genetics of quantitative traits, particularly those related to yield and stress tolerance (e.g., yield potential as well as increased drought, heat tolerance, and nutrient efficiency, etc.). The development of effective field-based high-throughput phenotyping platforms (HTPPs) remains a bottleneck for future breeding advances. However, progress in sensors, aeronautics, and high-performance computing are paving the way. Here, we review recent advances in field HTPPs, which should combine at an affordable cost, high capacity for data recording, scoring and processing, and non-invasive remote sensing methods, together with automated environmental data collection. Laboratory analyses of key plant parts may complement direct phenotyping under field conditions. Improvements in user-friendly data management together with a more powerful interpretation of results should increase the use of field HTPPs, therefore increasing the efficiency of crop genetic improvement to meet the needs of future generations.

  4. A high-throughput Raman notch filter set

    Science.gov (United States)

    Puppels, G. J.; Huizinga, A.; Krabbe, H. W.; de Boer, H. A.; Gijsbers, G.; de Mul, F. F. M.

    1990-12-01

    A chevron-type Raman notch filter (RNF) set is described. lt combines a high signal throughput (up to 90% around 1600 cm-1 and ≳80% between and 700 and 2700 cm-1) with a laser line suppression of 108-109. The filter set can be used to replace the first two dispersion stages in triple-stage Raman monochromators commonly employed in multichannel detection systems. This yields a gain in intensity of the detected Raman signal of a factor of 4. It is shown that in Raman spectrometers with a backscatter geometry, the filter set can also be used to optically couple the microscope and the spectrometer. This leads to a further increase in signal intensity of a factor of 3-4 as compared to the situation where a beam splitter is used. Additional advantages of the RNF set are the fact that signal throughput is almost polarization independent over a large spectral interval and that it offers the possibility to simultaneously record Stokes and anti-Stokes spectra.

  5. High-throughput comet assay using 96 minigels.

    Science.gov (United States)

    Gutzkow, Kristine B; Langleite, Torgrim M; Meier, Silja; Graupner, Anne; Collins, Andrew R; Brunborg, Gunnar

    2013-05-01

    The single-cell gel electrophoresis--the comet assay--has proved to be a sensitive and relatively simple method that is much used in research for the analysis of specific types of DNA damage, and its use in genotoxicity testing is increasing. The efficiency of the comet assay, in terms of number of samples processed per experiment, has been rather poor, and both research and toxicological testing should profit from an increased throughput. We have designed and validated a format involving 96 agarose minigels supported by a hydrophilic polyester film. Using simple technology, hundreds of samples may be processed in one experiment by one person, with less time needed for processing, less use of chemicals and requiring fewer cells per sample. Controlled electrophoresis, including circulation of the electrophoresis solution, improves the homogeneity between replicate samples in the 96-minigel format. The high-throughput method described in this paper should greatly increase the overall capacity, versatility and robustness of the comet assay.

  6. Adaptation to high throughput batch chromatography enhances multivariate screening.

    Science.gov (United States)

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Towards high-throughput microfluidic Raman-activated cell sorting.

    Science.gov (United States)

    Zhang, Qiang; Zhang, Peiran; Gou, Honglei; Mou, Chunbo; Huang, Wei E; Yang, Menglong; Xu, Jian; Ma, Bo

    2015-09-21

    Raman-activated cell sorting (RACS) is a promising single-cell analysis technology that is able to identify and isolate individual cells of targeted type, state or environment from an isogenic population or complex consortium of cells, in a label-free and non-invasive manner. However, compared with those widely used yet labeling-required or staining-dependent cell sorting technologies such as FACS and MACS, the weak Raman signal greatly limits the further development of the existing RACS systems to achieve higher throughput. Strategies that can tackle this bottleneck include, first, improvement of Raman-acquisition efficiency and quality based on advanced Raman spectrometers and enhanced Raman techniques; second, development of novel microfluidic devices for cell sorting followed by integration into a complete RACS system. Exploiting these strategies, prototypes for a new generation of RACS have been demonstrated, such as flow-based OT-RACS, DEP-RACS, and SERS/CARS flow cytometry. Such high-throughput microfluidic RACS can provide biologists with a powerful single-cell analysis tool to explore the scientific questions or applications that have been beyond the reach of FACS and MACS.

  8. A high-throughput screen for antibiotic drug discovery.

    Science.gov (United States)

    Scanlon, Thomas C; Dostal, Sarah M; Griswold, Karl E

    2014-02-01

    We describe an ultra-high-throughput screening platform enabling discovery and/or engineering of natural product antibiotics. The methodology involves creation of hydrogel-in-oil emulsions in which recombinant microorganisms are co-emulsified with bacterial pathogens; antibiotic activity is assayed by use of a fluorescent viability dye. We have successfully utilized both bulk emulsification and microfluidic technology for the generation of hydrogel microdroplets that are size-compatible with conventional flow cytometry. Hydrogel droplets are ∼25 pL in volume, and can be synthesized and sorted at rates exceeding 3,000 drops/s. Using this technique, we have achieved screening throughputs exceeding 5 million clones/day. Proof-of-concept experiments demonstrate efficient selection of antibiotic-secreting yeast from a vast excess of negative controls. In addition, we have successfully used this technique to screen a metagenomic library for secreted antibiotics that kill the human pathogen Staphylococcus aureus. Our results establish the practical utility of the screening platform, and we anticipate that the accessible nature of our methods will enable others seeking to identify and engineer the next generation of antibacterial biomolecules. © 2013 Wiley Periodicals, Inc.

  9. A robust robotic high-throughput antibody purification platform.

    Science.gov (United States)

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant.

  10. Acoustic transfer of protein crystals from agarose pedestals to micromeshes for high-throughput screening.

    Science.gov (United States)

    Cuttitta, Christina M; Ericson, Daniel L; Scalia, Alexander; Roessler, Christian G; Teplitsky, Ella; Joshi, Karan; Campos, Olven; Agarwal, Rakhi; Allaire, Marc; Orville, Allen M; Sweet, Robert M; Soares, Alexei S

    2015-01-01

    Acoustic droplet ejection (ADE) is an emerging technology with broad applications in serial crystallography such as growing, improving and manipulating protein crystals. One application of this technology is to gently transfer crystals onto MiTeGen micromeshes with minimal solvent. Once mounted on a micromesh, each crystal can be combined with different chemicals such as crystal-improving additives or a fragment library. Acoustic crystal mounting is fast (2.33 transfers s(-1)) and all transfers occur in a sealed environment that is in vapor equilibrium with the mother liquor. Here, a system is presented to retain crystals near the ejection point and away from the inaccessible dead volume at the bottom of the well by placing the crystals on a concave agarose pedestal (CAP) with the same chemical composition as the crystal mother liquor. The bowl-shaped CAP is impenetrable to crystals. Consequently, gravity will gently move the crystals into the optimal location for acoustic ejection. It is demonstrated that an agarose pedestal of this type is compatible with most commercially available crystallization conditions and that protein crystals are readily transferred from the agarose pedestal onto micromeshes with no loss in diffraction quality. It is also shown that crystals can be grown directly on CAPs, which avoids the need to transfer the crystals from the hanging drop to a CAP. This technology has been used to combine thermolysin and lysozyme crystals with an assortment of anomalously scattering heavy atoms. The results point towards a fast nanolitre method for crystal mounting and high-throughput screening.

  11. High-throughput microcavitation bubble induced cellular mechanotransduction

    Science.gov (United States)

    Compton, Jonathan Lee

    inhibitor to IP 3 induced Ca2+ release. This capability opens the development of a high-throughput screening platform for molecules that modulate cellular mechanotransduction. We have applied this approach to screen the effects of a small set of small molecules, in a 96-well plate in less than an hour. These detailed studies offer a basis for the design, development, and implementation of a novel high-throughput mechanotransduction assay to rapidly screen the effect of small molecules on cellular mechanotransduction at high throughput.

  12. High-Throughput Tools for Characterization of Antibody Epitopes

    DEFF Research Database (Denmark)

    Christiansen, Anders

    , it is important to characterize antibodies thoroughly. In parallel to the characterization of antibodies, it is also important to characterize the binding area that is recognized by the antibody, known as an epitope. With the development of new technologies, such as high-throughput sequencing (HTS....... In this study, these improvements were utilized to characterize epitopes at high resolution, i.e. determine the importance of each residue for antibody binding, for all major peanut allergens. Epitope reactivity among patients often converged on known epitope hotspots, however the binding patterns were somewhat...... multiple years. Taken together, the presented studies demonstrated new applications for the investigated techniques focusing on their utilization in epitope mapping. In the process, new insights were obtained into how antibodies recognize their targets in a major disease, i.e. food allergy....

  13. Single-platelet nanomechanics measured by high-throughput cytometry

    Science.gov (United States)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2016-10-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  14. Automated Transition State Theory Calculations for High-Throughput Kinetics.

    Science.gov (United States)

    Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard Henry

    2017-08-18

    A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.

  15. High-throughput antibody development and retrospective epitope mapping

    DEFF Research Database (Denmark)

    Rydahl, Maja Gro

    Plant cell walls are composed of an interlinked network of polysaccharides, glycoproteins and phenolic polymers. When addressing the diverse polysaccharides in green plants, including land plants and the ancestral green algae, there are significant overlaps in the cell wall structures. Yet...... of green algae, during the development into land plants. Hence, there is a pressing need for rethinking the glycomic toolbox, by developing new and high-throughput (HTP) technology, in order to acquire information of the location and relative abundance of diverse cell wall polymers. In this dissertation......, there are noteworthy differences in the less evolved species of algae as compared to land plants. The dynamic process orchestrating the deposition of these biopolymers both in algae and higher plants, is complex and highly heterogeneous, yet immensely important for the development and differentiation of the cell...

  16. Numerical techniques for high-throughput reflectance interference biosensing

    Science.gov (United States)

    Sevenler, Derin; Ünlü, M. Selim

    2016-06-01

    We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.

  17. The Principals and Practice of Distributed High Throughput Computing

    CERN Document Server

    CERN. Geneva

    2016-01-01

    The potential of Distributed Processing Systems to deliver computing capabilities with qualities ranging from high availability and reliability to easy expansion in functionality and capacity were recognized and formalized in the 1970’s. For more three decade these principals Distributed Computing guided the development of the HTCondor resource and job management system. The widely adopted suite of software tools offered by HTCondor are based on novel distributed computing technologies and are driven by the evolving needs of High Throughput scientific applications. We will review the principals that underpin our work, the distributed computing frameworks and technologies we developed and the lessons we learned from delivering effective and dependable software tools in an ever changing landscape computing technologies and needs that range today from a desktop computer to tens of thousands of cores offered by commercial clouds. About the speaker Miron Livny received a B.Sc. degree in Physics and Mat...

  18. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion....... The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery.......Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels...

  19. Electron crystallography and aquaporins.

    Science.gov (United States)

    Schenk, Andreas D; Hite, Richard K; Engel, Andreas; Fujiyoshi, Yoshinori; Walz, Thomas

    2010-01-01

    Electron crystallography of two-dimensional (2D) crystals can provide information on the structure of membrane proteins at near-atomic resolution. Originally developed and used to determine the structure of bacteriorhodopsin (bR), electron crystallography has recently been applied to elucidate the structure of aquaporins (AQPs), a family of membrane proteins that form pores mostly for water but also other solutes. While electron crystallography has made major contributions to our understanding of the structure and function of AQPs, structural studies on AQPs, in turn, have fostered a number of technical developments in electron crystallography. In this contribution, we summarize the insights electron crystallography has provided into the biology of AQPs, and describe technical advancements in electron crystallography that were driven by structural studies on AQP 2D crystals. In addition, we discuss some of the lessons that were learned from electron crystallographic work on AQPs.

  20. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  1. High-throughput DNA droplet assays using picoliter reactor volumes.

    Science.gov (United States)

    Srisa-Art, Monpichar; deMello, Andrew J; Edel, Joshua B

    2007-09-01

    The online characterization and detection of individual droplets at high speeds, low analyte concentrations, and perfect detection efficiencies is a significant challenge underpinning the application of microfluidic droplet reactors to high-throughput chemistry and biology. Herein, we describe the integration of confocal fluorescence spectroscopy as a high-efficiency detection method for droplet-based microfluidics. Issues such as surface contamination, rapid mixing, and rapid detection, as well as low detections limits have been addressed with the approach described when compared to conventional laminar flow-based fluidics. Using such a system, droplet size, droplet shape, droplet formation frequencies, and droplet compositions can be measured accurately and precisely at kilohertz frequencies. Taking advantage of this approach, we demonstrate a high-throughput biological assay based on fluorescence resonance energy transfer (FRET). By attaching a FRET donor (Alexa Fluor 488) to streptavidin and labeling a FRET acceptor (Alexa Fluor 647) on one DNA strand and biotin on the complementary strand, donor and acceptor molecules are brought in proximity due to streptavidin-biotin binding, resulting in FRET. Fluorescence bursts of the donor and acceptor from each droplet can be monitored simultaneously using separate avalanche photodiode detectors operating in single photon counting mode. Binding assays were investigated and compared between fixed streptavidin and DNA concentrations. Binding curves fit perfectly to Hill-Waud models, and the binding ratio between streptavidin and biotin was evaluated and found to be in agreement with the biotin binding sites on streptavidin. FRET efficiency for this FRET pair was also investigated from the binding results. Efficiency results show that this detection system can precisely measure FRET even at low FRET efficiencies.

  2. High-Throughput Genomics Enhances Tomato Breeding Efficiency

    Science.gov (United States)

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-01-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits. PMID:19721805

  3. The macromolecular crystallography facility at the advanced light source

    Science.gov (United States)

    Earnest, Thomas; Padmore, Howard; Cork, Carl; Behrsing, Rolf; Kim, Sung-Hou

    1996-10-01

    Synchrotron radiation offers several advantages over the use of rotating anode sources for biological crystallography, which allow for the collection of higher-resolution data, substantially more rapid data collection, phasing by multiwavelength anomalous diffraction (MAD) techniques, and time-resolved experiments using polychromatic radiation (Laue diffraction). The use of synchrotron radiation is often necessary to record useful data from crystals which diffract weakly or have very large unit cells. The high brightness and stability characteristics of the advanced light source (ALS) at Lawrence Berkeley National Laboratory, along with the low emittance and long straight sections to accommodate insertion devices present in third generation synchrotrons like the ALS, lead to several advantages in the field of macromolecular crystallography. We are presently constructing a macromolecular crystallography facility at the ALS which is optimized for user-friendliness and high-throughput data collection, with advanced capabilities for MAD and Laue experiments. The X-rays will be directed to three branchlines. A well-equipped support lab will be available for biochemistry, crystal mounting and sample storage, as well as computer hardware and software available, along with staff support, allowing for the complete processing of data on site.

  4. Atom probe crystallography

    National Research Council Canada - National Science Library

    Gault, Baptiste; Moody, Michael P; Cairney, Julie M; Ringer, Simon P

    2012-01-01

    This review addresses new developments in the emerging area of "atom probe crystallography", a materials characterization tool with the unique capacity to reveal both composition and crystallographic...

  5. Racemic DNA crystallography.

    Science.gov (United States)

    Mandal, Pradeep K; Collie, Gavin W; Kauffmann, Brice; Huc, Ivan

    2014-12-22

    Racemates increase the chances of crystallization by allowing molecular contacts to be formed in a greater number of ways. With the advent of protein synthesis, the production of protein racemates and racemic-protein crystallography are now possible. Curiously, racemic DNA crystallography had not been investigated despite the commercial availability of L- and D-deoxyribo-oligonucleotides. Here, we report a study into racemic DNA crystallography showing the strong propensity of racemic DNA mixtures to form racemic crystals. We describe racemic crystal structures of various DNA sequences and folded conformations, including duplexes, quadruplexes, and a four-way junction, showing that the advantages of racemic crystallography should extend to DNA.

  6. Present and future of membrane protein structure determination by electron crystallography.

    Science.gov (United States)

    Ubarretxena-Belandia, Iban; Stokes, David L

    2010-01-01

    Membrane proteins are critical to cell physiology, playing roles in signaling, trafficking, transport, adhesion, and recognition. Despite their relative abundance in the proteome and their prevalence as targets of therapeutic drugs, structural information about membrane proteins is in short supply. This chapter describes the use of electron crystallography as a tool for determining membrane protein structures. Electron crystallography offers distinct advantages relative to the alternatives of X-ray crystallography and NMR spectroscopy. Namely, membrane proteins are placed in their native membranous environment, which is likely to favor a native conformation and allow changes in conformation in response to physiological ligands. Nevertheless, there are significant logistical challenges in finding appropriate conditions for inducing membrane proteins to form two-dimensional arrays within the membrane and in using electron cryo-microscopy to collect the data required for structure determination. A number of developments are described for high-throughput screening of crystallization trials and for automated imaging of crystals with the electron microscope. These tools are critical for exploring the necessary range of factors governing the crystallization process. There have also been recent software developments to facilitate the process of structure determination. However, further innovations in the algorithms used for processing images and electron diffraction are necessary to improve throughput and to make electron crystallography truly viable as a method for determining atomic structures of membrane proteins.

  7. A high-throughput chemically induced inflammation assay in zebrafish

    Directory of Open Access Journals (Sweden)

    Liebel Urban

    2010-12-01

    Full Text Available Abstract Background Studies on innate immunity have benefited from the introduction of zebrafish as a model system. Transgenic fish expressing fluorescent proteins in leukocyte populations allow direct, quantitative visualization of an inflammatory response in vivo. It has been proposed that this animal model can be used for high-throughput screens aimed at the identification of novel immunomodulatory lead compounds. However, current assays require invasive manipulation of fish individually, thus preventing high-content screening. Results Here we show that specific, noninvasive damage to lateral line neuromast cells can induce a robust acute inflammatory response. Exposure of fish larvae to sublethal concentrations of copper sulfate selectively damages the sensory hair cell population inducing infiltration of leukocytes to neuromasts within 20 minutes. Inflammation can be assayed in real time using transgenic fish expressing fluorescent proteins in leukocytes or by histochemical assays in fixed larvae. We demonstrate the usefulness of this method for chemical and genetic screens to detect the effect of immunomodulatory compounds and mutations affecting the leukocyte response. Moreover, we transformed the assay into a high-throughput screening method by using a customized automated imaging and processing system that quantifies the magnitude of the inflammatory reaction. Conclusions This approach allows rapid screening of thousands of compounds or mutagenized zebrafish for effects on inflammation and enables the identification of novel players in the regulation of innate immunity and potential lead compounds toward new immunomodulatory therapies. We have called this method the chemically induced inflammation assay, or ChIn assay. See Commentary article: http://www.biomedcentral.com/1741-7007/8/148.

  8. Interpretation of mass spectrometry data for high-throughput proteomics.

    Science.gov (United States)

    Chamrad, Daniel C; Koerting, Gerhard; Gobom, Johan; Thiele, Herbert; Klose, Joachim; Meyer, Helmut E; Blueggel, Martin

    2003-08-01

    Recent developments in proteomics have revealed a bottleneck in bioinformatics: high-quality interpretation of acquired MS data. The ability to generate thousands of MS spectra per day, and the demand for this, makes manual methods inadequate for analysis and underlines the need to transfer the advanced capabilities of an expert human user into sophisticated MS interpretation algorithms. The identification rate in current high-throughput proteomics studies is not only a matter of instrumentation. We present software for high-throughput PMF identification, which enables robust and confident protein identification at higher rates. This has been achieved by automated calibration, peak rejection, and use of a meta search approach which employs various PMF search engines. The automatic calibration consists of a dynamic, spectral information-dependent algorithm, which combines various known calibration methods and iteratively establishes an optimised calibration. The peak rejection algorithm filters signals that are unrelated to the analysed protein by use of automatically generated and dataset-dependent exclusion lists. In the "meta search" several known PMF search engines are triggered and their results are merged by use of a meta score. The significance of the meta score was assessed by simulation of PMF identification with 10,000 artificial spectra resembling a data situation close to the measured dataset. By means of this simulation the meta score is linked to expectation values as a statistical measure. The presented software is part of the proteome database ProteinScape which links the information derived from MS data to other relevant proteomics data. We demonstrate the performance of the presented system with MS data from 1891 PMF spectra. As a result of automatic calibration and peak rejection the identification rate increased from 6% to 44%.

  9. Surrogate-assisted feature extraction for high-throughput phenotyping.

    Science.gov (United States)

    Yu, Sheng; Chakrabortty, Abhishek; Liao, Katherine P; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2017-04-01

    Phenotyping algorithms are capable of accurately identifying patients with specific phenotypes from within electronic medical records systems. However, developing phenotyping algorithms in a scalable way remains a challenge due to the extensive human resources required. This paper introduces a high-throughput unsupervised feature selection method, which improves the robustness and scalability of electronic medical record phenotyping without compromising its accuracy. The proposed Surrogate-Assisted Feature Extraction (SAFE) method selects candidate features from a pool of comprehensive medical concepts found in publicly available knowledge sources. The target phenotype's International Classification of Diseases, Ninth Revision and natural language processing counts, acting as noisy surrogates to the gold-standard labels, are used to create silver-standard labels. Candidate features highly predictive of the silver-standard labels are selected as the final features. Algorithms were trained to identify patients with coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis using various numbers of labels to compare the performance of features selected by SAFE, a previously published automated feature extraction for phenotyping procedure, and domain experts. The out-of-sample area under the receiver operating characteristic curve and F -score from SAFE algorithms were remarkably higher than those from the other two, especially at small label sizes. SAFE advances high-throughput phenotyping methods by automatically selecting a succinct set of informative features for algorithm training, which in turn reduces overfitting and the needed number of gold-standard labels. SAFE also potentially identifies important features missed by automated feature extraction for phenotyping or experts.

  10. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  11. Microfluidic cell chips for high-throughput drug screening.

    Science.gov (United States)

    Chi, Chun-Wei; Ahmed, Ah Rezwanuddin; Dereli-Korkut, Zeynep; Wang, Sihong

    2016-05-01

    The current state of screening methods for drug discovery is still riddled with several inefficiencies. Although some widely used high-throughput screening platforms may enhance the drug screening process, their cost and oversimplification of cell-drug interactions pose a translational difficulty. Microfluidic cell-chips resolve many issues found in conventional HTS technology, providing benefits such as reduced sample quantity and integration of 3D cell culture physically more representative of the physiological/pathological microenvironment. In this review, we introduce the advantages of microfluidic devices in drug screening, and outline the critical factors which influence device design, highlighting recent innovations and advances in the field including a summary of commercialization efforts on microfluidic cell chips. Future perspectives of microfluidic cell devices are also provided based on considerations of present technological limitations and translational barriers.

  12. High-throughput Identification of Phage-derived Imaging Agents

    Directory of Open Access Journals (Sweden)

    Kimberly A. Kelly

    2006-01-01

    Full Text Available The use of phage-displayed peptide libraries is a powerful method for selecting peptides with desired binding properties. However, the validation and prioritization of “hits” obtained from this screening approach remains challenging. Here, we describe the development and testing of a new analysis method to identify and display hits from phage-display experiments and high-throughput enzyme-linked immunosorbent assay screens. We test the method using a phage screen against activated macrophages to develop imaging agents with higher specificity for active disease processes. The new methodology should be useful in identifying phage hits and is extendable to other library screening methods such as small-molecule and nanoparticle libraries.

  13. High-throughput ab-initio dilute solute diffusion database

    Science.gov (United States)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  14. UAV-based high-throughput phenotyping in legume crops

    Science.gov (United States)

    Sankaran, Sindhuja; Khot, Lav R.; Quirós, Juan; Vandemark, George J.; McGee, Rebecca J.

    2016-05-01

    In plant breeding, one of the biggest obstacles in genetic improvement is the lack of proven rapid methods for measuring plant responses in field conditions. Therefore, the major objective of this research was to evaluate the feasibility of utilizing high-throughput remote sensing technology for rapid measurement of phenotyping traits in legume crops. The plant responses of several chickpea and peas varieties to the environment were assessed with an unmanned aerial vehicle (UAV) integrated with multispectral imaging sensors. Our preliminary assessment showed that the vegetation indices are strongly correlated (p<0.05) with seed yield of legume crops. Results endorse the potential of UAS-based sensing technology to rapidly measure those phenotyping traits.

  15. High-throughput profiling in the hematopoietic system.

    Science.gov (United States)

    Fabbri, Muller; Spizzo, Riccardo; Calin, George A

    2010-01-01

    The expression profile of microRNAs significantly varies in physiological and pathological conditions. Increasing evidence from the literature shows that abnormalities of the miRNome (defined as the full spectrum of miRNAs expressed in a genome) occur in almost all human diseases and have important pathogenetic, prognostic, and therapeutic implications. The study of the aberrancies of the miRNome has become possible by developing high-throughput profiling techniques that allow the simultaneous detection of differences in miRNA expression between normal and pathologic tissues or simply tissues at different stages of differentiation. These techniques provide the basis for further investigations focused on the miRNAs, which are most frequently and widely differentially expressed under the different investigated conditions.

  16. Interactive Visual Analysis of High Throughput Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; Goodall, John R [ORNL; Maness, Christopher S [ORNL; Senter, James K [ORNL; Potok, Thomas E [ORNL

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  17. High-Throughput Screening Using Fourier-Transform Infrared Imaging

    Directory of Open Access Journals (Sweden)

    Erdem Sasmaz

    2015-06-01

    Full Text Available Efficient parallel screening of combinatorial libraries is one of the most challenging aspects of the high-throughput (HT heterogeneous catalysis workflow. Today, a number of methods have been used in HT catalyst studies, including various optical, mass-spectrometry, and gas-chromatography techniques. Of these, rapid-scanning Fourier-transform infrared (FTIR imaging is one of the fastest and most versatile screening techniques. Here, the new design of the 16-channel HT reactor is presented and test results for its accuracy and reproducibility are shown. The performance of the system was evaluated through the oxidation of CO over commercial Pd/Al2O3 and cobalt oxide nanoparticles synthesized with different reducer-reductant molar ratios, surfactant types, metal and surfactant concentrations, synthesis temperatures, and ramp rates.

  18. High-throughput sequencing in veterinary infection biology and diagnostics.

    Science.gov (United States)

    Belák, S; Karlsson, O E; Leijon, M; Granberg, F

    2013-12-01

    Sequencing methods have improved rapidly since the first versions of the Sanger techniques, facilitating the development of very powerful tools for detecting and identifying various pathogens, such as viruses, bacteria and other microbes. The ongoing development of high-throughput sequencing (HTS; also known as next-generation sequencing) technologies has resulted in a dramatic reduction in DNA sequencing costs, making the technology more accessible to the average laboratory. In this White Paper of the World Organisation for Animal Health (OIE) Collaborating Centre for the Biotechnology-based Diagnosis of Infectious Diseases in Veterinary Medicine (Uppsala, Sweden), several approaches and examples of HTS are summarised, and their diagnostic applicability is briefly discussed. Selected future aspects of HTS are outlined, including the need for bioinformatic resources, with a focus on improving the diagnosis and control of infectious diseases in veterinary medicine.

  19. Applications of High-Throughput Nucleotide Sequencing (PhD)

    DEFF Research Database (Denmark)

    Waage, Johannes

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come......-sequencing, a study of the effects on alternative RNA splicing of KO of the nonsense mediated RNA decay system in Mus, using digital gene expression and a custom-built exon-exon junction mapping pipeline is presented (article I). Evolved from this work, a Bioconductor package, spliceR, for classifying alternative...... splicing events and coding potential of isoforms from full isoform deconvolution software, such as Cufflinks (article II), is presented. Finally, a study using 5’-end RNA-seq for alternative promoter detection between healthy patients and patients with acute promyelocytic leukemia is presented (article III...

  20. High throughput sequencing reveals a novel fabavirus infecting sweet cherry.

    Science.gov (United States)

    Villamor, D E V; Pillai, S S; Eastwell, K C

    2017-03-01

    The genus Fabavirus currently consists of five species represented by viruses that infect a wide range of hosts but none reported from temperate climate fruit trees. A virus with genomic features resembling fabaviruses (tentatively named Prunus virus F, PrVF) was revealed by high throughput sequencing of extracts from a sweet cherry tree (Prunus avium). PrVF was subsequently shown to be graft transmissible and further identified in three other non-symptomatic Prunus spp. from different geographical locations. Two genetic variants of RNA1 and RNA2 coexisted in the same samples. RNA1 consisted of 6,165 and 6,163 nucleotides, and RNA2 consisted of 3,622 and 3,468 nucleotides.

  1. EDITORIAL: Combinatorial and High-Throughput Materials Research

    Science.gov (United States)

    Potyrailo, Radislav A.; Takeuchi, Ichiro

    2005-01-01

    The success of combinatorial and high-throughput methodologies relies greatly on the availability of various characterization tools with new and improved capabilities [1]. Indeed, how useful can a combinatorial library of 250, 400, 25 000 or 2 000 000 compounds be [2-5] if one is unable to characterize its properties of interest fairly quickly? How useful can a set of thousands of spectra or chromatograms be if one is unable to analyse them in a timely manner? For these reasons, the development of new approaches for materials characterization is one of the most active areas in combinatorial materials science. The importance of this aspect of research in the field has been discussed in numerous conferences including the Pittsburgh Conferences, the American Chemical Society Meetings, the American Physical Society Meetings, the Materials Research Society Symposia and various Gordon Research Conferences. Naturally, the development of new measurement instrumentation attracts the attention not only of practitioners of combinatorial materials science but also of those who design new software for data manipulation and mining. Experimental designs of combinatorial libraries are pursued with available and realistic synthetic and characterization capabilities in mind. It is becoming increasingly critical to link the design of new equipment for high-throughput parallel materials synthesis with integrated measurement tools in order to enhance the efficacy of the overall experimental strategy. We have received an overwhelming response to our proposal and call for papers for this Special Issue on Combinatorial Materials Science. The papers in this issue of Measurement Science and Technology are a very timely collection that captures the state of modern combinatorial materials science. They demonstrate the significant advances that are taking place in the field. In some cases, characterization tools are now being operated in the factory mode. At the same time, major challenges

  2. High-throughput DNA extraction of forensic adhesive tapes.

    Science.gov (United States)

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  3. High throughput phenotyping for aphid resistance in large plant collections

    Directory of Open Access Journals (Sweden)

    Chen Xi

    2012-08-01

    Full Text Available Abstract Background Phloem-feeding insects are among the most devastating pests worldwide. They not only cause damage by feeding from the phloem, thereby depleting the plant from photo-assimilates, but also by vectoring viruses. Until now, the main way to prevent such problems is the frequent use of insecticides. Applying resistant varieties would be a more environmental friendly and sustainable solution. For this, resistant sources need to be identified first. Up to now there were no methods suitable for high throughput phenotyping of plant germplasm to identify sources of resistance towards phloem-feeding insects. Results In this paper we present a high throughput screening system to identify plants with an increased resistance against aphids. Its versatility is demonstrated using an Arabidopsis thaliana activation tag mutant line collection. This system consists of the green peach aphid Myzus persicae (Sulzer and the circulative virus Turnip yellows virus (TuYV. In an initial screening, with one plant representing one mutant line, 13 virus-free mutant lines were identified by ELISA. Using seeds produced from these lines, the putative candidates were re-evaluated and characterized, resulting in nine lines with increased resistance towards the aphid. Conclusions This M. persicae-TuYV screening system is an efficient, reliable and quick procedure to identify among thousands of mutated lines those resistant to aphids. In our study, nine mutant lines with increased resistance against the aphid were selected among 5160 mutant lines in just 5 months by one person. The system can be extended to other phloem-feeding insects and circulative viruses to identify insect resistant sources from several collections, including for example genebanks and artificially prepared mutant collections.

  4. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2016-10-14

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  5. Emerging metrology for high-throughput nanomaterial genotoxicology.

    Science.gov (United States)

    Nelson, Bryant C; Wright, Christa W; Ibuki, Yuko; Moreno-Villanueva, Maria; Karlsson, Hanna L; Hendriks, Giel; Sims, Christopher M; Singh, Neenu; Doak, Shareen H

    2017-01-01

    The rapid development of the engineered nanomaterial (ENM) manufacturing industry has accelerated the incorporation of ENMs into a wide variety of consumer products across the globe. Unintentionally or not, some of these ENMs may be introduced into the environment or come into contact with humans or other organisms resulting in unexpected biological effects. It is thus prudent to have rapid and robust analytical metrology in place that can be used to critically assess and/or predict the cytotoxicity, as well as the potential genotoxicity of these ENMs. Many of the traditional genotoxicity test methods [e.g. unscheduled DNA synthesis assay, bacterial reverse mutation (Ames) test, etc.,] for determining the DNA damaging potential of chemical and biological compounds are not suitable for the evaluation of ENMs, due to a variety of methodological issues ranging from potential assay interferences to problems centered on low sample throughput. Recently, a number of sensitive, high-throughput genotoxicity assays/platforms (CometChip assay, flow cytometry/micronucleus assay, flow cytometry/γ-H2AX assay, automated 'Fluorimetric Detection of Alkaline DNA Unwinding' (FADU) assay, ToxTracker reporter assay) have been developed, based on substantial modifications and enhancements of traditional genotoxicity assays. These new assays have been used for the rapid measurement of DNA damage (strand breaks), chromosomal damage (micronuclei) and for detecting upregulated DNA damage signalling pathways resulting from ENM exposures. In this critical review, we describe and discuss the fundamental measurement principles and measurement endpoints of these new assays, as well as the modes of operation, analytical metrics and potential interferences, as applicable to ENM exposures. An unbiased discussion of the major technical advantages and limitations of each assay for evaluating and predicting the genotoxic potential of ENMs is also provided. Published by Oxford University Press on

  6. High throughput optoelectronic smart pixel systems using diffractive optics

    Science.gov (United States)

    Chen, Chih-Hao

    1999-12-01

    Recent developments in digital video, multimedia technology and data networks have greatly increased the demand for high bandwidth communication channels and high throughput data processing. Electronics is particularly suited for switching, amplification and logic functions, while optics is more suitable for interconnections and communications with lower energy and crosstalk. In this research, we present the design, testing, integration and demonstration of several optoelectronic smart pixel devices and system architectures. These systems integrate electronic switching/processing capability with parallel optical interconnections to provide high throughput network communication and pipeline data processing. The Smart Pixel Array Cellular Logic processor (SPARCL) is designed in 0.8 m m CMOS and hybrid integrated with Multiple-Quantum-Well (MQW) devices for pipeline image processing. The Smart Pixel Network Interface (SAPIENT) is designed in 0.6 m m GaAs and monolithically integrated with LEDs to implement a highly parallel optical interconnection network. The Translucent Smart Pixel Array (TRANSPAR) design is implemented in two different versions. The first version, TRANSPAR-MQW, is designed in 0.5 m m CMOS and flip-chip integrated with MQW devices to provide 2-D pipeline processing and translucent networking using the Carrier- Sense-MultipleAccess/Collision-Detection (CSMA/CD) protocol. The other version, TRANSPAR-VM, is designed in 1.2 m m CMOS and discretely integrated with VCSEL-MSM (Vertical-Cavity-Surface- Emitting-Laser and Metal-Semiconductor-Metal detectors) chips and driver/receiver chips on a printed circuit board. The TRANSPAR-VM provides an option of using the token ring network protocol in addition to the embedded functions of TRANSPAR-MQW. These optoelectronic smart pixel systems also require micro-optics devices to provide high resolution, high quality optical interconnections and external source arrays. In this research, we describe an innovative

  7. Designing, Teaching, and Evaluating a Unit on Symmetry and Crystallography in the High School Classroom

    Science.gov (United States)

    Grove, Nathaniel P.; Collins, David J.; Lopez, Joseph J.; Bretz, Stacey Lowery; Zhou, Hong-Cai; Guerin, Nathan P.

    2009-01-01

    An innovative teaching and research partnership was developed in collaboration with public high school chemistry teachers from the Talawanda (Ohio) School District and faculty, staff, and students from Miami University. With the involvement of high school teachers, chemistry faculty, postdoctoral associates, and several graduate and undergraduate…

  8. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  9. High throughput jet singlet oxygen generator for multi kilowatt SCOIL

    Science.gov (United States)

    Rajesh, R.; Singhal, Gaurav; Mainuddin; Tyagi, R. K.; Dawar, A. L.

    2010-06-01

    A jet flow singlet oxygen generator (JSOG) capable of handling chlorine flows of nearly 1.5 mol s -1 has been designed, developed, and tested. The generator is designed in a modular configuration taking into consideration the practical aspects of handling high throughput flows without catastrophic BHP carry over. While for such high flow rates a cross-flow configuration has been reported, the generator utilized in the present study is a counter flow configuration. A near vertical extraction of singlet oxygen is effected at the generator exit, followed by a 90° rotation of the flow forming a novel verti-horizontal COIL scheme. This allows the COIL to be operated with a vertical extraction SOG followed by the horizontal arrangement of subsequent COIL systems such as supersonic nozzle, cavity, supersonic diffuser, etc. This enables a more uniform weight distribution from point of view of mobile and other platform mounted systems, which is highly relevant for large scale systems. The present study discusses the design aspects of the jet singlet oxygen generator along with its test results for various operating ranges. Typically, for the intended design flow rates, the chlorine utilization and singlet oxygen yield have been observed to be ˜94% and ˜64%, respectively.

  10. High-Throughput Screening Using Mass Spectrometry within Drug Discovery.

    Science.gov (United States)

    Rohman, Mattias; Wingfield, Jonathan

    2016-01-01

    In order to detect a biochemical analyte with a mass spectrometer (MS) it is necessary to ionize the analyte of interest. The analyte can be ionized by a number of different mechanisms, however, one common method is electrospray ionization (ESI). Droplets of analyte are sprayed through a highly charged field, the droplets pick up charge, and this is transferred to the analyte. High levels of salt in the assay buffer will potentially steal charge from the analyte and suppress the MS signal. In order to avoid this suppression of signal, salt is often removed from the sample prior to injection into the MS. Traditional ESI MS relies on liquid chromatography (LC) to remove the salt and reduce matrix effects, however, this is a lengthy process. Here we describe the use of RapidFire™ coupled to a triple-quadrupole MS for high-throughput screening. This system uses solid-phase extraction to de-salt samples prior to injection, reducing processing time such that a sample is injected into the MS ~every 10 s.

  11. Future directions of electron crystallography.

    Science.gov (United States)

    Fujiyoshi, Yoshinori

    2013-01-01

    In biological science, there are still many interesting and fundamental yet difficult questions, such as those in neuroscience, remaining to be answered. Structural and functional studies of membrane proteins, which are key molecules of signal transduction in neural and other cells, are essential for understanding the molecular mechanisms of many fundamental biological processes. Technological and instrumental advancements of electron microscopy have facilitated comprehension of structural studies of biological components, such as membrane proteins. While X-ray crystallography has been the main method of structure analysis of proteins including membrane proteins, electron crystallography is now an established technique to analyze structures of membrane proteins in the lipid bilayer, which is close to their natural biological environment. By utilizing cryo-electron microscopes with helium-cooled specimen stages, structures of membrane proteins were analyzed at a resolution better than 3 Å. Such high-resolution structural analysis of membrane proteins by electron crystallography opens up the new research field of structural physiology. Considering the fact that the structures of integral membrane proteins in their native membrane environment without artifacts from crystal contacts are critical in understanding their physiological functions, electron crystallography will continue to be an important technology for structural analysis. In this chapter, I will present several examples to highlight important advantages and to suggest future directions of this technique.

  12. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  13. High-pressure protein crystallography of hen egg-white lysozyme

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Hiroyuki; Nagae, Takayuki [Nagoya University, Chikusa, Nagoya, Aichi 464-8603 (Japan); Watanabe, Nobuhisa, E-mail: nobuhisa@nagoya-u.jp [Nagoya University, Chikusa, Nagoya, Aichi 464-8603 (Japan); Nagoya University, Chikusa, Nagoya, Aichi 464-8603 (Japan)

    2015-04-01

    The crystal structure of hen egg-white lysozyme (HEWL) was analyzed under pressures of up to 950 MPa. The high pressure modified the conformation of the molecule and induced a novel phase transition in the tetragonal crystal of HEWL. Crystal structures of hen egg-white lysozyme (HEWL) determined under pressures ranging from ambient pressure to 950 MPa are presented. From 0.1 to 710 MPa, the molecular and internal cavity volumes are monotonically compressed. However, from 710 to 890 MPa the internal cavity volume remains almost constant. Moreover, as the pressure increases to 950 MPa, the tetragonal crystal of HEWL undergoes a phase transition from P4{sub 3}2{sub 1}2 to P4{sub 3}. Under high pressure, the crystal structure of the enzyme undergoes several local and global changes accompanied by changes in hydration structure. For example, water molecules penetrate into an internal cavity neighbouring the active site and induce an alternate conformation of one of the catalytic residues, Glu35. These phenomena have not been detected by conventional X-ray crystal structure analysis and might play an important role in the catalytic activity of HEWL.

  14. PRISM: a data management system for high-throughput proteomics.

    Science.gov (United States)

    Kiebel, Gary R; Auberry, Ken J; Jaitly, Navdeep; Clark, David A; Monroe, Matthew E; Peterson, Elena S; Tolić, Nikola; Anderson, Gordon A; Smith, Richard D

    2006-03-01

    Advanced proteomic research efforts involving areas such as systems biology or biomarker discovery are enabled by the use of high level informatics tools that allow the effective analysis of large quantities of differing types of data originating from various studies. Performing such analyses on a large scale is not feasible without a computational platform that performs data processing and management tasks. Such a platform must be able to provide high-throughput operation while having sufficient flexibility to accommodate evolving data analysis tools and methodologies. The Proteomics Research Information Storage and Management system (PRISM) provides a platform that serves the needs of the accurate mass and time tag approach developed at Pacific Northwest National Laboratory. PRISM incorporates a diverse set of analysis tools and allows a wide range of operations to be incorporated by using a state machine that is accessible to independent, distributed computational nodes. The system has scaled well as data volume has increased over several years, while allowing adaptability for incorporating new and improved data analysis tools for more effective proteomics research.

  15. Hydrodynamic Cell Trapping for High Throughput Single-Cell Applications

    Directory of Open Access Journals (Sweden)

    Amin Abbaszadeh Banaeiyan

    2013-12-01

    Full Text Available The possibility to conduct complete cell assays under a precisely controlled environment while consuming minor amounts of chemicals and precious drugs have made microfluidics an interesting candidate for quantitative single-cell studies. Here, we present an application-specific microfluidic device, cellcomb, capable of conducting high-throughput single-cell experiments. The system employs pure hydrodynamic forces for easy cell trapping and is readily fabricated in polydimethylsiloxane (PDMS using soft lithography techniques. The cell-trapping array consists of V-shaped pockets designed to accommodate up to six Saccharomyces cerevisiae (yeast cells with the average diameter of 4 μm. We used this platform to monitor the impact of flow rate modulation on the arsenite (As(III uptake in yeast. Redistribution of a green fluorescent protein (GFP-tagged version of the heat shock protein Hsp104 was followed over time as read out. Results showed a clear reverse correlation between the arsenite uptake and three different adjusted low = 25 nL min−1, moderate = 50 nL min−1, and high = 100 nL min−1 flow rates. We consider the presented device as the first building block of a future integrated application-specific cell-trapping array that can be used to conduct complete single cell experiments on different cell types.

  16. High Throughput, Continuous, Mass Production of Photovoltaic Modules

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Barth

    2008-02-06

    AVA Solar has developed a very low cost solar photovoltaic (PV) manufacturing process and has demonstrated the significant economic and commercial potential of this technology. This I & I Category 3 project provided significant assistance toward accomplishing these milestones. The original goals of this project were to design, construct and test a production prototype system, fabricate PV modules and test the module performance. The original module manufacturing costs in the proposal were estimated at $2/Watt. The objectives of this project have been exceeded. An advanced processing line was designed, fabricated and installed. Using this automated, high throughput system, high efficiency devices and fully encapsulated modules were manufactured. AVA Solar has obtained 2 rounds of private equity funding, expand to 50 people and initiated the development of a large scale factory for 100+ megawatts of annual production. Modules will be manufactured at an industry leading cost which will enable AVA Solar's modules to produce power that is cost-competitive with traditional energy resources. With low manufacturing costs and the ability to scale manufacturing, AVA Solar has been contacted by some of the largest customers in the PV industry to negotiate long-term supply contracts. The current market for PV has continued to grow at 40%+ per year for nearly a decade and is projected to reach $40-$60 Billion by 2012. Currently, a crystalline silicon raw material supply shortage is limiting growth and raising costs. Our process does not use silicon, eliminating these limitations.

  17. PRISM: A Data Management System for High-Throughput Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Kiebel, Gary R.; Auberry, Kenneth J.; Jaitly, Navdeep; Clark, Dave; Monroe, Matthew E.; Peterson, Elena S.; Tolic, Nikola; Anderson, Gordon A.; Smith, Richard D.

    2006-03-01

    Advanced proteomic research efforts involving areas such as systems biology or biomarker discovery are enabled by the use of high level informatics tools that allow the effective analysis of large quantities of differing types of data originating from various studies. Performing such analyses on a large scale is not feasible without a computational platform that performs data processing and management tasks. Such a platform must be able to provide high-throughput operation while having sufficient flexibility to accommodate evolving data analysis tools and methodologies. The Proteomics Research Information Storage and Management System (PRISM) provides a platform that serves the needs of the accurate mass and time tag approach developed at PNNL. PRISM incorporates a diverse set of analysis tools and allows a wide range of operations to be incorporated by using a state machine that is accessible to independent, distributed computational nodes. The system has scaled well as data volume has increased over several years, while allowing adaptability for incorporating new and improved data analysis tools for more effective proteomics research.

  18. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    Science.gov (United States)

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-12-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  19. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    Directory of Open Access Journals (Sweden)

    Julio Alonso-Padilla

    2014-12-01

    Full Text Available The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  20. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  1. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  2. Automatic loop centring with a high-precision goniometer head at the SLS macromolecular crystallography beamlines

    Energy Technology Data Exchange (ETDEWEB)

    Pauluhn, Anuschka, E-mail: anuschka.pauluhn@psi.ch; Pradervand, Claude; Rossetti, Daniel; Salathe, Marco; Schulze-Briese, Clemens [Paul Scherrer Institut, CH-5232 Villigen (Switzerland)

    2011-07-01

    An automated loop-centring program and a high-precision goniometer head used at the Swiss Light Source are described. Automatic loop centring has been developed as part of the automation process in crystallographic data collection at the Swiss Light Source. The procedure described here consists of an optional set-up part, in which the background images are taken, and the actual centring part. The algorithm uses boundary and centre-of-mass detection at two different microscope image magnifications. Micromounts can be handled as well. Centring of the loops can be achieved in 15–26 s, depending on their initial position, and as fast as manual centring. The alignment of the sample is carried out by means of a new flexural-hinge-based compact goniometer head. The device features an electromagnet for robotic wet mounting of samples. The circle of confusion was measured to be smaller than 1 µm (r.m.s.); its bidirectional backlash is below 2 µm.

  3. Application of an active attachment model as a high-throughput demineralization biofilm model

    NARCIS (Netherlands)

    Silva, T.C.; Pereira, A.F.F.; Exterkate, R.A.M.; Bagnato, V.S.; Buzalaf, M.A.R.; de A.M. Machado, M.A.; ten Cate, J.M.; Crielaard, W.; Deng, D.M.

    2012-01-01

    Objectives To investigate the potential of an active attachment biofilm model as a high-throughput demineralization biofilm model for the evaluation of caries-preventive agents. Methods Streptococcus mutans UA159 biofilms were grown on bovine dentine discs in a high-throughput active attachment mode

  4. Recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers

    DEFF Research Database (Denmark)

    Andersen, Lars Dyrskjøt; Zieger, Karsten; Ørntoft, Torben Falck

    2007-01-01

    individually contributed to the management of the disease. However, the development of high-throughput techniques for simultaneous assessment of a large number of markers has allowed classification of tumors into clinically relevant molecular subgroups beyond those possible by pathological classification. Here......, we review the recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers....

  5. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  6. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  7. High throughput comet assay to study genotoxicity of nanomaterials

    Directory of Open Access Journals (Sweden)

    Naouale El Yamani

    2015-06-01

    Full Text Available The unique physicochemical properties of engineered nanomaterials (NMs have accelerated their use in diverse industrial and domestic products. Although their presence in consumer products represents a major concern for public health safety, their potential impact on human health is poorly understood. There is therefore an urgent need to clarify the toxic effects of NMs and to elucidate the mechanisms involved. In view of the large number of NMs currently being used, high throughput (HTP screening technologies are clearly needed for efficient assessment of toxicity. The comet assay is the most used method in nanogenotoxicity studies and has great potential for increasing throughput as it is fast, versatile and robust; simple technical modifications of the assay make it possible to test many compounds (NMs in a single experiment. The standard gel of 70-100 μL contains thousands of cells, of which only a tiny fraction are actually scored. Reducing the gel to a volume of 5 μL, with just a few hundred cells, allows twelve gels to be set on a standard slide, or 96 as a standard 8x12 array. For the 12 gel format, standard slides precoated with agarose are placed on a metal template and gels are set on the positions marked on the template. The HTP comet assay, incorporating digestion of DNA with formamidopyrimidine DNA glycosylase (FPG to detect oxidised purines, has recently been applied to study the potential induction of genotoxicity by NMs via reactive oxygen. In the NanoTEST project we investigated the genotoxic potential of several well-characterized metal and polymeric nanoparticles with the comet assay. All in vitro studies were harmonized; i.e. NMs were from the same batch, and identical dispersion protocols, exposure time, concentration range, culture conditions, and time-courses were used. As a kidney model, Cos-1 fibroblast-like kidney cells were treated with different concentrations of iron oxide NMs, and cells embedded in minigels (12

  8. High-throughput microfluidic line scan imaging for cytological characterization

    Science.gov (United States)

    Hutcheson, Joshua A.; Powless, Amy J.; Majid, Aneeka A.; Claycomb, Adair; Fritsch, Ingrid; Balachandran, Kartik; Muldoon, Timothy J.

    2015-03-01

    Imaging cells in a microfluidic chamber with an area scan camera is difficult due to motion blur and data loss during frame readout causing discontinuity of data acquisition as cells move at relatively high speeds through the chamber. We have developed a method to continuously acquire high-resolution images of cells in motion through a microfluidics chamber using a high-speed line scan camera. The sensor acquires images in a line-by-line fashion in order to continuously image moving objects without motion blur. The optical setup comprises an epi-illuminated microscope with a 40X oil immersion, 1.4 NA objective and a 150 mm tube lens focused on a microfluidic channel. Samples containing suspended cells fluorescently stained with 0.01% (w/v) proflavine in saline are introduced into the microfluidics chamber via a syringe pump; illumination is provided by a blue LED (455 nm). Images were taken of samples at the focal plane using an ELiiXA+ 8k/4k monochrome line-scan camera at a line rate of up to 40 kHz. The system's line rate and fluid velocity are tightly controlled to reduce image distortion and are validated using fluorescent microspheres. Image acquisition was controlled via MATLAB's Image Acquisition toolbox. Data sets comprise discrete images of every detectable cell which may be subsequently mined for morphological statistics and definable features by a custom texture analysis algorithm. This high-throughput screening method, comparable to cell counting by flow cytometry, provided efficient examination including counting, classification, and differentiation of saliva, blood, and cultured human cancer cells.

  9. High-Throughput Preparation of New Photoactive Nanocomposites.

    Science.gov (United States)

    Conterosito, Eleonora; Benesperi, Iacopo; Toson, Valentina; Saccone, Davide; Barbero, Nadia; Palin, Luca; Barolo, Claudia; Gianotti, Valentina; Milanesio, Marco

    2016-06-08

    New low-cost photoactive hybrid materials based on organic luminescent molecules inserted into hydrotalcite (layered double hydroxides; LDH) were produced, which exploit the high-throughput liquid-assisted grinding (LAG) method. These materials are conceived for applications in dye-sensitized solar cells (DSSCs) as a co-absorbers and in silicon photovoltaic (PV) panels to improve their efficiency as they are able to emit where PV modules show the maximum efficiency. A molecule that shows a large Stokes' shift was designed, synthesized, and intercalated into LDH. Two dyes already used in DSSCs were also intercalated to produce two new nanocomposites. LDH intercalation allows the stability of organic dyes to be improved and their direct use in polymer melt blending. The prepared nanocomposites absorb sunlight from UV to visible and emit from blue to near-IR and thus can be exploited for light-energy management. Finally one nanocomposite was dispersed by melt blending into a poly(methyl methacrylate)-block-poly(n-butyl acrylate) copolymer to obtain a photoactive film.

  10. High throughput miniature drug-screening platform using bioprinting technology.

    Science.gov (United States)

    Rodríguez-Dévora, Jorge I; Zhang, Bimeng; Reyna, Daniel; Shi, Zhi-dong; Xu, Tao

    2012-09-01

    In the pharmaceutical industry, new drugs are tested to find appropriate compounds for therapeutic purposes for contemporary diseases. Unfortunately, novel compounds emerge at expensive prices and current target evaluation processes have limited throughput, thus leading to an increase of cost and time for drug development. This work shows the development of the novel inkjet-based deposition method for assembling a miniature drug-screening platform, which can realistically and inexpensively evaluate biochemical reactions in a picoliter-scale volume at a high speed rate. As proof of concept, applying a modified Hewlett Packard model 5360 compact disc printer, green fluorescent protein expressing Escherichia coli cells along with alginate gel solution have been arrayed on a coverslip chip under a repeatable volume of 180% ± 26% picoliters per droplet; subsequently, different antibiotic droplets were patterned on the spots of cells to evaluate the inhibition of bacteria for antibiotic screening. The proposed platform was compared to the current screening process, validating its effectiveness. The viability and basic function of the printed cells were evaluated, resulting in cell viability above 98% and insignificant or no DNA damage to human kidney cells transfected. Based on the reduction of investment and compound volume used by this platform, this technique has the potential to improve the actual drug discovery process at its target evaluation stage.

  11. A fully automated high-throughput training system for rodents.

    Directory of Open Access Journals (Sweden)

    Rajesh Poddar

    Full Text Available Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal's home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors.

  12. High Throughput T Epitope Mapping and Vaccine Development

    Directory of Open Access Journals (Sweden)

    Giuseppina Li Pira

    2010-01-01

    Full Text Available Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th and by cytolytic T lymphocytes (CTL is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost.

  13. An apparatus for high throughput nanomechanical muscle cell experimentation.

    Science.gov (United States)

    Garcia-Webb, M; Hunter, I; Taberner, A

    2004-01-01

    An array of independent muscle cell testing modules is being developed to explore the mechanics of cardiac myocytes. The instrument will be able to perform established physiological tests and utilize novel system identification techniques to measure the dynamic stiffness and stress frequency response of single cells with possible applications in the pharmaceutical industry for high throughput screening. Currently, each module consists of two independently controlled Lorentz force actuators in the form of stainless steel cantilevers with dimensions 0.025 mm x 0.8 mm x 3 mm, 0.1 m/N compliance and 1.5 kHz resonant frequency. Confocal position sensors focused on each cantilever provide position and force resolution 0.1 mm and forces > 0.1 mN. A custom Visual Basic.Net software interface to a National Instruments data acquisition card implements real time digital control over 4 input channels and 2 output channels at 20 kHz. In addition, algorithms for both swept sine and stochastic system identification have been written to probe mechanical systems. The device has been used to find the dynamic stiffness of a 5 microm diameter polymer fiber between 0 and 500 Hz.

  14. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  15. Assessing the utility and limitations of high throughput virtual screening

    Directory of Open Access Journals (Sweden)

    Paul Daniel Phillips

    2016-05-01

    Full Text Available Due to low cost, speed, and unmatched ability to explore large numbers of compounds, high throughput virtual screening and molecular docking engines have become widely utilized by computational scientists. It is generally accepted that docking engines, such as AutoDock, produce reliable qualitative results for ligand-macromolecular receptor binding, and molecular docking results are commonly reported in literature in the absence of complementary wet lab experimental data. In this investigation, three variants of the sixteen amino acid peptide, α-conotoxin MII, were docked to a homology model of the a3β2-nicotinic acetylcholine receptor. DockoMatic version 2.0 was used to perform a virtual screen of each peptide ligand to the receptor for ten docking trials consisting of 100 AutoDock cycles per trial. The results were analyzed for both variation in the calculated binding energy obtained from AutoDock, and the orientation of bound peptide within the receptor. The results show that, while no clear correlation exists between consistent ligand binding pose and the calculated binding energy, AutoDock is able to determine a consistent positioning of bound peptide in the majority of trials when at least ten trials were evaluated.

  16. Comprehensive analysis of high-throughput screening data

    Science.gov (United States)

    Heyse, Stephan

    2002-06-01

    High-Throughput Screening (HTS) data in its entirety is a valuable raw material for the drug-discovery process. It provides the most compete information about the biological activity of a company's compounds. However, its quantity, complexity and heterogeneity require novel, sophisticated approaches in data analysis. At GeneData, we are developing methods for large-scale, synoptical mining of screening data in a five-step analysis: (1) Quality Assurance: Checking data for experimental artifacts and eliminating low quality data. (2) Biological Profiling: Clustering and ranking of compounds based on their biological activity, taking into account specific characteristics of HTS data. (3) Rule-based Classification: Applying user-defined rules to biological and chemical properties, and providing hypotheses on the biological mode-of-action of compounds. (4) Joint Biological-Chemical Analysis: Associating chemical compound data to HTS data, providing hypotheses for structure- activity relationships. (5) integration with Genomic and Gene Expression Data: Linking into other components of GeneData's bioinformatics platform, and assessing the compounds' modes-of-action, toxicity, and metabolic properties. These analyses address issues that are crucial for a correct interpretation and full exploitation of screening data. They lead to a sound rating of assays and compounds at an early state of the lead-finding process.

  17. Use of High Throughput Screening Data in IARC Monograph ...

    Science.gov (United States)

    Purpose: Evaluation of carcinogenic mechanisms serves a critical role in IARC monograph evaluations, and can lead to “upgrade” or “downgrade” of the carcinogenicity conclusions based on human and animal evidence alone. Three recent IARC monograph Working Groups (110, 112, and 113) pioneered analysis of high throughput in vitro screening data from the U.S. Environmental Protection Agency’s ToxCast program in evaluations of carcinogenic mechanisms. Methods: For monograph 110, ToxCast assay data across multiple nuclear receptors were used to test the hypothesis that PFOA acts exclusively through the PPAR family of receptors, with activity profiles compared to several prototypical nuclear receptor-activating compounds. For monographs 112 and 113, ToxCast assays were systematically evaluated and used as an additional data stream in the overall evaluation of the mechanistic evidence. Specifically, ToxCast assays were mapped to 10 “key characteristics of carcinogens” recently identified by an IARC expert group, and chemicals’ bioactivity profiles were evaluated both in absolute terms (number of relevant assays positive for bioactivity) and relative terms (ranking with respect to other compounds evaluated by IARC, using the ToxPi methodology). Results: PFOA activates multiple nuclear receptors in addition to the PPAR family in the ToxCast assays. ToxCast assays offered substantial coverage for 5 of the 10 “key characteristics,” with the greates

  18. High Throughput Multispectral Image Processing with Applications in Food Science.

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  19. High-throughput translational medicine: challenges and solutions.

    Science.gov (United States)

    Sulakhe, Dinanath; Balasubramanian, Sandhya; Xie, Bingqing; Berrocal, Eduardo; Feng, Bo; Taylor, Andrew; Chitturi, Bhadrachalam; Dave, Utpal; Agam, Gady; Xu, Jinbo; Börnigen, Daniela; Dubchak, Inna; Gilliam, T Conrad; Maltsev, Natalia

    2014-01-01

    Recent technological advances in genomics now allow producing biological data at unprecedented tera- and petabyte scales. Yet, the extraction of useful knowledge from this voluminous data presents a significant challenge to a scientific community. Efficient mining of vast and complex data sets for the needs of biomedical research critically depends on seamless integration of clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships accumulated in a plethora of publicly available databases. Furthermore, such experimental data should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining. Translational projects require sophisticated approaches that coordinate and perform various analytical steps involved in the extraction of useful knowledge from accumulated clinical and experimental data in an orderly semiautomated manner. It presents a number of challenges such as (1) high-throughput data management involving data transfer, data storage, and access control; (2) scalable computational infrastructure; and (3) analysis of large-scale multidimensional data for the extraction of actionable knowledge.We present a scalable computational platform based on crosscutting requirements from multiple scientific groups for data integration, management, and analysis. The goal of this integrated platform is to address the challenges and to support the end-to-end analytical needs of various translational projects.

  20. High Throughput Interrogation of Behavioral Transitions in C. elegans

    Science.gov (United States)

    Liu, Mochi; Shaevitz, Joshua; Leifer, Andrew

    We present a high-throughput method to probe transformations from neural activity to behavior in Caenorhabditis elegans to better understand how organisms change behavioral states. We optogenetically deliver white-noise stimuli to target sensory or inter neurons while simultaneously recording the movement of a population of worms. Using all the postural movement data collected, we computationally classify stereotyped behaviors in C. elegans by clustering based on the spectral properties of the instantaneous posture. (Berman et al., 2014) Transitions between these behavioral clusters indicate discrete behavioral changes. To study the neural correlates dictating these transitions, we perform model-driven experiments and employ Linear-Nonlinear-Poisson cascades that take the white-noise stimulus as the input. The parameters of these models are fitted by reverse-correlation from our measurements. The parameterized models of behavioral transitions predict the worm's response to novel stimuli and reveal the internal computations the animal makes before carrying out behavioral decisions. Preliminary results are shown that describe the neural-behavioral transformation between neural activity in mechanosensory neurons and reversal behavior.

  1. A microfluidic, high throughput protein crystal growth method for microgravity.

    Directory of Open Access Journals (Sweden)

    Carl W Carruthers

    Full Text Available The attenuation of sedimentation and convection in microgravity can sometimes decrease irregularities formed during macromolecular crystal growth. Current terrestrial protein crystal growth (PCG capabilities are very different than those used during the Shuttle era and that are currently on the International Space Station (ISS. The focus of this experiment was to demonstrate the use of a commercial off-the-shelf, high throughput, PCG method in microgravity. Using Protein BioSolutions' microfluidic Plug Maker™/CrystalCard™ system, we tested the ability to grow crystals of the regulator of glucose metabolism and adipogenesis: peroxisome proliferator-activated receptor gamma (apo-hPPAR-γ LBD, as well as several PCG standards. Overall, we sent 25 CrystalCards™ to the ISS, containing ~10,000 individual microgravity PCG experiments in a 3U NanoRacks NanoLab (1U = 10(3 cm.. After 70 days on the ISS, our samples were returned with 16 of 25 (64% microgravity cards having crystals, compared to 12 of 25 (48% of the ground controls. Encouragingly, there were more apo-hPPAR-γ LBD crystals in the microgravity PCG cards than the 1g controls. These positive results hope to introduce the use of the PCG standard of low sample volume and large experimental density to the microgravity environment and provide new opportunities for macromolecular samples that may crystallize poorly in standard laboratories.

  2. High Throughput Multispectral Image Processing with Applications in Food Science.

    Directory of Open Access Journals (Sweden)

    Panagiotis Tsakanikas

    Full Text Available Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  3. Airborne TDMA for High Throughput and Fast Weather Conditions Notification

    Directory of Open Access Journals (Sweden)

    Hyungjun Jang

    2011-05-01

    Full Text Available As air traffic grows significantly, aircraft accidents increase. Many aviation accidents could be prevented if the precise aircraft positions and weather conditions on the aircraft’s route were known. Existing studies propose determining the precise aircraft positions via a VHF channel with an air-to-air radio relay system that is based on mobile ad-hoc networks. However, due to the long propagation delay, the existing TDMA MAC schemes underutilize the networks. The existing TDMA MAC sends data and receives ACK in one time slot, which requires two guard times in one time slot. Since aeronautical communications spans a significant distance, the guard time occupies a significantly large portion of the slot. To solve this problem, we propose a piggybacking mechanism ACK. Our proposed MAC has one guard time in one time slot, which enables the transmission of more data. Using this additional data, we can send weather conditions that pertain to the aircraft’s current position. Our analysis shows that this proposed MAC performs better than the existing MAC, since it offers better throughput and network utilization. In addition, our weather condition notification model achieves a much lower transmission delay than a HF (high frequency voice communication.

  4. High Throughput Multispectral Image Processing with Applications in Food Science

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing’s outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples. PMID:26466349

  5. High-throughput screening technologies for drug glucuronidation profiling.

    Science.gov (United States)

    Trubetskoy, Olga; Finel, Moshe; Trubetskoy, Vladimir

    2008-08-01

    A significant number of endogenous and exogenous compounds, including many therapeutic agents, are metabolized in humans via glucuronidation, catalysed by uridine diphosphoglucuronosyltransferases (UGTs). The study of the UGTs is a growing field of research, with constantly accumulated and updated information regarding UGT structure, purification, substrate specificity and inhibition, including clinically relevant drug interactions. Development of reliable UGT assays for the assessment of individual isoform substrate specificity and for the discovery of novel isoform-specific substrates and inhibitors is crucial for understanding the function and regulation of the UGT enzyme family and its clinical and pharmacological relevance. High-throughput screening (HTS) is a powerful technology used to search for novel substrates and inhibitors for a wide variety of targets. However, application of HTS in the context of UGTs is complicated because of the poor stability, low levels of expression, low affinity and broad substrate specificity of the enzymes, combined with difficulties in obtaining individual UGT isoforms in purified format, and insufficient information regarding isoform-specific substrates and inhibitors. This review examines the current status of HTS assays used in the search for novel UGT substrates and inhibitors, emphasizing advancements and challenges in HTS technologies for drug glucuronidation profiling, and discusses possible avenues for future advancement of the field.

  6. Hypothesis testing in high-throughput screening for drug discovery.

    Science.gov (United States)

    Prummer, Michael

    2012-04-01

    Following the success of small-molecule high-throughput screening (HTS) in drug discovery, other large-scale screening techniques are currently revolutionizing the biological sciences. Powerful new statistical tools have been developed to analyze the vast amounts of data in DNA chip studies, but have not yet found their way into compound screening. In HTS, characterization of single-point hit lists is often done only in retrospect after the results of confirmation experiments are available. However, for prioritization, for optimal use of resources, for quality control, and for comparison of screens it would be extremely valuable to predict the rates of false positives and false negatives directly from the primary screening results. Making full use of the available information about compounds and controls contained in HTS results and replicated pilot runs, the Z score and from it the p value can be estimated for each measurement. Based on this consideration, we have applied the concept of p-value distribution analysis (PVDA), which was originally developed for gene expression studies, to HTS data. PVDA allowed prediction of all relevant error rates as well as the rate of true inactives, and excellent agreement with confirmation experiments was found.

  7. High Throughput Profiling of Molecular Shapes in Crystals

    Science.gov (United States)

    Spackman, Peter R.; Thomas, Sajesh P.; Jayatilaka, Dylan

    2016-02-01

    Molecular shape is important in both crystallisation and supramolecular assembly, yet its role is not completely understood. We present a computationally efficient scheme to describe and classify the molecular shapes in crystals. The method involves rotation invariant description of Hirshfeld surfaces in terms of of spherical harmonic functions. Hirshfeld surfaces represent the boundaries of a molecule in the crystalline environment, and are widely used to visualise and interpret crystalline interactions. The spherical harmonic description of molecular shapes are compared and classified by means of principal component analysis and cluster analysis. When applied to a series of metals, the method results in a clear classification based on their lattice type. When applied to around 300 crystal structures comprising of series of substituted benzenes, naphthalenes and phenylbenzamide it shows the capacity to classify structures based on chemical scaffolds, chemical isosterism, and conformational similarity. The computational efficiency of the method is demonstrated with an application to over 14 thousand crystal structures. High throughput screening of molecular shapes and interaction surfaces in the Cambridge Structural Database (CSD) using this method has direct applications in drug discovery, supramolecular chemistry and materials design.

  8. High throughput screening for drug discovery of autophagy modulators.

    Science.gov (United States)

    Shu, Chih-Wen; Liu, Pei-Feng; Huang, Chun-Ming

    2012-11-01

    Autophagy is an evolutionally conserved process in cells for cleaning abnormal proteins and organelles in a lysosome dependent manner. Growing studies have shown that defects or induced autophagy contributes to many diseases including aging, neurodegeneration, pathogen infection, and cancer. However, the precise involvement of autophagy in health and disease remains controversial because the theories are built on limited assays and chemical modulators, indicating that the role of autophagy in diseases may require further verification. Many food and drug administration (FDA) approved drugs modulate autophagy signaling, suggesting that modulation of autophagy with pharmacological agonists or antagonists provides a potential therapy for autophagy-related diseases. This suggestion raises an attractive issue on drug discovery for exploring chemical modulators of autophagy. High throughput screening (HTS) is becoming a powerful tool for drug discovery that may accelerate screening specific autophagy modulators to clarify the role of autophagy in diseases. Herein, this review lays out current autophagy assays to specifically measure autophagy components such as LC3 (mammalian homologue of yeast Atg8) and Atg4. These assays are feasible or successful for HTS with certain chemical libraries, which might be informative for this intensively growing field as research tools and hopefully developing new drugs for autophagy-related diseases.

  9. The JCSG high-throughput structural biology pipeline.

    Science.gov (United States)

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  10. High-Throughput Neuroimaging-Genetics Computational Infrastructure

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2014-04-01

    Full Text Available Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate and disseminate novel scientific methods, computational resources and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval and aggregation. Computational processing involves the necessary software, hardware and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical and phenotypic data and meta-data. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI and the Laboratory of Neuro Imaging (LONI at University of Southern California (USC. INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer’s and Parkinson’s data, we provide several examples of translational applications using this infrastructure.

  11. Mining Chemical Activity Status from High-Throughput Screening Assays

    KAUST Repository

    Soufan, Othman

    2015-12-14

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  12. Mining Chemical Activity Status from High-Throughput Screening Assays.

    Directory of Open Access Journals (Sweden)

    Othman Soufan

    Full Text Available High-throughput screening (HTS experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  13. Mining Chemical Activity Status from High-Throughput Screening Assays.

    Science.gov (United States)

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  14. High-throughput neuroimaging-genetics computational infrastructure.

    Science.gov (United States)

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D; Franco, Joseph; Toga, Arthur W

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize

  15. High-throughput screening method for lipases/esterases.

    Science.gov (United States)

    Mateos-Díaz, Eduardo; Rodríguez, Jorge Alberto; de Los Ángeles Camacho-Ruiz, María; Mateos-Díaz, Juan Carlos

    2012-01-01

    High-throughput screening (HTS) methods for lipases and esterases are generally performed by using synthetic chromogenic substrates (e.g., p-nitrophenyl, resorufin, and umbelliferyl esters) which may be misleading since they are not their natural substrates (e.g., partially or insoluble triglycerides). In previous works, we have shown that soluble nonchromogenic substrates and p-nitrophenol (as a pH indicator) can be used to quantify the hydrolysis and estimate the substrate selectivity of lipases and esterases from several sources. However, in order to implement a spectrophotometric HTS method using partially or insoluble triglycerides, it is necessary to find particular conditions which allow a quantitative detection of the enzymatic activity. In this work, we used Triton X-100, CHAPS, and N-lauroyl sarcosine as emulsifiers, β-cyclodextrin as a fatty acid captor, and two substrate concentrations, 1 mM of tributyrin (TC4) and 5 mM of trioctanoin (TC8), to improve the test conditions. To demonstrate the utility of this method, we screened 12 enzymes (commercial preparations and culture broth extracts) for the hydrolysis of TC4 and TC8, which are both classical substrates for lipases and esterases (for esterases, only TC4 may be hydrolyzed). Subsequent pH-stat experiments were performed to confirm the preference of substrate hydrolysis with the hydrolases tested. We have shown that this method is very useful for screening a high number of lipases (hydrolysis of TC4 and TC8) or esterases (only hydrolysis of TC4) from wild isolates or variants generated by directed evolution using nonchromogenic triglycerides directly in the test.

  16. A perspective on high throughput analysis of pesticide residues in foods

    Institute of Scientific and Technical Information of China (English)

    Kai ZHANG; Jon W WONG; Perry G WANG

    2011-01-01

    The screening of pesticide residues plays a vital role in food safety. Applications of high throughput analytical procedures are desirable for screening a large number of pesticides and food samples in a time-effi- cient and cost-effective manner. This review discusses how sample throughput of pesticide analysis could be improved with an emphasis on sample preparation, instrumentation and data analysis.

  17. Scanning fluorescence detector for high-throughput DNA genotyping

    Science.gov (United States)

    Rusch, Terry L.; Petsinger, Jeremy; Christensen, Carl; Vaske, David A.; Brumley, Robert L., Jr.; Luckey, John A.; Weber, James L.

    1996-04-01

    A new scanning fluorescence detector (SCAFUD) was developed for high-throughput genotyping of short tandem repeat polymorphisms (STRPs). Fluorescent dyes are incorporated into relatively short DNA fragments via polymerase chain reaction (PCR) and are separated by electrophoresis in short, wide polyacrylamide gels (144 lanes with well to read distances of 14 cm). Excitation light from an argon laser with primary lines at 488 and 514 nm is introduced into the gel through a fiber optic cable, dichroic mirror, and 40X microscope objective. Emitted fluorescent light is collected confocally through a second fiber. The confocal head is translated across the bottom of the gel at 0.5 Hz. The detection unit utilizes dichroic mirrors and band pass filters to direct light with 10 - 20 nm bandwidths to four photomultiplier tubes (PMTs). PMT signals are independently amplified with variable gain and then sampled at a rate of 2500 points per scan using a computer based A/D board. LabView software (National Instruments) is used for instrument operation. Currently, three fluorescent dyes (Fam, Hex and Rox) are simultaneously detected with peak detection wavelengths of 543, 567, and 613 nm, respectively. The detection limit for fluorescein-labeled primers is about 100 attomoles. Planned SCAFUD upgrades include rearrangement of laser head geometry, use of additional excitation lasers for simultaneous detection of more dyes, and the use of detector arrays instead of individual PMTs. Extensive software has been written for automatic analysis of SCAFUD images. The software enables background subtraction, band identification, multiple- dye signal resolution, lane finding, band sizing and allele calling. Whole genome screens are currently underway to search for loci influencing such complex diseases as diabetes, asthma, and hypertension. Seven production SCAFUDs are currently in operation. Genotyping output for the coming year is projected to be about one million total genotypes (DNA

  18. Towards Chip Scale Liquid Chromatography and High Throughput Immunosensing

    Energy Technology Data Exchange (ETDEWEB)

    Ni, Jing [Iowa State Univ., Ames, IA (United States)

    2000-09-21

    This work describes several research projects aimed towards developing new instruments and novel methods for high throughput chemical and biological analysis. Approaches are taken in two directions. The first direction takes advantage of well-established semiconductor fabrication techniques and applies them to miniaturize instruments that are workhorses in analytical laboratories. Specifically, the first part of this work focused on the development of micropumps and microvalves for controlled fluid delivery. The mechanism of these micropumps and microvalves relies on the electrochemically-induced surface tension change at a mercury/electrolyte interface. A miniaturized flow injection analysis device was integrated and flow injection analyses were demonstrated. In the second part of this work, microfluidic chips were also designed, fabricated, and tested. Separations of two fluorescent dyes were demonstrated in microfabricated channels, based on an open-tubular liquid chromatography (OT LC) or an electrochemically-modulated liquid chromatography (EMLC) format. A reduction in instrument size can potentially increase analysis speed, and allow exceedingly small amounts of sample to be analyzed under diverse separation conditions. The second direction explores the surface enhanced Raman spectroscopy (SERS) as a signal transduction method for immunoassay analysis. It takes advantage of the improved detection sensitivity as a result of surface enhancement on colloidal gold, the narrow width of Raman band, and the stability of Raman scattering signals to distinguish several different species simultaneously without exploiting spatially-separated addresses on a biochip. By labeling gold nanoparticles with different Raman reporters in conjunction with different detection antibodies, a simultaneous detection of a dual-analyte immunoassay was demonstrated. Using this scheme for quantitative analysis was also studied and preliminary dose-response curves from an immunoassay of a

  19. High Throughput Screening for Drugs that Modulate Intermediate Filament Proteins

    Science.gov (United States)

    Sun, Jingyuan; Groppi, Vincent E.; Gui, Honglian; Chen, Lu; Xie, Qing; Liu, Li

    2016-01-01

    Intermediate filament (IF) proteins have unique and complex cell and tissue distribution. Importantly, IF gene mutations cause or predispose to more than 80 human tissue-specific diseases (IF-pathies), with the most severe disease phenotypes being due to mutations at conserved residues that result in a disrupted IF network. A critical need for the entire IF-pathy field is the identification of drugs that can ameliorate or cure these diseases, particularly since all current therapies target the IF-pathy complication, such as diabetes or cardiovascular disease, rather than the mutant IF protein or gene. We describe a high throughput approach to identify drugs that can normalize disrupted IF proteins. This approach utilizes transduction of lentivirus that expresses green-fluorescent-protein-tagged keratin 18 (K18) R90C in A549 cells. The readout is drug ‘hits’ that convert the dot-like keratin filament distribution, due to the R90C mutation, to a wildtype-like filamentous array. A similar strategy can be used to screen thousands of compounds and can be utilized for practically any IF protein with a filament-disrupting mutation, and could therefore potentially target many IF-pathies. ‘Hits’ of interest require validation in cell culture then using in vivo experimental models. Approaches to study the mechanism of mutant-IF normalization by potential drugs of interest are also described. The ultimate goal of this drug screening approach is to identify effective and safe compounds that can potentially be tested for clinical efficacy in patients. PMID:26795471

  20. Hypoxia-sensitive reporter system for high-throughput screening.

    Science.gov (United States)

    Tsujita, Tadayuki; Kawaguchi, Shin-ichi; Dan, Takashi; Baird, Liam; Miyata, Toshio; Yamamoto, Masayuki

    2015-01-01

    The induction of anti-hypoxic stress enzymes and proteins has the potential to be a potent therapeutic strategy to prevent the progression of ischemic heart, kidney or brain diseases. To realize this idea, small chemical compounds, which mimic hypoxic conditions by activating the PHD-HIF-α system, have been developed. However, to date, none of these compounds were identified by monitoring the transcriptional activation of hypoxia-inducible factors (HIFs). Thus, to facilitate the discovery of potent inducers of HIF-α, we have developed an effective high-throughput screening (HTS) system to directly monitor the output of HIF-α transcription. We generated a HIF-α-dependent reporter system that responds to hypoxic stimuli in a concentration- and time-dependent manner. This system was developed through multiple optimization steps, resulting in the generation of a construct that consists of the secretion-type luciferase gene (Metridia luciferase, MLuc) under the transcriptional regulation of an enhancer containing 7 copies of 40-bp hypoxia responsive element (HRE) upstream of a mini-TATA promoter. This construct was stably integrated into the human neuroblastoma cell line, SK-N-BE(2)c, to generate a reporter system, named SKN:HRE-MLuc. To improve this system and to increase its suitability for the HTS platform, we incorporated the next generation luciferase, Nano luciferase (NLuc), whose longer half-life provides us with flexibility for the use of this reporter. We thus generated a stably transformed clone with NLuc, named SKN:HRE-NLuc, and found that it showed significantly improved reporter activity compared to SKN:HRE-MLuc. In this study, we have successfully developed the SKN:HRE-NLuc screening system as an efficient platform for future HTS.

  1. High-Throughput Screening by Nuclear Magnetic Resonance (HTS by NMR) for the Identification of PPIs Antagonists.

    Science.gov (United States)

    Wu, Bainan; Barile, Elisa; De, Surya K; Wei, Jun; Purves, Angela; Pellecchia, Maurizio

    2015-01-01

    In recent years the ever so complex field of drug discovery has embraced novel design strategies based on biophysical fragment screening (fragment-based drug design; FBDD) using nuclear magnetic resonance spectroscopy (NMR) and/or structure-guided approaches, most often using X-ray crystallography and computer modeling. Experience from recent years unveiled that these methods are more effective and less prone to artifacts compared to biochemical high-throughput screening (HTS) of large collection of compounds in designing protein inhibitors. Hence these strategies are increasingly becoming the most utilized in the modern pharmaceutical industry. Nonetheless, there is still an impending need to develop innovative and effective strategies to tackle other more challenging targets such as those involving protein-protein interactions (PPIs). While HTS strategies notoriously fail to identify viable hits against such targets, few successful examples of PPIs antagonists derived by FBDD strategies exist. Recently, we reported on a new strategy that combines some of the basic principles of fragment-based screening with combinatorial chemistry and NMR-based screening. The approach, termed HTS by NMR, combines the advantages of combinatorial chemistry and NMR-based screening to rapidly and unambiguously identify bona fide inhibitors of PPIs. This review will reiterate the critical aspects of the approach with examples of possible applications.

  2. A virtual high-throughput screening approach to the discovery of novel inhibitors of the bacterial leucine transporter, LeuT.

    Science.gov (United States)

    Simmons, Katie J; Gotfryd, Kamil; Billesbølle, Christian B; Loland, Claus J; Gether, Ulrik; Fishwick, Colin W G; Johnson, A Peter

    2013-03-01

    Membrane proteins are intrinsically involved in both human and pathogen physiology, and are the target of 60% of all marketed drugs. During the past decade, advances in the studies of membrane proteins using X-ray crystallography, electron microscopy and NMR-based techniques led to the elucidation of over 250 unique membrane protein crystal structures. The aim of the European Drug Initiative for Channels and Transporter (EDICT) project is to use the structures of clinically significant membrane proteins for the development of lead molecules. One of the approaches used to achieve this is a virtual high-throughput screening (vHTS) technique initially developed for soluble proteins. This paper describes application of this technique to the discovery of inhibitors of the leucine transporter (LeuT), a member of the neurotransmitter:sodium symporter (NSS) family.

  3. A virtual high-throughput screening approach to the discovery of novel inhibitors of the bacterial leucine transporter, LeuT

    DEFF Research Database (Denmark)

    Simmons, Katie J; Gotfryd, Kamil; Billesbølle, Christian B

    2013-01-01

    this is a virtual high-throughput screening (vHTS) technique initially developed for soluble proteins. This paper describes application of this technique to the discovery of inhibitors of the leucine transporter (LeuT), a member of the neurotransmitter:sodium symporter (NSS) family.......Abstract Membrane proteins are intrinsically involved in both human and pathogen physiology, and are the target of 60% of all marketed drugs. During the past decade, advances in the studies of membrane proteins using X-ray crystallography, electron microscopy and NMR-based techniques led...... to the elucidation of over 250 unique membrane protein crystal structures. The aim of the European Drug Initiative for Channels and Transporter (EDICT) project is to use the structures of clinically significant membrane proteins for the development of lead molecules. One of the approaches used to achieve...

  4. Overview on the current status on virtual high-throughput screening and combinatorial chemistry approaches in multi-target anticancer drug discovery; Part II.

    Science.gov (United States)

    Geromichalos, George D; Alifieris, Constantinos E; Geromichalou, Elena G; Trafalis, Dimitrios T

    2016-01-01

    Conventional drug design embraces the "one gene, one drug, one disease" philosophy. Nowadays, new generation of anticancer drugs, able to inhibit more than one pathway, is believed to play a major role in contemporary anticancer drug research. In this way, polypharmacology, focusing on multi-target drugs, has emerged as a new paradigm in drug discovery. A number of recent successful drugs have in part or in whole emerged from a structure-based research approach. Many advances including crystallography and informatics are behind these successes. In this part II we will review the role and methodology of ligand-, structure- and fragment-based computer-aided drug design computer aided drug desing (CADD), virtual high throughput screening (vHTS), de novo drug design, fragment-based design and structure-based molecular docking, homology modeling, combinatorial chemistry and library design, pharmacophore model chemistry and informatics in modern drug discovery.

  5. Protein Crystallography in Vaccine Research and Development.

    Science.gov (United States)

    Malito, Enrico; Carfi, Andrea; Bottomley, Matthew J

    2015-06-09

    The use of protein X-ray crystallography for structure-based design of small-molecule drugs is well-documented and includes several notable success stories. However, it is less well-known that structural biology has emerged as a major tool for the design of novel vaccine antigens. Here, we review the important contributions that protein crystallography has made so far to vaccine research and development. We discuss several examples of the crystallographic characterization of vaccine antigen structures, alone or in complexes with ligands or receptors. We cover the critical role of high-resolution epitope mapping by reviewing structures of complexes between antigens and their cognate neutralizing, or protective, antibody fragments. Most importantly, we provide recent examples where structural insights obtained via protein crystallography have been used to design novel optimized vaccine antigens. This review aims to illustrate the value of protein crystallography in the emerging discipline of structural vaccinology and its impact on the rational design of vaccines.

  6. A High-Throughput Pipeline for the Design of Real-Time PCR Signatures

    Science.gov (United States)

    2010-06-23

    available soon. A high-throughput pipeline for the design of real - time PCR signatures BMC Bioinformatics 2010, 11:340 doi:10.1186/1471-2105-11-340 Ravi...AND SUBTITLE A high-throughput pipeline for the design of real - time PCR signatures 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 1 A high-throughput pipeline for the design of real - time PCR signatures Ravi Vijaya Satya

  7. High-sensitivity high-throughput chip based biosensor array for multiplexed detection of heavy metals

    Science.gov (United States)

    Yan, Hai; Tang, Naimei; Jairo, Grace A.; Chakravarty, Swapnajit; Blake, Diane A.; Chen, Ray T.

    2016-03-01

    Heavy metal ions released into the environment from industrial processes lead to various health hazards. We propose an on-chip label-free detection approach that allows high-sensitivity and high-throughput detection of heavy metals. The sensing device consists of 2-dimensional photonic crystal microcavities that are combined by multimode interferometer to form a sensor array. We experimentally demonstrate the detection of cadmium-chelate conjugate with concentration as low as 5 parts-per-billion (ppb).

  8. High-throughput Saccharification assay for lignocellulosic materials.

    Science.gov (United States)

    Gomez, Leonardo D; Whitehead, Caragh; Roberts, Philip; McQueen-Mason, Simon J

    2011-07-03

    Polysaccharides that make up plant lignocellulosic biomass can be broken down to produce a range of sugars that subsequently can be used in establishing a biorefinery. These raw materials would constitute a new industrial platform, which is both sustainable and carbon neutral, to replace the current dependency on fossil fuel. The recalcitrance to deconstruction observed in lignocellulosic materials is produced by several intrinsic properties of plant cell walls. Crystalline cellulose is embedded in matrix polysaccharides such as xylans and arabinoxylans, and the whole structure is encased by the phenolic polymer lignin, that is also difficult to digest (1). In order to improve the digestibility of plant materials we need to discover the main bottlenecks for the saccharification of cell walls and also screen mutant and breeding populations to evaluate the variability in saccharification (2). These tasks require a high throughput approach and here we present an analytical platform that can perform saccharification analysis in a 96-well plate format. This platform has been developed to allow the screening of lignocellulose digestibility of large populations from varied plant species. We have scaled down the reaction volumes for gentle pretreatment, partial enzymatic hydrolysis and sugar determination, to allow large numbers to be assessed rapidly in an automated system. This automated platform works with milligram amounts of biomass, performing ball milling under controlled conditions to reduce the plant materials to a standardised particle size in a reproducible manner. Once the samples are ground, the automated formatting robot dispenses specified and recorded amounts of material into the corresponding wells of 96 deep well plate (Figure 1). Normally, we dispense the same material into 4 wells to have 4 replicates for analysis. Once the plates are filled with the plant material in the desired layout, they are manually moved to a liquid handling station (Figure 2

  9. High throughput modular chambers for rapid evaluation of anesthetic sensitivity

    Directory of Open Access Journals (Sweden)

    Eckmann David M

    2006-11-01

    Full Text Available Abstract Background Anesthetic sensitivity is determined by the interaction of multiple genes. Hence, a dissection of genetic contributors would be aided by precise and high throughput behavioral screens. Traditionally, anesthetic phenotyping has addressed only induction of anesthesia, evaluated with dose-response curves, while ignoring potentially important data on emergence from anesthesia. Methods We designed and built a controlled environment apparatus to permit rapid phenotyping of twenty-four mice simultaneously. We used the loss of righting reflex to indicate anesthetic-induced unconsciousness. After fitting the data to a sigmoidal dose-response curve with variable slope, we calculated the MACLORR (EC50, the Hill coefficient, and the 95% confidence intervals bracketing these values. Upon termination of the anesthetic, Emergence timeRR was determined and expressed as the mean ± standard error for each inhaled anesthetic. Results In agreement with several previously published reports we find that the MACLORR of halothane, isoflurane, and sevoflurane in 8–12 week old C57BL/6J mice is 0.79% (95% confidence interval = 0.78 – 0.79%, 0.91% (95% confidence interval = 0.90 – 0.93%, and 1.96% (95% confidence interval = 1.94 – 1.97%, respectively. Hill coefficients for halothane, isoflurane, and sevoflurane are 24.7 (95% confidence interval = 19.8 – 29.7%, 19.2 (95% confidence interval = 14.0 – 24.3%, and 33.1 (95% confidence interval = 27.3 – 38.8%, respectively. After roughly 2.5 MACLORR • hr exposures, mice take 16.00 ± 1.07, 6.19 ± 0.32, and 2.15 ± 0.12 minutes to emerge from halothane, isoflurane, and sevoflurane, respectively. Conclusion This system enabled assessment of inhaled anesthetic responsiveness with a higher precision than that previously reported. It is broadly adaptable for delivering an inhaled therapeutic (or toxin to a population while monitoring its vital signs, motor reflexes, and providing precise control

  10. High throughput RNAi assay optimization using adherent cell cytometry

    Directory of Open Access Journals (Sweden)

    Pradhan Leena

    2011-04-01

    Full Text Available Abstract Background siRNA technology is a promising tool for gene therapy of vascular disease. Due to the multitude of reagents and cell types, RNAi experiment optimization can be time-consuming. In this study adherent cell cytometry was used to rapidly optimize siRNA transfection in human aortic vascular smooth muscle cells (AoSMC. Methods AoSMC were seeded at a density of 3000-8000 cells/well of a 96well plate. 24 hours later AoSMC were transfected with either non-targeting unlabeled siRNA (50 nM, or non-targeting labeled siRNA, siGLO Red (5 or 50 nM using no transfection reagent, HiPerfect or Lipofectamine RNAiMax. For counting cells, Hoechst nuclei stain or Cell Tracker green were used. For data analysis an adherent cell cytometer, Celigo® was used. Data was normalized to the transfection reagent alone group and expressed as red pixel count/cell. Results After 24 hours, none of the transfection conditions led to cell loss. Red fluorescence counts were normalized to the AoSMC count. RNAiMax was more potent compared to HiPerfect or no transfection reagent at 5 nM siGLO Red (4.12 +/-1.04 vs. 0.70 +/-0.26 vs. 0.15 +/-0.13 red pixel/cell and 50 nM siGLO Red (6.49 +/-1.81 vs. 2.52 +/-0.67 vs. 0.34 +/-0.19. Fluorescence expression results supported gene knockdown achieved by using MARCKS targeting siRNA in AoSMCs. Conclusion This study underscores that RNAi delivery depends heavily on the choice of delivery method. Adherent cell cytometry can be used as a high throughput-screening tool for the optimization of RNAi assays. This technology can accelerate in vitro cell assays and thus save costs.

  11. Emerging high throughput analyses of cyanobacterial toxins and toxic cyanobacteria.

    Science.gov (United States)

    Sivonen, Kaarina

    2008-01-01

    The common occurrence of toxic cyanobacteria causes problems for health of animals and human beings. More research and good monitoring systems are needed to protect water users. It is important to have rapid, reliable and accurate analysis i.e. high throughput methods to identify the toxins as well as toxin producers in the environment. Excellent methods, such as ELISA already exist to analyse cyanobacterial hepatotoxins and saxitoxins, and PPIA for microcystins and nodularins. The LC/MS method can be fast in identifying the toxicants in the samples. Further development of this area should resolve the problems with sampling and sample preparation, which still are the bottlenecks of rapid analyses. In addition, the availability of reliable reference materials and standards should be resolved. Molecular detection methods are now routine in clinical and criminal laboratories and may also become important in environmental diagnostics. One prerequisite for the development of molecular analysis is that pure cultures of the producer organisms are available for identification of the biosynthetic genes responsible for toxin production and for proper testing of the diagnostic methods. Good methods are already available for the microcystin and nodularin-producing cyanobacteria such as conventional PCR, quantitative real-time PCR and microarrays/DNA chips. The DNA-chip technology offers an attractive monitoring system for toxic and non-toxic cyanobacteria. Only with these new technologies (PCR + DNA-chips) will we be able to study toxic cyanobacteria populations in situ and the effects of environmental factors on the occurrence and proliferation of especially toxic cyanobacteria. This is likely to yield important information for mitigation purposes. Further development of these methods should include all cyanobacterial biodiversity, including all toxin producers and primers/probes to detect producers of neurotoxins, cylindrospermopsins etc. (genes are unknown). The on

  12. High-throughput metal susceptibility testing of microbial biofilms

    Directory of Open Access Journals (Sweden)

    Turner Raymond J

    2005-10-01

    Full Text Available Abstract Background Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. Results This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO32- than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic

  13. Raman crystallography of RNA.

    Science.gov (United States)

    Gong, Bo; Chen, Jui-Hui; Yajima, Rieko; Chen, Yuanyuan; Chase, Elaine; Chadalavada, Durga M; Golden, Barbara L; Carey, Paul R; Bevilacqua, Philip C

    2009-10-01

    Raman crystallography is the application of Raman spectroscopy to single crystals. This technique has been applied to a variety of protein molecules where it has provided unique information about biopolymer folding, substrate binding, and catalysis. Here, we describe the application of Raman crystallography to functional RNA molecules. RNA represents unique opportunities and challenges for Raman crystallography. One issue that confounds studies of RNA is its tendency to adopt multiple non-functional folds. Raman crystallography has the advantage that it isolates a single state of the RNA within the crystal and can evaluate its fold, metal ion binding properties (ligand identity, stoichiometry, and affinity), proton binding properties (identity, stoichiometry, and affinity), and catalytic potential. In particular, base-specific stretches can be identified and then associated with the binding of metal ions and protons. Because measurements are carried out in the hanging drop at ambient, rather than cryo, conditions and because RNA crystals tend to be approximately 70% solvent, RNA dynamics and conformational changes become experimentally accessible. This review focuses on experimental setup and procedures, acquisition and interpretation of Raman data, and determination of physicochemical properties of the RNA. Raman crystallographic and solution biochemical experiments on the HDV RNA enzyme are summarized and found to be in excellent agreement. Remarkably, characterization of the crystalline state has proven to help rather than hinder functional characterization of functional RNA, most likely because the tendency of RNA to fold heterogeneously is limited in a crystalline environment. Future applications of Raman crystallography to RNA are briefly discussed.

  14. Implementation of an Automated High-Throughput Plasmid DNA Production Pipeline.

    Science.gov (United States)

    Billeci, Karen; Suh, Christopher; Di Ioia, Tina; Singh, Lovejit; Abraham, Ryan; Baldwin, Anne; Monteclaro, Stephen

    2016-12-01

    Biologics sample management facilities are often responsible for a diversity of large-molecule reagent types, such as DNA, RNAi, and protein libraries. Historically, the management of large molecules was dispersed into multiple laboratories. As methodologies to support pathway discovery, antibody discovery, and protein production have become high throughput, the implementation of automation and centralized inventory management tools has become important. To this end, to improve sample tracking, throughput, and accuracy, we have implemented a module-based automation system integrated into inventory management software using multiple platforms (Hamilton, Hudson, Dynamic Devices, and Brooks). Here we describe the implementation of these systems with a focus on high-throughput plasmid DNA production management.

  15. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    Science.gov (United States)

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  16. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3.04 "Propulsion Systems," Busek Co. Inc. will develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  17. Development of an optimized medium, strain and high-throughput culturing methods for Methylobacterium extorquens

    National Research Council Canada - National Science Library

    Delaney, Nigel F; Kaczmarek, Maria E; Ward, Lewis M; Swanson, Paige K; Lee, Ming-Chun; Marx, Christopher J

    2013-01-01

    .... Here we develop a new system for high-throughput batch culture of M. extorquens in microtiter plates by jointly optimizing the properties of the organism, the growth media and the culturing system...

  18. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3-04 "Propulsion Systems," Busek proposes to develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  19. EMBRYONIC VASCULAR DISRUPTION ADVERSE OUTCOMES: LINKING HIGH THROUGHPUT SIGNALING SIGNATURES WITH FUNCTIONAL CONSEQUENCES

    Science.gov (United States)

    Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...

  20. Self-encoding Functional Resin Applying for Combinatorial Chemistry and High Throughput Screening

    Institute of Scientific and Technical Information of China (English)

    DU Lei; CHEN Tong-sheng

    2004-01-01

    A novel solid phase organic synthesis resin was synthesized for combinatorial high-throughput screening,which based on FTIR spectra self-encoding functional resin technology. A new deconvolution strategy termed position encoding deconvolution had illustrated and was compared with some popular combinatorial deconvolution strategies in efficiency and information content. The mimic high throughput screening of hexapeptide library successfully proved the applying of the self-encoding functional resin technology and the position encoding deconvolution strategy.

  1. Solid-phase cloning for high-throughput assembly of single and multiple DNA parts

    DEFF Research Database (Denmark)

    Lundqvist, Magnus; Edfors, Fredrik; Sivertsson, Åsa

    2015-01-01

    We describe solid-phase cloning (SPC) for high-throughput assembly of expression plasmids. Our method allows PCR products to be put directly into a liquid handler for capture and purification using paramagnetic streptavidin beads and conversion into constructs by subsequent cloning reactions. We...... at an average success rate above 80%. We report on several applications for SPC and we suggest it to be particularly suitable for high-throughput efforts using laboratory workstations....

  2. A multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans

    OpenAIRE

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-01-01

    The booming nanotech industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At t...

  3. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  4. Synchrotron radiation macromolecular crystallography: science and spin-offs

    Directory of Open Access Journals (Sweden)

    John R. Helliwell

    2015-03-01

    Full Text Available A current overview of synchrotron radiation (SR in macromolecular crystallography (MX instrumentation, methods and applications is presented. Automation has been and remains a central development in the last decade, as have the rise of remote access and of industrial service provision. Results include a high number of Protein Data Bank depositions, with an increasing emphasis on the successful use of microcrystals. One future emphasis involves pushing the frontiers of using higher and lower photon energies. With the advent of X-ray free-electron lasers, closely linked to SR developments, the use of ever smaller samples such as nanocrystals, nanoclusters and single molecules is anticipated, as well as the opening up of femtosecond time-resolved diffraction structural studies. At SR sources, a very high-throughput assessment for the best crystal samples and the ability to tackle just a few micron and sub-micron crystals will become widespread. With higher speeds and larger detectors, diffraction data volumes are becoming long-term storage and archiving issues; the implications for today and the future are discussed. Together with the rise of the storage ring to its current pre-eminence in MX data provision, the growing tendency of central facility sites to offer other centralized facilities complementary to crystallography, such as cryo-electron microscopy and NMR, is a welcome development.

  5. High Throughput 600 Watt Hall Effect Thruster for Space Exploration

    Science.gov (United States)

    Szabo, James; Pote, Bruce; Tedrake, Rachel; Paintal, Surjeet; Byrne, Lawrence; Hruby, Vlad; Kamhawi, Hani; Smith, Tim

    2016-01-01

    A nominal 600-Watt Hall Effect Thruster was developed to propel unmanned space vehicles. Both xenon and iodine compatible versions were demonstrated. With xenon, peak measured thruster efficiency is 46-48% at 600-W, with specific impulse from 1400 s to 1700 s. Evolution of the thruster channel due to ion erosion was predicted through numerical models and calibrated with experimental measurements. Estimated xenon throughput is greater than 100 kg. The thruster is well sized for satellite station keeping and orbit maneuvering, either by itself or within a cluster.

  6. Advancing a distributed multi-scale computing framework for large-scale high-throughput discovery in materials science.

    Science.gov (United States)

    Knap, J; Spear, C E; Borodin, O; Leiter, K W

    2015-10-30

    We describe the development of a large-scale high-throughput application for discovery in materials science. Our point of departure is a computational framework for distributed multi-scale computation. We augment the original framework with a specialized module whose role is to route evaluation requests needed by the high-throughput application to a collection of available computational resources. We evaluate the feasibility and performance of the resulting high-throughput computational framework by carrying out a high-throughput study of battery solvents. Our results indicate that distributed multi-scale computing, by virtue of its adaptive nature, is particularly well-suited for building high-throughput applications.

  7. Neutron Nucleic Acid Crystallography.

    Science.gov (United States)

    Chatake, Toshiyuki

    2016-01-01

    The hydration shells surrounding nucleic acids and hydrogen-bonding networks involving water molecules and nucleic acids are essential interactions for the structural stability and function of nucleic acids. Water molecules in the hydration shells influence various conformations of DNA and RNA by specific hydrogen-bonding networks, which often contribute to the chemical reactivity and molecular recognition of nucleic acids. However, X-ray crystallography could not provide a complete description of structural information with respect to hydrogen bonds. Indeed, X-ray crystallography is a powerful tool for determining the locations of water molecules, i.e., the location of the oxygen atom of H2O; however, it is very difficult to determine the orientation of the water molecules, i.e., the orientation of the two hydrogen atoms of H2O, because X-ray scattering from the hydrogen atom is very small.Neutron crystallography is a specialized tool for determining the positions of hydrogen atoms. Neutrons are not diffracted by electrons, but are diffracted by atomic nuclei; accordingly, neutron scattering lengths of hydrogen and its isotopes are comparable to those of non-hydrogen atoms. Therefore, neutron crystallography can determine both of the locations and orientations of water molecules. This chapter describes the current status of neutron nucleic acid crystallographic research as well as the basic principles of neutron diffraction experiments performed on nucleic acid crystals: materials, crystallization, diffraction experiments, and structure determination.

  8. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    Science.gov (United States)

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  9. A Discrete Time Markov Chain Model for High Throughput Bidirectional Fano Decoders

    CERN Document Server

    Xu, Ran; Morris, Kevin; Kocak, Taskin

    2010-01-01

    The bidirectional Fano algorithm (BFA) can achieve at least two times decoding throughput compared to the conventional unidirectional Fano algorithm (UFA). In this paper, bidirectional Fano decoding is examined from the queuing theory perspective. A Discrete Time Markov Chain (DTMC) is employed to model the BFA decoder with a finite input buffer. The relationship between the input data rate, the input buffer size and the clock speed of the BFA decoder is established. The DTMC based modelling can be used in designing a high throughput parallel BFA decoding system. It is shown that there is a tradeoff between the number of BFA decoders and the input buffer size, and an optimal input buffer size can be chosen to minimize the hardware complexity for a target decoding throughput in designing a high throughput parallel BFA decoding system.

  10. Development and clinical performance of high throughput loop-mediated isothermal amplification for detection of malaria

    OpenAIRE

    Perera, Rushini S.; Ding, Xavier C; Tully, Frank; Oliver, James; Bright, Nigel; Bell, David; Chiodini, Peter L; Gonzalez, Iveth J.; Spencer D Polley

    2017-01-01

    Background Accurate and efficient detection of sub-microscopic malaria infections is crucial for enabling rapid treatment and interruption of transmission. Commercially available malaria LAMP kits have excellent diagnostic performance, though throughput is limited by the need to prepare samples individually. Here, we evaluate the clinical performance of a newly developed high throughput (HTP) sample processing system for use in conjunction with the Eiken malaria LAMP kit. Methods The HTP syst...

  11. Development and clinical performance of high throughput loop-mediated isothermal amplification for detection of malaria.

    OpenAIRE

    Perera, RS; Ding, XC; Tully, F.; Oliver, J.; Bright, N; Bell, D.; Chiodini, PL; Gonzalez, IJ; Polley, SD

    2017-01-01

    Background Accurate and efficient detection of sub-microscopic malaria infections is crucial for enabling rapid treatment and interruption of transmission. Commercially available malaria LAMP kits have excellent diagnostic performance, though throughput is limited by the need to prepare samples individually. Here, we evaluate the clinical performance of a newly developed high throughput (HTP) sample processing system for use in conjunction with the Eiken malaria LAMP kit. Methods The HTP syst...

  12. Throughput Analysis for a High-Performance FPGA-Accelerated Real-Time Search Application

    Directory of Open Access Journals (Sweden)

    Wim Vanderbauwhede

    2012-01-01

    Full Text Available We propose an FPGA design for the relevancy computation part of a high-throughput real-time search application. The application matches terms in a stream of documents against a static profile, held in off-chip memory. We present a mathematical analysis of the throughput of the application and apply it to the problem of scaling the Bloom filter used to discard nonmatches.

  13. High-Throughput Atomic Force Microscopes Operating in Parallel

    CERN Document Server

    Sadeghian, H; Dekker, B; Winters, J; Bijnagte, T; Rijnbeek, R

    2016-01-01

    Atomic force microscopy (AFM) is an essential nanoinstrument technique for several applications such as cell biology and nanoelectronics metrology and inspection. The need for statistically significant sample sizes means that data collection can be an extremely lengthy process in AFM. The use of a single AFM instrument is known for its very low speed and not being suitable for scanning large areas, resulting in very-low-throughput measurement. We address this challenge by parallelizing AFM instruments. The parallelization is achieved by miniaturizing the AFM instrument and operating many of them simultaneously. This nanoinstrument has the advantages that each miniaturized AFM can be operated independently and that the advances in the field of AFM, both in terms of speed and imaging modalities, can be implemented more easily. Moreover, a parallel AFM instrument also allows one to measure several physical parameters simultaneously; while one instrument measures nano-scale topography, another instrument can meas...

  14. Achieving Timeliness and High Throughput Metrics in Dissemination Systems

    Directory of Open Access Journals (Sweden)

    MANNE ANUSHA

    2012-09-01

    Full Text Available Existing systems for information dissemination is inadequate and typically results in information gaps. The lack of a clear concise system for information dissemination makes it difficult to determine the most efficient and effective way to pass information especially in the fields of ecommerce and security alerting systems to the appropriate parties. These systems usually require that the desired information be matched between numerous sources and sinks based on established subscriptions. Timeliness and Throughput are performance metrics used for evaluation. And these existing systems fail to achieve a balance between the two. So a much better system termed INFOD (INFOrmation Dissemination was proposed earlier that achieves a balance between performance metrics. We observed that an Integrated Control Loop used by admission control scheme of INFOD employs PL/SQL stored procedures that are huge computation overhead. We propose to replace them with Java stored procedures that can tremendously increase the performance.

  15. High-speed CMOS image sensor for high-throughput lensless microfluidic imaging system

    Science.gov (United States)

    Yan, Mei; Huang, Xiwei; Jia, Qixiang; Nadipalli, Revanth; Wang, Tongxi; Shang, Yang; Yu, Hao; Je, Minkyu; Yeo, Kiatseng

    2012-03-01

    The integration of CMOS image sensor and microfluidics becomes a promising technology for point-of-care (POC) diagnosis. However, commercial image sensors usually have limited speed and low-light sensitivity. One high-speed and high-sensitivity CMOS image sensor chip is introduced in this paper, targeted for high-throughput microfluidic imaging system. Firstly, high speed image sensor architecture is introduced with design of column-parallel single-slope analog-todigital converter (ADC) with digital correlated double sampling (CDS). The frame rate can be achieved to 2400 frames/second (fps) with resolution of 128×96 for high-throughput microfluidic imaging. Secondly, the designed system has superior low-light sensitivity, which is achieved by large pixel size (10μm×10μm, 56% fill factor). Pixel peak signalnoise- ratio (SNR) reaches to 50dB with 10dB improvement compared to the commercial pixel (2.2μm×2.2μm). The degradation of pixel resolution is compensated by super-resolution image processing algorithm. By reconstructing single image with multiple low-resolution frames, we can equivalently achieve 2μm resolution with physical 10μm pixel. Thirdly, the system-on-chip (SoC) integration results in a real-time controlled intelligent imaging system without expensive data storage and time-consuming computer analysis. This initial sensor prototype with timing-control makes it possible to develop high-throughput lensless microfluidic imaging system for POC diagnosis.

  16. A high-throughput, high-quality plant genomic DNA extraction protocol.

    Science.gov (United States)

    Li, H; Li, J; Cong, X H; Duan, Y B; Li, L; Wei, P C; Lu, X Z; Yang, J B

    2013-10-15

    The isolation of high-quality genomic DNA (gDNA) is a crucial technique in plant molecular biology. The quality of gDNA determines the reliability of real-time polymerase chain reaction (PCR) analysis. In this paper, we reported a high-quality gDNA extraction protocol optimized for real-time PCR in a variety of plant species. Performed in a 96-well block, our protocol provides high throughput. Without the need for phenol-chloroform and liquid nitrogen or dry ice, our protocol is safer and more cost-efficient than traditional DNA extraction methods. The method takes 10 mg leaf tissue to yield 5-10 µg high-quality gDNA. Spectral measurement and electrophoresis were used to demonstrate gDNA purity. The extracted DNA was qualified in a restriction enzyme digestion assay and conventional PCR. The real-time PCR amplification was sufficiently sensitive to detect gDNA at very low concentrations (3 pg/µL). The standard curve of gDNA dilutions from our phenol-chloroform-free protocol showed better linearity (R(2) = 0.9967) than the phenol-chloroform protocol (R(2) = 0.9876). The results indicate that the gDNA was of high quality and fit for real-time PCR. This safe, high-throughput plant gDNA extraction protocol could be used to isolate high-quality gDNA for real-time PCR and other downstream molecular applications.

  17. 3D nanochannel electroporation for high-throughput cell transfection with high uniformity and dosage control

    Science.gov (United States)

    Chang, Lingqian; Bertani, Paul; Gallego-Perez, Daniel; Yang, Zhaogang; Chen, Feng; Chiang, Chiling; Malkoc, Veysi; Kuang, Tairong; Gao, Keliang; Lee, L. James; Lu, Wu

    2015-12-01

    Of great interest to modern medicine and biomedical research is the ability to inject individual target cells with the desired genes or drug molecules. Some advances in cell electroporation allow for high throughput, high cell viability, or excellent dosage control, yet no platform is available for the combination of all three. In an effort to solve this problem, here we show a ``3D nano-channel electroporation (NEP) chip'' on a silicon platform designed to meet these three criteria. This NEP chip can simultaneously deliver the desired molecules into 40 000 cells per cm2 on the top surface of the device. Each 650 nm pore aligns to a cell and can be used to deliver extremely small biological elements to very large plasmids (>10 kbp). When compared to conventional bulk electroporation (BEP), the NEP chip shows a 20 fold improvement in dosage control and uniformity, while still maintaining high cell viability (>90%) even in cells such as cardiac cells which are characteristically difficult to transfect. This high-throughput 3D NEP system provides an innovative and medically valuable platform with uniform and reliable cellular transfection, allowing for a steady supply of healthy, engineered cells.Of great interest to modern medicine and biomedical research is the ability to inject individual target cells with the desired genes or drug molecules. Some advances in cell electroporation allow for high throughput, high cell viability, or excellent dosage control, yet no platform is available for the combination of all three. In an effort to solve this problem, here we show a ``3D nano-channel electroporation (NEP) chip'' on a silicon platform designed to meet these three criteria. This NEP chip can simultaneously deliver the desired molecules into 40 000 cells per cm2 on the top surface of the device. Each 650 nm pore aligns to a cell and can be used to deliver extremely small biological elements to very large plasmids (>10 kbp). When compared to conventional bulk

  18. Forecasting Ecological Genomics: High-Tech Animal Instrumentation Meets High-Throughput Sequencing.

    Science.gov (United States)

    Shafer, Aaron B A; Northrup, Joseph M; Wikelski, Martin; Wittemyer, George; Wolf, Jochen B W

    2016-01-01

    Recent advancements in animal tracking technology and high-throughput sequencing are rapidly changing the questions and scope of research in the biological sciences. The integration of genomic data with high-tech animal instrumentation comes as a natural progression of traditional work in ecological genetics, and we provide a framework for linking the separate data streams from these technologies. Such a merger will elucidate the genetic basis of adaptive behaviors like migration and hibernation and advance our understanding of fundamental ecological and evolutionary processes such as pathogen transmission, population responses to environmental change, and communication in natural populations.

  19. Accelerator mass spectrometry targets of submilligram carbonaceous samples using the high-throughput Zn reduction method.

    Science.gov (United States)

    Kim, Seung-Hyun; Kelly, Peter B; Clifford, Andrew J

    2009-07-15

    The high-throughput Zn reduction method was developed and optimized for various biological/biomedical accelerator mass spectrometry (AMS) applications of mg of C size samples. However, high levels of background carbon from the high-throughput Zn reduction method were not suitable for sub-mg of C size samples in environmental, geochronology, and biological/biomedical AMS applications. This study investigated the effect of background carbon mass (mc) and background 14C level (Fc) from the high-throughput Zn reduction method. Background mc was 0.011 mg of C and background Fc was 1.5445. Background subtraction, two-component mixing, and expanded formulas were used for background correction. All three formulas accurately corrected for backgrounds to 0.025 mg of C in the aerosol standard (NIST SRM 1648a). Only the background subtraction and the two-component mixing formulas accurately corrected for backgrounds to 0.1 mg of C in the IAEA-C6 and -C7 standards. After the background corrections, our high-throughput Zn reduction method was suitable for biological (diet)/biomedical (drug) and environmental (fine particulate matter) applications of sub-mg of C samples (> or = 0.1 mg of C) in keeping with a balance between throughput (270 samples/day/analyst) and sensitivity/accuracy/precision of AMS measurement. The development of a high-throughput method for examination of > or = 0.1 mg of C size samples opens up a range of applications for 14C AMS studies. While other methods do exist for > or = 0.1 mg of C size samples, the low throughput has made them cost prohibitive for many applications.

  20. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  1. High-throughput atomic force microscopes operating in parallel

    Science.gov (United States)

    Sadeghian, Hamed; Herfst, Rodolf; Dekker, Bert; Winters, Jasper; Bijnagte, Tom; Rijnbeek, Ramon

    2017-03-01

    Atomic force microscopy (AFM) is an essential nanoinstrument technique for several applications such as cell biology and nanoelectronics metrology and inspection. The need for statistically significant sample sizes means that data collection can be an extremely lengthy process in AFM. The use of a single AFM instrument is known for its very low speed and not being suitable for scanning large areas, resulting in a very-low-throughput measurement. We address this challenge by parallelizing AFM instruments. The parallelization is achieved by miniaturizing the AFM instrument and operating many of them simultaneously. This instrument has the advantages that each miniaturized AFM can be operated independently and that the advances in the field of AFM, both in terms of speed and imaging modalities, can be implemented more easily. Moreover, a parallel AFM instrument also allows one to measure several physical parameters simultaneously; while one instrument measures nano-scale topography, another instrument can measure mechanical, electrical, or thermal properties, making it a lab-on-an-instrument. In this paper, a proof of principle of such a parallel AFM instrument has been demonstrated by analyzing the topography of large samples such as semiconductor wafers. This nanoinstrument provides new research opportunities in the nanometrology of wafers and nanolithography masks by enabling real die-to-die and wafer-level measurements and in cell biology by measuring the nano-scale properties of a large number of cells.

  2. Chemically modified solid state nanopores for high throughput nanoparticle separation

    Energy Technology Data Exchange (ETDEWEB)

    Prabhu, Anmiv S; Kim, Min Jun [School of Biomedical Engineering and Health Science, Drexel University, Philadelphia, PA 19104 (United States); Jubery, Talukder Zaki N; Dutta, Prashanta [School of Mechanical and Materials Engineering, Washington State University, Pullman, WA 99164 (United States); Freedman, Kevin J; Mulero, Rafael, E-mail: mkim@coe.drexel.ed [Department of Mechanical Engineering and Mechanics, Drexel University, Philadelphia, PA 19104 (United States)

    2010-11-17

    The separation of biomolecules and other nanoparticles is a vital step in several analytical and diagnostic techniques. Towards this end we present a solid state nanopore-based set-up as an efficient separation platform. The translocation of charged particles through a nanopore was first modeled mathematically using the multi-ion model and the surface charge density of the nanopore membrane was identified as a critical parameter that determines the selectivity of the membrane and the throughput of the separation process. Drawing from these simulations a single 150 nm pore was fabricated in a 50 nm thick free-standing silicon nitride membrane by focused-ion-beam milling and was chemically modified with (3-aminopropyl)triethoxysilane to change its surface charge density. This chemically modified membrane was then used to separate 22 and 58 nm polystyrene nanoparticles in solution. Once optimized, this approach can readily be scaled up to nanopore arrays which would function as a key component of next-generation nanosieving systems.

  3. Software Switching for High Throughput Data Acquisition Networks

    CERN Document Server

    Jereczek, Grzegorz; Lehmann Miotto, Giovanna

    The bursty many-to-one communication pattern, typical for data acquisition systems, is particularly demanding for commodity TCP/IP and Ethernet technologies. The problem arising from this pattern is widely known in the literature as \\emph{incast} and can be observed as TCP throughput collapse. It is a result of overloading the switch buffers, when a specific node in a network requests data from multiple sources. This will become even more demanding for future upgrades of the experiments at the Large Hadron Collider at CERN. It is questionable whether commodity TCP/IP and Ethernet technologies in their current form will be still able to effectively adapt to bursty traffic without losing packets due to the scarcity of buffers in the networking hardware. This thesis provides an analysis of TCP/IP performance in data acquisition networks and presents a novel approach to incast congestion in these networks based on software-based packet forwarding. Our first contribution lies in confirming the strong analogies bet...

  4. High throughput chromatography strategies for potential use in the formal process characterization of a monoclonal antibody.

    Science.gov (United States)

    Petroff, Matthew G; Bao, Haiying; Welsh, John P; van Beuningen-de Vaan, Miranda; Pollard, Jennifer M; Roush, David J; Kandula, Sunitha; Machielsen, Peter; Tugcu, Nihal; Linden, Thomas O

    2016-06-01

    High throughput experimental strategies are central to the rapid optimization of biologics purification processes. In this work, we extend common high throughput technologies towards the characterization of a multi-column chromatography process for a monoclonal antibody (mAb). Scale-down strategies were first evaluated by comparing breakthrough, retention, and performance (yields and clearance of aggregates and host cell protein) across miniature and lab scale columns. The process operating space was then evaluated using several integrated formats, with batch experimentation to define process testing ranges, miniature columns to evaluate the operating space, and comparison to traditional scale columns to establish scale-up correlations and verify the determined operating space. When compared to an independent characterization study at traditional lab column scale, the high throughput approach identified the same control parameters and similar process sensitivity. Importantly, the high throughput approach significantly decreased time and material needs while improving prediction robustness. Miniature columns and manufacturing scale centerpoint data comparisons support the validity of this approach, making the high throughput strategy an attractive and appropriate scale-down tool for the formal characterization of biotherapeutic processes in the future if regulatory acceptance of the miniature column data can be achieved. Biotechnol. Bioeng. 2016;113: 1273-1283. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  5. High-throughput Phenotyping and Genomic Selection: The Frontiers of Crop Breeding Converge

    Institute of Scientific and Technical Information of China (English)

    Llorenc Cabrera-Bosquet; José Crossa; Jarislav von Zitzewitz; Maria Dolors Serret; José Luis Araus

    2012-01-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide.Both approaches promise to revolutionize the prediction of complex traits,including growth,yield and adaptation to stress.Whereas high-throughput phenotyping may help to improve understanding of crop physiology,most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection.Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome),they both consider the targeted traits (e.g.grain yield,growth,phenology,plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e.physiological) putatively related to the target trait.Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology.This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield.

  6. High-throughput, high-fidelity HLA genotyping with deep sequencing.

    Science.gov (United States)

    Wang, Chunlin; Krishnakumar, Sujatha; Wilhelmy, Julie; Babrzadeh, Farbod; Stepanyan, Lilit; Su, Laura F; Levinson, Douglas; Fernandez-Viña, Marcelo A; Davis, Ronald W; Davis, Mark M; Mindrinos, Michael

    2012-05-29

    Human leukocyte antigen (HLA) genes are the most polymorphic in the human genome. They play a pivotal role in the immune response and have been implicated in numerous human pathologies, especially autoimmunity and infectious diseases. Despite their importance, however, they are rarely characterized comprehensively because of the prohibitive cost of standard technologies and the technical challenges of accurately discriminating between these highly related genes and their many allelles. Here we demonstrate a high-resolution, and cost-effective methodology to type HLA genes by sequencing, which combines the advantage of long-range amplification, the power of high-throughput sequencing platforms, and a unique genotyping algorithm. We calibrated our method for HLA-A, -B, -C, and -DRB1 genes with both reference cell lines and clinical samples and identified several previously undescribed alleles with mismatches, insertions, and deletions. We have further demonstrated the utility of this method in a clinical setting by typing five clinical samples in an Illumina MiSeq instrument with a 5-d turnaround. Overall, this technology has the capacity to deliver low-cost, high-throughput, and accurate HLA typing by multiplexing thousands of samples in a single sequencing run, which will enable comprehensive disease-association studies with large cohorts. Furthermore, this approach can also be extended to include other polymorphic genes.

  7. 3D nanochannel electroporation for high-throughput cell transfection with high uniformity and dosage control.

    Science.gov (United States)

    Chang, Lingqian; Bertani, Paul; Gallego-Perez, Daniel; Yang, Zhaogang; Chen, Feng; Chiang, Chiling; Malkoc, Veysi; Kuang, Tairong; Gao, Keliang; Lee, L James; Lu, Wu

    2016-01-01

    Of great interest to modern medicine and biomedical research is the ability to inject individual target cells with the desired genes or drug molecules. Some advances in cell electroporation allow for high throughput, high cell viability, or excellent dosage control, yet no platform is available for the combination of all three. In an effort to solve this problem, here we show a "3D nano-channel electroporation (NEP) chip" on a silicon platform designed to meet these three criteria. This NEP chip can simultaneously deliver the desired molecules into 40,000 cells per cm(2) on the top surface of the device. Each 650 nm pore aligns to a cell and can be used to deliver extremely small biological elements to very large plasmids (>10 kbp). When compared to conventional bulk electroporation (BEP), the NEP chip shows a 20 fold improvement in dosage control and uniformity, while still maintaining high cell viability (>90%) even in cells such as cardiac cells which are characteristically difficult to transfect. This high-throughput 3D NEP system provides an innovative and medically valuable platform with uniform and reliable cellular transfection, allowing for a steady supply of healthy, engineered cells.

  8. Recent Progress Using High-throughput Sequencing Technologies in Plant Molecular Breeding

    Institute of Scientific and Technical Information of China (English)

    Qiang Gao; Guidong Yue; Wenqi Li; Junyi Wang; Jiaohui Xu; Ye Yin

    2012-01-01

    High-throughput sequencing is a revolutionary technological innovation in DNA sequencing.This technology has an ultra-low cost per base of sequencing and an overwhelmingly high data output.High-throughput sequencing has brought novel research methods and solutions to the research fields of genomics and post-genomics.Furthermore,this technology is leading to a new molecular breeding revolution that has landmark significance for scientific research and enables us to launch multi-level,multifaceted,and multi-extent studies in the fields of crop genetics,genomics,and crop breeding.In this paper,we review progress in the application of high-throughput sequencing technologies to plant molecular breeding studies.

  9. Complementing high-throughput X-ray powder diffraction data with quantum-chemical calculations

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; van de Streek, Jacco; Rantanen, Jukka

    2012-01-01

    High-throughput crystallisation and characterisation platforms provide an efficient means to carry out solid-form screening during the pre-formulation phase. To determine the crystal structures of identified new solid phases, however, usually requires independent crystallisation trials to produce...... single crystals or bulk samples of sufficient quantity to carry out high-quality X-ray diffraction measurements. This process could be made more efficient by a robust procedure for crystal structure determination directly from high-throughput X-ray powder diffraction (XRPD) data. Quantum......-chemical calculations based on dispersion-corrected density functional theory (DFT-D) have now become feasible for typical small organic molecules used as active pharmaceutical ingredients. We demonstrate how these calculations can be applied to complement high-throughput XRPD data by determining the crystal structure...

  10. High-throughput imaging: Focusing in on drug discovery in 3D.

    Science.gov (United States)

    Li, Linfeng; Zhou, Qiong; Voss, Ty C; Quick, Kevin L; LaBarbera, Daniel V

    2016-03-01

    3D organotypic culture models such as organoids and multicellular tumor spheroids (MCTS) are becoming more widely used for drug discovery and toxicology screening. As a result, 3D culture technologies adapted for high-throughput screening formats are prevalent. While a multitude of assays have been reported and validated for high-throughput imaging (HTI) and high-content screening (HCS) for novel drug discovery and toxicology, limited HTI/HCS with large compound libraries have been reported. Nonetheless, 3D HTI instrumentation technology is advancing and this technology is now on the verge of allowing for 3D HCS of thousands of samples. This review focuses on the state-of-the-art high-throughput imaging systems, including hardware and software, and recent literature examples of 3D organotypic culture models employing this technology for drug discovery and toxicology screening.

  11. Filtering high-throughput protein-protein interaction data using a combination of genomic features

    Directory of Open Access Journals (Sweden)

    Patil Ashwini

    2005-04-01

    Full Text Available Abstract Background Protein-protein interaction data used in the creation or prediction of molecular networks is usually obtained from large scale or high-throughput experiments. This experimental data is liable to contain a large number of spurious interactions. Hence, there is a need to validate the interactions and filter out the incorrect data before using them in prediction studies. Results In this study, we use a combination of 3 genomic features – structurally known interacting Pfam domains, Gene Ontology annotations and sequence homology – as a means to assign reliability to the protein-protein interactions in Saccharomyces cerevisiae determined by high-throughput experiments. Using Bayesian network approaches, we show that protein-protein interactions from high-throughput data supported by one or more genomic features have a higher likelihood ratio and hence are more likely to be real interactions. Our method has a high sensitivity (90% and good specificity (63%. We show that 56% of the interactions from high-throughput experiments in Saccharomyces cerevisiae have high reliability. We use the method to estimate the number of true interactions in the high-throughput protein-protein interaction data sets in Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens to be 27%, 18% and 68% respectively. Our results are available for searching and downloading at http://helix.protein.osaka-u.ac.jp/htp/. Conclusion A combination of genomic features that include sequence, structure and annotation information is a good predictor of true interactions in large and noisy high-throughput data sets. The method has a very high sensitivity and good specificity and can be used to assign a likelihood ratio, corresponding to the reliability, to each interaction.

  12. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  13. High-precision, high-throughput stability determinations facilitated by robotics and a semiautomated titrating fluorometer.

    Science.gov (United States)

    Edgell, Marshall Hall; Sims, Dorothy A; Pielak, Gary J; Yi, Fang

    2003-06-24

    The use of statistical modeling to test hypotheses concerning the determinants of protein structure requires stability data (e.g., the free energy of denaturation in H(2)O, DeltaG(HOH)) from hundreds of protein mutants. Fluorescence-monitored chemical denaturation provides a convenient method for high-precision, high-throughput DeltaG(HOH) determination. For eglin c we find that a throughput of about 20 min per protein can be attained in a two-channel semiautomated titrating fluorometer. We find also that the use of robotics for protein purification and preparation of the solutions for chemical denaturation gives highly precise DeltaG(HOH) values in which the standard deviation of values from multiple preparations (+/-0.051 kcal/mol) differs very little from multiple measurements from a single preparation (+/-0.040 kcal/mol). Since the variance introduced into model fitting by DeltaG(HOH) increases as the square of measurement error, there is a premium on precision. In fact, the fraction of stability behavior explicable by otherwise perfect models goes from 98% to only 50% over the error range commonly reported for chemical denaturation measurements (0.1-0.6 kcal/mol). We have found that the precision of chemical denaturation DeltaG(HOH) measurements depends most heavily on the precision of the instrument used, followed by protein purity and the capacity to precisely prepare the solutions used for titrations.

  14. A novel imaging-based high-throughput screening approach to anti-angiogenic drug discovery.

    Science.gov (United States)

    Evensen, Lasse; Micklem, David R; Link, Wolfgang; Lorens, James B

    2010-01-01

    The successful progression to the clinic of angiogenesis inhibitors for cancer treatment has spurred interest in developing new classes of anti-angiogenic compounds. The resulting surge in available candidate therapeutics highlights the need for robust, high-throughput angiogenesis screening systems that adequately capture the complexity of new vessel formation while providing quantitative evaluation of the potency of these agents. Available in vitro angiogenesis assays are either cumbersome, impeding adaptation to high-throughput screening formats, or inadequately model the complex multistep process of new vessel formation. We therefore developed an organotypic endothelial-mural cell co-culture assay system that reflects several facets of angiogenesis while remaining compatible with high-throughput/high-content image screening. Co-culture of primary human endothelial cells (EC) and vascular smooth muscle cells (vSMC) results in assembly of a network of tubular endothelial structures enveloped with vascular basement membrane proteins, thus, comprising the three main components of blood vessels. Initially, EC are dependent on vSMC-derived VEGF and sensitive to clinical anti-angiogenic therapeutics. A subsequent phenotypic VEGF-switch renders EC networks resistant to anti-VEGF therapeutics, demarcating a mature vascular phenotype. Conversely, mature EC networks remain sensitive to vascular disrupting agents. Therefore, candidate anti-angiogenic compounds can be interrogated for their relative potency on immature and mature networks and classified as either vascular normalizing or vascular disrupting agents. Here, we demonstrate that the EC-vSMC co-culture assay represents a robust high-content imaging high-throughput screening system for identification of novel anti-angiogenic agents. A pilot high-throughput screening campaign was used to define informative imaging parameters and develop a follow-up dose-response scheme for hit characterization. High-throughput

  15. High throughput route selection in multi-rate wireless mesh networks

    Institute of Scientific and Technical Information of China (English)

    WEI Yi-fei; GUO Xiang-li; SONG Mei; SONG Jun-de

    2008-01-01

    Most existing Ad-hoc routing protocols use the shortest path algorithm with a hop count metric to select paths. It is appropriate in single-rate wireless networks, but has a tendency to select paths containing long-distance links that have low data rates and reduced reliability in multi-rate networks. This article introduces a high throughput routing algorithm utilizing the multi-rate capability and some mesh characteristics in wireless fidelity (WiFi) mesh networks. It uses the medium access control (MAC) transmission time as the routing metric, which is estimated by the information passed up from the physical layer. When the proposed algorithm is adopted, the Ad-hoc on-demand distance vector (AODV) routing can be improved as high throughput AODV (HT-AODV). Simulation results show that HT-AODV is capable of establishing a route that has high data-rate, short end-to-end delay and great network throughput.

  16. High-throughput parallel SPM for metrology, defect, and mask inspection

    Science.gov (United States)

    Sadeghian, H.; Herfst, R. W.; van den Dool, T. C.; Crowcombe, W. E.; Winters, J.; Kramer, G. F. I. J.

    2014-10-01

    Scanning probe microscopy (SPM) is a promising candidate for accurate assessment of metrology and defects on wafers and masks, however it has traditionally been too slow for high-throughput applications, although recent developments have significantly pushed the speed of SPM [1,2]. In this paper we present new results obtained with our previously presented high-throughput parallel SPM system [3,4] that showcase two key advances that are required for a successful deployment of SPM in high-throughput metrology, defect and mask inspection. The first is a very fast (up to 40 lines/s) image acquisition and a comparison of the image quality as function of speed. Secondly, a fast approach method: measurements of the scan-head approaching the sample from 0.2 and 1.0 mm distance in under 1.4 and 6 seconds respectively.

  17. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    Science.gov (United States)

    Green, M. L.; Choi, C. L.; Hattrick-Simpers, J. R.; Joshi, A. M.; Takeuchi, I.; Barron, S. C.; Campo, E.; Chiang, T.; Empedocles, S.; Gregoire, J. M.; Kusne, A. G.; Martin, J.; Mehta, A.; Persson, K.; Trautt, Z.; Van Duren, J.; Zakutayev, A.

    2017-03-01

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. A major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.

  18. Current impact and future directions of high throughput sequencing in plant virus diagnostics.

    Science.gov (United States)

    Massart, Sebastien; Olmos, Antonio; Jijakli, Haissam; Candresse, Thierry

    2014-08-08

    The ability to provide a fast, inexpensive and reliable diagnostic for any given viral infection is a key parameter in efforts to fight and control these ubiquitous pathogens. The recent developments of high-throughput sequencing (also called Next Generation Sequencing - NGS) technologies and bioinformatics have drastically changed the research on viral pathogens. It is now raising a growing interest for virus diagnostics. This review provides a snapshot vision on the current use and impact of high throughput sequencing approaches in plant virus characterization. More specifically, this review highlights the potential of these new technologies and their interplay with current protocols in the future of molecular diagnostic of plant viruses. The current limitations that will need to be addressed for a wider adoption of high-throughput sequencing in plant virus diagnostics are thoroughly discussed.

  19. High-throughput screening of small molecule libraries using SAMDI mass spectrometry.

    Science.gov (United States)

    Gurard-Levin, Zachary A; Scholle, Michael D; Eisenberg, Adam H; Mrksich, Milan

    2011-07-11

    High-throughput screening is a common strategy used to identify compounds that modulate biochemical activities, but many approaches depend on cumbersome fluorescent reporters or antibodies and often produce false-positive hits. The development of "label-free" assays addresses many of these limitations, but current approaches still lack the throughput needed for applications in drug discovery. This paper describes a high-throughput, label-free assay that combines self-assembled monolayers with mass spectrometry, in a technique called SAMDI, as a tool for screening libraries of 100,000 compounds in one day. This method is fast, has high discrimination, and is amenable to a broad range of chemical and biological applications.

  20. Microfluidics for cell-based high throughput screening platforms - A review.

    Science.gov (United States)

    Du, Guansheng; Fang, Qun; den Toonder, Jaap M J

    2016-01-15

    In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery.

  1. Implementation of remote monitoring and diffraction evaluation systems at the Photon Factory macromolecular crystallography beamlines

    Science.gov (United States)

    Yamada, Yusuke; pHonda, Nobuo; Matsugaki, Naohiro; Igarashi, Noriyuki; Hiraki, Masahiko; Wakatsuki, Soichi

    2008-01-01

    Owing to recent advances in high-throughput technology in macromolecular crystallography beamlines, such as high-brilliant X-ray sources, high-speed readout detectors and robotics, the number of samples that can be examined in a single visit to the beamline has increased dramatically. In order to make these experiments more efficient, two functions, remote monitoring and diffraction image evaluation, have been implemented in the macromolecular crystallography beamlines at the Photon Factory (PF). Remote monitoring allows scientists to participate in the experiment by watching from their laboratories, without having to come to the beamline. Diffraction image evaluation makes experiments easier, especially when using the sample exchange robot. To implement these two functions, two independent clients have been developed that work specifically for remote monitoring and diffraction image evaluation. In the macromolecular crystallography beamlines at PF, beamline control is performed using STARS (simple transmission and retrieval system). The system adopts a client–server style in which client programs communicate with each other through a server process using the STARS protocol. This is an advantage of the extension of the system; implementation of these new functions required few modifications of the existing system. PMID:18421163

  2. Fixed target combined with spectral mapping: approaching 100% hit rates for serial crystallography.

    Science.gov (United States)

    Oghbaey, Saeed; Sarracini, Antoine; Ginn, Helen M; Pare-Labrosse, Olivier; Kuo, Anling; Marx, Alexander; Epp, Sascha W; Sherrell, Darren A; Eger, Bryan T; Zhong, Yinpeng; Loch, Rolf; Mariani, Valerio; Alonso-Mori, Roberto; Nelson, Silke; Lemke, Henrik T; Owen, Robin L; Pearson, Arwen R; Stuart, David I; Ernst, Oliver P; Mueller-Werkmeister, Henrike M; Miller, R J Dwayne

    2016-08-01

    The advent of ultrafast highly brilliant coherent X-ray free-electron laser sources has driven the development of novel structure-determination approaches for proteins, and promises visualization of protein dynamics on sub-picosecond timescales with full atomic resolution. Significant efforts are being applied to the development of sample-delivery systems that allow these unique sources to be most efficiently exploited for high-throughput serial femtosecond crystallography. Here, the next iteration of a fixed-target crystallography chip designed for rapid and reliable delivery of up to 11 259 protein crystals with high spatial precision is presented. An experimental scheme for predetermining the positions of crystals in the chip by means of in situ spectroscopy using a fiducial system for rapid, precise alignment and registration of the crystal positions is presented. This delivers unprecedented performance in serial crystallography experiments at room temperature under atmospheric pressure, giving a raw hit rate approaching 100% with an effective indexing rate of approximately 50%, increasing the efficiency of beam usage and allowing the method to be applied to systems where the number of crystals is limited.

  3. Applications of high-throughput clonogenic survival assays in high-LET particle microbeams

    Directory of Open Access Journals (Sweden)

    Antonios eGeorgantzoglou

    2016-01-01

    Full Text Available Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-LET particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells’ clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells’ response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell’s capacity to divide at least 4-5 times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  4. GROMACS 4.5: A high-throughput and highly parallel open source molecular simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pronk, Sander [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Pall, Szilard [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Schulz, Roland [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Larsson, Per [Univ. of Virginia, Charlottesville, VA (United States); Bjelkmar, Par [Science for Life Lab., Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden); Apostolov, Rossen [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Shirts, Michael R. [Univ. of Virginia, Charlottesville, VA (United States); Smith, Jeremy C. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kasson, Peter M. [Univ. of Virginia, Charlottesville, VA (United States); van der Spoel, David [Science for Life Lab., Stockholm (Sweden); Uppsala Univ., Uppsala (Sweden); Hess, Berk [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Lindahl, Erik [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden)

    2013-02-13

    In this study, molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. As a result, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.

  5. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  6. A Warm Near-Infrared High-Resolution Spectrograph with Very High Throughput (WINERED)

    CERN Document Server

    Kondo, Sohei; Kobayashi, Naoto; Yasui, Chikako; Mito, Hiroyuki; Fukue, Kei; Nakanishi, Kenshi; Kawanishi, Takafumi; Nakaoka, Tetsuya; Otsubo, Shogo; Kinoshita, Masaomi; Kitano, Ayaka; Hamano, Satoshi; Mizumoto, Misaki; Yamamoto, Ryo; Izumi, Natsuko; Matsunaga, Noriyuki; Kawakita, Hideyo

    2015-01-01

    WINERED is a newly built high-efficiency (throughput$ > 25-30\\%$) and high-resolution spectrograph customized for short NIR bands at 0.9-1.35 ${\\rm \\mu}$m. WINERED is equipped with ambient temperature optics and a cryogenic camera using a 1.7 ${\\rm \\mu}$m cut-off HgCdTe HAWAII-2RG array detector. WINERED has two grating modes: one with a conventional reflective echelle grating (R$\\sim$28,300), which covers 0.9-1.35 $\\mu$m simultaneously, the other with ZnSe or ZnS immersion grating (R$\\sim$100,000). We have completed the development of WINERED except for the immersion grating, and started engineering and science observations at the Nasmyth platform of the 1.3 m Araki Telescope at Koyama Astronomical Observatory of Kyoto-Sangyo University in Japan. We confirmed that the spectral resolution ($R\\sim$ 28,300) and the throughput ($>$ 40\\% w/o telescope/atmosphere/array QE) meet our specifications. We measured ambient thermal backgrounds (e.g., 0.06 ${\\rm [e^{-}/sec/pixel]}$ at 287 K), which are roughly consistent ...

  7. Continuous-flow high pressure hydrogenation reactor for optimization and high-throughput synthesis.

    Science.gov (United States)

    Jones, Richard V; Godorhazy, Lajos; Varga, Norbert; Szalay, Daniel; Urge, Laszlo; Darvas, Ferenc

    2006-01-01

    This paper reports on a novel continuous-flow hydrogenation reactor and its integration with a liquid handler to generate a fully automated high-throughput hydrogenation system for library synthesis. The reactor, named the H-Cube, combines endogenous hydrogen generation from the electrolysis of water with a continuous flow-through system. The system makes significant advances over current batch hydrogenation reactors in terms of safety, reaction validation efficiency, and rates of reaction. The hydrogenation process is described along with a detailed description of the device's main parts. The reduction of a series of functional groups, varying in difficulty up to 70 degrees C and 70 bar are also described. The paper concludes with the integration of the device into an automated liquid handler followed by the reduction of a nitro compound in a high throughput manner. The system is fully automated and can conduct 5 reactions in the time it takes to perform and workup one reaction manually on a standard batch reactor.

  8. Protocol: A high-throughput DNA extraction system suitable for conifers

    Directory of Open Access Journals (Sweden)

    Rajora Om P

    2008-08-01

    Full Text Available Abstract Background High throughput DNA isolation from plants is a major bottleneck for most studies requiring large sample sizes. A variety of protocols have been developed for DNA isolation from plants. However, many species, including conifers, have high contents of secondary metabolites that interfere with the extraction process or the subsequent analysis steps. Here, we describe a procedure for high-throughput DNA isolation from conifers. Results We have developed a high-throughput DNA extraction protocol for conifers using an automated liquid handler and modifying the Qiagen MagAttract Plant Kit protocol. The modifications involve change to the buffer system and improving the protocol so that it almost doubles the number of samples processed per kit, which significantly reduces the overall costs. We describe two versions of the protocol: one for medium-throughput (MTP and another for high-throughput (HTP DNA isolation. The HTP version works from start to end in the industry-standard 96-well format, while the MTP version provides higher DNA yields per sample processed. We have successfully used the protocol for DNA extraction and genotyping of thousands of individuals of several spruce and a pine species. Conclusion A high-throughput system for DNA extraction from conifer needles and seeds has been developed and validated. The quality of the isolated DNA was comparable with that obtained from two commonly used methods: the silica-spin column and the classic CTAB protocol. Our protocol provides a fully automatable and cost effective solution for processing large numbers of conifer samples.

  9. Machine learning in computational biology to accelerate high-throughput protein expression

    DEFF Research Database (Denmark)

    Sastry, Anand; Monk, Jonathan M.; Tegel, Hanna

    2017-01-01

    and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide...... the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. Availability and implementation: We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template...

  10. High-throughput exposure modeling to support prioritization of chemicals in personal care products

    DEFF Research Database (Denmark)

    Csiszar, Susan A.; Ernstoff, Alexi; Fantke, Peter;

    2016-01-01

    We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or population per mass...... intakes were associated with body lotion. Bioactive doses derived from high-throughput in vitro toxicity data were combined with the estimated PiFs to demonstrate an approach to estimate bioactive equivalent chemical content and to screen chemicals for risk....

  11. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    Science.gov (United States)

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  12. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    Science.gov (United States)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  13. High-throughput screening of small-molecule adsorption in MOF

    OpenAIRE

    Canepa, Pieremanuele; Arter, Calvin A.; Conwill, Eliot M.; Johnson, Daniel H.; Shoemaker, Brian A.; Soliman, Karim Z.; Thonhauser, T.

    2013-01-01

    Using high-throughput screening coupled with state-of-the-art van der Waals density functional theory, we investigate the adsorption properties of four important molecules, H_2, CO_2, CH_4, and H_2O in MOF-74-M with M = Be, Mg, Al, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sr, Zr, Nb, Ru, Rh, Pd, La, W, Os, Ir, and Pt. We show that high-throughput techniques can aid in speeding up the development and refinement of effective materials for hydrogen storage, carbon capture, and gas separation. ...

  14. High-throughput analysis of total nitrogen content that replaces the classic Kjeldahl method.

    Science.gov (United States)

    Yasuhara, T; Nokihara, K

    2001-10-01

    A high-throughput method for determination of total nitrogen content has been developed. The method involves decomposition of samples, followed by trapping and quantitative colorimetric determination of the resulting ammonia. The present method is rapid, facile, and economical. Thus, it can replace the classic Kjeldahl method through its higher efficiency for determining multiple samples. Compared to the classic method, the present method is economical and environmentally friendly. Based on the present method, a novel reactor was constructed to realize routine high-throughput analyses of multiple samples such as those found for pharmaceutical materials, foods, and/or excrements.

  15. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    BACKGROUND: Unbiased flow cytometry-based methods have become the technique of choice in many laboratories for high-throughput, accurate assessments of malaria parasites in bioassays. A method to quantify live parasites based on mitotracker red CMXRos was recently described but consistent...... distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...

  16. Accelerating Virtual High-Throughput Ligand Docking: current technology and case study on a petascale supercomputer.

    Science.gov (United States)

    Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome

    2014-04-25

    In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.

  17. Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Y; Grossman, JC

    2014-12-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  18. GiNA, an efficient and high-throughput software for horticultural phenotyping

    Science.gov (United States)

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed,...

  19. Rapid and high-throughput detection of highly pathogenic bacteria by Ibis PLEX-ID technology.

    Directory of Open Access Journals (Sweden)

    Daniela Jacob

    Full Text Available In this manuscript, we describe the identification of highly pathogenic bacteria using an assay coupling biothreat group-specific PCR with electrospray ionization mass spectrometry (PCR/ESI-MS run on an Ibis PLEX-ID high-throughput platform. The biothreat cluster assay identifies most of the potential bioterrorism-relevant microorganisms including Bacillus anthracis, Francisella tularensis, Yersinia pestis, Burkholderia mallei and pseudomallei, Brucella species, and Coxiella burnetii. DNA from 45 different reference materials with different formulations and different concentrations were chosen and sent to a service screening laboratory that uses the PCR/ESI-MS platform to provide a microbial identification service. The standard reference materials were produced out of a repository built up in the framework of the EU funded project "Establishment of Quality Assurances for Detection of Highly Pathogenic Bacteria of Potential Bioterrorism Risk" (EQADeBa. All samples were correctly identified at least to the genus level.

  20. High resolution Si(Li) X-ray spectrometer with high throughput rate

    Energy Technology Data Exchange (ETDEWEB)

    Bacso, J.; Kalinka, G.; Kertesz, Zs.; Kovacs, P.; Lakatos, T. (Magyar Tudomanyos Akademia Atommag Kutato Intezete, Debrecen)

    1982-06-01

    The paper presents the description of a modern Si(Li) X-ray spectrometer developed in ATOMKI. The Si(Li) detectors are single-grooved with an active area of 20-50 mm/sup 2/. The Be window is coated with a special protective layer against corrosion. A small getter-ion pump maintains the high vacuum in the cryostat chamber. The preamplifier employs pulsed drain feedback; in its first stage selected, teflon-encapsulated field effect transistors are used. The analogue signal processor is direct coupled and employs time variant pulse shaping. This construction provides high resolution (150-170 eV), high throughput rate, excellent stability, effective pile-up elimination, accurate live-time correction and simplicity in the applications. The live-time correction is performed by a random pulse generator, its average frequency is stabilized and the corresponding peak appears at zero energy in the spectra.

  1. HT-Paxos: High Throughput State-Machine Replication Protocol for Large Clustered Data Centers

    Directory of Open Access Journals (Sweden)

    Vinit Kumar

    2015-01-01

    Full Text Available Paxos is a prominent theory of state-machine replication. Recent data intensive systems that implement state-machine replication generally require high throughput. Earlier versions of Paxos as few of them are classical Paxos, fast Paxos, and generalized Paxos have a major focus on fault tolerance and latency but lacking in terms of throughput and scalability. A major reason for this is the heavyweight leader. Through offloading the leader, we can further increase throughput of the system. Ring Paxos, Multiring Paxos, and S-Paxos are few prominent attempts in this direction for clustered data centers. In this paper, we are proposing HT-Paxos, a variant of Paxos that is the best suitable for any large clustered data center. HT-Paxos further offloads the leader very significantly and hence increases the throughput and scalability of the system, while at the same time, among high throughput state-machine replication protocols, it provides reasonably low latency and response time.

  2. Achieving high mass-throughput of therapeutic proteins through parvovirus retentive filters.

    Science.gov (United States)

    Bolton, Glen R; Basha, Jonida; Lacasse, Daniel P

    2010-01-01

    Parvovirus retentive filters that assure removal of viruses and virus-like particles during the production of therapeutic proteins significantly contribute to total manufacturing costs. Operational approaches that can increase throughput and reduce filtration area would result in a significant cost savings. A combination of methods was used to achieve high throughputs of an antibody or therapeutic protein solution through three parvovirus retentive filters. These methods included evaluation of diatomaceous earth or size-based prefilters, the addition of additives, and the optimization of protein concentration, temperature, buffer composition, and solution pH. An optimum temperature of 35°C was found for maximizing throughput through the Virosart CPV and Viresolve Pro filters. Mass-throughput values of 7.3, 26.4, and 76.2 kg/m(2) were achieved through the Asahi Planova 20N, Virosart CPV, and Viresolve Pro filters, respectively, in 4 h of processing. Mass-throughput values of 73, 137, and 192 kg/m(2) were achieved through a Millipore Viresolve Pro filter in 4.0, 8.8, and 22.1 h of processing, respectively, during a single experiment. However, large-scale parvovirus filtration operations are typically controlled to limit volumetric throughput to below the level achieved during small-scale virus spiking experiments. The virus spike may cause significant filter plugging, limiting throughput. Therefore newer parvovirus filter spiking strategies should be adopted that may lead to more representative viral clearance data and higher utilization of large-scale filter capacity. Copyright © 2010 American Institute of Chemical Engineers (AIChE).

  3. High throughput phenotyping to accelerate crop breeding and monitoring of diseases in the field.

    Science.gov (United States)

    Shakoor, Nadia; Lee, Scott; Mockler, Todd C

    2017-08-01

    Effective implementation of technology that facilitates accurate and high-throughput screening of thousands of field-grown lines is critical for accelerating crop improvement and breeding strategies for higher yield and disease tolerance. Progress in the development of field-based high throughput phenotyping methods has advanced considerably in the last 10 years through technological progress in sensor development and high-performance computing. Here, we review recent advances in high throughput field phenotyping technologies designed to inform the genetics of quantitative traits, including crop yield and disease tolerance. Successful application of phenotyping platforms to advance crop breeding and identify and monitor disease requires: (1) high resolution of imaging and environmental sensors; (2) quality data products that facilitate computer vision, machine learning and GIS; (3) capacity infrastructure for data management and analysis; and (4) automated environmental data collection. Accelerated breeding for agriculturally relevant crop traits is key to the development of improved varieties and is critically dependent on high-resolution, high-throughput field-scale phenotyping technologies that can efficiently discriminate better performing lines within a larger population and across multiple environments. Copyright © 2017. Published by Elsevier Ltd.

  4. Life in the fast lane for protein crystallization and X-ray crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Pusey, Marc L.; Liu, Zhi-Jie; Tempel, Wolfram; Praissman, Jeremy; Lin, Dawei; Wang, Bi-Cheng; Gavira, Jose A.; Ng, Joseph D. (UAH); (NASA); (Georgia)

    2010-07-20

    The common goal for structural genomic centers and consortiums is to decipher as quickly as possible the three-dimensional structures for a multitude of recombinant proteins derived from known genomic sequences. Since X-ray crystallography is the foremost method to acquire atomic resolution for macromolecules, the limiting step is obtaining protein crystals that can be useful of structure determination. High-throughput methods have been developed in recent years to clone, express, purify, crystallize and determine the three-dimensional structure of a protein gene product rapidly using automated devices, commercialized kits and consolidated protocols. However, the average number of protein structures obtained for most structural genomic groups has been very low compared to the total number of proteins purified. As more entire genomic sequences are obtained for different organisms from the three kingdoms of life, only the proteins that can be crystallized and whose structures can be obtained easily are studied. Consequently, an astonishing number of genomic proteins remain unexamined. In the era of high-throughput processes, traditional methods in molecular biology, protein chemistry and crystallization are eclipsed by automation and pipeline practices. The necessity for high-rate production of protein crystals and structures has prevented the usage of more intellectual strategies and creative approaches in experimental executions. Fundamental principles and personal experiences in protein chemistry and crystallization are minimally exploited only to obtain 'low-hanging fruit' protein structures. We review the practical aspects of today's high-throughput manipulations and discuss the challenges in fast pace protein crystallization and tools for crystallography. Structural genomic pipelines can be improved with information gained from low-throughput tactics that may help us reach the higher-bearing fruits. Examples of recent developments in this area

  5. Life in the fast lane for protein crystallization and X-ray crystallography.

    Science.gov (United States)

    Pusey, Marc L; Liu, Zhi-Jie; Tempel, Wolfram; Praissman, Jeremy; Lin, Dawei; Wang, Bi-Cheng; Gavira, José A; Ng, Joseph D

    2005-07-01

    The common goal for structural genomic centers and consortiums is to decipher as quickly as possible the three-dimensional structures for a multitude of recombinant proteins derived from known genomic sequences. Since X-ray crystallography is the foremost method to acquire atomic resolution for macromolecules, the limiting step is obtaining protein crystals that can be useful of structure determination. High-throughput methods have been developed in recent years to clone, express, purify, crystallize and determine the three-dimensional structure of a protein gene product rapidly using automated devices, commercialized kits and consolidated protocols. However, the average number of protein structures obtained for most structural genomic groups has been very low compared to the total number of proteins purified. As more entire genomic sequences are obtained for different organisms from the three kingdoms of life, only the proteins that can be crystallized and whose structures can be obtained easily are studied. Consequently, an astonishing number of genomic proteins remain unexamined. In the era of high-throughput processes, traditional methods in molecular biology, protein chemistry and crystallization are eclipsed by automation and pipeline practices. The necessity for high-rate production of protein crystals and structures has prevented the usage of more intellectual strategies and creative approaches in experimental executions. Fundamental principles and personal experiences in protein chemistry and crystallization are minimally exploited only to obtain "low-hanging fruit" protein structures. We review the practical aspects of today's high-throughput manipulations and discuss the challenges in fast pace protein crystallization and tools for crystallography. Structural genomic pipelines can be improved with information gained from low-throughput tactics that may help us reach the higher-bearing fruits. Examples of recent developments in this area are reported from

  6. Life in the fast lane for protein crystallization and X-ray crystallography

    Science.gov (United States)

    Pusey, Marc L.; Liu, Zhi-Jie; Tempel, Wolfram; Praissman, Jeremy; Lin, Dawei; Wang, Bi-Cheng; Gavira, Jose A.; Ng, Joseph D.

    2005-01-01

    The common goal for structural genomic centers and consortiums is to decipher as quickly as possible the three-dimensional structures for a multitude of recombinant proteins derived from known genomic sequences. Since X-ray crystallography is the foremost method to acquire atomic resolution for macromolecules, the limiting step is obtaining protein crystals that can be useful of structure determination. High-throughput methods have been developed in recent years to clone, express, purify, crystallize and determine the three-dimensional structure of a protein gene product rapidly using automated devices, commercialized kits and consolidated protocols. However, the average number of protein structures obtained for most structural genomic groups has been very low compared to the total number of proteins purified. As more entire genomic sequences are obtained for different organisms from the three kingdoms of life, only the proteins that can be crystallized and whose structures can be obtained easily are studied. Consequently, an astonishing number of genomic proteins remain unexamined. In the era of high-throughput processes, traditional methods in molecular biology, protein chemistry and crystallization are eclipsed by automation and pipeline practices. The necessity for high-rate production of protein crystals and structures has prevented the usage of more intellectual strategies and creative approaches in experimental executions. Fundamental principles and personal experiences in protein chemistry and crystallization are minimally exploited only to obtain "low-hanging fruit" protein structures. We review the practical aspects of today's high-throughput manipulations and discuss the challenges in fast pace protein crystallization and tools for crystallography. Structural genomic pipelines can be improved with information gained from low-throughput tactics that may help us reach the higher-bearing fruits. Examples of recent developments in this area are reported from

  7. High throughput screening of physicochemical properties and in vitro ADME profiling in drug discovery.

    Science.gov (United States)

    Wan, Hong; Holmén, Anders G

    2009-03-01

    Current advances of new technologies with robotic automated assays combined with highly selective and sensitive LC-MS enable high-speed screening of lead series libraries in many in vitro assays. In this review, we summarize state of the art high throughput assays for screening of key physicochemical properties such as solubility, lipophilicity, pKa, drug-plasma protein binding and brain tissue binding as well as in vitro ADME profiling. We discuss two primary approaches for high throughput screening of solubility, i.e. an automated 96-well plate assay integrated with LC-MS and a rapid multi-wavelength UV plate reader. We address the advantages of newly developed miniaturized techniques for high throughput pKa screening by capillary electrophoresis combined with mass spectrometry (CE-MS) with automated data analysis flow. Several new lipophilicity approaches other than octanol-water partitioning are critically reviewed, including rapid liquid chromatographic retention based approach, immobilized artificial membrane (IAM) partitioning and liposome, and potential microemulsion electrokinetic chromatography (MEEKC) for accurate screening of LogP. We highlight the sample pooling (namely cassette dosing, all-in-one, cocktail) as an efficient approach for high throughput screening of physicochemical properties and in vitro ADME profiling with emphasis on the benefit of on-line quality control. This cassette dosing approach has been widely adapted in drug discovery for rapid screening of in vivo pharmacokinetic parameters with significantly increased capacity and dramatically reduced animal usage.

  8. Microfluidic-Enabled Print-to-Screen Platform for High-Throughput Screening of Combinatorial Chemotherapy.

    Science.gov (United States)

    Ding, Yuzhe; Li, Jiannan; Xiao, Wenwu; Xiao, Kai; Lee, Joyce; Bhardwaj, Urvashi; Zhu, Zijie; Digiglio, Philip; Yang, Gaomai; Lam, Kit S; Pan, Tingrui

    2015-10-20

    Since the 1960s, combination chemotherapy has been widely utilized as a standard method to treat cancer. However, because of the potentially enormous number of drug candidates and combinations, conventional identification methods of the effective drug combinations are usually associated with significantly high operational costs, low throughput screening, laborious and time-consuming procedures, and ethical concerns. In this paper, we present a low-cost, high-efficiency microfluidic print-to-screen (P2S) platform, which integrates combinatorial screening with biomolecular printing for high-throughput screening of anticancer drug combinations. This P2S platform provides several distinct advantages and features, including automatic combinatorial printing, high-throughput parallel drug screening, modular disposable cartridge, and biocompatibility, which can potentially speed up the entire discovery cycle of potent drug combinations. Microfluidic impact printing utilizing plug-and-play microfluidic cartridges is experimentally characterized with controllable droplet volume and accurate positioning. Furthermore, the combinatorial print-to-screen assay is demonstrated in a proof-of-concept biological experiment which can identify the positive hits among the entire drug combination library in a parallel and rapid manner. Overall, this microfluidic print-to-screen platform offers a simple, low-cost, high-efficiency solution for high-throughput large-scale combinatorial screening and can be applicable for various emerging applications in drug cocktail discovery.

  9. High Throughput Atomic Layer Deposition Processes: High Pressure Operations, New Reactor Designs, and Novel Metal Processing

    Science.gov (United States)

    Mousa, MoatazBellah Mahmoud

    Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor

  10. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  11. High pressure inertial focusing for separating and concentrating bacteria at high throughput

    Science.gov (United States)

    Cruz, J.; Hooshmand Zadeh, S.; Graells, T.; Andersson, M.; Malmström, J.; Wu, Z. G.; Hjort, K.

    2017-08-01

    Inertial focusing is a promising microfluidic technology for concentration and separation of particles by size. However, there is a strong correlation of increased pressure with decreased particle size. Theory and experimental results for larger particles were used to scale down the phenomenon and find the conditions that focus 1 µm particles. High pressure experiments in robust glass chips were used to demonstrate the alignment. We show how the technique works for 1 µm spherical polystyrene particles and for Escherichia coli, not being harmful for the bacteria at 50 µl min-1. The potential to focus bacteria, simplicity of use and high throughput make this technology interesting for healthcare applications, where concentration and purification of a sample may be required as an initial step.

  12. Validation of a High-Throughput Multiplex Genetic Detection System for Helicobacter pylori Identification, Quantification, Virulence, and Resistance Analysis

    OpenAIRE

    Zhang, Yanmei; Zhao, Fuju; Kong, Mimi; Wang, Shiwen; Nan, Li; Hu, Binjie; Olszewski, Michal A.; Miao, Yingxin; Ji, Danian; Jiang, Wenrong; Fang, Yi; Zhang, Jinghao; Chen, Fei; Xiang, Ping; Wu, Yong

    2016-01-01

    Helicobacter pylori (H. pylori) infection is closely related to various gastroduodenal diseases. Virulence factors and bacterial load of H. pylori are associated with clinical outcomes, and drug-resistance severely impacts the clinical efficacy of eradication treatment. Existing detection methods are low-throughput, time-consuming and labor intensive. Therefore, a rapid and high-throughput method is needed for clinical diagnosis, treatment, and monitoring for H. pylori. High-throughput Multip...

  13. High-throughput data pipelines for metabolic flux analysis in plants.

    Science.gov (United States)

    Poskar, C Hart; Huege, Jan; Krach, Christian; Shachar-Hill, Yair; Junker, Björn H

    2014-01-01

    In this chapter we illustrate the methodology for high-throughput metabolic flux analysis. Central to this is developing an end to end data pipeline, crucial for integrating the wet lab experiments and analytics, combining hardware and software automation, and standardizing data representation providing importers and exporters to support third party tools. The use of existing software at the start, data extraction from the chromatogram, and the end, MFA analysis, allows for the most flexibility in this workflow. Developing iMS2Flux provided a standard, extensible, platform independent tool to act as the "glue" between these end points. Most importantly this tool can be easily adapted to support different data formats, data verification and data correction steps allowing it to be central to managing the data necessary for high-throughput MFA. An additional tool was needed to automate the MFA software and in particular to take advantage of the course grained parallel nature of high-throughput analysis and available high performance computing facilities.In combination these methods show the development of high-throughput pipelines that allow metabolic flux analysis to join as a full member of the omics family.

  14. A targeted proteomics toolkit for high-throughput absolute quantification of Escherichia coli proteins.

    Science.gov (United States)

    Batth, Tanveer S; Singh, Pragya; Ramakrishnan, Vikram R; Sousa, Mirta M L; Chan, Leanne Jade G; Tran, Huu M; Luning, Eric G; Pan, Eva H Y; Vuu, Khanh M; Keasling, Jay D; Adams, Paul D; Petzold, Christopher J

    2014-11-01

    Transformation of engineered Escherichia coli into a robust microbial factory is contingent on precise control of metabolism. Yet, the throughput of omics technologies used to characterize cell components has lagged far behind our ability to engineer novel strains. To expand the utility of quantitative proteomics for metabolic engineering, we validated and optimized targeted proteomics methods for over 400 proteins from more than 20 major pathways in E. coli metabolism. Complementing these methods, we constructed a series of synthetic genes to produce concatenated peptides (QconCAT) for absolute quantification of the proteins and made them available through the Addgene plasmid repository (www.addgene.org). To facilitate high sample throughput, we developed a fast, analytical-flow chromatography method using a 5.5-min gradient (10 min total run time). Overall this toolkit provides an invaluable resource for metabolic engineering by increasing sample throughput, minimizing development time and providing peptide standards for absolute quantification of E. coli proteins.

  15. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis

    Directory of Open Access Journals (Sweden)

    Na Wen

    2016-07-01

    Full Text Available This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1 prototype demonstration of single-cell encapsulation in microfluidic droplets; (2 technical improvements of single-cell encapsulation in microfluidic droplets; (3 microfluidic droplets enabling single-cell proteomic analysis; (4 microfluidic droplets enabling single-cell genomic analysis; and (5 integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  16. Considerations for the design and reporting of enzyme assays in high-throughput screening applications

    Directory of Open Access Journals (Sweden)

    Michael G. Acker

    2014-05-01

    Full Text Available This review describes the key steps and methods which are used to develop enzyme assays suitable for high-throughput screening (HTS applications. The goals of HTS enzyme assays are defined relative to lower-throughput bench top assays and important aspects which go into constructing robust and sensitive enzyme assays are described. Methods that have been applied to common enzyme classes are reviewed and pitfalls related to assay artifacts are discussed. We also suggest a reporting format to describe the steps in HTS enzyme assays.

  17. High-throughput, temperature-controlled microchannel acoustophoresis device made with rapid prototyping

    DEFF Research Database (Denmark)

    Adams, Jonathan D; Ebbesen, Christian L.; Barnkob, Rune

    2012-01-01

    We report a temperature-controlled microfluidic acoustophoresis device capable of separating particles and transferring blood cells from undiluted whole human blood at a volume throughput greater than 1 L h−1. The device is fabricated from glass substrates and polymer sheets in microscope......-slide format using low-cost, rapid-prototyping techniques. This high-throughput acoustophoresis chip (HTAC) utilizes a temperature-stabilized, standing ultrasonic wave, which imposes differential acoustic radiation forces that can separate particles according to size, density and compressibility. The device...

  18. Development of Microfluidic Systems Enabling High-Throughput Single-Cell Protein Characterization

    Science.gov (United States)

    Fan, Beiyuan; Li, Xiufeng; Chen, Deyong; Peng, Hongshang; Wang, Junbo; Chen, Jian

    2016-01-01

    This article reviews recent developments in microfluidic systems enabling high-throughput characterization of single-cell proteins. Four key perspectives of microfluidic platforms are included in this review: (1) microfluidic fluorescent flow cytometry; (2) droplet based microfluidic flow cytometry; (3) large-array micro wells (microengraving); and (4) large-array micro chambers (barcode microchips). We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on three key performance parameters (absolute quantification, sensitivity, and throughput). PMID:26891303

  19. Development of Microfluidic Systems Enabling High-Throughput Single-Cell Protein Characterization

    Directory of Open Access Journals (Sweden)

    Beiyuan Fan

    2016-02-01

    Full Text Available This article reviews recent developments in microfluidic systems enabling high-throughput characterization of single-cell proteins. Four key perspectives of microfluidic platforms are included in this review: (1 microfluidic fluorescent flow cytometry; (2 droplet based microfluidic flow cytometry; (3 large-array micro wells (microengraving; and (4 large-array micro chambers (barcode microchips. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on three key performance parameters (absolute quantification, sensitivity, and throughput.

  20. High-throughput screening: speeding up porous materials discovery.

    Science.gov (United States)

    Wollmann, Philipp; Leistner, Matthias; Stoeck, Ulrich; Grünker, Ronny; Gedrich, Kristina; Klein, Nicole; Throl, Oliver; Grählert, Wulf; Senkovska, Irena; Dreisbach, Frieder; Kaskel, Stefan

    2011-05-14

    A new tool (Infrasorb-12) for the screening of porosity is described, identifying high surface area materials in a very short time with high accuracy. Further, an example for the application of the tool in the discovery of new cobalt-based metal-organic frameworks is given. © The Royal Society of Chemistry 2011

  1. Applications of high throughput (combinatorial) methodologies to electronic, magnetic, optical, and energy-related materials

    Science.gov (United States)

    Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.

    2013-06-01

    High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome

  2. Novel High-Throughput Drug Screening Platform for Chemotherapy-Induced Axonal Neuropathy

    Science.gov (United States)

    2013-05-01

    COVERED 1 May 201 - 30 Apr 201 4. TITLE AND SUBTITLE : Novel High-Throughput Drug Screening Platform for Chemotherapy-Induced axonal...Introduction-page 1 Results- page 1,2,3 Conclusion-page 3 Introduction: Taxol is an antineoplastic agent, which is used for the treatment of

  3. Roche genome sequencer FLX based high-throughput sequencing of ancient DNA

    DEFF Research Database (Denmark)

    Alquezar-Planas, David E; Fordyce, Sarah Louise

    2012-01-01

    Since the development of so-called "next generation" high-throughput sequencing in 2005, this technology has been applied to a variety of fields. Such applications include disease studies, evolutionary investigations, and ancient DNA. Each application requires a specialized protocol to ensure tha...

  4. Comprehensive analysis of high-throughput screens with HiTSeekR

    DEFF Research Database (Denmark)

    List, Markus; Schmidt, Steffen; Christiansen, Helle;

    2016-01-01

    High-throughput screening (HTS) is an indispensable tool for drug (target) discovery that currently lacks user-friendly software tools for the robust identification of putative hits from HTS experiments and for the interpretation of these findings in the context of systems biology. We developed H...

  5. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    Science.gov (United States)

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  6. Evaluation of simple and inexpensive high-throughput methods for phytic acid determination

    Science.gov (United States)

    High-throughput/low-cost/low-tech methods for phytic acid determination that are sufficiently accurate and reproducible would be of value in plant genetics, crop breeding and in the food and feed industries. Variants of two candidate methods, those described by Vaintraub and Lapteva (Anal. Biochem. ...

  7. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an

  8. A practical fluorogenic substrate for high-throughput screening of glutathione S-transferase inhibitors.

    Science.gov (United States)

    Fujikawa, Yuuta; Morisaki, Fumika; Ogura, Asami; Morohashi, Kana; Enya, Sora; Niwa, Ryusuke; Goto, Shinji; Kojima, Hirotatsu; Okabe, Takayoshi; Nagano, Tetsuo; Inoue, Hideshi

    2015-07-21

    We report a new fluorogenic substrate for glutathione S-transferase (GST), 3,4-DNADCF, enabling the assay with a low level of nonenzymatic background reaction. Inhibitors against Noppera-bo/GSTe14 from Drosophila melanogaster were identified by high throughput screening using 3,4-DNADCF, demonstrating the utility of this substrate.

  9. 3D high throughput screening and profiling of embryoid bodies in thermoformed microwell plates

    NARCIS (Netherlands)

    Vrij, E. J.; Espinoza, S.; Heilig, M.; Kolew, A.; Schneider, M.; Van Blitterswijk, C. A.; Truckenmüller, R. K.; Rivron, N. C.

    2016-01-01

    3D organoids using stem cells to study development and disease are now widespread. These models are powerful to mimic in vivo situations but are currently associated with high variability and low throughput. For biomedical research, platforms are thus necessary to increase reproducibility and allow

  10. Application of high-throughput technologies to a structural proteomics-type analysis of Bacillus anthracis

    NARCIS (Netherlands)

    Au, K.; Folkers, G.E.; Kaptein, R.

    2006-01-01

    A collaborative project between two Structural Proteomics In Europe (SPINE) partner laboratories, York and Oxford, aimed at high-throughput (HTP) structure determination of proteins from Bacillus anthracis, the aetiological agent of anthrax and a biomedically important target, is described. Based up

  11. A novel high-throughput irradiator for in vitro radiation sensitivity bioassays

    Science.gov (United States)

    Fowler, Tyler L.

    Given the emphasis on more personalized radiation therapy there is an ongoing and compelling need to develop high-throughput screening tools to further examine the biological effects of ionizing radiation on cells, tissues and organ systems in either the research or clinical setting. Conventional x-ray irradiators are designed to provide maximum versatility to radiobiology researchers, typically accommodating small animals, tissue or blood samples, and cellular applications. This added versatility often impedes the overall sensitivity and specificity of an experiment resulting in a trade-off between the number of absorbed doses (or dose rates) and biological endpoints that can be investigated in vitro in a reasonable amount of time. Therefore, modern irradiator designs are incompatible with current high-throughput bioassay technologies. Furthermore, important dosimetry and calibration characteristics (i.e. dose build-up region, beam attenuation, and beam scatter) of these irradiators are typically unknown to the end user, which can lead to significant deviation between delivered dose and intended dose to cells that adversely impact experimental results. Therefore, the overarching goal of this research is to design and develop a robust and fully automated high-throughput irradiator for in vitro radiation sensitivity investigations. Additionally, in vitro biological validation of this system was performed by assessing intracellular reactive oxygen species production, physical DNA double strand breaks, and activation of cellular DNA repair mechanisms. Finally, the high-throughput irradiator was used to investigate autophagic flux, a cellular adaptive response, as a potential biomarker of radiation sensitivity.

  12. Cancer panomics: computational methods and infrastructure for integrative analysis of cancer high-throughput "omics" data

    DEFF Research Database (Denmark)

    Brunak, Søren; De La Vega, Francisco M.; Rätsch, Gunnar

    2014-01-01

    Targeted cancer treatment is becoming the goal of newly developed oncology medicines and has already shown promise in some spectacular cases such as the case of BRAF kinase inhibitors in BRAF-mutant (e.g. V600E) melanoma. These developments are driven by the advent of high-throughput sequencing...

  13. A high-throughput method for GMO multi-detection using a microfluidic dynamic array

    NARCIS (Netherlands)

    Brod, F.C.A.; Dijk, van J.P.; Voorhuijzen, M.M.; Dinon, A.Z.; Guimarães, L.H.S.; Scholtens, I.M.J.; Arisi, A.C.M.; Kok, E.J.

    2014-01-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNAbased methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the nu

  14. High-throughput exposure modeling to support prioritization of chemicals in personal care products

    DEFF Research Database (Denmark)

    Csiszar, Susan A.; Ernstoff, Alexi; Fantke, Peter

    2016-01-01

    We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or population per mass...

  15. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis (Intestinal-

  16. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  17. Development of in-house methods for high-throughput DNA extraction

    Science.gov (United States)

    Given the high-throughput nature of many current biological studies, in particular field-based or applied environmental studies, optimization of cost-effective, efficient methods for molecular analysis of large numbers of samples is a critical first step. Existing methods are either based on costly ...

  18. HIGH-THROUGHPUT IDENTIFICATION OF CATALYTIC REDOX-ACTIVE CYSTEINE RESIDUES

    Science.gov (United States)

    Cysteine (Cys) residues often play critical roles in proteins; however, identification of their specific functions has been limited to case-by-case experimental approaches. We developed a procedure for high-throughput identification of catalytic redox-active Cys in proteins by se...

  19. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an infrare

  20. Investigation of non-halogenated solvent mixtures for high throughput fabrication of polymerfullerene solar cells

    NARCIS (Netherlands)

    Schmidt-Hansberg, B.; Sanyal, M.; Grossiord, N.; Galagan, Y.O.; Baunach, M.; Klein, M.F.G.; Colsmann, A.; Scharfer, P.; Lemmer, U.; Dosch, H.; Michels, J.J; Barrena, E.; Schabel, W.

    2012-01-01

    The rapidly increasing power conversion efficiencies of organic solar cells are an important prerequisite towards low cost photovoltaic fabricated in high throughput. In this work we suggest indane as a non-halogenated replacement for the commonly used halogenated solvent o-dichlorobenzene. Indane w

  1. An industrial engineering approach to laboratory automation for high throughput screening

    OpenAIRE

    Menke, Karl C.

    2000-01-01

    Across the pharmaceutical industry, there are a variety of approaches to laboratory automation for high throughput screening. At Sphinx Pharmaceuticals, the principles of industrial engineering have been applied to systematically identify and develop those automated solutions that provide the greatest value to the scientists engaged in lead generation.

  2. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis

  3. An integrated framework for discovery and genotyping of genomic variants from high-throughput sequencing experiments

    NARCIS (Netherlands)

    Duitama, Jorge; Quintero, Juan Camilo; Cruz, Daniel Felipe; Quintero, Constanza; Hubmann, Georg; Foulquié-Moreno, Maria R.; Verstrepen, Kevin J.; Thevelein, Johan M.; Tohme, Joe

    2014-01-01

    Recent advances in high-throughput sequencing (HTS) technologies and computing capacity have produced unprecedented amounts of genomic data that have unraveled the genetics of phenotypic variability in several species. However, operating and integrating current software tools for data analysis still

  4. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly refle

  5. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  6. A high-throughput, precipitating colorimetric sandwich ELISA microarray for shiga toxins

    Science.gov (United States)

    Shiga toxins 1 and 2 (Stx1 and Stx2) from Shiga toxin-producing E. coli (STEC) bacteria were simultaneously detected with a newly developed, high-throughput antibody microarray platform. The proteinaceous toxins were immobilized and sandwiched between biorecognition elements (monoclonal antibodies)...

  7. High-throughput screening of carbohydrate-degrading enzymes using novel insoluble chromogenic substrate assay kits

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Willats, William George Tycho

    2016-01-01

    for this is that advances in genome and transcriptome sequencing, together with associated bioinformatics tools allow for rapid identification of candidate CAZymes, but technology for determining an enzyme's biochemical characteristics has advanced more slowly. To address this technology gap, a novel high-throughput assay...

  8. PLASMA PROTEIN PROFILING AS A HIGH THROUGHPUT TOOL FOR CHEMICAL SCREENING USING A SMALL FISH MODEL

    Science.gov (United States)

    Hudson, R. Tod, Michael J. Hemmer, Kimberly A. Salinas, Sherry S. Wilkinson, James Watts, James T. Winstead, Peggy S. Harris, Amy Kirkpatrick and Calvin C. Walker. In press. Plasma Protein Profiling as a High Throughput Tool for Chemical Screening Using a Small Fish Model (Abstra...

  9. High-throughput synthesis and characterization of nanocrystalline porphyrinic zirconium metal-organic frameworks.

    Science.gov (United States)

    Kelty, M L; Morris, W; Gallagher, A T; Anderson, J S; Brown, K A; Mirkin, C A; Harris, T D

    2016-06-14

    We describe and employ a high-throughput screening method to accelerate the synthesis and identification of pure-phase, nanocrystalline metal-organic frameworks (MOFs). We demonstrate the efficacy of this method through its application to a series of porphyrinic zirconium MOFs, resulting in the isolation of MOF-525, MOF-545, and PCN-223 on the nanoscale.

  10. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free of t...

  11. High-throughput genotoxicity assay identifies antioxidants as inducers of DNA damage response and cell death

    Science.gov (United States)

    Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...

  12. High-throughput semiquantitative analysis of insertional mutations in heterogeneous tumors

    NARCIS (Netherlands)

    Koudijs, M.J.; Klijn, C.; van der Weyden, L.; Kool, J.; ten Hoeve, J.; Sie, D.; Prasetyanti, P.R.; Schut, E.; Kas, S.; Whipp, T.; Cuppen, E.; Wessels, L.; Adams, D.J.; Jonkers, J.

    2011-01-01

    Retroviral and transposon-based insertional mutagenesis (IM) screens are widely used for cancer gene discovery in mice. Exploiting the full potential of IM screens requires methods for high-throughput sequencing and mapping of transposon and retroviral insertion sites. Current protocols are based on

  13. Investigation of non-halogenated solvent mixtures for high throughput fabrication of polymerfullerene solar cells

    NARCIS (Netherlands)

    Schmidt-Hansberg, B.; Sanyal, M.; Grossiord, N.; Galagan, Y.O.; Baunach, M.; Klein, M.F.G.; Colsmann, A.; Scharfer, P.; Lemmer, U.; Dosch, H.; Michels, J.J; Barrena, E.; Schabel, W.

    2012-01-01

    The rapidly increasing power conversion efficiencies of organic solar cells are an important prerequisite towards low cost photovoltaic fabricated in high throughput. In this work we suggest indane as a non-halogenated replacement for the commonly used halogenated solvent o-dichlorobenzene. Indane w

  14. The Impact of Data Fragmentation on High-Throughput Clinical Phenotyping

    Science.gov (United States)

    Wei, Weiqi

    2012-01-01

    Subject selection is essential and has become the rate-limiting step for harvesting knowledge to advance healthcare through clinical research. Present manual approaches inhibit researchers from conducting deep and broad studies and drawing confident conclusions. High-throughput clinical phenotyping (HTCP), a recently proposed approach, leverages…

  15. A High-Throughput MALDI-TOF Mass Spectrometry-Based Assay of Chitinase Activity

    Science.gov (United States)

    A high-throughput MALDI-TOF mass spectrometric assay is described for assay of chitolytic enzyme activity. The assay uses unmodified chitin oligosaccharide substrates, and is readily achievable on a microliter scale (2 µL total volume, containing 2 µg of substrate and 1 ng of protein). The speed a...

  16. Organic light-emitting diodes: High-throughput virtual screening

    Science.gov (United States)

    Hirata, Shuzo; Shizu, Katsuyuki

    2016-10-01

    Computer networks, trained with data from delayed-fluorescence materials that have been successfully used in organic light-emitting diodes, facilitate the high-speed prediction of good emitters for display and lighting applications.

  17. Holographic memory module with ultra-high capacity and throughput

    Energy Technology Data Exchange (ETDEWEB)

    Vladimir A. Markov, Ph.D.

    2000-06-04

    High capacity, high transfer rate, random access memory systems are needed to archive and distribute the tremendous volume of digital information being generated, for example, the human genome mapping and online libraries. The development of multi-gigabit per second networks underscores the need for next-generation archival memory systems. During Phase I we conducted the theoretical analysis and accomplished experimental tests that validated the key aspects of the ultra-high density holographic data storage module with high transfer rate. We also inspected the secure nature of the encoding method and estimated the performance of full-scale system. Two basic architectures were considered, allowing for reversible compact solid-state configuration with limited capacity, and very large capacity write once read many memory system.

  18. Crystallography: past and present

    Science.gov (United States)

    Hodeau, J.-L.; Guinebretiere, R.

    2007-12-01

    In the 19th century, crystallography referred to the study of crystal shapes. Such studies by Haüy and Bravais allowed the establishment of important hypotheses such as (i) “les molécules intégrantes qui sont censées être les plus petits solides que l’on puisse extraire d’un minéral” [1], (ii) the definition of the crystal lattice and (iii) “le cristal est clivable parallèlement à deux ou trois formes cristallines” [2]. This morphological crystallography defined a crystal like “a chemically homogeneous solid, wholly or partly bounded by natural planes that intersect at predetermined angles” [3]. It described the main symmetry elements and operations, nomenclatures of different crystal forms and also the theory of twinning. A breakthrough appeared in 1912 with the use of X-rays by M. von Laue and W.H. and W.L. Bragg. This experimental development allowed the determination of the atomic content of each unit cell constituting the crystal and defined a crystal as “any solid in which an atomic pattern is repeated periodically in three dimensions, that is, any solid that “diffracts” an incident X-ray beam” [3]. Mathematical tools like the Patterson methods, the direct methods, were developed. The way for solving crystalline structure was opened first for simple compounds and at that time crystallography was associated mainly with perfect crystals. In the fifties, crystallographers already had most apparatus and fundamental methods at their disposal; however, we had to wait for the development of computers to see the full use of these tools. Furthermore the development of new sources of neutrons, electrons and synchrotron X-rays allowed the studies of complex compounds like large macromolecules in biology. Nowadays, one of the new frontiers for crystallographers is to relate the crystal structure to its physical-chemical-biological properties, this means that an accurate structural determination is needed to focus on a selective part of the

  19. Subtyping Animal Influenza Virus with General Multiplex RT-PCR and Liquichip High Throughput (GMPLex)

    Institute of Scientific and Technical Information of China (English)

    Zhi-feng Qin; Bing Cheng; Zhou-xi Ruan; Ying-zuo Bi; Joseph J Giambrone; Hong-zhuan Wu; Jie Sun; Ti-kang Lu; Shao-ling Zeng; Qun-yi Hua; Qing-yan Ling; Shu-kun Chen; Jian-qiang Lv; Cai-hong Zhang

    2012-01-01

    This study developed a multiplex RT-PCR integrated with luminex technology to rapidly subtype simultaneously multiple influenza viruses.Primers and probes were designed to amplify NS and M genes of influenza A viruses HA gene of H1,H3,H5,H7,H9 subtypes,and NA gene of the N1 and N2 subtypes.Universal super primers were introduced to establish a multiplex RT-PCR (GM RT-PCR).It included three stages of RT-PCR amplification,and then the RT-PCR products were further tested by LiquiChip probe,combined to give an influenza virus (Ⅳ) rapid high throughput subtyping test,designated as GMPLex.The IV GMPLex rapid high throughput subtyping test presents the following features:high throughput,able to determine the subtypes of 9 target genes in H1,H3,H5,H7,H9,N1,and N2 subtypes of the influenza A virus at one time; rapid,completing the influenza subtyping within 6 hours; high specificity,ensured the specificity of the different subtypes by using two nested degenerate primers and one probe,no cross reaction occurring between the subtypes,no non-specific reactions with other pathogens and high sensitivity.When used separately to detect the product of single GM RT-PCR for single H5 or N1 gene,the GMPLex test showed a sensitivity of 105(=280ELD50) forboth tests and the Luminex qualitative ratio results were 3.08 and 3.12,respectively.When used to detect the product of GM RT-PCR for HSN1 strain at the same time,both showed a sensitivity of 10-4(=2800 ELD50).The GMPLex rapid high throughput subtyping test can satisfy the needs of influenza rapid testing.

  20. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  1. Embedded image enhancement for high-throughput cameras

    Science.gov (United States)

    Geerts, Stan J. C.; Cornelissen, Dion; de With, Peter H. N.

    2014-03-01

    This paper presents image enhancement for a novel Ultra-High-Definition (UHD) video camera offering 4K images and higher. Conventional image enhancement techniques need to be reconsidered for the high-resolution images and the low-light sensitivity of the new sensor. We study two image enhancement functions and evaluate and optimize the algorithms for embedded implementation in programmable logic (FPGA). The enhancement study involves high-quality Auto White Balancing (AWB) and Local Contrast Enhancement (LCE). We have compared multiple algorithms from literature, both with objective and subjective metrics. In order to objectively compare Local Contrast (LC), an existing LC metric is modified for LC measurement in UHD images. For AWB, we have found that color histogram stretching offers a subjective high image quality and it is among the algorithms with the lowest complexity, while giving only a small balancing error. We impose a color-to-color gain constraint, which improves robustness of low-light images. For local contrast enhancement, a combination of contrast preserving gamma and single-scale Retinex is selected. A modified bilateral filter is designed to prevent halo artifacts, while significantly reducing the complexity and simultaneously preserving quality. We show that by cascading contrast preserving gamma and single-scale Retinex, the visibility of details is improved towards the level appropriate for high-quality surveillance applications. The user is offered control over the amount of enhancement. Also, we discuss the mapping of those functions on a heterogeneous platform to come to an effective implementation while preserving quality and robustness.

  2. High-throughput antibody development and retrospective epitope mapping

    DEFF Research Database (Denmark)

    Rydahl, Maja Gro

    the binding profile - in more or less high resolution - of two small molecular probes, 11 carbohydrate binding modules and 24 monoclonal antibodies. This was made possible by combining the HTP multiplexing capacity of carbohydrate microarrays with diverse glycomic tools, to downstream characterize......, there are noteworthy differences in the less evolved species of algae as compared to land plants. The dynamic process orchestrating the deposition of these biopolymers both in algae and higher plants, is complex and highly heterogeneous, yet immensely important for the development and differentiation of the cell....... However, our understanding of the evolutionary mechanisms, biosynthesis and remodelling is limited, especially due to a lack of sufficient glycomic tools for studying green plants. This poses a serious hindrance for understanding the fundamental processes behind terrestrialisation and vascularisation...

  3. A high-throughput bioinformatics distributed computing platform

    OpenAIRE

    Keane, Thomas M; Page, Andrew J.; McInerney, James O; Naughton, Thomas J.

    2005-01-01

    In the past number of years the demand for high performance computing has greatly increased in the area of bioinformatics. The huge increase in size of many genomic databases has meant that many common tasks in bioinformatics are not possible to complete in a reasonable amount of time on a single processor. Recently distributed computing has emerged as an inexpensive alternative to dedicated parallel computing. We have developed a general-purpose distributed computing platform ...

  4. The High Throughput X-ray Spectroscopy (HTXS) Mission

    Science.gov (United States)

    White, N. E.; Tananbaum, H.; Kahn, S. M.

    1997-01-01

    The HTXS mission concept combines large effective area (approximately 15,000 sq cm at 1 keV), high spectral resolution (E/Delta(E) approximately 300-3000), and broad energy bandpass (0.25-40 keV and possibly up to 100 keV) by using replicated optics together with a complement of spectroscopic instrumentation including reflection gratings readout by charge-coupled device detectors (CCDs), quantum micro-calorimeters, and cadmium zinc telluride (CZT) or comparable high energy detectors. An essential feature of this concept involves minimization of cost (approximately $350M for development and approximately $500-600M including launches) and risk by building six identical modest satellites to achieve the large area. Current mission and technology studies are targeted towards a new start in the 2002 timeframe, with first launch around 2005-2006. The HTXS mission represents a major advance, providing as much as a factor of 100 increase in sensitivity over currently planned high resolution X ray spectroscopy missions. HTXS will mark the start of a new era when high quality X ray spectra will be obtained for all classes of X ray sources, over a wide range of luminosity and distance. With its increased capabilities, HTXS will address many fundamental astrophysics questions such as the origin and distribution of the elements from carbon to zinc, the formation and evolution of clusters of galaxies, the validity of general relativity in the strong gravity limit, the evolution of supermassive black holes in active galactic nuclei, the details of supernova explosions and their aftermath, and the mechanisms involved in the heating of stellar coronae and driving of stellar winds.

  5. Melter Throughput Enhancements for High-Iron HLW

    Energy Technology Data Exchange (ETDEWEB)

    Kruger, A. A. [Department of Energy, Office of River Protection, Richland, Washington (United States); Gan, Hoa [The Catholic University of America, Washington, DC (United States); Joseph, Innocent [The Catholic University of America, Washington, DC (United States); Pegg, Ian L. [The Catholic University of America, Washington, DC (United States); Matlack, Keith S. [The Catholic University of America, Washington, DC (United States); Chaudhuri, Malabika [The Catholic University of America, Washington, DC (United States); Kot, Wing [The Catholic University of America, Washington, DC (United States)

    2012-12-26

    This report describes work performed to develop and test new glass and feed formulations in order to increase glass melting rates in high waste loading glass formulations for HLW with high concentrations of iron. Testing was designed to identify glass and melter feed formulations that optimize waste loading and waste processing rate while meeting all processing and product quality requirements. The work included preparation and characterization of crucible melts to assess melt rate using a vertical gradient furnace system and to develop new formulations with enhanced melt rate. Testing evaluated the effects of waste loading on glass properties and the maximum waste loading that can be achieved. The results from crucible-scale testing supported subsequent DuraMelter 100 (DM100) tests designed to examine the effects of enhanced glass and feed formulations on waste processing rate and product quality. The DM100 was selected as the platform for these tests due to its extensive previous use in processing rate determination for various HLW streams and glass compositions.

  6. Using Mendelian inheritance to improve high-throughput SNP discovery.

    Science.gov (United States)

    Chen, Nancy; Van Hout, Cristopher V; Gottipati, Srikanth; Clark, Andrew G

    2014-11-01

    Restriction site-associated DNA sequencing or genotyping-by-sequencing (GBS) approaches allow for rapid and cost-effective discovery and genotyping of thousands of single-nucleotide polymorphisms (SNPs) in multiple individuals. However, rigorous quality control practices are needed to avoid high levels of error and bias with these reduced representation methods. We developed a formal statistical framework for filtering spurious loci, using Mendelian inheritance patterns in nuclear families, that accommodates variable-quality genotype calls and missing data--both rampant issues with GBS data--and for identifying sex-linked SNPs. Simulations predict excellent performance of both the Mendelian filter and the sex-linkage assignment under a variety of conditions. We further evaluate our method by applying it to real GBS data and validating a subset of high-quality SNPs. These results demonstrate that our metric of Mendelian inheritance is a powerful quality filter for GBS loci that is complementary to standard coverage and Hardy-Weinberg filters. The described method, implemented in the software MendelChecker, will improve quality control during SNP discovery in nonmodel as well as model organisms. Copyright © 2014 by the Genetics Society of America.

  7. Multiplexed labeling system for high-throughput cell sorting.

    Science.gov (United States)

    Shin, Seung Won; Park, Kyung Soo; Song, In Hyun; Shin, Woo Jung; Kim, Byung Woo; Kim, Dong-Ik; Um, Soong Ho

    2016-09-01

    Flow cytometry and fluorescence activated cell sorting techniques were designed to realize configurable classification and separation of target cells. A number of cell phenotypes with different functionalities have recently been revealed. Before simultaneous selective capture of cells, it is desirable to label different samples with the corresponding dyes in a multiplexing manner to allow for a single analysis. However, few methods to obtain multiple fluorescent colors for various cell types have been developed. Even when restricted laser sources are employed, a small number of color codes can be expressed simultaneously. In this study, we demonstrate the ability to manifest DNA nanostructure-based multifluorescent colors formed by a complex of dyes. Highly precise self-assembly of fluorescent dye-conjugated oligonucleotides gives anisotropic DNA nanostructures, Y- and tree-shaped DNA (Y-DNA and T-DNA, respectively), which may be used as platforms for fluorescent codes. As a proof of concept, we have demonstrated seven different fluorescent codes with only two different fluorescent dyes using T-DNA. This method provides maximum efficiency for current flow cytometry. We are confident that this system will provide highly efficient multiplexed fluorescent detection for bioanalysis compared with one-to-one fluorescent correspondence for specific marker detection.

  8. A High Throughput Interferometeric Technique For Planet Detection

    Science.gov (United States)

    Mahadevan, Suvrath

    2007-12-01

    We have developed a novel instrument called the Exoplanet Tracker (ET) that can measure precise radial velocities. ET is installed at the Kitt Peak 2.1m telescope and uses a Michelson interferometer in series with a medium resolution spectrograph, enabling high light-collection effeciency. A survey for exoplanet systems using ET has lead to the discovery of a hot-Jupiter planet around the star HD102195. We have also built a stable monolithic Michelson interferometer for ET, enabling higher stability and precision. The development of the fixed-delay interferometer technology with the ET prototype has lead to the development of a large multi-object instrument capable of observing over 60 objects simultaneoulsy in the 3 degree field of view of the Sloan telescope. An upgraded version of this instrument will be used to conduct a large-scale survey for exoplanetary systems starting July 2008.

  9. Versatile High Throughput Microarray Analysis for Marine Glycobiology

    DEFF Research Database (Denmark)

    Asunción Salmeán, Armando

    Algal cell walls are a type of extracellular matrix mainly made of polysaccharides, highly diverse, complex and heterogeneous. They possess unique and original polymers in their composition including several polysaccharides with industrial relevance such as agar, agarose, carrageenans (red algae......) alginates and sulphated fucans (brown algae). Part of this work studied the polysaccharide composition of macroalgal cell walls, with special focus into the brown algal cell walls in a context of evolution, wall architecture and embryo development. Thus, we found evidence of the presence of (1......-3),(1-4)--glucan, also known as mixed linkage glucan, probably present in brown algae through a convergent evolutionary process which provides another piece of the cell wall puzzle. We also demonstrated the presence of chimeric-arabinogalactan proteins (AGP) in brown algal cell wall and confirmed their role...

  10. High-throughput discovery of novel developmental phenotypes.

    Science.gov (United States)

    Dickinson, Mary E; Flenniken, Ann M; Ji, Xiao; Teboul, Lydia; Wong, Michael D; White, Jacqueline K; Meehan, Terrence F; Weninger, Wolfgang J; Westerberg, Henrik; Adissu, Hibret; Baker, Candice N; Bower, Lynette; Brown, James M; Caddle, L Brianna; Chiani, Francesco; Clary, Dave; Cleak, James; Daly, Mark J; Denegre, James M; Doe, Brendan; Dolan, Mary E; Edie, Sarah M; Fuchs, Helmut; Gailus-Durner, Valerie; Galli, Antonella; Gambadoro, Alessia; Gallegos, Juan; Guo, Shiying; Horner, Neil R; Hsu, Chih-Wei; Johnson, Sara J; Kalaga, Sowmya; Keith, Lance C; Lanoue, Louise; Lawson, Thomas N; Lek, Monkol; Mark, Manuel; Marschall, Susan; Mason, Jeremy; McElwee, Melissa L; Newbigging, Susan; Nutter, Lauryl M J; Peterson, Kevin A; Ramirez-Solis, Ramiro; Rowland, Douglas J; Ryder, Edward; Samocha, Kaitlin E; Seavitt, John R; Selloum, Mohammed; Szoke-Kovacs, Zsombor; Tamura, Masaru; Trainor, Amanda G; Tudose, Ilinca; Wakana, Shigeharu; Warren, Jonathan; Wendling, Olivia; West, David B; Wong, Leeyean; Yoshiki, Atsushi; MacArthur, Daniel G; Tocchini-Valentini, Glauco P; Gao, Xiang; Flicek, Paul; Bradley, Allan; Skarnes, William C; Justice, Monica J; Parkinson, Helen E; Moore, Mark; Wells, Sara; Braun, Robert E; Svenson, Karen L; de Angelis, Martin Hrabe; Herault, Yann; Mohun, Tim; Mallon, Ann-Marie; Henkelman, R Mark; Brown, Steve D M; Adams, David J; Lloyd, K C Kent; McKerlie, Colin; Beaudet, Arthur L; Bućan, Maja; Murray, Stephen A

    2016-09-22

    Approximately one-third of all mammalian genes are essential for life. Phenotypes resulting from knockouts of these genes in mice have provided tremendous insight into gene function and congenital disorders. As part of the International Mouse Phenotyping Consortium effort to generate and phenotypically characterize 5,000 knockout mouse lines, here we identify 410 lethal genes during the production of the first 1,751 unique gene knockouts. Using a standardized phenotyping platform that incorporates high-resolution 3D imaging, we identify phenotypes at multiple time points for previously uncharacterized genes and additional phenotypes for genes with previously reported mutant phenotypes. Unexpectedly, our analysis reveals that incomplete penetrance and variable expressivity are common even on a defined genetic background. In addition, we show that human disease genes are enriched for essential genes, thus providing a dataset that facilitates the prioritization and validation of mutations identified in clinical sequencing efforts.

  11. A High Throughput On-Demand Routing Protocol for Multirate Ad Hoc Wireless Networks

    Science.gov (United States)

    Rahman, Md. Mustafizur; Hong, Choong Seon; Lee, Sungwon

    Routing in wireless ad hoc networks is a challenging issue because it dynamically controls the network topology and determines the network performance. Most of the available protocols are based on single-rate radio networks and they use hop-count as the routing metric. There have been some efforts for multirate radios as well that use transmission-time of a packet as the routing metric. However, neither the hop-count nor the transmission-time may be a sufficient criterion for discovering a high-throughput path in a multirate wireless ad hoc network. Hop-count based routing metrics usually select a low-rate bound path whereas the transmission-time based metrics may select a path with a comparatively large number of hops. The trade-off between transmission time and effective transmission range of a data rate can be another key criterion for finding a high-throughput path in such environments. In this paper, we introduce a novel routing metric based on the efficiency of a data rate that balances the required time and covering distance by a transmission and results in increased throughput. Using the new metric, we propose an on-demand routing protocol for multirate wireless environment, dubbed MR-AODV, to discover high-throughput paths in the network. A key feature of MR-AODV is that it controls the data rate in transmitting both the data and control packets. Rate control during the route discovery phase minimizes the route request (RREQ) avalanche. We use simulations to evaluate the performance of the proposed MR-AODV protocol and results reveal significant improvements in end-to-end throughput and minimization of routing overhead.

  12. Ultra-high-throughput Production of III-V/Si Wafer for Electronic and Photonic Applications.

    Science.gov (United States)

    Geum, Dae-Myeong; Park, Min-Su; Lim, Ju Young; Yang, Hyun-Duk; Song, Jin Dong; Kim, Chang Zoo; Yoon, Euijoon; Kim, SangHyeon; Choi, Won Jun

    2016-02-11

    Si-based integrated circuits have been intensively developed over the past several decades through ultimate device scaling. However, the Si technology has reached the physical limitations of the scaling. These limitations have fuelled the search for alternative active materials (for transistors) and the introduction of optical interconnects (called "Si photonics"). A series of attempts to circumvent the Si technology limits are based on the use of III-V compound semiconductor due to their superior benefits, such as high electron mobility and direct bandgap. To use their physical properties on a Si platform, the formation of high-quality III-V films on the Si (III-V/Si) is the basic technology ; however, implementing this technology using a high-throughput process is not easy. Here, we report new concepts for an ultra-high-throughput heterogeneous integration of high-quality III-V films on the Si using the wafer bonding and epitaxial lift off (ELO) technique. We describe the ultra-fast ELO and also the re-use of the III-V donor wafer after III-V/Si formation. These approaches provide an ultra-high-throughput fabrication of III-V/Si substrates with a high-quality film, which leads to a dramatic cost reduction. As proof-of-concept devices, this paper demonstrates GaAs-based high electron mobility transistors (HEMTs), solar cells, and hetero-junction phototransistors on Si substrates.

  13. Microfluidic Tools for Protein Crystallography

    Science.gov (United States)

    Abdallah, Bahige G.

    X-ray crystallography is the most widely used method to determine the structure of proteins, providing an understanding of their functions in all aspects of life to advance applications in fields such as drug development and renewable energy. New techniques, namely serial femtosecond crystallography (SFX), have unlocked the ability to unravel the structures of complex proteins with vital biological functions. A key step and major bottleneck of structure determination is protein crystallization, which is very arduous due to the complexity of proteins and their natural environments. Furthermore, crystal characteristics govern data quality, thus need to be optimized to attain the most accurate reconstruction of the protein structure. Crystal size is one such characteristic in which narrowed distributions with a small modal size can significantly reduce the amount of protein needed for SFX. A novel microfluidic sorting platform was developed to isolate viable ~200 nm -- ~600 nm photosystem I (PSI) membrane protein crystals from ~200 nm -- ~20 ?m crystal samples using dielectrophoresis, as confirmed by fluorescence microscopy, second-order nonlinear imaging of chiral crystals (SONICC), and dynamic light scattering. The platform was scaled-up to rapidly provide 100s of microliters of sorted crystals necessary for SFX, in which similar crystal size distributions were attained. Transmission electron microscopy was used to view the PSI crystal lattice, which remained well-ordered postsorting, and SFX diffraction data was obtained, confirming a high-quality, viable crystal sample. Simulations indicated sorted samples provided accurate, complete SFX datasets with 3500-fold less protein than unsorted samples. Microfluidic devices were also developed for versatile, rapid protein crystallization screening using nanovolumes of sample. Concentration gradients of protein and precipitant were generated to crystallize PSI, phycocyanin, and lysozyme using modified counterdiffusion

  14. Label-free high-throughput imaging flow cytometry

    Science.gov (United States)

    Mahjoubfar, A.; Chen, C.; Niazi, K. R.; Rabizadeh, S.; Jalali, B.

    2014-03-01

    Flow cytometry is an optical method for studying cells based on their individual physical and chemical characteristics. It is widely used in clinical diagnosis, medical research, and biotechnology for analysis of blood cells and other cells in suspension. Conventional flow cytometers aim a laser beam at a stream of cells and measure the elastic scattering of light at forward and side angles. They also perform single-point measurements of fluorescent emissions from labeled cells. However, many reagents used in cell labeling reduce cellular viability or change the behavior of the target cells through the activation of undesired cellular processes or inhibition of normal cellular activity. Therefore, labeled cells are not completely representative of their unaltered form nor are they fully reliable for downstream studies. To remove the requirement of cell labeling in flow cytometry, while still meeting the classification sensitivity and specificity goals, measurement of additional biophysical parameters is essential. Here, we introduce an interferometric imaging flow cytometer based on the world's fastest continuous-time camera. Our system simultaneously measures cellular size, scattering, and protein concentration as supplementary biophysical parameters for label-free cell classification. It exploits the wide bandwidth of ultrafast laser pulses to perform blur-free quantitative phase and intensity imaging at flow speeds as high as 10 meters per second and achieves nanometer-scale optical path length resolution for precise measurements of cellular protein concentration.

  15. High-throughput search for new permanent magnet materials.

    Science.gov (United States)

    Goll, D; Loeffler, R; Herbst, J; Karimi, R; Schneider, G

    2014-02-12

    The currently highest-performance Fe-Nd-B magnets show limited cost-effectiveness and lifetime due to their rare-earth (RE) content. The demand for novel hard magnetic phases with more widely available RE metals, reduced RE content or, even better, completely free of RE metals is therefore tremendous. The chances are that such materials still exist given the large number of as yet unexplored alloy systems. To discover such phases, an elaborate concept is necessary which can restrict and prioritize the search field while making use of efficient synthesis and analysis methods. It is shown that an efficient synthesis of new phases using heterogeneous non-equilibrium diffusion couples and reaction sintering is possible. Quantitative microstructure analysis of the domain pattern of the hard magnetic phases can be used to estimate the intrinsic magnetic parameters (saturation polarization from the domain contrast, anisotropy constant from the domain width, Curie temperature from the temperature dependence of the domain contrast). The probability of detecting TM-rich phases for a given system is high, therefore the approach enables one to scan through even higher component systems with one single sample. The visualization of newly occurring hard magnetic phases via their typical domain structure and the correlation existing between domain structure and intrinsic magnetic properties allows an evaluation of the industrial relevance of these novel phases.

  16. High-throughput solution processing of large-scale graphene

    Science.gov (United States)

    Tung, Vincent C.; Allen, Matthew J.; Yang, Yang; Kaner, Richard B.

    2009-01-01

    The electronic properties of graphene, such as high charge carrier concentrations and mobilities, make it a promising candidate for next-generation nanoelectronic devices. In particular, electrons and holes can undergo ballistic transport on the sub-micrometre scale in graphene and do not suffer from the scale limitations of current MOSFET technologies. However, it is still difficult to produce single-layer samples of graphene and bulk processing has not yet been achieved, despite strenuous efforts to develop a scalable production method. Here, we report a versatile solution-based process for the large-scale production of single-layer chemically converted graphene over the entire area of a silicon/SiO2 wafer. By dispersing graphite oxide paper in pure hydrazine we were able to remove oxygen functionalities and restore the planar geometry of the single sheets. The chemically converted graphene sheets that were produced have the largest area reported to date (up to 20 × 40 µm), making them far easier to process. Field-effect devices have been fabricated by conventional photolithography, displaying currents that are three orders of magnitude higher than previously reported for chemically produced graphene. The size of these sheets enables a wide range of characterization techniques, including optical microscopy, scanning electron microscopy and atomic force microscopy, to be performed on the same specimen.

  17. Scrutinizing virus genome termini by high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Shasha Li

    Full Text Available Analysis of genomic terminal sequences has been a major step in studies on viral DNA replication and packaging mechanisms. However, traditional methods to study genome termini are challenging due to the time-consuming protocols and their inefficiency where critical details are lost easily. Recent advances in next generation sequencing (NGS have enabled it to be a powerful tool to study genome termini. In this study, using NGS we sequenced one iridovirus genome and twenty phage genomes and confirmed for the first time that the high frequency sequences (HFSs found in the NGS reads are indeed the terminal sequences of viral genomes. Further, we established a criterion to distinguish the type of termini and the viral packaging mode. We also obtained additional terminal details such as terminal repeats, multi-termini, asymmetric termini. With this approach, we were able to simultaneously detect details of the genome termini as well as obtain the complete sequence of bacteriophage genomes. Theoretically, this application can be further extended to analyze larger and more complicated genomes of plant and animal viruses. This study proposed a novel and efficient method for research on viral replication, packaging, terminase activity, transcription regulation, and metabolism of the host cell.

  18. High throughput interferometric Doppler technique for planet detection

    Science.gov (United States)

    Mahadevan, Suvrath

    We have developed a novel instrument called the Exoplanet Tracker (ET) that can measure precise differential radial velocities, as well as barycentric radial velocities. ET is installed at the Kitt Peak 2.1 meter telescope and uses a Michelson interferometer in series with a medium resolution spectrograph. This instrument allows stellar radial velocities to be measured precisely without the use of a high resolution spectrograph. This allows the instrument to be very efficient in collecting light from the telescope. ET can achieve a radial velocity precision of 5-10 m s-1 over a 10 day observing run. A survey for extrasolar planets using the ET instrument has led to the detection of radial velocity variability for the star HD102195. Using photometry, CaII HK measurements, and precision radial velocities we demonstrate that these radial velocity variations are caused by a giant planet in a 4.11 day orbit around HD102195. A prototype monolithic interferometer has also been built for the ET instrument and is capable of delivering precise radial velocities. A large multi-object radial velocity instrument based on the ET instrument has been built and installed at the wide field Sloan 2.5 m telescope. This instrument, called the W. M. Keck Exoplanet Tracker, is capable of obtaining precise radial velocities for 59 stars simultaneously. Over the next few years this multi-object instrument will be used to conduct an All Sky ExoPlanet Survey capable of efficiently searching thousands of stars for planets.

  19. High-Throughput Plasmid cDNA Library Screening

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Kenneth H.; Yu, Charles; George, Reed A.; Carlson, JosephW.; Hoskins, Roger A.; Svirskas, Robert; Stapleton, Mark; Celniker, SusanE.

    2006-05-24

    Libraries of cDNA clones are valuable resources foranalysing the expression, structure, and regulation of genes, as well asfor studying protein functions and interactions. Full-length cDNA clonesprovide information about intron and exon structures, splice junctionsand 5'- and 3'-untranslated regions (UTRs). Open reading frames (ORFs)derived from cDNA clones can be used to generate constructs allowingexpression of native proteins and N- or C-terminally tagged proteins.Thus, obtaining full-length cDNA clones and sequences for most or allgenes in an organism is critical for understanding genome functions.Expressed sequence tag (EST) sequencing samples cDNA libraries at random,which is most useful at the beginning of large-scale screening projects.However, as projects progress towards completion, the probability ofidentifying unique cDNAs via EST sequencing diminishes, resulting in poorrecovery of rare transcripts. We describe an adapted, high-throughputprotocol intended for recovery of specific, full-length clones fromplasmid cDNA libraries in five days.

  20. High-throughput machining using high average power ultrashort pulse lasers and ultrafast polygon scanner

    Science.gov (United States)

    Schille, Joerg; Schneider, Lutz; Streek, André; Kloetzer, Sascha; Loeschner, Udo

    2016-03-01

    In this paper, high-throughput ultrashort pulse laser machining is investigated on various industrial grade metals (Aluminium, Copper, Stainless steel) and Al2O3 ceramic at unprecedented processing speeds. This is achieved by using a high pulse repetition frequency picosecond laser with maximum average output power of 270 W in conjunction with a unique, in-house developed two-axis polygon scanner. Initially, different concepts of polygon scanners are engineered and tested to find out the optimal architecture for ultrafast and precision laser beam scanning. Remarkable 1,000 m/s scan speed is achieved on the substrate, and thanks to the resulting low pulse overlap, thermal accumulation and plasma absorption effects are avoided at up to 20 MHz pulse repetition frequencies. In order to identify optimum processing conditions for efficient high-average power laser machining, the depths of cavities produced under varied parameter settings are analyzed and, from the results obtained, the characteristic removal values are specified. The maximum removal rate is achieved as high as 27.8 mm3/min for Aluminium, 21.4 mm3/min for Copper, 15.3 mm3/min for Stainless steel and 129.1 mm3/min for Al2O3 when full available laser power is irradiated at optimum pulse repetition frequency.

  1. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  2. High-throughput determination of malondialdehyde in plant tissues.

    Science.gov (United States)

    Davey, M W; Stals, E; Panis, B; Keulemans, J; Swennen, R L

    2005-12-15

    Malondialdehyde (MDA) is a widely used marker of oxidative lipid injury whose concentration varies in response to biotic and abiotic stress. Commonly, MDA is quantified as a strong light-absorbing and fluorescing adduct following reaction with thiobarbituric acid (TBA). However, plant tissues in particular contain many compounds that potentially interfere with this reaction and whose concentrations also vary according to the tissue type and stress conditions. As part of our studies into the stress responses of plant tissues, we were interested in developing a rapid, accurate, and robust protocol for MDA analysis using reverse-phased HPLC to avoid these problems with reaction specificity. We demonstrate that a partitioning step into n-butanol during sample preparation is essential and that gradient HPLC analysis is necessary to prevent sample carryover between injections. Furthermore, the starting composition of the mobile phase must be sufficiently hydrophobic to allow direct injection of the n-butanol extracts without peak splitting, tailing, and other artifacts. To minimize analysis times, we used a short, so-called "Rocket" HPLC column and high flow rates. The optimized HPLC separation has a turnaround time of 2.5 min per sample. Butanolic extracts of MDA(TBA)(2) were stable for at least 48 h, and recoveries were linear between 0.38 and 7.5 pmol MDA added. Importantly, this procedure proved to be compatible with existing extraction procedures for l-ascorbate and glutathione analysis in different plant species, allowing multiple "stress metabolite" analyses to be carried out on a single tissue extract.

  3. Development of a high-throughput Candida albicans biofilm chip.

    Directory of Open Access Journals (Sweden)

    Anand Srinivasan

    Full Text Available We have developed a high-density microarray platform consisting of nano-biofilms of Candida albicans. A robotic microarrayer was used to print yeast cells of C. albicans encapsulated in a collagen matrix at a volume as low as 50 nL onto surface-modified microscope slides. Upon incubation, the cells grow into fully formed "nano-biofilms". The morphological and architectural complexity of these biofilms were evaluated by scanning electron and confocal scanning laser microscopy. The extent of biofilm formation was determined using a microarray scanner from changes in fluorescence intensities due to FUN 1 metabolic processing. This staining technique was also adapted for antifungal susceptibility testing, which demonstrated that, similar to regular biofilms, cells within the on-chip biofilms displayed elevated levels of resistance against antifungal agents (fluconazole and amphotericin B. Thus, results from structural analyses and antifungal susceptibility testing indicated that despite miniaturization, these biofilms display the typical phenotypic properties associated with the biofilm mode of growth. In its final format, the C. albicans biofilm chip (CaBChip is composed of 768 equivalent and spatially distinct nano-biofilms on a single slide; multiple chips can be printed and processed simultaneously. Compared to current methods for the formation of microbial biofilms, namely the 96-well microtiter plate model, this fungal biofilm chip has advantages in terms of miniaturization and automation, which combine to cut reagent use and analysis time, minimize labor intensive steps, and dramatically reduce assay costs. Such a chip should accelerate the antifungal drug discovery process by enabling rapid, convenient and inexpensive screening of hundreds-to-thousands of compounds simultaneously.

  4. Development of a high-throughput Candida albicans biofilm chip.

    Science.gov (United States)

    Srinivasan, Anand; Uppuluri, Priya; Lopez-Ribot, Jose; Ramasubramanian, Anand K

    2011-04-22

    We have developed a high-density microarray platform consisting of nano-biofilms of Candida albicans. A robotic microarrayer was used to print yeast cells of C. albicans encapsulated in a collagen matrix at a volume as low as 50 nL onto surface-modified microscope slides. Upon incubation, the cells grow into fully formed "nano-biofilms". The morphological and architectural complexity of these biofilms were evaluated by scanning electron and confocal scanning laser microscopy. The extent of biofilm formation was determined using a microarray scanner from changes in fluorescence intensities due to FUN 1 metabolic processing. This staining technique was also adapted for antifungal susceptibility testing, which demonstrated that, similar to regular biofilms, cells within the on-chip biofilms displayed elevated levels of resistance against antifungal agents (fluconazole and amphotericin B). Thus, results from structural analyses and antifungal susceptibility testing indicated that despite miniaturization, these biofilms display the typical phenotypic properties associated with the biofilm mode of growth. In its final format, the C. albicans biofilm chip (CaBChip) is composed of 768 equivalent and spatially distinct nano-biofilms on a single slide; multiple chips can be printed and processed simultaneously. Compared to current methods for the formation of microbial biofilms, namely the 96-well microtiter plate model, this fungal biofilm chip has advantages in terms of miniaturization and automation, which combine to cut reagent use and analysis time, minimize labor intensive steps, and dramatically reduce assay costs. Such a chip should accelerate the antifungal drug discovery process by enabling rapid, convenient and inexpensive screening of hundreds-to-thousands of compounds simultaneously.

  5. Optically encoded microspheres for high-throughput analysis of genes and proteins

    Science.gov (United States)

    Gao, Xiaohu; Han, Mingyong; Nie, Shuming

    2002-06-01

    We have developed a novel optical coding technology for massively parallel and high-throughput analysis of biological molecules. Its unprecedented multiplexing capability is based on the unique optical properties of semiconductor quantum dots (QDs) and the ability to incorporate multicolor QQs into small polymer beads at precisely controlled ratios. The use of 10 intensity levels and 6 colors could theoretically code one million nucleic acid or protein sequences. Imaging and spectroscopic studies indicate that the QD tagged beads are highly uniform and reproducible, yielding bead identification accuracies as high as 99.99 percent under favorable conditions. DNA hybridization results demonstrate that the coding and target signals can be simultaneously read at the single-bead level. This spectral coding technology is expected to open new opportunities in gene expression studies, high-throughput screening, and medical diagnosis.

  6. Evaluation of Compatibility of ToxCast High-Throughput/High-Content Screening Assays with Engineered Nanomaterials

    Science.gov (United States)

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  7. Antileishmanial High-Throughput Drug Screening Reveals Drug Candidates with New Scaffolds

    OpenAIRE

    Siqueira-Neto, Jair L; Ok-Ryul Song; Hyunrim Oh; Jeong-Hun Sohn; Gyongseon Yang; Jiyoun Nam; Jiyeon Jang; Jonathan Cechetto; Chang Bok Lee; Seunghyun Moon; Auguste Genovesio; Eric Chatelain; Thierry Christophe; Freitas-Junior, Lucio H.

    2010-01-01

    International audience; Drugs currently available for leishmaniasis treatment often show parasite resistance, highly toxic side effects and prohibitive costs commonly incompatible with patients from the tropical endemic countries. In this sense, there is an urgent need for new drugs as a treatment solution for this neglected disease. Here we show the development and implementation of an automated high-throughput viability screening assay for the discovery of new drugs against Leishmania. Assa...

  8. Developing highER-throughput zebrafish screens for in-vivo CNS drug discovery

    OpenAIRE

    Adam Michael Stewart; Robert eGerlai; Kalueff, Allan V.

    2015-01-01

    The high prevalence of brain disorders and the lack of their efficient treatments necessitate improved in-vivo pre-clinical models and tests. The zebrafish (Danio rerio), a vertebrate species with high genetic and physiological homology to humans, is an excellent organism for innovative central nervous system (CNS) drug discovery and small molecule screening. Here, we outline new strategies for developing higher-throughput zebrafish screens to test neuroactive drugs and predict their pharmaco...

  9. A High-Throughput, Adaptive FFT Architecture for FPGA-Based Space-Borne Data Processors

    Science.gov (United States)

    Nguyen, Kayla; Zheng, Jason; He, Yutao; Shah, Biren

    2010-01-01

    Historically, computationally-intensive data processing for space-borne instruments has heavily relied on ground-based computing resources. But with recent advances in functional densities of Field-Programmable Gate-Arrays (FPGAs), there has been an increasing desire to shift more processing on-board; therefore relaxing the downlink data bandwidth requirements. Fast Fourier Transforms (FFTs) are commonly used building blocks for data processing applications, with a growing need to increase the FFT block size. Many existing FFT architectures have mainly emphasized on low power consumption or resource usage; but as the block size of the FFT grows, the throughput is often compromised first. In addition to power and resource constraints, space-borne digital systems are also limited to a small set of space-qualified memory elements, which typically lag behind the commercially available counterparts in capacity and bandwidth. The bandwidth limitation of the external memory creates a bottleneck for a large, high-throughput FFT design with large block size. In this paper, we present the Multi-Pass Wide Kernel FFT (MPWK-FFT) architecture for a moderately large block size (32K) with considerations to power consumption and resource usage, as well as throughput. We will also show that the architecture can be easily adapted for different FFT block sizes with different throughput and power requirements. The result is completely contained within an FPGA without relying on external memories. Implementation results are summarized.

  10. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  11. Adaptation and validation of DNA synthesis detection by fluorescent dye derivatization for high-throughput screening.

    Science.gov (United States)

    Ranall, Max V; Gabrielli, Brian G; Gonda, Thomas J

    2010-05-01

    Cellular proliferation is fundamental to organism development, tissue renewal, and diverse disease states such as cancer. In vitro measurement of proliferation by high-throughput screening allows rapid characterization of the effects of small-molecule or genetic treatments on primary and established cell lines. Current assays that directly measure the cell cycle are not amenable to high-throughput processing and analysis. Here we report the adaptation of the chemical method for detecting DNA synthesis by 5-ethynyl-2'-deoxyuridine (EdU) incorporation into both high-throughput liquid handling and high-content imaging analysis. We demonstrate that chemical detection of EdU incorporation is effective for high-resolution analysis and quantitation of DNA synthesis by high-content imaging. To validate this assay platform we used treatments of MCF10A cells with media supplements and pharmacological inhibitors that are known to affect cell proliferation. Treatments with specific kinase inhibitors indicate that EGF and serum stimulation employs both the mitogen extracellular kinase (MEK)/extracellular-regulated kinase (ERK) and phosphoinositol-3 kinase (PI3K)/AKT signaling networks. As described here, this method is fast, reliable, and inexpensive and yields robust data that can be easily interpreted.

  12. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  13. Development of a high-throughput replicon assay for the identification of respiratory syncytial virus inhibitors.

    Science.gov (United States)

    Tiong-Yip, Choi-Lai; Plant, Helen; Sharpe, Paul; Fan, Jun; Rich, Kirsty; Gorseth, Elise; Yu, Qin

    2014-01-01

    Respiratory syncytial virus (RSV) drug discovery has been hindered by the lack of good chemistry starting points and would benefit from robust and convenient assays for high-throughput screening (HTS). In this paper, we present the development and optimization of a 384-well RSV replicon assay that enabled HTS for RSV replication inhibitors with a low bio-containment requirement. The established replicon assay was successfully implemented for high-throughput screening. A validation screen was performed which demonstrated high assay performance and reproducibility. Assay quality was further confirmed via demonstration of appropriate pharmacology for different classes of RSV replication tool inhibitors. RSV replicon and cytotoxicity assays were further developed into a multiplexed format that measured both inhibition of viral replication and cytotoxicity from the same well. This provided a time and cost efficient approach to support lead optimization. In summary, we have developed a robust RSV replicon assay to help expedite the discovery of novel RSV therapeutics.

  14. High-Throughput Continuous Flow Production of Nanoscale Liposomes by Microfluidic Vertical Flow Focusing.

    Science.gov (United States)

    Hood, Renee R; DeVoe, Don L

    2015-11-18

    Liposomes represent a leading class of nanoparticles for drug delivery. While a variety of techniques for liposome synthesis have been reported that take advantage of microfluidic flow elements to achieve precise control over the size and polydispersity of nanoscale liposomes, with important implications for nanomedicine applications, these methods suffer from extremely limited throughput, making them impractical for large-scale nanoparticle synthesis. High aspect ratio microfluidic vertical flow focusing is investigated here as a new approach to overcoming the throughput limits of established microfluidic nanoparticle synthesis techniques. Here the vertical flow focusing technique is utilized to generate populations of small, unilamellar, and nearly monodisperse liposomal nanoparticles with exceptionally high production rates and remarkable sample homogeneity. By leveraging this platform, liposomes with modal diameters ranging from 80 to 200 nm are prepared at production rates as high as 1.6 mg min(-1) in a simple flow-through process.

  15. High throughput label-free platform for statistical bio-molecular sensing.

    Science.gov (United States)

    Bosco, Filippo G; Hwu, En-Te; Chen, Ching-Hsiu; Keller, Stephan; Bache, Michael; Jakobsen, Mogens H; Hwang, Ing-Shouh; Boisen, Anja

    2011-07-21

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing cantilever based sensors. These sensors have often been acclaimed to facilitate highly parallelized operation. Unfortunately, so far no concept has been presented which offers large datasets as well as easy liquid sample handling. We use optics and mechanics from a DVD player to handle liquid samples and to read-out cantilever deflection and resonant frequency. Also, surface roughness is measured. When combined with cantilever deflection the roughness is discovered to hold valuable additional information on specific and unspecific binding events. In a few minutes, 30 liquid samples can be analyzed in parallel, each by 24 cantilever-based sensors. The approach was used to detect the binding of streptavidin and antibodies.

  16. Fully automatized high-throughput enzyme library screening using a robotic platform.

    Science.gov (United States)

    Dörr, Mark; Fibinger, Michael P C; Last, Daniel; Schmidt, Sandy; Santos-Aberturas, Javier; Böttcher, Dominique; Hummel, Anke; Vickers, Clare; Voss, Moritz; Bornscheuer, Uwe T

    2016-07-01

    A fully automatized robotic platform has been established to facilitate high-throughput screening for protein engineering purposes. This platform enables proper monitoring and control of growth conditions in the microtiter plate format to ensure precise enzyme production for the interrogation of enzyme mutant libraries, protein stability tests and multiple assay screenings. The performance of this system has been exemplified for four enzyme classes important for biocatalysis such as Baeyer-Villiger monooxygenase, transaminase, dehalogenase and acylase in the high-throughput screening of various mutant libraries. This allowed the identification of novel enzyme variants in a sophisticated and highly reliable manner. Furthermore, the detailed optimization protocols should enable other researchers to adapt and improve their methods. Biotechnol. Bioeng. 2016;113: 1421-1432. © 2016 Wiley Periodicals, Inc.

  17. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions.

  18. Accurate Classification of Protein Subcellular Localization from High-Throughput Microscopy Images Using Deep Learning

    Directory of Open Access Journals (Sweden)

    Tanel Pärnamaa

    2017-05-01

    Full Text Available High-throughput microscopy of many single cells generates high-dimensional data that are far from straightforward to analyze. One important problem is automatically detecting the cellular compartment where a fluorescently-tagged protein resides, a task relatively simple for an experienced human, but difficult to automate on a computer. Here, we train an 11-layer neural network on data from mapping thousands of yeast proteins, achieving per cell localization classification accuracy of 91%, and per protein accuracy of 99% on held-out images. We confirm that low-level network features correspond to basic image characteristics, while deeper layers separate localization classes. Using this network as a feature calculator, we train standard classifiers that assign proteins to previously unseen compartments after observing only a small number of training examples. Our results are the most accurate subcellular localization classifications to date, and demonstrate the usefulness of deep learning for high-throughput microscopy.

  19. A High Throughput Biochemical Fluorometric Method for Measuring Lipid Peroxidation in HDL

    Science.gov (United States)

    Kelesidis, Theodoros; Roberts, Christian K.; Huynh, Diana; Martínez-Maza, Otoniel; Currier, Judith S.; Reddy, Srinivasa T.; Yang, Otto O.

    2014-01-01

    Current cell-based assays for determining the functional properties of high-density lipoproteins (HDL) have limitations. We report here the development of a new, robust fluorometric cell-free biochemical assay that measures HDL lipid peroxidation (HDLox) based on the oxidation of the fluorochrome Amplex Red. HDLox correlated with previously validated cell-based (r = 0.47, pHDL in established animal models of atherosclerosis and Human Immunodeficiency Virus (HIV) patients. Using an immunoaffinity method for capturing HDL, we demonstrate the utility of this novel assay for measuring HDLox in a high throughput format. Furthermore, HDLox correlated significantly with measures of cardiovascular diseases including carotid intima media thickness (r = 0.35, pHDL function/quality that is suitable for high throughput implementation. PMID:25368900

  20. Turning tumor-promoting copper into an anti-cancer weapon via high-throughput chemistry.

    Science.gov (United States)

    Wang, F; Jiao, P; Qi, M; Frezza, M; Dou, Q P; Yan, B

    2010-01-01

    Copper is an essential element for multiple biological processes. Its concentration is elevated to a very high level in cancer tissues for promoting cancer development through processes such as angiogenesis. Organic chelators of copper can passively reduce cellular copper and serve the role as inhibitors of angiogenesis. However, they can also actively attack cellular targets such as proteasome, which plays a critical role in cancer development and survival. The discovery of such molecules initially relied on a step by step synthesis followed by biological assays. Today high-throughput chemistry and high-throughput screening have significantly expedited the copper-binding molecules discovery to turn "cancer-promoting" copper into anti-cancer agents.

  1. A high-throughput UPLC method for the characterization of chemical modifications in monoclonal antibody molecules.

    Science.gov (United States)

    Stackhouse, Nicole; Miller, Amanda K; Gadgil, Himanshu S

    2011-12-01

    Development of high-throughput release and characterization assays is critical for the effective support of the rapidly growing biologics pipeline for biotherapeutics. Clipping of polypeptide chains is commonly monitored during process optimization, formulation development, and stability studies. A reduced capillary electrophoresis-sodium dodecyl sulfate (rCE -SDS) method is often used as a purity release assay for monitoring clips in monoclonal antibodies (mAbs); however, it has a cycle time of approximately 40 min, which is not suited for high-throughput screening. Additionally, the characterization of clips and variants from electropherograms is not straightforward and takes significant time. Reduced reversed-phase (RP) chromatography has been a popular assay for the characterization and identification of clips and variants because it can be directly coupled with online mass spectrometric analysis. However, the high-column temperature and low pH required for RP assays can induce on-column cleavage and therefore skew the results. To minimize on-column degradation, we have developed a high-throughput method with a significantly shorter cycle time of 5 min. The short cycle time was achieved using an ultra-high-pressure liquid chromatography (UPLC) system with a 1.7 μm phenyl column. This UPLC method allowed quantitation of hinge clipping in an IgG1 molecule and acid induced aspartic acid/proline (D/P) clip in an IgG2 molecule. The results from the UPLC method were comparable to those obtained with rCE-SDS. Additionally, the phenyl column offered partial resolution of oxidation and other chemical modifications, making this technique an attractive assay for high-throughput process characterization and formulation screens. Copyright © 2011 Wiley-Liss, Inc.

  2. High throughput heme assay by detection of chemiluminescence of reconstituted horseradish peroxidase.

    Science.gov (United States)

    Takahashi, Shigekazu; Masuda, Tatsuru

    2009-06-01

    In living organisms, heme is an essential molecule for various biological functions. Recent studies also suggest that heme functions as organelle-derived signal that regulates fundamental cell processes. Furthermore, estimation of heme is widely used for studying various blood disorders. In this regard, development of a rapid, sensitive, and high throughput heme assay has been sought. The most frequently used method of measuring heme by pyridine hemochrome is time, labor, and material intensive, and therefore limiting in its utility for large scale, high throughput analysis. Recently, we reported alternative method that is sensitive and specific to heme, which is based on the ability of horseradish peroxidase (HRP) apo-enzyme to reconstitute with heme to form an active holo-enzyme. Here, we developed high throughput heme assay by performing reactions on multi-well plate with highly sensitive chemiluminescence detection reagents. Detection of chemiluminescence in charged coupled device (CCD)-based gel doc apparatus enables simultaneous measurement of multiple samples. Furthermore, the high sensitivity of this assay allowed a direct measurement of heme in solvent extracts after dilution. This assay is sensitive, quick, provides a large dynamic range, and is well suited for large-scale analysis of heme extracted from minute amount of samples.

  3. High-throughput synchrotron X-ray diffraction for combinatorial phase mapping.

    Science.gov (United States)

    Gregoire, J M; Van Campen, D G; Miller, C E; Jones, R J R; Suram, S K; Mehta, A

    2014-11-01

    Discovery of new materials drives the deployment of new technologies. Complex technological requirements demand precisely tailored material functionalities, and materials scientists are driven to search for these new materials in compositionally complex and often non-equilibrium spaces containing three, four or more elements. The phase behavior of these high-order composition spaces is mostly unknown and unexplored. High-throughput methods can offer strategies for efficiently searching complex and multi-dimensional material genomes for these much needed new materials and can also suggest a processing pathway for synthesizing them. However, high-throughput structural characterization is still relatively under-developed for rapid material discovery. Here, a synchrotron X-ray diffraction and fluorescence experiment for rapid measurement of both X-ray powder patterns and compositions for an array of samples in a material library is presented. The experiment is capable of measuring more than 5000 samples per day, as demonstrated by the acquisition of high-quality powder patterns in a bismuth-vanadium-iron oxide composition library. A detailed discussion of the scattering geometry and its ability to be tailored for different material systems is provided, with specific attention given to the characterization of fiber textured thin films. The described prototype facility is capable of meeting the structural characterization needs for the first generation of high-throughput material genomic searches.

  4. Automated disposable small scale reactor for high throughput bioprocess development: a proof of concept study.

    Science.gov (United States)

    Bareither, Rachel; Bargh, Neil; Oakeshott, Robert; Watts, Kathryn; Pollard, David

    2013-12-01

    The acceleration of bioprocess development for biologics and vaccines can be enabled by automated high throughput technologies. This will alleviate the significant resource burden from the multi-factorial statistical experimentation required for controlling product quality attributes of complex biologics. Recent technology advances have improved clone evaluation and screening, but have struggled to combine the scale down criteria required for both high cell density cell culture and microbial processes, with sufficient automation and disposable technologies to accelerate process development. This article describes the proof of concept evaluations of an automated disposable small scale reactor for high throughput upstream process development. Characterization studies established the small scale stirred tank disposable 250 mL reactor as similar to those of lab and pilot scale. The reactor generated equivalent process performance for industrial biologics processes for therapeutic protein and monoclonal antibody production using CHO cell culture, Pichia pastoris and E. coli. This included similar growth, cell viability, product titer, and product quality. The technology was shown to be robust across multiple runs and met the requirements for the ability to run high cell density processes (>400 g/L wet cell weight) with exponential feeds and sophisticated event triggered processes. Combining this reactor into an automated array of reactors will ultimately be part of a high throughput process development strategy. This will combine upstream, small scale purification with rapid analytics that will dramatically shorten timelines and costs of developing biological processes.

  5. Spectral Unmixing Plate Reader: High-Throughput, High-Precision FRET Assays in Living Cells.

    Science.gov (United States)

    Schaaf, Tory M; Peterson, Kurt C; Grant, Benjamin D; Thomas, David D; Gillispie, Gregory D

    2017-03-01

    We have developed a microplate reader that records a complete high-quality fluorescence emission spectrum on a well-by-well basis under true high-throughput screening (HTS) conditions. The read time for an entire 384-well plate is less than 3 min. This instrument is particularly well suited for assays based on fluorescence resonance energy transfer (FRET). Intramolecular protein biosensors with genetically encoded green fluorescent protein (GFP) donor and red fluorescent protein (RFP) acceptor tags at positions sensitive to structural changes were stably expressed and studied in living HEK cells. Accurate quantitation of FRET was achieved by decomposing each observed spectrum into a linear combination of four component (basis) spectra (GFP emission, RFP emission, water Raman, and cell autofluorescence). Excitation and detection are both conducted from the top, allowing for thermoelectric control of the sample temperature from below. This spectral unmixing plate reader (SUPR) delivers an unprecedented combination of speed, precision, and accuracy for studying ensemble-averaged FRET in living cells. It complements our previously reported fluorescence lifetime plate reader, which offers the feature of resolving multiple FRET populations within the ensemble. The combination of these two direct waveform-recording technologies greatly enhances the precision and information content for HTS in drug discovery.

  6. Adapting human pluripotent stem cells to high-throughput and high-content screening.

    Science.gov (United States)

    Desbordes, Sabrina C; Studer, Lorenz

    2013-01-01

    The increasing use of human pluripotent stem cells (hPSCs) as a source of cells for drug discovery, cytotoxicity assessment and disease modeling requires their adaptation to large-scale culture conditions and screening formats. Here, we describe a simple and robust protocol for the adaptation of human embryonic stem cells (hESCs) to high-throughput screening (HTS). This protocol can also be adapted to human induced pluripotent stem cells (hiPSCs) and high-content screening (HCS). We also describe a 7-d assay to identify compounds with an effect on hESC self-renewal and differentiation. This assay can be adapted to a variety of applications. The procedure involves the culture expansion of hESCs, their adaptation to 384-well plates, the addition of small molecules or other factors, and finally data acquisition and processing. In this protocol, the optimal number of hESCs plated in 384-well plates has been adapted to HTS/HCS assays of 7 d.

  7. Overview on the current status of virtual high-throughput screening and combinatorial chemistry approaches in multi-target anticancer drug discovery; Part I.

    Science.gov (United States)

    Geromichalos, George D; Alifieris, Constantinos E; Geromichalou, Elena G; Trafalis, Dimitrios T

    2016-01-01

    Conventional drug design embraces the "one gene, one drug, one disease" philosophy. Nowadays, new generation of anti- cancer drugs, able to inhibit more than one pathway, is believed to play a major role in contemporary anticancer drug research. In this way, polypharmacology, focusing on multi-target drugs, has emerged as a new paradigm in drug discovery. A number of recent successful drugs have in part or in whole emerged from a structure-based research approach. Many advances including crystallography and informatics are behind these successes. Increasing insight into the genetics and molecular biology of cancer has resulted in the identification of an increasing number of potential molecular targets, for anticancer drug discovery and development. These targets can be approached through exploitation of emerging structural biology, "rational" drug design, screening of chemical libraries, or a combination of these methods. The result is the rapid discovery of new anticancer drugs. In this article we discuss the application of molecular modeling, molecular docking and virtual high-throughput screening to multi-targeted anticancer drug discovery. Efforts have been made to employ in silico methods for facilitating the search and design of selective multi-target agents. These computer aided molecular design methods have shown promising potential in facilitating drug discovery directed at selective multiple targets and is expected to contribute to intelligent lead anticancer drugs.

  8. High throughput and multiplex localization of proteins and cells for in situ micropatterning using pneumatic microfluidics.

    Science.gov (United States)

    Wang, Jian-Chun; Liu, Wenming; Tu, Qin; Ma, Chao; Zhao, Lei; Wang, Yaolei; Ouyang, Jia; Pang, Long; Wang, Jinyi

    2015-02-07

    Micropatterning technologies are emerging as an enabling tool for various microfluidic-based applications in life sciences. However, the high throughput and multiplex localization of multiple bio-components in a microfluidic device has not yet been well established. In this paper, we describe a simple and in situ micropatterning method using an integrated microfluidic device with pneumatic microstructures (PμSs) for highly controllable immobilization of both proteins and cells in a high throughput, geometry-dynamic, and multi-patterning way. The precise Pluronic F127 passivation of a microchamber surface except the PμS-blocked regions was performed and characterized, and the spatial dynamics and consistency of both the PμSs and protein/cell micropatterning were optically evaluated and quantitatively demonstrated too. Furthermore, a systematic investigation of PμS-assisted micropatterning in microfluidics was carried out. The feature of high throughput and spatial control of micropatterning can be simply realized by using the well-designed PμS arrays. Meanwhile, the co-micropatterning of different proteins (bovine serum albumin and chicken egg albumin) and cells (human umbilical vein endothelial cells and human hepatocellular carcinoma cells) in a microfluidic device was successfully accomplished with the orderly serial manipulation of PμS groups. We demonstrate that PμS-assisted micropatterning can be applied as a convenient microfluidic component for large-scale and diversified protein/cell patterning and manipulation, which could be useful for cell-based tissue organization, high-throughput imaging, protein-related interactions and immunoassays.

  9. Infra-red Thermography for High Throughput Field Phenotyping in Solanum tuberosum

    Science.gov (United States)

    Prashar, Ankush; Yildiz, Jane; McNicol, James W.; Bryan, Glenn J.; Jones, Hamlyn G.

    2013-01-01

    The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions. PMID:23762433

  10. High throughput protein-protein interaction data: clues for the architecture of protein complexes

    Directory of Open Access Journals (Sweden)

    Pang Chi

    2008-11-01

    Full Text Available Abstract Background High-throughput techniques are becoming widely used to study protein-protein interactions and protein complexes on a proteome-wide scale. Here we have explored the potential of these techniques to accurately determine the constituent proteins of complexes and their architecture within the complex. Results Two-dimensional representations of the 19S and 20S proteasome, mediator, and SAGA complexes were generated and overlaid with high quality pairwise interaction data, core-module-attachment classifications from affinity purifications of complexes and predicted domain-domain interactions. Pairwise interaction data could accurately determine the members of each complex, but was unexpectedly poor at deciphering the topology of proteins in complexes. Core and module data from affinity purification studies were less useful for accurately defining the member proteins of these complexes. However, these data gave strong information on the spatial proximity of many proteins. Predicted domain-domain interactions provided some insight into the topology of proteins within complexes, but was affected by a lack of available structural data for the co-activator complexes and the presence of shared domains in paralogous proteins. Conclusion The constituent proteins of complexes are likely to be determined with accuracy by combining data from high-throughput techniques. The topology of some proteins in the complexes will be able to be clearly inferred. We finally suggest strategies that can be employed to use high throughput interaction data to define the membership and understand the architecture of proteins in novel complexes.

  11. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Energy Technology Data Exchange (ETDEWEB)

    Schuster, Andre; Bruno, Kenneth S.; Collett, James R.; Baker, Scott E.; Seiboth, Bernhard; Kubicek, Christian P.; Schmoll, Monika

    2012-01-02

    The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina), represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. RESULTS: Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ) by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414.CONCLUSIONS:Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  12. Acoustic Droplet Ejection Applications for High-Throughput Screening of Infectious Agents.

    Science.gov (United States)

    Rasmussen, Lynn; White, E Lucile; Bostwick, James R

    2016-02-01

    When acoustic droplet ejection technology was first introduced for high-throughput applications, it was used primarily for dispensing compounds dissolved in DMSO. The high precision and accuracy achieved for low-volume transfers in this application were noted by those working outside of the compound management area, and interest was generated in expanding the scope of the technology to include other liquid types. Later-generation instruments included calibrations for several aqueous buffers that were applicable to the life sciences. The High Throughput Screening Center at Southern Research has made use of this range of liquid calibrations for the Infectious Disease Program. The original calibration for DMSO has allowed the preparation of assay-ready plates that can be sent to remote locations. This process was used as part of the collaboration between Southern Research and Galveston National Laboratory, University of Texas Medical Branch, to develop high-throughput screening for biological safety level 4 containment and to provide compounds for two pilot screens that were run there with BSL-4-level pathogens. The aqueous calibrations have been instrumental in miniaturizing assays used for infectious disease, such as qPCR, tissue culture infectious dose 50, and bacterial motility, to make them compatible with HTS operations.

  13. Droplet microfluidic technology for single-cell high-throughput screening.

    Science.gov (United States)

    Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L

    2009-08-25

    We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.

  14. High-throughput SHAPE and hydroxyl radical analysis of RNA structure and ribonucleoprotein assembly.

    Science.gov (United States)

    McGinnis, Jennifer L; Duncan, Caia D S; Weeks, Kevin M

    2009-01-01

    RNA folds to form complex structures vital to many cellular functions. Proteins facilitate RNA folding at both the secondary and tertiary structure levels. An absolute prerequisite for understanding RNA folding and ribonucleoprotein (RNP) assembly reactions is a complete understanding of the RNA structure at each stage of the folding or assembly process. Here we provide a guide for comprehensive and high-throughput analysis of RNA secondary and tertiary structure using SHAPE and hydroxyl radical footprinting. As an example of the strong and sometimes surprising conclusions that can emerge from high-throughput analysis of RNA folding and RNP assembly, we summarize the structure of the bI3 group I intron RNA in four distinct states. Dramatic structural rearrangements occur in both secondary and tertiary structure as the RNA folds from the free state to the active, six-component, RNP complex. As high-throughput and high-resolution approaches are applied broadly to large protein-RNA complexes, other proteins previously viewed as making simple contributions to RNA folding are also likely to be found to exert multifaceted, long-range, cooperative, and nonadditive effects on RNA folding. These protein-induced contributions add another level of control, and potential regulatory function, in RNP complexes.

  15. Algorithm-driven high-throughput screening of colloidal nanoparticles under simulated physiological and therapeutic conditions.

    Science.gov (United States)

    Bhirde, Ashwinkumar A; Sindiri, Sivasish; Calco, Gina N; Aronova, Maria A; Beaucage, Serge L

    2017-02-09

    Colloidal nanoparticles have shown tremendous potential as cancer drug carriers and as phototherapeutics. However, the stability of nanoparticles under physiological and phototherapeutic conditions is a daunting issue, which needs to be addressed in order to ensure a successful clinical translation. The design, development and implementation of unique algorithms are described herein for high-throughput hydrodynamic size measurements of colloidal nanoparticles. The data obtained from such measurements provide clinically-relevant particle size distribution assessments that are directly related to the stability and aggregation profiles of the nanoparticles under putative physiological and phototherapeutic conditions; those profiles are not only dependent on the size and surface coating of the nanoparticles, but also on their composition. Uncoated nanoparticles showed varying degrees of association with bovine serum albumin, whereas PEGylated nanoparticles did not exhibit significant association with the protein. The algorithm-driven, high-throughput size screening method described in this report provides highly meaningful size measurement patterns stemming from the association of colloidal particles with bovine serum albumin used as a protein model. Noteworthy is that this algorithm-based high-throughput method can accomplish sophisticated hydrodynamic size measurement protocols within days instead of years it would take conventional hydrodynamic size measurement techniques to achieve a similar task.

  16. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  17. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance.

  18. Organic chemistry. Nanomole-scale high-throughput chemistry for the synthesis of complex molecules.

    Science.gov (United States)

    Buitrago Santanilla, Alexander; Regalado, Erik L; Pereira, Tony; Shevlin, Michael; Bateman, Kevin; Campeau, Louis-Charles; Schneeweis, Jonathan; Berritt, Simon; Shi, Zhi-Cai; Nantermet, Philippe; Liu, Yong; Helmy, Roy; Welch, Christopher J; Vachal, Petr; Davies, Ian W; Cernak, Tim; Dreher, Spencer D

    2015-01-02

    At the forefront of new synthetic endeavors, such as drug discovery or natural product synthesis, large quantities of material are rarely available and timelines are tight. A miniaturized automation platform enabling high-throughput experimentation for synthetic route scouting to identify conditions for preparative reaction scale-up would be a transformative advance. Because automated, miniaturized chemistry is difficult to carry out in the presence of solids or volatile organic solvents, most of the synthetic "toolkit" cannot be readily miniaturized. Using palladium-catalyzed cross-coupling reactions as a test case, we developed automation-friendly reactions to run in dimethyl sulfoxide at room temperature. This advance enabled us to couple the robotics used in biotechnology with emerging mass spectrometry-based high-throughput analysis techniques. More than 1500 chemistry experiments were carried out in less than a day, using as little as 0.02 milligrams of material per reaction.

  19. High throughput functional epitope mapping: revisiting phage display platform to scan target antigen surface.

    Science.gov (United States)

    Rojas, Gertrudis; Tundidor, Yaima; Infante, Yanelys Cabrera

    2014-01-01

    Antibody engineering must be accompanied by mapping strategies focused on identifying the epitope recognized by each antibody to define its unique functional identity. High throughput fine specificity determination remains technically challenging. We review recent experiences aimed at revisiting the oldest and most extended display technology to develop a robust epitope mapping platform, based on the ability to manipulate target-derived molecules (ranging from the whole native antigen to antigen domains and smaller fragments) on filamentous phages. Single, multiple and combinatorial mutagenesis allowed comprehensive scanning of phage-displayed antigen surface that resulted in the identification of clusters of residues contributing to epitope formation. Functional pictures of the epitope(s) were thus delineated in the natural context. Successful mapping of antibodies against interleukin-2, epidermal growth factor and its receptor, and vascular endothelial growth factor showed the versatility of these procedures, which combine the accuracy of site-directed mutagenesis with the high throughput potential of phage display.

  20. Multiple and high-throughput droplet reactions via combination of microsampling technique and microfluidic chip

    KAUST Repository

    Wu, Jinbo

    2012-11-20

    Microdroplets offer unique compartments for accommodating a large number of chemical and biological reactions in tiny volume with precise control. A major concern in droplet-based microfluidics is the difficulty to address droplets individually and achieve high throughput at the same time. Here, we have combined an improved cartridge sampling technique with a microfluidic chip to perform droplet screenings and aggressive reaction with minimal (nanoliter-scale) reagent consumption. The droplet composition, distance, volume (nanoliter to subnanoliter scale), number, and sequence could be precisely and digitally programmed through the improved sampling technique, while sample evaporation and cross-contamination are effectively eliminated. Our combined device provides a simple model to utilize multiple droplets for various reactions with low reagent consumption and high throughput. © 2012 American Chemical Society.

  1. Inhibitors of the Yersinia protein tyrosine phosphatase through high throughput and virtual screening approaches.

    Science.gov (United States)

    Hu, Xin; Vujanac, Milos; Southall, Noel; Stebbins, C Erec

    2013-02-15

    The bacterial protein tyrosine phosphatase YopH is an essential virulence determinant in Yersinia pestis and a potential antibacterial drug target. Here we report our studies of screening for small molecule inhibitors of YopH using both high throughput and in silico approaches. The identified inhibitors represent a diversity of chemotypes and novel pTyr mimetics, providing a starting point for further development and fragment-based design of multi-site binding inhibitors. We demonstrate that the applications of high throughput and virtual screening, when guided by structural binding mode analysis, is an effective approach for identifying potent and selective inhibitors of YopH and other protein phosphatases for rational drug design. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy.

    Science.gov (United States)

    Lai, Yiu Wai; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios; Hofmann, Martin R; Ludwig, Alfred

    2011-10-01

    A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  3. High-throughput syntheses of iron phosphite open frameworks in ionic liquids

    Science.gov (United States)

    Wang, Zhixiu; Mu, Ying; Wang, Yilin; Bing, Qiming; Su, Tan; Liu, Jingyao

    2017-02-01

    Three open-framework iron phosphites: Feп5(NH4)2(HPO3)6 (1), Feп2Fe♯(NH4)(HPO3)4 (2) and Fe♯2(HPO3)3 (3) have been synthesized under ionothermal conditions. How the different synthesis parameters, such as the gel concentrations, synthetic times, reaction temperatures and solvents affect the products have been monitored by using high-throughput approaches. Within each type of experiment, relevant products have been investigated. The optimal reaction conditions are obtained from a series of experiments by high-throughput approaches. All the structures are determined by single-crystal X-ray diffraction analysis and further characterized by PXRD, TGA and FTIR analyses. Magnetic study reveals that those three compounds show interesting magnetic behavior at low temperature.

  4. Miniaturization of High-Throughput Epigenetic Methyltransferase Assays with Acoustic Liquid Handling.

    Science.gov (United States)

    Edwards, Bonnie; Lesnick, John; Wang, Jing; Tang, Nga; Peters, Carl

    2016-02-01

    Epigenetics continues to emerge as an important target class for drug discovery and cancer research. As programs scale to evaluate many new targets related to epigenetic expression, new tools and techniques are required to enable efficient and reproducible high-throughput epigenetic screening. Assay miniaturization increases screening throughput and reduces operating costs. Echo liquid handlers can transfer compounds, samples, reagents, and beads in submicroliter volumes to high-density assay formats using only acoustic energy-no contact or tips required. This eliminates tip costs and reduces the risk of reagent carryover. In this study, we demonstrate the miniaturization of a methyltransferase assay using Echo liquid handlers and two different assay technologies: AlphaLISA from PerkinElmer and EPIgeneous HTRF from Cisbio.

  5. A pulse-front-tilt–compensated streaked optical spectrometer with high throughput and picosecond time resolution

    Energy Technology Data Exchange (ETDEWEB)

    Katz, J., E-mail: jkat@lle.rochester.edu; Boni, R.; Rivlis, R. [Laboratory for Laser Energetics, University of Rochester, Rochester, New York 14623-1299 (United States); Muir, C. [Department of Mechanical Engineering, University of Rochester, Rochester, New York 14623-1299 (United States); Froula, D. H. [Laboratory for Laser Energetics, University of Rochester, Rochester, New York 14623-1299 (United States); Department of Physics, University of Rochester, Rochester, New York 14623-1299 (United States)

    2016-11-15

    A high-throughput, broadband optical spectrometer coupled to the Rochester optical streak system equipped with a Photonis P820 streak tube was designed to record time-resolved spectra with 1-ps time resolution. Spectral resolution of 0.8 nm is achieved over a wavelength coverage range of 480 to 580 nm, using a 300-groove/mm diffraction grating in conjunction with a pair of 225-mm-focal-length doublets operating at an f/2.9 aperture. Overall pulse-front tilt across the beam diameter generated by the diffraction grating is reduced by preferentially delaying discrete segments of the collimated input beam using a 34-element reflective echelon optic. The introduced delay temporally aligns the beam segments and the net pulse-front tilt is limited to the accumulation across an individual sub-element. The resulting spectrometer design balances resolving power and pulse-front tilt while maintaining high throughput.

  6. Miniature high-throughput chemosensing of yield, ee, and absolute configuration from crude reaction mixtures

    Science.gov (United States)

    Bentley, Keith W.; Zhang, Peng; Wolf, Christian

    2016-01-01

    High-throughput experimentation (HTE) has emerged as a widely used technology that accelerates discovery and optimization processes with parallel small-scale reaction setups. A high-throughput screening (HTS) method capable of comprehensive analysis of crude asymmetric reaction mixtures (eliminating product derivatization or isolation) would provide transformative impact by matching the pace of HTE. We report how spontaneous in situ construction of stereodynamic metal probes from readily available, inexpensive starting materials can be applied to chiroptical chemosensing of the total amount, enantiomeric excess (ee), and absolute configuration of a wide variety of amines, diamines, amino alcohols, amino acids, carboxylic acids, α-hydroxy acids, and diols. This advance and HTS potential are highlighted with the analysis of 1 mg of crude reaction mixtures of a catalytic asymmetric reaction. This operationally simple assay uses a robust mix-and-measure protocol, is amenable to microscale platforms and automation, and provides critical time efficiency and sustainability advantages over traditional serial methods. PMID:26933684

  7. Gradient Technology for High-Throughput Screening of Interactions between Cells and Nanostructured Materials

    Directory of Open Access Journals (Sweden)

    Andrew Michelmore

    2012-01-01

    Full Text Available We present a novel substrate suitable for the high-throughput analysis of cell response to variations in surface chemistry and nanotopography. Electrochemical etching was used to produce silicon wafers with nanopores between 10 and 100 nm in diameter. Over this substrate and flat silicon wafers, a gradient film ranging from hydrocarbon to carboxylic acid plasma polymer was deposited, with the concentration of surface carboxylic acid groups varying between 0.7 and 3% as measured by XPS. MG63 osteoblast-like cells were then cultured on these substrates and showed greatest cell spreading and adhesion onto porous silicon with a carboxylic acid group concentration between 2-3%. This method has great potential for high-throughput screening of cell-material interaction with particular relevance to tissue engineering.

  8. High-Throughput Synthesis and Characterization of BiMoVOX Materials

    Science.gov (United States)

    Russu, Sergio; Tromp, Moniek; Tsapatsaris, Nikolaos; Beesley, Angela M.; Schroeder, Sven L. M.; Weller, Mark T.; Evans, John

    2007-02-01

    The high throughput synthesis and characterization of a particular family of ceramic materials, bismuth molybdenum vanadium oxides (BiMoVOX), suitable as inorganic yellow pigments and low temperature oxidation catalysts, is described. Samples, synthesized by calcination and peroxo sol-gel methods, are characterized by X-ray powder diffraction, UV-visible and XAFS spectroscopy. A combined high-throughput XRD/XAFS study of a 54 samples array, with simultaneous refinement of data of both techniques, has been performed. Molybdenum doping of bismuth vanadate results in a phase transition from monoclinic BiVO4 to tetragonal Bi(V,Mo)O4, both of scheelite type. Both central metals, V5+ and Mo6+, remain in a tetrahedral coordination. UV/visible spectroscopy identifies a linear blue shift as a function of Mo6+ amount.

  9. High-throughput Sequencing Based Immune Repertoire Study during Infectious Disease

    Directory of Open Access Journals (Sweden)

    Dongni Hou

    2016-08-01

    Full Text Available The selectivity of the adaptive immune response is based on the enormous diversity of T and B cell antigen-specific receptors. The immune repertoire, the collection of T and B cells with functional diversity in the circulatory system at any given time, is dynamic and reflects the essence of immune selectivity. In this article, we review the recent advances in immune repertoire study of infectious diseases that achieved by traditional techniques and high-throughput sequencing techniques. High-throughput sequencing techniques enable the determination of complementary regions of lymphocyte receptors with unprecedented efficiency and scale. This progress in methodology enhances the understanding of immunologic changes during pathogen challenge, and also provides a basis for further development of novel diagnostic markers, immunotherapies and vaccines.

  10. From cradle to grave: high-throughput studies of aging in model organisms.

    Science.gov (United States)

    Spivey, Eric C; Finkelstein, Ilya J

    2014-07-01

    Aging-the progressive decline of biological functions-is a universal fact of life. Decades of intense research in unicellular and metazoan model organisms have highlighted that aging manifests at all levels of biological organization - from the decline of individual cells, to tissue and organism degeneration. To better understand the aging process, we must first aim to integrate quantitative biological understanding on the systems and cellular levels. A second key challenge is to then understand the many heterogeneous outcomes that may result in aging cells, and to connect cellular aging to organism-wide degeneration. Addressing these challenges requires the development of high-throughput aging and longevity assays. In this review, we highlight the emergence of high-throughput aging approaches in the most commonly used model organisms. We conclude with a discussion of the critical questions that can be addressed with these new methods.

  11. A pulse-front-tilt-compensated streaked optical spectrometer with high throughput and picosecond time resolution

    Science.gov (United States)

    Katz, J.; Boni, R.; Rivlis, R.; Muir, C.; Froula, D. H.

    2016-11-01

    A high-throughput, broadband optical spectrometer coupled to the Rochester optical streak system equipped with a Photonis P820 streak tube was designed to record time-resolved spectra with 1-ps time resolution. Spectral resolution of 0.8 nm is achieved over a wavelength coverage range of 480 to 580 nm, using a 300-groove/mm diffraction grating in conjunction with a pair of 225-mm-focal-length doublets operating at an f/2.9 aperture. Overall pulse-front tilt across the beam diameter generated by the diffraction grating is reduced by preferentially delaying discrete segments of the collimated input beam using a 34-element reflective echelon optic. The introduced delay temporally aligns the beam segments and the net pulse-front tilt is limited to the accumulation across an individual sub-element. The resulting spectrometer design balances resolving power and pulse-front tilt while maintaining high throughput.

  12. Transcriptome characteristics of filamentous fungi deduced using high-throughput analytical technologies.

    Science.gov (United States)

    Meijueiro, Martha Lucía; Santoyo, Francisco; Ramírez, Lucía; Pisabarro, Antonio G

    2014-11-01

    Transcriptomes are the complete set of genome sequences transcribed at a given time point by a given organism, organ, tissue or cell. The availability of high-throughput analytical techniques and, especially, the democratization of the use of RNA sequencing using new platforms have made it possible to transform transcriptome analysis into a common study affordable by most laboratories. In many cases, however, there is a certain level of prevention toward the use of these technologies because of the lack of knowledge about what has been done, what can be done and how high-throughput sequencing can help us solve specific scientific questions. Here, we will try to answer some initial questions about fungal transcriptome analysis, provide some examples of fungal biology questions that have been addressed using this approach and extract some general conclusions about the transcriptome structure and dynamics in fungal systems.

  13. Increasing the delivery of next generation therapeutics from high throughput screening libraries.

    Science.gov (United States)

    Wigglesworth, Mark J; Murray, David C; Blackett, Carolyn J; Kossenjans, Michael; Nissink, J Willem M

    2015-06-01

    The pharmaceutical industry has historically relied on high throughput screening as a cornerstone to identify chemical equity for drug discovery projects. However, with pharmaceutical companies moving through a phase of diminished returns and alternative hit identification strategies proving successful, it is more important than ever to understand how this approach can be used more effectively to increase the delivery of next generation therapeutics from high throughput screening libraries. There is a wide literature that describes HTS and fragment based screening approaches which offer clear direction on the process for these two distinct activities. However, few people have considered how best to identify medium to low molecular weight compounds from large diversity screening sets and increase downstream success.

  14. High-throughput engineering and analysis of peptide binding to class II MHC.

    Science.gov (United States)

    Jiang, Wei; Boder, Eric T

    2010-07-27

    Class II major histocompatibility complex (MHC-II) proteins govern stimulation of adaptive immunity by presenting antigenic peptides to CD4+ T lymphocytes. Many allelic variants of MHC-II exist with implications in peptide presentation and immunity; thus, high-throughput experimental tools for rapid and quantitative analysis of peptide binding to MHC-II are needed. Here, we present an expression system wherein peptide and MHC-II are codisplayed on the surface of yeast in an intracellular association-dependent manner and assayed by flow cytometry. Accordingly, the relative binding of different peptides and/or MHC-II variants can be assayed by genetically manipulating either partner, enabling the application of directed evolution approaches for high-throughput characterization or engineering. We demonstrate the application of this tool to map the side-chain preference for peptides binding to HLA-DR1 and to evolve novel HLA-DR1 mutants with altered peptide-binding specificity.

  15. Automated High-Throughput Root Phenotyping of Arabidopsis thaliana Under Nutrient Deficiency Conditions.

    Science.gov (United States)

    Satbhai, Santosh B; Göschl, Christian; Busch, Wolfgang

    2017-01-01

    The central question of genetics is how a genotype determines the phenotype of an organism. Genetic mapping approaches are a key for finding answers to this question. In particular, genome-wide association (GWA) studies have been rapidly adopted to study the architecture of complex quantitative traits. This was only possible due to the improvement of high-throughput and low-cost phenotyping methodologies. In this chapter we provide a detailed protocol for obtaining root trait data from the model species Arabidopsis thaliana using the semiautomated, high-throughput phenotyping pipeline BRAT (Busch-lab Root Analysis Toolchain) for early root growth under the stress condition of iron deficiency. Extracted root trait data can be directly used to perform GWA mapping using the freely accessible web application GWAPP to identify marker polymorphisms associated with the phenotype of interest.

  16. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  17. High-throughput strategies for the discovery and engineering of enzymes for biocatalysis.

    Science.gov (United States)

    Jacques, Philippe; Béchet, Max; Bigan, Muriel; Caly, Delphine; Chataigné, Gabrielle; Coutte, François; Flahaut, Christophe; Heuson, Egon; Leclère, Valérie; Lecouturier, Didier; Phalip, Vincent; Ravallec, Rozenn; Dhulster, Pascal; Froidevaux, Rénato

    2017-02-01

    Innovations in novel enzyme discoveries impact upon a wide range of industries for which biocatalysis and biotransformations represent a great challenge, i.e., food industry, polymers and chemical industry. Key tools and technologies, such as bioinformatics tools to guide mutant library design, molecular biology tools to create mutants library, microfluidics/microplates, parallel miniscale bioreactors and mass spectrometry technologies to create high-throughput screening methods and experimental design tools for screening and optimization, allow to evolve the discovery, development and implementation of enzymes and whole cells in (bio)processes. These technological innovations are also accompanied by the development and implementation of clean and sustainable integrated processes to meet the growing needs of chemical, pharmaceutical, environmental and biorefinery industries. This review gives an overview of the benefits of high-throughput screening approach from the discovery and engineering of biocatalysts to cell culture for optimizing their production in integrated processes and their extraction/purification.

  18. High-throughput screening for integrative biomaterials design: exploring advances and new trends.

    Science.gov (United States)

    Oliveira, Mariana B; Mano, João F

    2014-12-01

    With the increasing need for biomaterials and tissue engineering alternatives, more accurate, rapid, and cost-saving methods and models to study biomaterial-cell interactions must be developed. We review the evolution of microarray platforms used for such studies in order to meet the criteria of complex tissue engineering biological environments. Particular aspects regarding biomaterials processing, data acquisition, and treatment are addressed. Apart from in vitro array-based strategies, we also address emerging in vivo high-throughput approaches and their associated trends, such as the role of inflammation in regeneration. The up-scaling of high-throughput methods using single cell encapsulation systems is also explored. Possible limitations related to the use of such methods, such as spot-to-spot crosstalk, are also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. A droplet-based, optofluidic device for high-throughput, quantitative bioanalysis

    OpenAIRE

    Guo, Feng; Lapsley, Michael Ian; Nawaz, Ahmad Ahsan; Zhao, Yanhui; Lin, Sz-Chin Steven; Chen, Yuchao; Yang, Shikuan; Zhao, Xing-Zhong; Huang, Tony Jun

    2012-01-01

    Analysis of chemical or biomolecular contents in a tiny amount of specimen presents a significant challenge in many biochemical studies and diagnostic applications. In this work, we present a single-layer, optofluidic device for real-time, high-throughput, quantitative analysis of droplet contents. Our device integrates an optical fiber-based, on-chip detection unit with a droplet-based microfluidic unit. It can quantitatively analyze the contents of individual droplets in real-time. It also ...

  20. A high-throughput method for quantifying metabolically active yeast cells

    DEFF Research Database (Denmark)

    Nandy, Subir Kumar; Knudsen, Peter Boldsen; Rosenkjær, Alexander

    2015-01-01

    By redesigning the established methylene blue reduction test for bacteria and yeast, we present a cheap and efficient methodology for quantitative physiology of eukaryotic cells applicable for high-throughput systems. Validation of themethod in fermenters and highthroughput systems proved...... equivalent, displaying reduction curves that interrelated directly with CFU counts. For growth rate estimation, the methylene blue reduction test (MBRT) proved superior, since the discriminatory nature of the method allowed for the quantification of metabolically active cells only, excluding dead cells...