WorldWideScience

Sample records for high throughput continuous

  1. High Throughput, Continuous, Mass Production of Photovoltaic Modules

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Barth

    2008-02-06

    AVA Solar has developed a very low cost solar photovoltaic (PV) manufacturing process and has demonstrated the significant economic and commercial potential of this technology. This I & I Category 3 project provided significant assistance toward accomplishing these milestones. The original goals of this project were to design, construct and test a production prototype system, fabricate PV modules and test the module performance. The original module manufacturing costs in the proposal were estimated at $2/Watt. The objectives of this project have been exceeded. An advanced processing line was designed, fabricated and installed. Using this automated, high throughput system, high efficiency devices and fully encapsulated modules were manufactured. AVA Solar has obtained 2 rounds of private equity funding, expand to 50 people and initiated the development of a large scale factory for 100+ megawatts of annual production. Modules will be manufactured at an industry leading cost which will enable AVA Solar's modules to produce power that is cost-competitive with traditional energy resources. With low manufacturing costs and the ability to scale manufacturing, AVA Solar has been contacted by some of the largest customers in the PV industry to negotiate long-term supply contracts. The current market for PV has continued to grow at 40%+ per year for nearly a decade and is projected to reach $40-$60 Billion by 2012. Currently, a crystalline silicon raw material supply shortage is limiting growth and raising costs. Our process does not use silicon, eliminating these limitations.

  2. High-throughput DNA Stretching in Continuous Elongational Flow for Genome Sequence Scanning

    Science.gov (United States)

    Meltzer, Robert; Griffis, Joshua; Safranovitch, Mikhail; Malkin, Gene; Cameron, Douglas

    2014-03-01

    Genome Sequence Scanning (GSS) identifies and compares bacterial genomes by stretching long (60 - 300 kb) genomic DNA restriction fragments and scanning for site-selective fluorescent probes. Practical application of GSS requires: 1) high throughput data acquisition, 2) efficient DNA stretching, 3) reproducible DNA elasticity in the presence of intercalating fluorescent dyes. GSS utilizes a pseudo-two-dimensional micron-scale funnel with convergent sheathing flows to stretch one molecule at a time in continuous elongational flow and center the DNA stream over diffraction-limited confocal laser excitation spots. Funnel geometry has been optimized to maximize throughput of DNA within the desired length range (>10 million nucleobases per second). A constant-strain detection channel maximizes stretching efficiency by applying a constant parabolic tension profile to each molecule, minimizing relaxation and flow-induced tumbling. The effect of intercalator on DNA elasticity is experimentally controlled by reacting one molecule of DNA at a time in convergent sheathing flows of the dye. Derivations of accelerating flow and non-linear tension distribution permit alignment of detected fluorescence traces to theoretical templates derived from whole-genome sequence data.

  3. Titer plate formatted continuous flow thermal reactors for high throughput applications: fabrication and testing

    Science.gov (United States)

    Sang-Won Park, Daniel; Chen, Pin-Chuan; You, Byoung Hee; Kim, Namwon; Park, Taehyun; Lee, Tae Yoon; Datta, Proyag; Desta, Yohannes; Soper, Steven A.; Nikitopoulos, Dimitris E.; Murphy, Michael C.

    2010-05-01

    A high throughput, multi-well (96) polymerase chain reaction (PCR) platform, based on a continuous flow (CF) mode of operation, was developed. Each CFPCR device was confined to a footprint of 8 × 8 mm2, matching the footprint of a well on a standard micro-titer plate. While several CFPCR devices have been demonstrated, this is the first example of a high-throughput multi-well continuous flow thermal reactor configuration. Verification of the feasibility of the multi-well CFPCR device was carried out at each stage of development from manufacturing to demonstrating sample amplification. The multi-well CFPCR devices were fabricated by micro-replication in polymers, polycarbonate to accommodate the peak temperatures during thermal cycling in this case, using double-sided hot embossing. One side of the substrate contained the thermal reactors and the opposite side was patterned with structures to enhance thermal isolation of the closely packed constant temperature zones. A 99 bp target from a λ-DNA template was successfully amplified in a prototype multi-well CFPCR device with a total reaction time as low as ~5 min at a flow velocity of 3 mm s-1 (15.3 s cycle-1) and a relatively low amplification efficiency compared to a bench-top thermal cycler for a 20-cycle device; reducing the flow velocity to 1 mm s-1 (46.2 s cycle-1) gave a seven-fold improvement in amplification efficiency. Amplification efficiencies increased at all flow velocities for 25-cycle devices with the same configuration.

  4. High-throughput continuous hydrothermal synthesis of an entire nanoceramic phase diagram.

    Science.gov (United States)

    Weng, Xiaole; Cockcroft, Jeremy K; Hyett, Geoffrey; Vickers, Martin; Boldrin, Paul; Tang, Chiu C; Thompson, Stephen P; Parker, Julia E; Knowles, Jonathan C; Rehman, Ihtesham; Parkin, Ivan; Evans, Julian R G; Darr, Jawwad A

    2009-01-01

    A novel High-Throughput Continuous Hydrothermal (HiTCH) flow synthesis reactor was used to make directly and rapidly a 66-sample nanoparticle library (entire phase diagram) of nanocrystalline Ce(x)Zr(y)Y(z)O(2-delta) in less than 12 h. High resolution PXRD data were obtained for the entire heat-treated library (at 1000 degrees C/1 h) in less than a day using the new robotic beamline I11, located at Diamond Light Source (DLS). This allowed Rietveld-quality powder X-ray diffraction (PXRD) data collection of the entire 66-sample library in <1 day. Consequently, the authors rapidly mapped out phase behavior and sintering behaviors for the entire library. Out of the entire 66-sample heat-treated library, the PXRD data suggests that 43 possess the fluorite structure, of which 30 (out of 36) are ternary compositions. The speed, quantity and quality of data obtained by our new approach, offers an exciting new development which will allow structure-property relationships to be accessed for nanoceramics in much shorter time periods.

  5. FLIC: high-throughput, continuous analysis of feeding behaviors in Drosophila.

    Directory of Open Access Journals (Sweden)

    Jennifer Ro

    Full Text Available We present a complete hardware and software system for collecting and quantifying continuous measures of feeding behaviors in the fruit fly, Drosophila melanogaster. The FLIC (Fly Liquid-Food Interaction Counter detects analog electronic signals as brief as 50 µs that occur when a fly makes physical contact with liquid food. Signal characteristics effectively distinguish between different types of behaviors, such as feeding and tasting events. The FLIC system performs as well or better than popular methods for simple assays, and it provides an unprecedented opportunity to study novel components of feeding behavior, such as time-dependent changes in food preference and individual levels of motivation and hunger. Furthermore, FLIC experiments can persist indefinitely without disturbance, and we highlight this ability by establishing a detailed picture of circadian feeding behaviors in the fly. We believe that the FLIC system will work hand-in-hand with modern molecular techniques to facilitate mechanistic studies of feeding behaviors in Drosophila using modern, high-throughput technologies.

  6. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  7. Continuous high throughput molecular adhesion based cell sorting using ridged microchannels

    Science.gov (United States)

    Tasadduq, Bushra; Wang, Gonghao; Alexeev, Alexander; Sarioglu, Ali Fatih; Sulchek, Todd

    2016-11-01

    Cell molecular interactions govern important physiological processes such as stem cell homing, inflammation and cancer metastasis. But due to a lack of effective separation technologies selective to these interactions it is challenging to specifically sort cells. Other label free separation techniques based on size, stiffness and shape do not provide enough specificity to cell type, and correlation to clinical condition. We propose a novel microfluidic device capable of high throughput molecule dependent separation of cells by flowing them through a microchannel decorated with molecule specific coated ridges. The unique aspect of this sorting design is the use of optimized gap size which is small enough to lightly squeeze the cells while flowing under the ridged part of the channel to increase the surface area for interaction between the ligand on cell surface and coated receptor molecule but large enough so that biomechanical markers, stiffness and viscoelasticity, do not dominate the cell separation mechanism. We are able to separate Jurkat cells based on its expression of PSGL-1ligand using ridged channel coated with P selectin at a flow rate of 0.045ml/min and achieve 2-fold and 5-fold enrichment of PSGL-1 positive and negative Jurkat cells respectively.

  8. High-Throughput Continuous Hydrothermal Synthesis of Transparent Conducting Aluminum and Gallium Co-doped Zinc Oxides.

    Science.gov (United States)

    Howard, Dougal P; Marchand, Peter; McCafferty, Liam; Carmalt, Claire J; Parkin, Ivan P; Darr, Jawwad A

    2017-04-10

    High-throughput continuous hydrothermal flow synthesis was used to generate a library of aluminum and gallium-codoped zinc oxide nanoparticles of specific atomic ratios. Resistivities of the materials were determined by Hall Effect measurements on heat-treated pressed discs and the results collated into a conductivity-composition map. Optimal resistivities of ∼9 × 10-3 Ω cm were reproducibly achieved for several samples, for example, codoped ZnO with 2 at% Ga and 1 at% Al. The optimum sample on balance of performance and cost was deemed to be ZnO codoped with 3 at% Al and 1 at% Ga.

  9. Fabrication of continuous flow microfluidics device with 3D electrode structures for high throughput DEP applications using mechanical machining.

    Science.gov (United States)

    Zeinali, Soheila; Çetin, Barbaros; Oliaei, Samad Nadimi Bavil; Karpat, Yiğit

    2015-07-01

    Microfluidics is the combination of micro/nano fabrication techniques with fluid flow at microscale to pursue powerful techniques in controlling and manipulating chemical and biological processes. Sorting and separation of bio-particles are highly considered in diagnostics and biological analyses. Dielectrophoresis (DEP) has offered unique advantages for microfluidic devices. In DEP devices, asymmetric pair of planar electrodes could be employed to generate non-uniform electric fields. In DEP applications, facing 3D sidewall electrodes is considered to be one of the key solutions to increase device throughput due to the generated homogeneous electric fields along the height of microchannels. Despite the advantages, fabrication of 3D vertical electrodes requires a considerable challenge. In this study, two alternative fabrication techniques have been proposed for the fabrication of a microfluidic device with 3D sidewall electrodes. In the first method, both the mold and the electrodes are fabricated using high precision machining. In the second method, the mold with tilted sidewalls is fabricated using high precision machining and the electrodes are deposited on the sidewall using sputtering together with a shadow mask fabricated by electric discharge machining. Both fabrication processes are assessed as highly repeatable and robust. Moreover, the two methods are found to be complementary with respect to the channel height. Only the manipulation of particles with negative-DEP is demonstrated in the experiments, and the throughput values up to 105 particles / min is reached in a continuous flow. The experimental results are compared with the simulation results and the limitations on the fabrication techniques are also discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Ribonuclease activity of vaccinia DNA topoisomerase IB: kinetic and high-throughput inhibition studies using a robust continuous fluorescence assay.

    Science.gov (United States)

    Kwon, Keehwan; Nagarajan, Rajesh; Stivers, James T

    2004-11-30

    Vaccinia type I DNA topoisomerase exhibits a strong site-specific ribonuclease activity when provided a DNA substrate that contains a single uridine ribonucleotide within a duplex DNA containing the sequence 5' CCCTU 3'. The reaction involves two steps: attack of the active site tyrosine nucleophile of topo I at the 3' phosphodiester of the uridine nucleotide to generate a covalent enzyme-DNA adduct, followed by nucleophilic attack of the uridine 2'-hydroxyl to release the covalently tethered enzyme. Here we report the first continuous spectroscopic assay for topoisomerase that allows monitoring of the ribonuclease reaction under multiple-turnover conditions. The assay is especially robust for high-throughput screening applications because sensitive molecular beacon technology is utilized, and the topoisomerase is released during the reaction to allow turnover of multiple substrate molecules by a single molecule of enzyme. Direct computer simulation of the fluorescence time courses was used to obtain the rate constants for substrate binding and release, covalent complex formation, and formation of the 2',3'-cyclic phosphodiester product of the ribonuclease reaction. The assay allowed rapid screening of a 500 member chemical library from which several new inhibitors of topo I were identified with IC(50) values in the range of 2-100 microM. Three of the most potent hits from the high-throughput screening were also found to inhibit plasmid supercoil relaxation by the enzyme, establishing the utility of the assay in identifying inhibitors of the biologically relevant DNA relaxation reaction. One of the most potent inhibitors of the vaccinia enzyme, 3-benzo[1,3]dioxol-5-yl-2-oxoproprionic acid, did not inhibit the closely related human enzyme. The inhibitory mechanism of this compound is unique and involves a step required for recycling the enzyme for steady-state turnover.

  11. Continuous-specimen-flow, high-throughput, 1-hour tissue processing. A system for rapid diagnostic tissue preparation.

    Science.gov (United States)

    Morales, Azorides R; Essenfeld, Harold; Essenfeld, Ervin; Duboue, Maria Carmen; Vincek, Vladimir; Nadji, Mehrdad

    2002-05-01

    Current conventional tissue-processing methods employ fixation of tissues with neutral buffered formalin, dehydration with alcohol, and clearing with xylene before paraffin impregnation. Because the time required for this procedure is usually 8 hours or longer, it is customary to process tissues in automated instruments throughout the night. Although this time-honored method continues to serve histology laboratories well, it has a number of shortcomings, such as a 1-day delay of diagnosis, the need to batch specimens, the relatively large volumes and toxicity of reagents used, and the extent of RNA degradation. To describe a rapid new method of tissue processing using a continuous-throughput technique. Design.-We used a combination of common histologic reagents, excluding formalin and xylene, as well as microwave energy, to develop a rapid processing method. The effect of this method on the quality of histomorphology, histochemistry, immunohistochemistry, and RNA content of processed tissue was compared with that of adjacent tissue sections processed by the conventional processing technique. We also assessed the impact of this rapid processing system on our practice by comparing the turnaround times of surgical pathology reports before and after its implementation. The new processing method permitted preparation of paraffin blocks from fresh or prefixed tissue in about 1 hour. The procedure allowed continuous flow of specimens at 15-minute intervals. It eliminated the use of formalin and xylene in the processing and used considerably lower volumes of other chemical reagents. Histomorphologic, histochemical, and immunohistochemical results were comparable to the parallel sections prepared by the conventional method. The new technique, however, preserved higher quality RNA. Use of the new methodology led to the diagnosis and reporting of more than one third of surgical pathology specimens on the same day that they were received, as compared to 1% of same

  12. Throughput Optimization of Continuous Biopharmaceutical Manufacturing Facilities.

    Science.gov (United States)

    Garcia, Fernando A; Vandiver, Michael W

    2017-01-01

    In order to operate profitably under different product demand scenarios, biopharmaceutical companies must design their facilities with mass output flexibility in mind. Traditional biologics manufacturing technologies pose operational challenges in this regard due to their high costs and slow equipment turnaround times, restricting the types of products and mass quantities that can be processed. Modern plant design, however, has facilitated the development of lean and efficient bioprocessing facilities through footprint reduction and adoption of disposable and continuous manufacturing technologies. These development efforts have proven to be crucial in seeking to drastically reduce the high costs typically associated with the manufacturing of recombinant proteins. In this work, mathematical modeling is used to optimize annual production schedules for a single-product commercial facility operating with a continuous upstream and discrete batch downstream platform. Utilizing cell culture duration and volumetric productivity as process variables in the model, and annual plant throughput as the optimization objective, 3-D surface plots are created to understand the effect of process and facility design on expected mass output. The model shows that once a plant has been fully debottlenecked it is capable of processing well over a metric ton of product per year. Moreover, the analysis helped to uncover a major limiting constraint on plant performance, the stability of the neutralized viral inactivated pool, which may indicate that this should be a focus of attention during future process development efforts. LAY ABSTRACT: Biopharmaceutical process modeling can be used to design and optimize manufacturing facilities and help companies achieve a predetermined set of goals. One way to perform optimization is by making the most efficient use of process equipment in order to minimize the expenditure of capital, labor and plant resources. To that end, this paper introduces a

  13. Validation of a high-throughput fermentation system based on online monitoring of biomass and fluorescence in continuously shaken microtiter plates

    Directory of Open Access Journals (Sweden)

    Kensy Frank

    2009-06-01

    Full Text Available Abstract Background An advanced version of a recently reported high-throughput fermentation system with online measurement, called BioLector, and its validation is presented. The technology combines high-throughput screening and high-information content by applying online monitoring of scattered light and fluorescence intensities in continuously shaken microtiter plates. Various examples in calibration of the optical measurements, clone and media screening and promoter characterization are given. Results Bacterial and yeast biomass concentrations of up to 50 g/L cell dry weight could be linearly correlated to scattered light intensities. In media screening, the BioLector could clearly demonstrate its potential for detecting different biomass and product yields and deducing specific growth rates for quantitatively evaluating media and nutrients. Growth inhibition due to inappropriate buffer conditions could be detected by reduced growth rates and a temporary increase in NADH fluorescence. GFP served very well as reporter protein for investigating the promoter regulation under different carbon sources in yeast strains. A clone screening of 90 different GFP-expressing Hansenula polymorpha clones depicted the broad distribution of growth behavior and an even stronger distribution in GFP expression. The importance of mass transfer conditions could be demonstrated by varying filling volumes of an E. coli culture in 96 well MTP. The different filling volumes cause a deviation in the culture growth and acidification both monitored via scattered light intensities and the fluorescence of a pH indicator, respectively. Conclusion The BioLector technology is a very useful tool to perform quantitative microfermentations under engineered reaction conditions. With this technique, specific yields and rates can be directly deduced from online biomass and product concentrations, which is superior to existing technologies such as microplate readers or optode

  14. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  15. Variations of rhizosphere bacterial communities in tea (Camellia sinensis L.) continuous cropping soil by high-throughput pyrosequencing approach.

    Science.gov (United States)

    Li, Y C; Li, Z; Li, Z W; Jiang, Y H; Weng, B Q; Lin, W X

    2016-09-01

    The goal was to investigate the dynamics of soil bacterial community in the chronosequence tea orchards. In this study, soils from tea orchards with continuously cropping histories for 1, 10 and 20 years were collected for investigating rhizosphere bacterial communities using 454 pyrosequencing. The results indicated that Gammaproteobacteria, Alphaproteobacteria, Acidobacteria and Actinobacteria were the main phyla in the tea orchard soils and accounted for more than 60% of the bacterial sequences. At the genus level, the relative abundance of beneficial bacteria, such as Pseudomonas, Rhodanobacter, Bradyrhizobium, Mycobacterium and Sphingomonas, significantly decreased in the 20-year tea orchard soils. Similar patterns of bacterial community structure were observed between 1-year and 10-year tea orchards, which significantly differed from those of 20-year tea orchards. Redundancy analysis indicated that soil organic carbon and pH showed high correlations (positive or negative) with the majority of the taxa. Long-term tea cultivation altered the composition and structure of soil bacterial community, which led to the reduction in the beneficial bacteria. The results can provide clues on how to regulate the soil microbial community and maintain the health of soils in tea orchard systems. © 2016 The Society for Applied Microbiology.

  16. High-throughput continuous hydrothermal flow synthesis of Zn-Ce oxides: unprecedented solubility of Zn in the nanoparticle fluorite lattice.

    Science.gov (United States)

    Kellici, Suela; Gong, Kenan; Lin, Tian; Brown, Sonal; Clark, Robin J H; Vickers, Martin; Cockcroft, Jeremy K; Middelkoop, Vesna; Barnes, Paul; Perkins, James M; Tighe, Christopher J; Darr, Jawwad A

    2010-09-28

    High-throughput continuous hydrothermal flow synthesis has been used as a rapid and efficient synthetic route to produce a range of crystalline nanopowders in the Ce-Zn oxide binary system. High-resolution powder X-ray diffraction data were obtained for both as-prepared and heat-treated (850 degrees C for 10 h in air) samples using the new robotic beamline I11, located at Diamond Light Source. The influence of the sample composition on the crystal structure and on the optical and physical properties was studied. All the nanomaterials were characterized using Raman spectroscopy, UV-visible spectrophotometry, Brunauer-Emmett-Teller surface area and elemental analysis (via energy-dispersive X-ray spectroscopy). Initially, for 'as-prepared' Ce(1-x)Zn(x)O(y), a phase-pure cerium oxide (fluorite) structure was obtained for nominal values of x=0.1 and 0.2. Biphasic mixtures were obtained for nominal values of x in the range of 0.3-0.9 (inclusive). High-resolution transmission electron microscopy images revealed that the phase-pure nano-CeO(2) (x=0) consisted of ca 3.7 nm well-defined nanoparticles. The nanomaterials produced herein generally had high surface areas (greater than 150 m(2) g(-1)) and possessed combinations of particle properties (e.g. bandgap, crystallinity, size, etc.) that were unobtainable or difficult to achieve by other more conventional synthetic methods.

  17. Integrated automation for continuous high-throughput synthetic chromosome assembly and transformation to identify improved yeast strains for industrial production of peptide sweetener brazzein

    Science.gov (United States)

    Production and recycling of recombinant sweetener peptides in industrial biorefineries involves the evaluation of large numbers of genes and proteins. High-throughput integrated robotic molecular biology platforms that have the capacity to rapidly synthesize, clone, and express heterologous gene ope...

  18. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  19. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  20. High Throughput Architecture for High Performance NoC

    OpenAIRE

    Ghany, Mohamed A. Abd El; El-Moursy, Magdy A.; Ismail, Mohammed

    2010-01-01

    In this chapter, the high throughput NoC architecture is proposed to increase the throughput of the switch in NoC. The proposed architecture can also improve the latency of the network. The proposed high throughput interconnect architecture is applied on different NoC architectures. The architecture increases the throughput of the network by more than 38% while preserving the average latency. The area of high throughput NoC switch is decreased by 18% as compared to the area of BFT switch. The...

  1. Conjugated Polymers Via Direct Arylation Polymerization in Continuous Flow: Minimizing the Cost and Batch-to-Batch Variations for High-Throughput Energy Conversion

    DEFF Research Database (Denmark)

    Gobalasingham, Nemal S.; Carlé, Jon Eggert; Krebs, Frederik C

    2017-01-01

    is comparable to the performance of PPDTBT polymerized through Stille cross coupling. These efforts demonstrate the distinct advantages of the continuous flow protocol with DArP avoiding use of toxic tin chemicals, reducing the associated costs of polymer upscaling, and minimizing batch-to-batch variations...... for high-quality material....

  2. Conjugated Polymers Via Direct Arylation Polymerization in Continuous Flow: Minimizing the Cost and Batch-to-Batch Variations for High-Throughput Energy Conversion.

    Science.gov (United States)

    Gobalasingham, Nemal S; Carlé, Jon E; Krebs, Frederik C; Thompson, Barry C; Bundgaard, Eva; Helgesen, Martin

    2017-11-01

    Continuous flow methods are utilized in conjunction with direct arylation polymerization (DArP) for the scaled synthesis of the roll-to-roll compatible polymer, poly[(2,5-bis(2-hexyldecyloxy)phenylene)-alt-(4,7-di(thiophen-2-yl)-benzo[c][1,2,5]thiadiazole)] (PPDTBT). PPDTBT is based on simple, inexpensive, and scalable monomers using thienyl-flanked benzothiadiazole as the acceptor, which is the first β-unprotected substrate to be used in continuous flow via DArP, enabling critical evaluation of the suitability of this emerging synthetic method for minimizing defects and for the scaled synthesis of high-performance materials. To demonstrate the usefulness of the method, DArP-prepared PPDTBT via continuous flow synthesis is employed for the preparation of indium tin oxide (ITO)-free and flexible roll-coated solar cells to achieve a power conversion efficiency of 3.5% for 1 cm2 devices, which is comparable to the performance of PPDTBT polymerized through Stille cross coupling. These efforts demonstrate the distinct advantages of the continuous flow protocol with DArP avoiding use of toxic tin chemicals, reducing the associated costs of polymer upscaling, and minimizing batch-to-batch variations for high-quality material. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  4. Screening and synthesis: high throughput technologies applied to parasitology.

    Science.gov (United States)

    Morgan, R E; Westwood, N J

    2004-01-01

    High throughput technologies continue to develop in response to the challenges set by the genome projects. This article discusses how the techniques of both high throughput screening (HTS) and synthesis can influence research in parasitology. Examples of the use of targeted and phenotype-based HTS using unbiased compound collections are provided. The important issue of identifying the protein target(s) of bioactive compounds is discussed from the synthetic chemist's perspective. This article concludes by reviewing recent examples of successful target identification studies in parasitology.

  5. Economic consequences of high throughput maskless lithography

    Science.gov (United States)

    Hartley, John G.; Govindaraju, Lakshmi

    2005-11-01

    Many people in the semiconductor industry bemoan the high costs of masks and view mask cost as one of the significant barriers to bringing new chip designs to market. All that is needed is a viable maskless technology and the problem will go away. Numerous sites around the world are working on maskless lithography but inevitably, the question asked is "Wouldn't a one wafer per hour maskless tool make a really good mask writer?" Of course, the answer is yes, the hesitation you hear in the answer isn't based on technology concerns, it's financial. The industry needs maskless lithography because mask costs are too high. Mask costs are too high because mask pattern generators (PG's) are slow and expensive. If mask PG's become much faster, mask costs go down, the maskless market goes away and the PG supplier is faced with an even smaller tool demand from the mask shops. Technical success becomes financial suicide - or does it? In this paper we will present the results of a model that examines some of the consequences of introducing high throughput maskless pattern generation. Specific features in the model include tool throughput for masks and wafers, market segmentation by node for masks and wafers and mask cost as an entry barrier to new chip designs. How does the availability of low cost masks and maskless tools affect the industries tool makeup and what is the ultimate potential market for high throughput maskless pattern generators?

  6. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  7. Validation of high throughput sequencing and microbial forensics applications

    OpenAIRE

    Budowle, Bruce; Connell, Nancy D.; Bielecka-Oder, Anna; Rita R Colwell; Corbett, Cindi R.; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A.; Murch, Randall S; Sajantila, Antti; Schemes, Sarah E; Ternus, Krista L; Turner, Stephen D

    2014-01-01

    Abstract High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results a...

  8. A high throughput spectral image microscopy system

    Science.gov (United States)

    Gesley, M.; Puri, R.

    2018-01-01

    A high throughput spectral image microscopy system is configured for rapid detection of rare cells in large populations. To overcome flow cytometry rates and use of fluorophore tags, a system architecture integrates sample mechanical handling, signal processors, and optics in a non-confocal version of light absorption and scattering spectroscopic microscopy. Spectral images with native contrast do not require the use of exogeneous stain to render cells with submicron resolution. Structure may be characterized without restriction to cell clusters of differentiation.

  9. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  10. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  11. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  12. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  13. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  14. High throughput assays for analyzing transcription factors.

    Science.gov (United States)

    Li, Xianqiang; Jiang, Xin; Yaoi, Takuro

    2006-06-01

    Transcription factors are a group of proteins that modulate the expression of genes involved in many biological processes, such as cell growth and differentiation. Alterations in transcription factor function are associated with many human diseases, and therefore these proteins are attractive potential drug targets. A key issue in the development of such therapeutics is the generation of effective tools that can be used for high throughput discovery of the critical transcription factors involved in human diseases, and the measurement of their activities in a variety of disease or compound-treated samples. Here, a number of innovative arrays and 96-well format assays for profiling and measuring the activities of transcription factors will be discussed.

  15. High-throughput hyperdimensional vertebrate phenotyping.

    Science.gov (United States)

    Pardo-Martin, Carlos; Allalou, Amin; Medina, Jaime; Eimon, Peter M; Wählby, Carolina; Fatih Yanik, Mehmet

    2013-01-01

    Most gene mutations and biologically active molecules cause complex responses in animals that cannot be predicted by cell culture models. Yet animal studies remain too slow and their analyses are often limited to only a few readouts. Here we demonstrate high-throughput optical projection tomography with micrometre resolution and hyperdimensional screening of entire vertebrates in tens of seconds using a simple fluidic system. Hundreds of independent morphological features and complex phenotypes are automatically captured in three dimensions with unprecedented speed and detail in semitransparent zebrafish larvae. By clustering quantitative phenotypic signatures, we can detect and classify even subtle alterations in many biological processes simultaneously. We term our approach hyperdimensional in vivo phenotyping. To illustrate the power of hyperdimensional in vivo phenotyping, we have analysed the effects of several classes of teratogens on cartilage formation using 200 independent morphological measurements, and identified similarities and differences that correlate well with their known mechanisms of actions in mammals.

  16. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  17. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......, focusing on oft encountered problems in data processing, such as quality assurance, mapping, normalization, visualization, and interpretation. Presented in the second part are scientific endeavors representing solutions to problems of two sub-genres of next generation sequencing. For the first flavor, RNA-sequencing...

  18. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......). For the second flavor, DNA-seq, a study presenting genome wide profiling of transcription factor CEBP/A in liver cells undergoing regeneration after partial hepatectomy (article IV) is included....

  19. High Throughput Spectroscopic Catalyst Screening via Surface Plasmon Spectroscopy

    Science.gov (United States)

    2015-07-15

    Final 3. DATES COVERED (From - To) 26-June-2014 to 25-March-2015 4. TITLE AND SUBTITLE High Throughput Catalyst Screening via Surface...TITLE AND SUBTITLE High Throughput Catalyst Screening via Surface Plasmon Spectroscopy 5a. CONTRACT NUMBER FA2386-14-1-4064 5b. GRANT NUMBER 5c...AOARD Grant 144064 FA2386-14-1-4064 “High Throughput Spectroscopic Catalyst Screening by Surface Plasmon Spectroscopy” Date July 15, 2015

  20. High-throughput DNA sequencing: a genomic data manufacturing process.

    Science.gov (United States)

    Huang, G M

    1999-01-01

    The progress trends in automated DNA sequencing operation are reviewed. Technological development in sequencing instruments, enzymatic chemistry and robotic stations has resulted in ever-increasing capacity of sequence data production. This progress leads to a higher demand on laboratory information management and data quality assessment. High-throughput laboratories face the challenge of organizational management, as well as technology management. Engineering principles of process control should be adopted in this biological data manufacturing procedure. While various systems attempt to provide solutions to automate different parts of, or even the entire process, new technical advances will continue to change the paradigm and provide new challenges.

  1. High Throughput Direct Detection Doppler Lidar Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Lite Cycles, Inc. (LCI) proposes to develop a direct-detection Doppler lidar (D3L) technology called ELITE that improves the system optical throughput by more than...

  2. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  3. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification...

  4. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  5. High-throughput crystallography for structural genomics.

    Science.gov (United States)

    Joachimiak, Andrzej

    2009-10-01

    Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now more than 55000 protein structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal, and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact.

  6. High-throughput Crystallography for Structural Genomics

    Science.gov (United States)

    Joachimiak, Andrzej

    2009-01-01

    Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now over 53,000 proteins structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact. PMID:19765976

  7. Plant chip for high-throughput phenotyping of Arabidopsis.

    Science.gov (United States)

    Jiang, Huawei; Xu, Zhen; Aluru, Maneesha R; Dong, Liang

    2014-04-07

    We report on the development of a vertical and transparent microfluidic chip for high-throughput phenotyping of Arabidopsis thaliana plants. Multiple Arabidopsis seeds can be germinated and grown hydroponically over more than two weeks in the chip, thus enabling large-scale and quantitative monitoring of plant phenotypes. The novel vertical arrangement of this microfluidic device not only allows for normal gravitropic growth of the plants but also, more importantly, makes it convenient to continuously monitor phenotypic changes in plants at the whole organismal level, including seed germination and root and shoot growth (hypocotyls, cotyledons, and leaves), as well as at the cellular level. We also developed a hydrodynamic trapping method to automatically place single seeds into seed holding sites of the device and to avoid potential damage to seeds that might occur during manual loading. We demonstrated general utility of this microfluidic device by showing clear visible phenotypes of the immutans mutant of Arabidopsis, and we also showed changes occurring during plant-pathogen interactions at different developmental stages. Arabidopsis plants grown in the device maintained normal morphological and physiological behaviour, and distinct phenotypic variations consistent with a priori data were observed via high-resolution images taken in real time. Moreover, the timeline for different developmental stages for plants grown in this device was highly comparable to growth using a conventional agar plate method. This prototype plant chip technology is expected to lead to the establishment of a powerful experimental and cost-effective framework for high-throughput and precise plant phenotyping.

  8. High-throughput cultivation and screening platform for unicellular phototrophs.

    Science.gov (United States)

    Tillich, Ulrich M; Wolter, Nick; Schulze, Katja; Kramer, Dan; Brödel, Oliver; Frohme, Marcus

    2014-09-16

    High-throughput cultivation and screening methods allow a parallel, miniaturized and cost efficient processing of many samples. These methods however, have not been generally established for phototrophic organisms such as microalgae or cyanobacteria. In this work we describe and test high-throughput methods with the model organism Synechocystis sp. PCC6803. The required technical automation for these processes was achieved with a Tecan Freedom Evo 200 pipetting robot. The cultivation was performed in 2.2 ml deepwell microtiter plates within a cultivation chamber outfitted with programmable shaking conditions, variable illumination, variable temperature, and an adjustable CO2 atmosphere. Each microtiter-well within the chamber functions as a separate cultivation vessel with reproducible conditions. The automated measurement of various parameters such as growth, full absorption spectrum, chlorophyll concentration, MALDI-TOF-MS, as well as a novel vitality measurement protocol, have already been established and can be monitored during cultivation. Measurement of growth parameters can be used as inputs for the system to allow for periodic automatic dilutions and therefore a semi-continuous cultivation of hundreds of cultures in parallel. The system also allows the automatic generation of mid and long term backups of cultures to repeat experiments or to retrieve strains of interest. The presented platform allows for high-throughput cultivation and screening of Synechocystis sp. PCC6803. The platform should be usable for many phototrophic microorganisms as is, and be adaptable for even more. A variety of analyses are already established and the platform is easily expandable both in quality, i.e. with further parameters to screen for additional targets and in quantity, i.e. size or number of processed samples.

  9. High throughput instruments, methods, and informatics for systems biology.

    Energy Technology Data Exchange (ETDEWEB)

    Sinclair, Michael B.; Cowie, Jim R. (New Mexico State University, Las Cruces, NM); Van Benthem, Mark Hilary; Wylie, Brian Neil; Davidson, George S.; Haaland, David Michael; Timlin, Jerilyn Ann; Aragon, Anthony D. (University of New Mexico, Albuquerque, NM); Keenan, Michael Robert; Boyack, Kevin W.; Thomas, Edward Victor; Werner-Washburne, Margaret C. (University of New Mexico, Albuquerque, NM); Mosquera-Caro, Monica P. (University of New Mexico, Albuquerque, NM); Martinez, M. Juanita (University of New Mexico, Albuquerque, NM); Martin, Shawn Bryan; Willman, Cheryl L. (University of New Mexico, Albuquerque, NM)

    2003-12-01

    High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery.

  10. High-throughput screening: update on practices and success.

    Science.gov (United States)

    Fox, Sandra; Farr-Jones, Shauna; Sopchak, Lynne; Boggs, Amy; Nicely, Helen Wang; Khoury, Richard; Biros, Michael

    2006-10-01

    High-throughput screening (HTS) has become an important part of drug discovery at most pharmaceutical and many biotechnology companies worldwide, and use of HTS technologies is expanding into new areas. Target validation, assay development, secondary screening, ADME/Tox, and lead optimization are among the areas in which there is an increasing use of HTS technologies. It is becoming fully integrated within drug discovery, both upstream and downstream, which includes increasing use of cell-based assays and high-content screening (HCS) technologies to achieve more physiologically relevant results and to find higher quality leads. In addition, HTS laboratories are continually evaluating new technologies as they struggle to increase their success rate for finding drug candidates. The material in this article is based on a 900-page HTS industry report involving 54 HTS directors representing 58 HTS laboratories and 34 suppliers.

  11. Resolution- and throughput-enhanced spectroscopy using a high-throughput computational slit

    Science.gov (United States)

    Kazemzadeh, Farnoud; Wong, Alexander

    2016-09-01

    There exists a fundamental tradeoff between spectral resolution and the efficiency or throughput for all optical spectrometers. The primary factors affecting the spectral resolution and throughput of an optical spectrometer are the size of the entrance aperture and the optical power of the focusing element. Thus far collective optimization of the above mentioned has proven difficult. Here, we introduce the concept of high-throughput computational slits (HTCS), a numerical technique for improving both the effective spectral resolution and efficiency of a spectrometer. The proposed HTCS approach was experimentally validated using an optical spectrometer configured with a 200 um entrance aperture, test, and a 50 um entrance aperture, control, demonstrating improvements in spectral resolution of the spectrum by ~ 50% over the control spectral resolution and improvements in efficiency of > 2 times over the efficiency of the largest entrance aperture used in the study while producing highly accurate spectra.

  12. High-Throughput Approaches to Pinpoint Function within the Noncoding Genome.

    Science.gov (United States)

    Montalbano, Antonino; Canver, Matthew C; Sanjana, Neville E

    2017-10-05

    The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas nuclease system is a powerful tool for genome editing, and its simple programmability has enabled high-throughput genetic and epigenetic studies. These high-throughput approaches offer investigators a toolkit for functional interrogation of not only protein-coding genes but also noncoding DNA. Historically, noncoding DNA has lacked the detailed characterization that has been applied to protein-coding genes in large part because there has not been a robust set of methodologies for perturbing these regions. Although the majority of high-throughput CRISPR screens have focused on the coding genome to date, an increasing number of CRISPR screens targeting noncoding genomic regions continue to emerge. Here, we review high-throughput CRISPR-based approaches to uncover and understand functional elements within the noncoding genome and discuss practical aspects of noncoding library design and screen analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Scanning droplet cell for high throughput electrochemical and photoelectrochemical measurements

    Science.gov (United States)

    Gregoire, John M.; Xiang, Chengxiang; Liu, Xiaonao; Marcin, Martin; Jin, Jian

    2013-02-01

    High throughput electrochemical techniques are widely applied in material discovery and optimization. For many applications, the most desirable electrochemical characterization requires a three-electrode cell under potentiostat control. In high throughput screening, a material library is explored by either employing an array of such cells, or rastering a single cell over the library. To attain this latter capability with unprecedented throughput, we have developed a highly integrated, compact scanning droplet cell that is optimized for rapid electrochemical and photoeletrochemical measurements. Using this cell, we screened a quaternary oxide library as (photo)electrocatalysts for the oxygen evolution (water splitting) reaction. High quality electrochemical measurements were carried out and key electrocatalytic properties were identified for each of 5456 samples with a throughput of 4 s per sample.

  14. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek is developing a high throughput nominal 100-W Hall Effect Thruster. This device is well sized for spacecraft ranging in size from several tens of kilograms to...

  15. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    Science.gov (United States)

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  16. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek Co. Inc. proposes to develop a high throughput, nominal 100 W Hall Effect Thruster (HET). This HET will be sized for small spacecraft (< 180 kg), including...

  17. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  18. Materiomics - High-Throughput Screening of Biomaterial Properties

    NARCIS (Netherlands)

    de Boer, Jan; van Blitterswijk, Clemens

    2013-01-01

    This complete, yet concise, guide introduces you to the rapidly developing field of high throughput screening of biomaterials: materiomics. Bringing together the key concepts and methodologies used to determine biomaterial properties, you will understand the adaptation and application of materomics

  19. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  20. Applications of High Throughput Sequencing for Immunology and Clinical Diagnostics

    OpenAIRE

    Kim, Hyunsung John

    2014-01-01

    High throughput sequencing methods have fundamentally shifted the manner in which biological experiments are performed. In this dissertation, conventional and novel high throughput sequencing and bioinformatics methods are applied to immunology and diagnostics. In order to study rare subsets of cells, an RNA sequencing method was first optimized for use with minimal levels of RNA and cellular input. The optimized RNA sequencing method was then applied to study the transcriptional differences ...

  1. High-throughput optical coherence tomography at 800 nm.

    Science.gov (United States)

    Goda, Keisuke; Fard, Ali; Malik, Omer; Fu, Gilbert; Quach, Alan; Jalali, Bahram

    2012-08-27

    We report high-throughput optical coherence tomography (OCT) that offers 1,000 times higher axial scan rate than conventional OCT in the 800 nm spectral range. This is made possible by employing photonic time-stretch for chirping a pulse train and transforming it into a passive swept source. We demonstrate a record high axial scan rate of 90.9 MHz. To show the utility of our method, we also demonstrate real-time observation of laser ablation dynamics. Our high-throughput OCT is expected to be useful for industrial applications where the speed of conventional OCT falls short.

  2. A novel high throughput method to investigate polymer dissolution.

    Science.gov (United States)

    Zhang, Ying; Mallapragada, Surya K; Narasimhan, Balaji

    2010-02-16

    The dissolution behavior of polystyrene (PS) in biodiesel was studied by developing a novel high throughput approach based on Fourier-transform infrared (FTIR) microscopy. A multiwell device for high throughput dissolution testing was fabricated using a photolithographic rapid prototyping method. The dissolution of PS films in each well was tracked by following the characteristic IR band of PS and the effect of PS molecular weight and temperature on the dissolution rate was simultaneously investigated. The results were validated with conventional gravimetric methods. The high throughput method can be extended to evaluate the dissolution profiles of a large number of samples, or to simultaneously investigate the effect of variables such as polydispersity, crystallinity, and mixed solvents. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. High-Throughput Printing Process for Flexible Electronics

    Science.gov (United States)

    Hyun, Woo Jin

    Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.

  4. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.|info:eu-repo/dai/nl/074334603; Folkers, G.E.|info:eu-repo/dai/nl/162277202

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  5. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r...

  6. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  7. High throughput calorimetry for evaluating enzymatic reactions generating phosphate.

    Science.gov (United States)

    Hoflack, Lieve; De Groeve, Manu; Desmet, Tom; Van Gerwen, Peter; Soetaert, Wim

    2010-05-01

    A calorimetric assay is described for the high-throughput screening of enzymes that produce inorganic phosphate. In the current example, cellobiose phosphorylase (EC 2.4.1.20) is tested for its ability to synthesise rare disaccharides. The generated phosphate is measured in a high-throughput calorimeter by coupling the reaction to pyruvate oxidase and catalase. This procedure allows for the simultaneous analysis of 48 reactions in microtiter plate format and has been validated by comparison with a colorimetric phosphate assay. The proposed assay has a coefficient of variation of 3.14% and is useful for screening enzyme libraries for enhanced activity and substrate libraries for enzyme promiscuity.

  8. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  9. High-throughput microfluidic line scan imaging for cytological characterization

    Science.gov (United States)

    Hutcheson, Joshua A.; Powless, Amy J.; Majid, Aneeka A.; Claycomb, Adair; Fritsch, Ingrid; Balachandran, Kartik; Muldoon, Timothy J.

    2015-03-01

    Imaging cells in a microfluidic chamber with an area scan camera is difficult due to motion blur and data loss during frame readout causing discontinuity of data acquisition as cells move at relatively high speeds through the chamber. We have developed a method to continuously acquire high-resolution images of cells in motion through a microfluidics chamber using a high-speed line scan camera. The sensor acquires images in a line-by-line fashion in order to continuously image moving objects without motion blur. The optical setup comprises an epi-illuminated microscope with a 40X oil immersion, 1.4 NA objective and a 150 mm tube lens focused on a microfluidic channel. Samples containing suspended cells fluorescently stained with 0.01% (w/v) proflavine in saline are introduced into the microfluidics chamber via a syringe pump; illumination is provided by a blue LED (455 nm). Images were taken of samples at the focal plane using an ELiiXA+ 8k/4k monochrome line-scan camera at a line rate of up to 40 kHz. The system's line rate and fluid velocity are tightly controlled to reduce image distortion and are validated using fluorescent microspheres. Image acquisition was controlled via MATLAB's Image Acquisition toolbox. Data sets comprise discrete images of every detectable cell which may be subsequently mined for morphological statistics and definable features by a custom texture analysis algorithm. This high-throughput screening method, comparable to cell counting by flow cytometry, provided efficient examination including counting, classification, and differentiation of saliva, blood, and cultured human cancer cells.

  10. Validation of high throughput sequencing and microbial forensics applications.

    Science.gov (United States)

    Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.

  11. High throughput defect detection with multiple parallel electron beams

    NARCIS (Netherlands)

    Himbergen, H.M.P. van; Nijkerk, M.D.; Jager, P.W.H. de; Hosman, T.C.; Kruit, P.

    2007-01-01

    A new concept for high throughput defect detection with multiple parallel electron beams is described. As many as 30 000 beams can be placed on a footprint of a in.2, each beam having its own microcolumn and detection system without cross-talk. Based on the International Technology Roadmap for

  12. A High-Throughput SU-8Microfluidic Magnetic Bead Separator

    DEFF Research Database (Denmark)

    Bu, Minqiang; Christensen, T. B.; Smistrup, Kristian

    2007-01-01

    We present a novel microfluidic magnetic bead separator based on SU-8 fabrication technique for high through-put applications. The experimental results show that magnetic beads can be captured at an efficiency of 91 % and 54 % at flow rates of 1 mL/min and 4 mL/min, respectively. Integration of s...

  13. High-Throughput Toxicity Testing: New Strategies for ...

    Science.gov (United States)

    In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it

  14. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  15. High-throughput screening, predictive modeling and computational embryology - Abstract

    Science.gov (United States)

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  16. High-throughput screening, predictive modeling and computational embryology

    Science.gov (United States)

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  17. High-throughput sequencing in mitochondrial DNA research.

    Science.gov (United States)

    Ye, Fei; Samuels, David C; Clark, Travis; Guo, Yan

    2014-07-01

    Next-generation sequencing, also known as high-throughput sequencing, has greatly enhanced researchers' ability to conduct biomedical research on all levels. Mitochondrial research has also benefitted greatly from high-throughput sequencing; sequencing technology now allows for screening of all 16,569 base pairs of the mitochondrial genome simultaneously for SNPs and low level heteroplasmy and, in some cases, the estimation of mitochondrial DNA copy number. It is important to realize the full potential of high-throughput sequencing for the advancement of mitochondrial research. To this end, we review how high-throughput sequencing has impacted mitochondrial research in the categories of SNPs, low level heteroplasmy, copy number, and structural variants. We also discuss the different types of mitochondrial DNA sequencing and their pros and cons. Based on previous studies conducted by various groups, we provide strategies for processing mitochondrial DNA sequencing data, including assembly, variant calling, and quality control. Copyright © 2014 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  18. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    Science.gov (United States)

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  19. Chemometric Optimization Studies in Catalysis Employing High-Throughput Experimentation

    NARCIS (Netherlands)

    Pereira, S.R.M.

    2008-01-01

    The main topic of this thesis is the investigation of the synergies between High-Throughput Experimentation (HTE) and Chemometric Optimization methodologies in Catalysis research and of the use of such methodologies to maximize the advantages of using HTE methods. Several case studies were analysed

  20. Savant: genome browser for high-throughput sequencing data.

    Science.gov (United States)

    Fiume, Marc; Williams, Vanessa; Brook, Andrew; Brudno, Michael

    2010-08-15

    The advent of high-throughput sequencing (HTS) technologies has made it affordable to sequence many individuals' genomes. Simultaneously the computational analysis of the large volumes of data generated by the new sequencing machines remains a challenge. While a plethora of tools are available to map the resulting reads to a reference genome, and to conduct primary analysis of the mappings, it is often necessary to visually examine the results and underlying data to confirm predictions and understand the functional effects, especially in the context of other datasets. We introduce Savant, the Sequence Annotation, Visualization and ANalysis Tool, a desktop visualization and analysis browser for genomic data. Savant was developed for visualizing and analyzing HTS data, with special care taken to enable dynamic visualization in the presence of gigabases of genomic reads and references the size of the human genome. Savant supports the visualization of genome-based sequence, point, interval and continuous datasets, and multiple visualization modes that enable easy identification of genomic variants (including single nucleotide polymorphisms, structural and copy number variants), and functional genomic information (e.g. peaks in ChIP-seq data) in the context of genomic annotations. Savant is freely available at http://compbio.cs.toronto.edu/savant.

  1. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  2. High-Throughput Microfluidics for the Screening of Yeast Libraries.

    Science.gov (United States)

    Huang, Mingtao; Joensson, Haakan N; Nielsen, Jens

    2018-01-01

    Cell factory development is critically important for efficient biological production of chemicals, biofuels, and pharmaceuticals. Many rounds of the Design-Build-Test-Learn cycles may be required before an engineered strain meeting specific metrics required for industrial application. The bioindustry prefer products in secreted form (secreted products or extracellular metabolites) as it can lower the cost of downstream processing, reduce metabolic burden to cell hosts, and allow necessary modification on the final products , such as biopharmaceuticals. Yet, products in secreted form result in the disconnection of phenotype from genotype, which may have limited throughput in the Test step for identification of desired variants from large libraries of mutant strains. In droplet microfluidic screening, single cells are encapsulated in individual droplet and enable high-throughput processing and sorting of single cells or clones. Encapsulation in droplets allows this technology to overcome the throughput limitations present in traditional methods for screening by extracellular phenotypes. In this chapter, we describe a protocol/guideline for high-throughput droplet microfluidics screening of yeast libraries for higher protein secretion . This protocol can be adapted to screening by a range of other extracellular products from yeast or other hosts.

  3. A high throughput droplet based electroporation system

    Science.gov (United States)

    Yoo, Byeongsun; Ahn, Myungmo; Im, Dojin; Kang, Inseok

    2014-11-01

    Delivery of exogenous genetic materials across the cell membrane is a powerful and popular research tool for bioengineering. Among conventional non-viral DNA delivery methods, electroporation (EP) is one of the most widely used technologies and is a standard lab procedure in molecular biology. We developed a novel digital microfluidic electroporation system which has higher efficiency of transgene expression and better cell viability than that of conventional EP techniques. We present the successful performance of digital EP system for transformation of various cell lines by investigating effects of the EP conditions such as electric pulse voltage, number, and duration on the cell viability and transfection efficiency in comparison with a conventional bulk EP system. Through the numerical analysis, we have also calculated the electric field distribution around the cells precisely to verify the effect of the electric field on the high efficiency of the digital EP system. Furthermore, the parallelization of the EP processes has been developed to increase the transformation productivity. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (Grant Number: 2013R1A1A2011956).

  4. High-throughput theoretical design of lithium battery materials

    Science.gov (United States)

    Shi-Gang, Ling; Jian, Gao; Rui-Juan, Xiao; Li-Quan, Chen

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. Project supported by the National Natural Science Foundation of China (Grant Nos. 11234013 and 51172274) and the National High Technology Research and Development Program of China (Grant No. 2015AA034201).

  5. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  6. Upscaling and automation of electrophysiology: toward high throughput screening in ion channel drug discovery

    DEFF Research Database (Denmark)

    Asmild, Margit; Oswald, Nicholas; Krzywkowski, Karen M

    2003-01-01

    Effective screening of large compound libraries in ion channel drug discovery requires the development of new electrophysiological techniques with substantially increased throughputs compared to the conventional patch clamp technique. Sophion Bioscience is aiming to meet this challenge by develop......Effective screening of large compound libraries in ion channel drug discovery requires the development of new electrophysiological techniques with substantially increased throughputs compared to the conventional patch clamp technique. Sophion Bioscience is aiming to meet this challenge...... by developing two lines of automated patch clamp products, a traditional pipette-based system called Apatchi-1, and a silicon chip-based system QPatch. The degree of automation spans from semi-automation (Apatchi-1) where a trained technician interacts with the system in a limited way, to a complete automation...... (QPatch 96) where the system works continuously and unattended until screening of a full compound library is completed. The performance of the systems range from medium to high throughputs....

  7. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  8. High-throughput sequence alignment using Graphics Processing Units.

    Science.gov (United States)

    Schatz, Michael C; Trapnell, Cole; Delcher, Arthur L; Varshney, Amitabh

    2007-12-10

    The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs) in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA) from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  9. Predictions versus high-throughput experiments in T-cell epitope discovery: competition or synergy?

    DEFF Research Database (Denmark)

    Lundegaard, Claus; Lund, Ole; Nielsen, Morten

    2012-01-01

    Prediction methods as well as experimental methods for T-cell epitope discovery have developed significantly in recent years. High-throughput experimental methods have made it possible to perform full-length protein scans for epitopes restricted to a limited number of MHC alleles. The high costs...... discovery. We expect prediction methods as well as experimental validation methods to continue to develop and that we will soon see clinical trials of products whose development has been guided by prediction methods....

  10. Human transcriptome array for high-throughput clinical studies

    Science.gov (United States)

    Xu, Weihong; Seok, Junhee; Mindrinos, Michael N.; Schweitzer, Anthony C.; Jiang, Hui; Wilhelmy, Julie; Clark, Tyson A.; Kapur, Karen; Xing, Yi; Faham, Malek; Storey, John D.; Moldawer, Lyle L.; Maier, Ronald V.; Tompkins, Ronald G.; Wong, Wing Hung; Davis, Ronald W.; Xiao, Wenzhong; Toner, Mehmet; Warren, H. Shaw; Schoenfeld, David A.; Rahme, Laurence; McDonald-Smith, Grace P.; Hayden, Douglas; Mason, Philip; Fagan, Shawn; Yu, Yong-Ming; Cobb, J. Perren; Remick, Daniel G.; Mannick, John A.; Lederer, James A.; Gamelli, Richard L.; Silver, Geoffrey M.; West, Michael A.; Shapiro, Michael B.; Smith, Richard; Camp, David G.; Qian, Weijun; Tibshirani, Rob; Lowry, Stephen; Calvano, Steven; Chaudry, Irshad; Cohen, Mitchell; Moore, Ernest E.; Johnson, Jeffrey; Baker, Henry V.; Efron, Philip A.; Balis, Ulysses G. J.; Billiar, Timothy R.; Ochoa, Juan B.; Sperry, Jason L.; Miller-Graziano, Carol L.; De, Asit K.; Bankey, Paul E.; Herndon, David N.; Finnerty, Celeste C.; Jeschke, Marc G.; Minei, Joseph P.; Arnoldo, Brett D.; Hunt, John L.; Horton, Jureta; Cobb, J. Perren; Brownstein, Bernard; Freeman, Bradley; Nathens, Avery B.; Cuschieri, Joseph; Gibran, Nicole; Klein, Matthew; O'Keefe, Grant

    2011-01-01

    A 6.9 million-feature oligonucleotide array of the human transcriptome [Glue Grant human transcriptome (GG-H array)] has been developed for high-throughput and cost-effective analyses in clinical studies. This array allows comprehensive examination of gene expression and genome-wide identification of alternative splicing as well as detection of coding SNPs and noncoding transcripts. The performance of the array was examined and compared with mRNA sequencing (RNA-Seq) results over multiple independent replicates of liver and muscle samples. Compared with RNA-Seq of 46 million uniquely mappable reads per replicate, the GG-H array is highly reproducible in estimating gene and exon abundance. Although both platforms detect similar expression changes at the gene level, the GG-H array is more sensitive at the exon level. Deeper sequencing is required to adequately cover low-abundance transcripts. The array has been implemented in a multicenter clinical program and has generated high-quality, reproducible data. Considering the clinical trial requirements of cost, sample availability, and throughput, the GG-H array has a wide range of applications. An emerging approach for large-scale clinical genomic studies is to first use RNA-Seq to the sufficient depth for the discovery of transcriptome elements relevant to the disease process followed by high-throughput and reliable screening of these elements on thousands of patient samples using custom-designed arrays. PMID:21317363

  11. Computational analysis of high-throughput flow cytometry data.

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2012-08-01

    Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible.

  12. Computational analysis of high-throughput flow cytometry data

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  13. High throughput screening of starch structures using carbohydrate microarrays.

    Science.gov (United States)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Motawia, Mohammed Saddik; Shaik, Shahnoor Sultana; Mikkelsen, Maria Dalgaard; Krunic, Susanne Langgaard; Fangel, Jonatan Ulrik; Willats, William George Tycho; Blennow, Andreas

    2016-07-29

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers the potential for rapidly analysing resistant and slowly digested dietary starches.

  14. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  15. Trade-Off Analysis in High-Throughput Materials Exploration.

    Science.gov (United States)

    Volety, Kalpana K; Huyberechts, Guido P J

    2017-03-13

    This Research Article presents a strategy to identify the optimum compositions in metal alloys with certain desired properties in a high-throughput screening environment, using a multiobjective optimization approach. In addition to the identification of the optimum compositions in a primary screening, the strategy also allows pointing to regions in the compositional space where further exploration in a secondary screening could be carried out. The strategy for the primary screening is a combination of two multiobjective optimization approaches namely Pareto optimality and desirability functions. The experimental data used in the present study have been collected from over 200 different compositions belonging to four different alloy systems. The metal alloys (comprising Fe, Ti, Al, Nb, Hf, Zr) are synthesized and screened using high-throughput technologies. The advantages of such a kind of approach compared to the limitations of the traditional and comparatively simpler approaches like ranking and calculating figures of merit are discussed.

  16. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...... maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content...... and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers...

  17. High-throughput screening for modulators of cellular contractile force

    CERN Document Server

    Park, Chan Young; Tambe, Dhananjay; Chen, Bohao; Lavoie, Tera; Dowell, Maria; Simeonov, Anton; Maloney, David J; Marinkovic, Aleksandar; Tschumperlin, Daniel J; Burger, Stephanie; Frykenberg, Matthew; Butler, James P; Stamer, W Daniel; Johnson, Mark; Solway, Julian; Fredberg, Jeffrey J; Krishnan, Ramaswamy

    2014-01-01

    When cellular contractile forces are central to pathophysiology, these forces comprise a logical target of therapy. Nevertheless, existing high-throughput screens are limited to upstream signaling intermediates with poorly defined relationship to such a physiological endpoint. Using cellular force as the target, here we screened libraries to identify novel drug candidates in the case of human airway smooth muscle cells in the context of asthma, and also in the case of Schlemm's canal endothelial cells in the context of glaucoma. This approach identified several drug candidates for both asthma and glaucoma. We attained rates of 1000 compounds per screening day, thus establishing a force-based cellular platform for high-throughput drug discovery.

  18. High-throughput optical screening of cellular mechanotransduction

    OpenAIRE

    Compton, JL; Luo, JC; Ma, H.; Botvinick, E; Venugopalan, V

    2014-01-01

    We introduce an optical platform for rapid, high-throughput screening of exogenous molecules that affect cellular mechanotransduction. Our method initiates mechanotransduction in adherent cells using single laser-microbeam generated microcavitation bubbles without requiring flow chambers or microfluidics. These microcavitation bubbles expose adherent cells to a microtsunami, a transient microscale burst of hydrodynamic shear stress, which stimulates cells over areas approaching 1 mm2. We demo...

  19. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  20. High-throughput evaluation of synthetic metabolic pathways.

    Science.gov (United States)

    Klesmith, Justin R; Whitehead, Timothy A

    2016-03-01

    A central challenge in the field of metabolic engineering is the efficient identification of a metabolic pathway genotype that maximizes specific productivity over a robust range of process conditions. Here we review current methods for optimizing specific productivity of metabolic pathways in living cells. New tools for library generation, computational analysis of pathway sequence-flux space, and high-throughput screening and selection techniques are discussed.

  1. The high-throughput highway to computational materials design.

    Science.gov (United States)

    Curtarolo, Stefano; Hart, Gus L W; Nardelli, Marco Buongiorno; Mingo, Natalio; Sanvito, Stefano; Levy, Ohad

    2013-03-01

    High-throughput computational materials design is an emerging area of materials science. By combining advanced thermodynamic and electronic-structure methods with intelligent data mining and database construction, and exploiting the power of current supercomputer architectures, scientists generate, manage and analyse enormous data repositories for the discovery of novel materials. In this Review we provide a current snapshot of this rapidly evolving field, and highlight the challenges and opportunities that lie ahead.

  2. Web-based visual analysis for high-throughput genomics.

    Science.gov (United States)

    Goecks, Jeremy; Eberhard, Carl; Too, Tomithy; Nekrutenko, Anton; Taylor, James

    2013-06-13

    Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput

  3. Graph-based signal integration for high-throughput phenotyping.

    Science.gov (United States)

    Herskovic, Jorge R; Subramanian, Devika; Cohen, Trevor; Bozzo-Silva, Pamela A; Bearden, Charles F; Bernstam, Elmer V

    2012-01-01

    Electronic Health Records aggregated in Clinical Data Warehouses (CDWs) promise to revolutionize Comparative Effectiveness Research and suggest new avenues of research. However, the effectiveness of CDWs is diminished by the lack of properly labeled data. We present a novel approach that integrates knowledge from the CDW, the biomedical literature, and the Unified Medical Language System (UMLS) to perform high-throughput phenotyping. In this paper, we automatically construct a graphical knowledge model and then use it to phenotype breast cancer patients. We compare the performance of this approach to using MetaMap when labeling records. MetaMap's overall accuracy at identifying breast cancer patients was 51.1% (n=428); recall=85.4%, precision=26.2%, and F1=40.1%. Our unsupervised graph-based high-throughput phenotyping had accuracy of 84.1%; recall=46.3%, precision=61.2%, and F1=52.8%. We conclude that our approach is a promising alternative for unsupervised high-throughput phenotyping.

  4. Condor-COPASI: high-throughput computing for biochemical networks

    Directory of Open Access Journals (Sweden)

    Kent Edward

    2012-07-01

    Full Text Available Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage.

  5. High-throughput computational and experimental techniques in structural genomics.

    Science.gov (United States)

    Chance, Mark R; Fiser, Andras; Sali, Andrej; Pieper, Ursula; Eswar, Narayanan; Xu, Guiping; Fajardo, J Eduardo; Radhakannan, Thirumuruhan; Marinkovic, Nebojsa

    2004-10-01

    Structural genomics has as its goal the provision of structural information for all possible ORF sequences through a combination of experimental and computational approaches. The access to genome sequences and cloning resources from an ever-widening array of organisms is driving high-throughput structural studies by the New York Structural Genomics Research Consortium. In this report, we outline the progress of the Consortium in establishing its pipeline for structural genomics, and some of the experimental and bioinformatics efforts leading to structural annotation of proteins. The Consortium has established a pipeline for structural biology studies, automated modeling of ORF sequences using solved (template) structures, and a novel high-throughput approach (metallomics) to examining the metal binding to purified protein targets. The Consortium has so far produced 493 purified proteins from >1077 expression vectors. A total of 95 have resulted in crystal structures, and 81 are deposited in the Protein Data Bank (PDB). Comparative modeling of these structures has generated >40,000 structural models. We also initiated a high-throughput metal analysis of the purified proteins; this has determined that 10%-15% of the targets contain a stoichiometric structural or catalytic transition metal atom. The progress of the structural genomics centers in the U.S. and around the world suggests that the goal of providing useful structural information on most all ORF domains will be realized. This projected resource will provide structural biology information important to understanding the function of most proteins of the cell.

  6. 76 FR 28990 - Ultra High Throughput Sequencing for Clinical Diagnostic Applications-Approaches To Assess...

    Science.gov (United States)

    2011-05-19

    ... Clinical Diagnostic Applications--Approaches To Assess Analytical Validity.'' The purpose of the public... approaches to assess analytical validity of ultra high throughput sequencing for clinical diagnostic... HUMAN SERVICES Food and Drug Administration Ultra High Throughput Sequencing for Clinical Diagnostic...

  7. High-Throughput Synthesis, Screening, and Scale-Up of Optimized Conducting Indium Tin Oxides

    OpenAIRE

    Marchand, P; Makwana, N. M.; Tighe, C. J.; Gruar, R. I.; Parkin, I. P.; Carmalt, C. J.; Darr, J. A.

    2016-01-01

    A high-throughput optimization and subsequent scale-up methodology has been used for the synthesis of conductive tin-doped indium oxide (known as ITO) nanoparticles. ITO nanoparticles with up to 12 at % Sn were synthesized using a laboratory scale (15 g/hour by dry mass) continuous hydrothermal synthesis process, and the as-synthesized powders were characterized by powder X-ray diffraction, transmission electron microscopy, energy-dispersive X-ray analysis, and X-ray photoelectron spectroscop...

  8. Large scale library generation for high throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Erik Borgström

    Full Text Available BACKGROUND: Large efforts have recently been made to automate the sample preparation protocols for massively parallel sequencing in order to match the increasing instrument throughput. Still, the size selection through agarose gel electrophoresis separation is a labor-intensive bottleneck of these protocols. METHODOLOGY/PRINCIPAL FINDINGS: In this study a method for automatic library preparation and size selection on a liquid handling robot is presented. The method utilizes selective precipitation of certain sizes of DNA molecules on to paramagnetic beads for cleanup and selection after standard enzymatic reactions. CONCLUSIONS/SIGNIFICANCE: The method is used to generate libraries for de novo and re-sequencing on the Illumina HiSeq 2000 instrument with a throughput of 12 samples per instrument in approximately 4 hours. The resulting output data show quality scores and pass filter rates comparable to manually prepared samples. The sample size distribution can be adjusted for each application, and are suitable for all high throughput DNA processing protocols seeking to control size intervals.

  9. Achieving High Throughput for Data Transfer over ATM Networks

    Science.gov (United States)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  10. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  11. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    Science.gov (United States)

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (ΦPSII). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of ΦPSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll

  12. High throughput inclusion body sizing: Nano particle tracking analysis.

    Science.gov (United States)

    Reichelt, Wieland N; Kaineder, Andreas; Brillmann, Markus; Neutsch, Lukas; Taschauer, Alexander; Lohninger, Hans; Herwig, Christoph

    2017-06-01

    The expression of pharmaceutical relevant proteins in Escherichia coli frequently triggers inclusion body (IB) formation caused by protein aggregation. In the scientific literature, substantial effort has been devoted to the quantification of IB size. However, particle-based methods used up to this point to analyze the physical properties of representative numbers of IBs lack sensitivity and/or orthogonal verification. Using high pressure freezing and automated freeze substitution for transmission electron microscopy (TEM) the cytosolic inclusion body structure was preserved within the cells. TEM imaging in combination with manual grey scale image segmentation allowed the quantification of relative areas covered by the inclusion body within the cytosol. As a high throughput method nano particle tracking analysis (NTA) enables one to derive the diameter of inclusion bodies in cell homogenate based on a measurement of the Brownian motion. The NTA analysis of fixated (glutaraldehyde) and non-fixated IBs suggests that high pressure homogenization annihilates the native physiological shape of IBs. Nevertheless, the ratio of particle counts of non-fixated and fixated samples could potentially serve as factor for particle stickiness. In this contribution, we establish image segmentation of TEM pictures as an orthogonal method to size biologic particles in the cytosol of cells. More importantly, NTA has been established as a particle-based, fast and high throughput method (1000-3000 particles), thus constituting a much more accurate and representative analysis than currently available methods. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Efficient Management of High-Throughput Screening Libraries with SAVANAH

    DEFF Research Database (Denmark)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen

    2017-01-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such scr......High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis...... for such screens are molecular libraries, that is, microtiter plates with solubilized reagents such as siRNAs, shRNAs, miRNA inhibitors or mimics, and sgRNAs, or small compounds, that is, drugs. These reagents are typically condensed to provide enough material for covering several screens. Library plates thus need...... to be serially diluted before they can be used as assay plates. This process, however, leads to an explosion in the number of plates and samples to be tracked. Here, we present SAVANAH, the first tool to effectively manage molecular screening libraries across dilution series. It conveniently links (connects...

  14. High-throughput genomics enhances tomato breeding efficiency.

    Science.gov (United States)

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-03-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits.

  15. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Energy Technology Data Exchange (ETDEWEB)

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  16. High-throughput sequencing: a roadmap toward community ecology.

    Science.gov (United States)

    Poisot, Timothée; Péquin, Bérangère; Gravel, Dominique

    2013-04-01

    High-throughput sequencing is becoming increasingly important in microbial ecology, yet it is surprisingly under-used to generate or test biogeographic hypotheses. In this contribution, we highlight how adding these methods to the ecologist toolbox will allow the detection of new patterns, and will help our understanding of the structure and dynamics of diversity. Starting with a review of ecological questions that can be addressed, we move on to the technical and analytical issues that will benefit from an increased collaboration between different disciplines.

  17. UAV-based high-throughput phenotyping in legume crops

    Science.gov (United States)

    Sankaran, Sindhuja; Khot, Lav R.; Quirós, Juan; Vandemark, George J.; McGee, Rebecca J.

    2016-05-01

    In plant breeding, one of the biggest obstacles in genetic improvement is the lack of proven rapid methods for measuring plant responses in field conditions. Therefore, the major objective of this research was to evaluate the feasibility of utilizing high-throughput remote sensing technology for rapid measurement of phenotyping traits in legume crops. The plant responses of several chickpea and peas varieties to the environment were assessed with an unmanned aerial vehicle (UAV) integrated with multispectral imaging sensors. Our preliminary assessment showed that the vegetation indices are strongly correlated (pphenotyping traits.

  18. REDItools: high-throughput RNA editing detection made easy.

    Science.gov (United States)

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  19. High throughput platforms for structural genomics of integral membrane proteins.

    Science.gov (United States)

    Mancia, Filippo; Love, James

    2011-08-01

    Structural genomics approaches on integral membrane proteins have been postulated for over a decade, yet specific efforts are lagging years behind their soluble counterparts. Indeed, high throughput methodologies for production and characterization of prokaryotic integral membrane proteins are only now emerging, while large-scale efforts for eukaryotic ones are still in their infancy. Presented here is a review of recent literature on actively ongoing structural genomics of membrane protein initiatives, with a focus on those aimed at implementing interesting techniques aimed at increasing our rate of success for this class of macromolecules. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families....

  1. Bifrost: Stream processing framework for high-throughput applications

    Science.gov (United States)

    Barsdell, Ben; Price, Daniel; Cranmer, Miles; Garsden, Hugh; Dowell, Jayce

    2017-11-01

    Bifrost is a stream processing framework that eases the development of high-throughput processing CPU/GPU pipelines. It is designed for digital signal processing (DSP) applications within radio astronomy. Bifrost uses a flexible ring buffer implementation that allows different signal processing blocks to be connected to form a pipeline. Each block may be assigned to a CPU core, and the ring buffers are used to transport data to and from blocks. Processing blocks may be run on either the CPU or GPU, and the ring buffer will take care of memory copies between the CPU and GPU spaces.

  2. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  3. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  4. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  5. A pocket device for high-throughput optofluidic holographic microscopy

    Science.gov (United States)

    Mandracchia, B.; Bianco, V.; Wang, Z.; Paturzo, M.; Bramanti, A.; Pioggia, G.; Ferraro, P.

    2017-06-01

    Here we introduce a compact holographic microscope embedded onboard a Lab-on-a-Chip (LoC) platform. A wavefront division interferometer is realized by writing a polymer grating onto the channel to extract a reference wave from the object wave impinging the LoC. A portion of the beam reaches the samples flowing along the channel path, carrying their information content to the recording device, while one of the diffraction orders from the grating acts as an off-axis reference wave. Polymeric micro-lenses are delivered forward the chip by Pyro-ElectroHydroDynamic (Pyro-EHD) inkjet printing techniques. Thus, all the required optical components are embedded onboard a pocket device, and fast, non-iterative, reconstruction algorithms can be used. We use our device in combination with a novel high-throughput technique, named Space-Time Digital Holography (STDH). STDH exploits the samples motion inside microfluidic channels to obtain a synthetic hologram, mapped in a hybrid space-time domain, and with intrinsic useful features. Indeed, a single Linear Sensor Array (LSA) is sufficient to build up a synthetic representation of the entire experiment (i.e. the STDH) with unlimited Field of View (FoV) along the scanning direction, independently from the magnification factor. The throughput of the imaging system is dramatically increased as STDH provides unlimited FoV, refocusable imaging of samples inside the liquid volume with no need for hologram stitching. To test our embedded STDH microscopy module, we counted, imaged and tracked in 3D with high-throughput red blood cells moving inside the channel volume under non ideal flow conditions.

  6. An Updated Protocol for High Throughput Plant Tissue Sectioning

    Directory of Open Access Journals (Sweden)

    Jonathan A. Atkinson

    2017-10-01

    Full Text Available Quantification of the tissue and cellular structure of plant material is essential for the study of a variety of plant sciences applications. Currently, many methods for sectioning plant material are either low throughput or involve free-hand sectioning which requires a significant amount of practice. Here, we present an updated method to provide rapid and high-quality cross sections, primarily of root tissue but which can also be readily applied to other tissues such as leaves or stems. To increase the throughput of traditional agarose embedding and sectioning, custom designed 3D printed molds were utilized to embed 5–15 roots in a block for sectioning in a single cut. A single fluorescent stain in combination with laser scanning confocal microscopy was used to obtain high quality images of thick sections. The provided CAD files allow production of the embedding molds described here from a number of online 3D printing services. Although originally developed for roots, this method provides rapid, high quality cross sections of many plant tissue types, making it suitable for use in forward genetic screens for differences in specific cell structures or developmental changes. To demonstrate the utility of the technique, the two parent lines of the wheat (Triticum aestivum Chinese Spring × Paragon doubled haploid mapping population were phenotyped for root anatomical differences. Significant differences in adventitious cross section area, stele area, xylem, phloem, metaxylem, and cortical cell file count were found.

  7. High-throughput search for improved transparent conducting oxides

    Science.gov (United States)

    Miglio, Anna

    High-throughput methodologies are a very useful computational tool to explore the space of binary and ternary oxides. We use these methods to search for new and improved transparent conducting oxides (TCOs). TCOs exhibit both visible transparency and good carrier mobility and underpin many energy and electronic applications (e.g. photovoltaics, transparent transistors). We find several potential new n-type and p-type TCOs with a low effective mass. Combining different ab initio approaches, we characterize candidate oxides by their effective mass (mobility), band gap (transparency) and dopability. We present several compounds, not considered previously as TCOs, and discuss the chemical rationale for their promising properties. This analysis is useful to formulate design strategies for future high mobility oxides and has led to follow-up studies including preliminary experimental characterization of a p-type TCO candidate with unexpected chemistry. G. Hautier, A. Miglio, D. Waroquiers, G.-M. Rignanese, and X. Gonze, ``How Does Chemistry Influence Electron Effective Mass in Oxides? A High-Throughput Computational Analysis'', Chem. Mater. 26, 5447 (2014). G. Hautier, A. Miglio, G. Ceder, G.-M. Rignanese, and X. Gonze, ``Identification and design principles of low hole effective mass p-type transparent conducting oxides'', Nature Commun. 4, 2292 (2013).

  8. High Throughput Atomic Layer Deposition Processes: High Pressure Operations, New Reactor Designs, and Novel Metal Processing

    Science.gov (United States)

    Mousa, MoatazBellah Mahmoud

    Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor

  9. Miniaturization of High-Throughput Epigenetic Methyltransferase Assays with Acoustic Liquid Handling.

    Science.gov (United States)

    Edwards, Bonnie; Lesnick, John; Wang, Jing; Tang, Nga; Peters, Carl

    2016-02-01

    Epigenetics continues to emerge as an important target class for drug discovery and cancer research. As programs scale to evaluate many new targets related to epigenetic expression, new tools and techniques are required to enable efficient and reproducible high-throughput epigenetic screening. Assay miniaturization increases screening throughput and reduces operating costs. Echo liquid handlers can transfer compounds, samples, reagents, and beads in submicroliter volumes to high-density assay formats using only acoustic energy-no contact or tips required. This eliminates tip costs and reduces the risk of reagent carryover. In this study, we demonstrate the miniaturization of a methyltransferase assay using Echo liquid handlers and two different assay technologies: AlphaLISA from PerkinElmer and EPIgeneous HTRF from Cisbio. © 2015 Society for Laboratory Automation and Screening.

  10. Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation

    Science.gov (United States)

    Potyrailo, Radislav A.; Mirsky, Vladimir M.

    New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.

  11. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  12. High-throughput technology for novel SO2 oxidation catalysts

    Directory of Open Access Journals (Sweden)

    Jonas Loskyll, Klaus Stoewe and Wilhelm F Maier

    2011-01-01

    Full Text Available We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations.

  13. Fusion genes and their discovery using high throughput sequencing.

    Science.gov (United States)

    Annala, M J; Parker, B C; Zhang, W; Nykter, M

    2013-11-01

    Fusion genes are hybrid genes that combine parts of two or more original genes. They can form as a result of chromosomal rearrangements or abnormal transcription, and have been shown to act as drivers of malignant transformation and progression in many human cancers. The biological significance of fusion genes together with their specificity to cancer cells has made them into excellent targets for molecular therapy. Fusion genes are also used as diagnostic and prognostic markers to confirm cancer diagnosis and monitor response to molecular therapies. High-throughput sequencing has enabled the systematic discovery of fusion genes in a wide variety of cancer types. In this review, we describe the history of fusion genes in cancer and the ways in which fusion genes form and affect cellular function. We also describe computational methodologies for detecting fusion genes from high-throughput sequencing experiments, and the most common sources of error that lead to false discovery of fusion genes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  15. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  16. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  17. Nanoliter high-throughput PCR for DNA and RNA profiling.

    Science.gov (United States)

    Brenan, Colin J H; Roberts, Douglas; Hurley, James

    2009-01-01

    The increasing emphasis in life science research on utilization of genetic and genomic information underlies the need for high-throughput technologies capable of analyzing the expression of multiple genes or the presence of informative single nucleotide polymorphisms (SNPs) in large-scale, population-based applications. Human disease research, disease diagnosis, personalized therapeutics, environmental monitoring, blood testing, and identification of genetic traits impacting agricultural practices, both in terms of food quality and production efficiency, are a few areas where such systems are in demand. This has stimulated the need for PCR technologies that preserves the intrinsic analytical benefits of PCR yet enables higher throughputs without increasing the time to answer, labor and reagent expenses and workflow complexity. An example of such a system based on a high-density array of nanoliter PCR assays is described here. Functionally equivalent to a microtiter plate, the nanoplate system makes possible up to 3,072 simultaneous end-point or real-time PCR measurements in a device, the size of a standard microscope slide. Methods for SNP genotyping with end-point TaqMan PCR assays and quantitative measurement of gene expression with SYBR Green I real-time PCR are outlined and illustrative data showing system performance is provided.

  18. Structuring intuition with theory: The high-throughput way

    Science.gov (United States)

    Fornari, Marco

    2015-03-01

    First principles methodologies have grown in accuracy and applicability to the point where large databases can be built, shared, and analyzed with the goal of predicting novel compositions, optimizing functional properties, and discovering unexpected relationships between the data. In order to be useful to a large community of users, data should be standardized, validated, and distributed. In addition, tools to easily manage large datasets should be made available to effectively lead to materials development. Within the AFLOW consortium we have developed a simple frame to expand, validate, and mine data repositories: the MTFrame. Our minimalistic approach complement AFLOW and other existing high-throughput infrastructures and aims to integrate data generation with data analysis. We present few examples from our work on materials for energy conversion. Our intent s to pinpoint the usefulness of high-throughput methodologies to guide the discovery process by quantitatively structuring the scientific intuition. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.

  19. An Air-Well sparging minifermenter system for high-throughput protein production.

    Science.gov (United States)

    Deantonio, Cecilia; Sedini, Valentina; Cesaro, Patrizia; Quasso, Fabio; Cotella, Diego; Persichetti, Francesca; Santoro, Claudio; Sblattero, Daniele

    2014-09-14

    Over the last few years High-Throughput Protein Production (HTPP) has played a crucial role for functional proteomics. High-quality, high yield and fast recombinant protein production are critical for new HTPP technologies. Escherichia coli is usually the expression system of choice in protein production thanks to its fast growth, ease of handling and high yields of protein produced. Even though shake-flask cultures are widely used, there is an increasing need for easy to handle, lab scale, high throughput systems. In this article we described a novel minifermenter system suitable for HTPP. The Air-Well minifermenter system is made by a homogeneous air sparging device that includes an air diffusion system, and a stainless steel 96 needle plate integrated with a 96 deep well plate where cultures take place. This system provides aeration to achieve higher optical density growth compared to classical shaking growth without the decrease in pH value and bacterial viability. Moreover the yield of recombinant protein is up to 3-fold higher with a considerable improvement in the amount of full length proteins. High throughput production of hundreds of proteins in parallel can be obtained sparging air in a continuous and controlled manner. The system used is modular and can be easily modified and scaled up to meet the demands for HTPP.

  20. Microfluidic droplet-based PCR instrumentation for high-throughput gene expression profiling and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Christopher J. Hayes

    2015-06-01

    Full Text Available PCR is a common and often indispensable technique used in medical and biological research labs for a variety of applications. Real-time quantitative PCR (RT-qPCR has become a definitive technique for quantitating differences in gene expression levels between samples. Yet, in spite of this importance, reliable methods to quantitate nucleic acid amounts in a higher throughput remain elusive. In the following paper, a unique design to quantify gene expression levels at the nanoscale in a continuous flow system is presented. Fully automated, high-throughput, low volume amplification of deoxynucleotides (DNA in a droplet based microfluidic system is described. Unlike some conventional qPCR instrumentation that use integrated fluidic circuits or plate arrays, the instrument performs qPCR in a continuous, micro-droplet flowing process with droplet generation, distinctive reagent mixing, thermal cycling and optical detection platforms all combined on one complete instrument. Detailed experimental profiling of reactions of less than 300 nl total volume is achieved using the platform demonstrating the dynamic range to be 4 order logs and consistent instrument sensitivity. Furthermore, reduced pipetting steps by as much as 90% and a unique degree of hands-free automation makes the analytical possibilities for this instrumentation far reaching. In conclusion, a discussion of the first demonstrations of this approach to perform novel, continuous high-throughput biological screens is presented. The results generated from the instrument, when compared with commercial instrumentation, demonstrate the instrument reliability and robustness to carry out further studies of clinical significance with added throughput and economic benefits.

  1. A robust robotic high-throughput antibody purification platform.

    Science.gov (United States)

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. High-throughput microcavitation bubble induced cellular mechanotransduction

    Science.gov (United States)

    Compton, Jonathan Lee

    inhibitor to IP 3 induced Ca2+ release. This capability opens the development of a high-throughput screening platform for molecules that modulate cellular mechanotransduction. We have applied this approach to screen the effects of a small set of small molecules, in a 96-well plate in less than an hour. These detailed studies offer a basis for the design, development, and implementation of a novel high-throughput mechanotransduction assay to rapidly screen the effect of small molecules on cellular mechanotransduction at high throughput.

  3. High-throughput ballistic injection nanorheology to measure cell mechanics

    Science.gov (United States)

    Wu, Pei-Hsun; Hale, Christopher M; Chen, Wei-Chiang; Lee, Jerry S H; Tseng, Yiider; Wirtz, Denis

    2015-01-01

    High-throughput ballistic injection nanorheology is a method for the quantitative study of cell mechanics. Cell mechanics are measured by ballistic injection of submicron particles into the cytoplasm of living cells and tracking the spontaneous displacement of the particles at high spatial resolution. The trajectories of the cytoplasm-embedded particles are transformed into mean-squared displacements, which are subsequently transformed into frequency-dependent viscoelastic moduli and time-dependent creep compliance of the cytoplasm. This method allows for the study of a wide range of cellular conditions, including cells inside a 3D matrix, cell subjected to shear flows and biochemical stimuli, and cells in a live animal. Ballistic injection lasts < 1 min and is followed by overnight incubation. Multiple particle tracking for one cell lasts < 1 min. Forty cells can be examined in < 1 h. PMID:22222790

  4. Machine Learning for High-Throughput Stress Phenotyping in Plants.

    Science.gov (United States)

    Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh Kumar; Sarkar, Soumik

    2016-02-01

    Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large corpus of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping. Four stages of the decision cycle in plant stress phenotyping and plant breeding activities where different ML approaches can be deployed are (i) identification, (ii) classification, (iii) quantification, and (iv) prediction (ICQP). We provide here a comprehensive overview and user-friendly taxonomy of ML tools to enable the plant community to correctly and easily apply the appropriate ML tools and best-practice guidelines for various biotic and abiotic stress traits. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. The Principals and Practice of Distributed High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The potential of Distributed Processing Systems to deliver computing capabilities with qualities ranging from high availability and reliability to easy expansion in functionality and capacity were recognized and formalized in the 1970’s. For more three decade these principals Distributed Computing guided the development of the HTCondor resource and job management system. The widely adopted suite of software tools offered by HTCondor are based on novel distributed computing technologies and are driven by the evolving needs of High Throughput scientific applications. We will review the principals that underpin our work, the distributed computing frameworks and technologies we developed and the lessons we learned from delivering effective and dependable software tools in an ever changing landscape computing technologies and needs that range today from a desktop computer to tens of thousands of cores offered by commercial clouds. About the speaker Miron Livny received a B.Sc. degree in Physics and Mat...

  6. Single-platelet nanomechanics measured by high-throughput cytometry

    Science.gov (United States)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2017-02-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  7. High-throughput antibody development and retrospective epitope mapping

    DEFF Research Database (Denmark)

    Rydahl, Maja Gro

    Plant cell walls are composed of an interlinked network of polysaccharides, glycoproteins and phenolic polymers. When addressing the diverse polysaccharides in green plants, including land plants and the ancestral green algae, there are significant overlaps in the cell wall structures. Yet......, there are noteworthy differences in the less evolved species of algae as compared to land plants. The dynamic process orchestrating the deposition of these biopolymers both in algae and higher plants, is complex and highly heterogeneous, yet immensely important for the development and differentiation of the cell...... of green algae, during the development into land plants. Hence, there is a pressing need for rethinking the glycomic toolbox, by developing new and high-throughput (HTP) technology, in order to acquire information of the location and relative abundance of diverse cell wall polymers. In this dissertation...

  8. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  9. Ethoscopes: An open platform for high-throughput ethomics.

    Science.gov (United States)

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J; French, Alice S; Jamasb, Arian R; Gilestro, Giorgio F

    2017-10-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  10. Ethoscopes: An open platform for high-throughput ethomics

    Science.gov (United States)

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J.; French, Alice S.; Jamasb, Arian R.

    2017-01-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope. PMID:29049280

  11. High-throughput drawing and testing of metallic glass nanostructures.

    Science.gov (United States)

    Hasan, Molla; Kumar, Golden

    2017-03-02

    Thermoplastic embossing of metallic glasses promises direct imprinting of metal nanostructures using templates. However, embossing high-aspect-ratio nanostructures faces unworkable flow resistance due to friction and non-wetting conditions at the template interface. Herein, we show that these inherent challenges of embossing can be reversed by thermoplastic drawing using templates. The flow resistance not only remains independent of wetting but also decreases with increasing feature aspect-ratio. Arrays of assembled nanotips, nanowires, and nanotubes with aspect-ratios exceeding 1000 can be produced through controlled elongation and fracture of metallic glass structures. In contrast to embossing, the drawing approach generates two sets of nanostructures upon final fracture; one set remains anchored to the metallic glass substrate while the second set is assembled on the template. This method can be readily adapted for high-throughput fabrication and testing of nanoscale tensile specimens, enabling rapid screening of size-effects in mechanical behavior.

  12. Statistically invalid classification of high throughput gene expression data

    Science.gov (United States)

    Barbash, Shahar; Soreq, Hermona

    2013-01-01

    Classification analysis based on high throughput data is a common feature in neuroscience and other fields of science, with a rapidly increasing impact on both basic biology and disease-related studies. The outcome of such classifications often serves to delineate novel biochemical mechanisms in health and disease states, identify new targets for therapeutic interference, and develop innovative diagnostic approaches. Given the importance of this type of studies, we screened 111 recently-published high-impact manuscripts involving classification analysis of gene expression, and found that 58 of them (53%) based their conclusions on a statistically invalid method which can lead to bias in a statistical sense (lower true classification accuracy then the reported classification accuracy). In this report we characterize the potential methodological error and its scope, investigate how it is influenced by different experimental parameters, and describe statistically valid methods for avoiding such classification mistakes. PMID:23346359

  13. High throughput sequencing of microRNAs in chicken somites.

    Science.gov (United States)

    Rathjen, Tina; Pais, Helio; Sweetman, Dylan; Moulton, Vincent; Munsterberg, Andrea; Dalmay, Tamas

    2009-05-06

    High throughput Solexa sequencing technology was applied to identify microRNAs in somites of developing chicken embryos. We obtained 651,273 reads, from which 340,415 were mapped to the chicken genome representing 1701 distinct sequences. Eighty-five of these were known microRNAs and 42 novel miRNA candidates were identified. Accumulation of 18 of 42 sequences was confirmed by Northern blot analysis. Ten of the 18 sequences are new variants of known miRNAs and eight short RNAs are novel miRNAs. Six of these eight have not been reported by other deep sequencing projects. One of the six new miRNAs is highly enriched in somite tissue suggesting that deep sequencing of other specific tissues has the potential to identify novel tissue specific miRNAs.

  14. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  15. A high throughput DNA extraction method with high yield and quality.

    Science.gov (United States)

    Xin, Zhanguo; Chen, Junping

    2012-07-28

    Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome), and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L.) Moench] leaves and dry seeds with high yield, high quality, and affordable cost. We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  16. High-throughput phenotyping of seminal root traits in wheat.

    Science.gov (United States)

    Richard, Cecile Ai; Hickey, Lee T; Fletcher, Susan; Jennings, Raeleen; Chenu, Karine; Christopher, Jack T

    2015-01-01

    Water availability is a major limiting factor for wheat (Triticum aestivum L.) production in rain-fed agricultural systems worldwide. Root system architecture has important functional implications for the timing and extent of soil water extraction, yet selection for root architectural traits in breeding programs has been limited by a lack of suitable phenotyping methods. The aim of this research was to develop low-cost high-throughput phenotyping methods to facilitate selection for desirable root architectural traits. Here, we report two methods, one using clear pots and the other using growth pouches, to assess the angle and the number of seminal roots in wheat seedlings- two proxy traits associated with the root architecture of mature wheat plants. Both methods revealed genetic variation for seminal root angle and number in the panel of 24 wheat cultivars. The clear pot method provided higher heritability and higher genetic correlations across experiments compared to the growth pouch method. In addition, the clear pot method was more efficient - requiring less time, space, and labour compared to the growth pouch method. Therefore the clear pot method was considered the most suitable for large-scale and high-throughput screening of seedling root characteristics in crop improvement programs. The clear-pot method could be easily integrated in breeding programs targeting drought tolerance to rapidly enrich breeding populations with desirable alleles. For instance, selection for narrow root angle and high number of seminal roots could lead to deeper root systems with higher branching at depth. Such root characteristics are highly desirable in wheat to cope with anticipated future climate conditions, particularly where crops rely heavily on stored soil moisture at depth, including some Australian, Indian, South American, and African cropping regions.

  17. High-Throughput Genomics Enhances Tomato Breeding Efficiency

    Science.gov (United States)

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-01-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits. PMID:19721805

  18. Surrogate-assisted feature extraction for high-throughput phenotyping.

    Science.gov (United States)

    Yu, Sheng; Chakrabortty, Abhishek; Liao, Katherine P; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2017-04-01

    Phenotyping algorithms are capable of accurately identifying patients with specific phenotypes from within electronic medical records systems. However, developing phenotyping algorithms in a scalable way remains a challenge due to the extensive human resources required. This paper introduces a high-throughput unsupervised feature selection method, which improves the robustness and scalability of electronic medical record phenotyping without compromising its accuracy. The proposed Surrogate-Assisted Feature Extraction (SAFE) method selects candidate features from a pool of comprehensive medical concepts found in publicly available knowledge sources. The target phenotype's International Classification of Diseases, Ninth Revision and natural language processing counts, acting as noisy surrogates to the gold-standard labels, are used to create silver-standard labels. Candidate features highly predictive of the silver-standard labels are selected as the final features. Algorithms were trained to identify patients with coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis using various numbers of labels to compare the performance of features selected by SAFE, a previously published automated feature extraction for phenotyping procedure, and domain experts. The out-of-sample area under the receiver operating characteristic curve and F -score from SAFE algorithms were remarkably higher than those from the other two, especially at small label sizes. SAFE advances high-throughput phenotyping methods by automatically selecting a succinct set of informative features for algorithm training, which in turn reduces overfitting and the needed number of gold-standard labels. SAFE also potentially identifies important features missed by automated feature extraction for phenotyping or experts.

  19. A Primer on High-Throughput Computing for Genomic Selection

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  20. An improved high throughput sequencing method for studying oomycete communities

    DEFF Research Database (Denmark)

    Sapkota, Rumakanta; Nicolaisen, Mogens

    2015-01-01

    Culture-independent studies using next generation sequencing have revolutionizedmicrobial ecology, however, oomycete ecology in soils is severely lagging behind. The aimof this study was to improve and validate standard techniques for using high throughput sequencing as a tool for studying oomycete...... agricultural fields in Denmark, and 11 samples from carrot tissue with symptoms of Pythium infection. Sequence data from the Pythium and Phytophthora mock communities showed that our strategy successfully detected all included species. Taxonomic assignments of OTUs from 26 soil sample showed that 95...... the usefulness of the method not only in soil DNA but also in a plant DNA background. In conclusion, we demonstrate a successful approach for pyrosequencing of oomycete communities using ITS1 as the barcode sequence with well-known primers for oomycete DNA amplification....

  1. High-Throughput Screening Using Fourier-Transform Infrared Imaging

    Directory of Open Access Journals (Sweden)

    Erdem Sasmaz

    2015-06-01

    Full Text Available Efficient parallel screening of combinatorial libraries is one of the most challenging aspects of the high-throughput (HT heterogeneous catalysis workflow. Today, a number of methods have been used in HT catalyst studies, including various optical, mass-spectrometry, and gas-chromatography techniques. Of these, rapid-scanning Fourier-transform infrared (FTIR imaging is one of the fastest and most versatile screening techniques. Here, the new design of the 16-channel HT reactor is presented and test results for its accuracy and reproducibility are shown. The performance of the system was evaluated through the oxidation of CO over commercial Pd/Al2O3 and cobalt oxide nanoparticles synthesized with different reducer-reductant molar ratios, surfactant types, metal and surfactant concentrations, synthesis temperatures, and ramp rates.

  2. High throughput parametric studies of the structure of complex nanomaterials

    Science.gov (United States)

    Tian, Peng

    The structure of nanoscale materials is difficult to study because crystallography, the gold-standard for structure studies, no longer works at the nanoscale. New tools are needed to study nanostructure. Furthermore, it is important to study the evolution of nanostructure of complex nanostructured materials as a function of various parameters such as temperature or other environmental variables. These are called parametric studies because an environmental parameter is being varied. This means that the new tools for studying nanostructure also need to be extended to work quickly and on large numbers of datasets. This thesis describes the development of new tools for high throughput studies of complex and nanostructured materials, and their application to study the structural evolution of bulk, and nanoparticles of, MnAs as a function of temperature. The tool for high throughput analysis of the bulk material was developed as part of this PhD thesis work and is called SrRietveld. A large part of making a new tool is to validate it and we did this for SrRietveld by carrying out a high-throughput study of uncertainties coming from the program using different ways of estimating the uncertainty. This tool was applied to study structural changes in MnAs as a function of temperature. We were also interested in studying different MnAs nanoparticles fabricated through different methods because of their applications in information storage. PDFgui, an existing tool for analyzing nanoparticles using Pair distribution function (PDF) refinement, was used in these cases. Comparing the results from the analysis by SrRietveld and PDFgui, we got more comprehensive structure information about MnAs. The layout of the thesis is as follows. First, the background knowledge about material structures is given. The conventional crystallographic analysis is introduced in both theoretical and practical ways. For high throughput study, the next-generation Rietveld analysis program: Sr

  3. High-Throughput Automation in Chemical Process Development.

    Science.gov (United States)

    Selekman, Joshua A; Qiu, Jun; Tran, Kristy; Stevens, Jason; Rosso, Victor; Simmons, Eric; Xiao, Yi; Janey, Jacob

    2017-06-07

    High-throughput (HT) techniques built upon laboratory automation technology and coupled to statistical experimental design and parallel experimentation have enabled the acceleration of chemical process development across multiple industries. HT technologies are often applied to interrogate wide, often multidimensional experimental spaces to inform the design and optimization of any number of unit operations that chemical engineers use in process development. In this review, we outline the evolution of HT technology and provide a comprehensive overview of how HT automation is used throughout different industries, with a particular focus on chemical and pharmaceutical process development. In addition, we highlight the common strategies of how HT automation is incorporated into routine development activities to maximize its impact in various academic and industrial settings.

  4. Interactive Visual Analysis of High Throughput Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; Goodall, John R [ORNL; Maness, Christopher S [ORNL; Senter, James K [ORNL; Potok, Thomas E [ORNL

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  5. High-Throughput Mass Spectrometry Applied to Structural Genomics

    Directory of Open Access Journals (Sweden)

    Rod Chalk

    2014-10-01

    Full Text Available Mass spectrometry (MS remains under-utilized for the analysis of expressed proteins because it is inaccessible to the non-specialist, and sample-turnaround from service labs is slow. Here, we describe 3.5 min Liquid-Chromatography (LC-MS and 16 min LC-MSMS methods which are tailored to validation and characterization of recombinant proteins in a high throughput structural biology pipeline. We illustrate the type and scope of MS data typically obtained from a 96-well expression and purification test for both soluble and integral membrane proteins (IMPs, and describe their utility in the selection of constructs for scale-up structural work, leading to cost and efficiency savings. We propose that value of MS data lies in how quickly it becomes available and that this can fundamentally change the way in which it is used.

  6. Applications of High-Throughput Nucleotide Sequencing (PhD)

    DEFF Research Database (Denmark)

    Waage, Johannes

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come...... equally large demands in data handling, analysis and interpretation, perhaps defining the modern challenge of the computational biologist of the post-genomic era. The first part of this thesis consists of a general introduction to the history, common terms and challenges of next generation sequencing......). For the second flavor, DNA-seq, a study presenting genome wide profiling of transcription factor CEBP/A in liver cells undergoing regeneration after partial hepatectomy (article IV) is included....

  7. Automated high-throughput behavioral analyses in zebrafish larvae.

    Science.gov (United States)

    Richendrfer, Holly; Créton, Robbert

    2013-07-04

    We have created a novel high-throughput imaging system for the analysis of behavior in 7-day-old zebrafish larvae in multi-lane plates. This system measures spontaneous behaviors and the response to an aversive stimulus, which is shown to the larvae via a PowerPoint presentation. The recorded images are analyzed with an ImageJ macro, which automatically splits the color channels, subtracts the background, and applies a threshold to identify individual larvae placement in the lanes. We can then import the coordinates into an Excel sheet to quantify swim speed, preference for edge or side of the lane, resting behavior, thigmotaxis, distance between larvae, and avoidance behavior. Subtle changes in behavior are easily detected using our system, making it useful for behavioral analyses after exposure to environmental toxicants or pharmaceuticals.

  8. High-throughput ab-initio dilute solute diffusion database

    Science.gov (United States)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-01-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world. PMID:27434308

  9. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    RNAs with known killing effects as a model system to demonstrate that RPPA-based protein quantification can serve as substitute readout of cell viability, hereby reliably reflecting toxicity. In terms of automation, cell exposure, protein harvest, serial dilution and sample reformatting were performed using...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si...... a robotic screening platform. Furthermore, we automated sample tracking and data analysis by developing a bundled bioinformatics tool named “MIRACLE”. Automation and RPPA-based viability/toxicity readouts enable rapid testing of large sample numbers, while granting the possibility for flexible consecutive...

  10. A High-Throughput Antibody-Based Microarray Typing Platform

    Science.gov (United States)

    Andrew, Gehring; Charles, Barnett; Chu, Ted; DebRoy, Chitrita; D'Souza, Doris; Eaker, Shannon; Fratamico, Pina; Gillespie, Barbara; Hegde, Narasimha; Jones, Kevin; Lin, Jun; Oliver, Stephen; Paoli, George; Perera, Ashan; Uknalis, Joseph

    2013-01-01

    Many rapid methods have been developed for screening foods for the presence of pathogenic microorganisms. Rapid methods that have the additional ability to identify microorganisms via multiplexed immunological recognition have the potential for classification or typing of microbial contaminants thus facilitating epidemiological investigations that aim to identify outbreaks and trace back the contamination to its source. This manuscript introduces a novel, high throughput typing platform that employs microarrayed multiwell plate substrates and laser-induced fluorescence of the nucleic acid intercalating dye/stain SYBR Gold for detection of antibody-captured bacteria. The aim of this study was to use this platform for comparison of different sets of antibodies raised against the same pathogens as well as demonstrate its potential effectiveness for serotyping. To that end, two sets of antibodies raised against each of the “Big Six” non-O157 Shiga toxin-producing E. coli (STEC) as well as E. coli O157:H7 were array-printed into microtiter plates, and serial dilutions of the bacteria were added and subsequently detected. Though antibody specificity was not sufficient for the development of an STEC serotyping method, the STEC antibody sets performed reasonably well exhibiting that specificity increased at lower capture antibody concentrations or, conversely, at lower bacterial target concentrations. The favorable results indicated that with sufficiently selective and ideally concentrated sets of biorecognition elements (e.g., antibodies or aptamers), this high-throughput platform can be used to rapidly type microbial isolates derived from food samples within ca. 80 min of total assay time. It can also potentially be used to detect the pathogens from food enrichments and at least serve as a platform for testing antibodies. PMID:23645110

  11. Compound Cytotoxicity Profiling Using Quantitative High-Throughput Screening

    Science.gov (United States)

    Xia, Menghang; Huang, Ruili; Witt, Kristine L.; Southall, Noel; Fostel, Jennifer; Cho, Ming-Hsuang; Jadhav, Ajit; Smith, Cynthia S.; Inglese, James; Portier, Christopher J.; Tice, Raymond R.; Austin, Christopher P.

    2008-01-01

    Background The propensity of compounds to produce adverse health effects in humans is generally evaluated using animal-based test methods. Such methods can be relatively expensive, low-throughput, and associated with pain suffered by the treated animals. In addition, differences in species biology may confound extrapolation to human health effects. Objective The National Toxicology Program and the National Institutes of Health Chemical Genomics Center are collaborating to identify a battery of cell-based screens to prioritize compounds for further toxicologic evaluation. Methods A collection of 1,408 compounds previously tested in one or more traditional toxicologic assays were profiled for cytotoxicity using quantitative high-throughput screening (qHTS) in 13 human and rodent cell types derived from six common targets of xenobiotic toxicity (liver, blood, kidney, nerve, lung, skin). Selected cytotoxicants were further tested to define response kinetics. Results qHTS of these compounds produced robust and reproducible results, which allowed cross-compound, cross-cell type, and cross-species comparisons. Some compounds were cytotoxic to all cell types at similar concentrations, whereas others exhibited species- or cell type–specific cytotoxicity. Closely related cell types and analogous cell types in human and rodent frequently showed different patterns of cytotoxicity. Some compounds inducing similar levels of cytotoxicity showed distinct time dependence in kinetic studies, consistent with known mechanisms of toxicity. Conclusions The generation of high-quality cytotoxicity data on this large library of known compounds using qHTS demonstrates the potential of this methodology to profile a much broader array of assays and compounds, which, in aggregate, may be valuable for prioritizing compounds for further toxicologic evaluation, identifying compounds with particular mechanisms of action, and potentially predicting in vivo biological response. PMID:18335092

  12. High throughput phenotyping for aphid resistance in large plant collections

    Directory of Open Access Journals (Sweden)

    Chen Xi

    2012-08-01

    Full Text Available Abstract Background Phloem-feeding insects are among the most devastating pests worldwide. They not only cause damage by feeding from the phloem, thereby depleting the plant from photo-assimilates, but also by vectoring viruses. Until now, the main way to prevent such problems is the frequent use of insecticides. Applying resistant varieties would be a more environmental friendly and sustainable solution. For this, resistant sources need to be identified first. Up to now there were no methods suitable for high throughput phenotyping of plant germplasm to identify sources of resistance towards phloem-feeding insects. Results In this paper we present a high throughput screening system to identify plants with an increased resistance against aphids. Its versatility is demonstrated using an Arabidopsis thaliana activation tag mutant line collection. This system consists of the green peach aphid Myzus persicae (Sulzer and the circulative virus Turnip yellows virus (TuYV. In an initial screening, with one plant representing one mutant line, 13 virus-free mutant lines were identified by ELISA. Using seeds produced from these lines, the putative candidates were re-evaluated and characterized, resulting in nine lines with increased resistance towards the aphid. Conclusions This M. persicae-TuYV screening system is an efficient, reliable and quick procedure to identify among thousands of mutated lines those resistant to aphids. In our study, nine mutant lines with increased resistance against the aphid were selected among 5160 mutant lines in just 5 months by one person. The system can be extended to other phloem-feeding insects and circulative viruses to identify insect resistant sources from several collections, including for example genebanks and artificially prepared mutant collections.

  13. High-throughput DNA extraction of forensic adhesive tapes.

    Science.gov (United States)

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  14. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2018-01-01

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org. Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  15. Emerging metrology for high-throughput nanomaterial genotoxicology.

    Science.gov (United States)

    Nelson, Bryant C; Wright, Christa W; Ibuki, Yuko; Moreno-Villanueva, Maria; Karlsson, Hanna L; Hendriks, Giel; Sims, Christopher M; Singh, Neenu; Doak, Shareen H

    2017-01-01

    The rapid development of the engineered nanomaterial (ENM) manufacturing industry has accelerated the incorporation of ENMs into a wide variety of consumer products across the globe. Unintentionally or not, some of these ENMs may be introduced into the environment or come into contact with humans or other organisms resulting in unexpected biological effects. It is thus prudent to have rapid and robust analytical metrology in place that can be used to critically assess and/or predict the cytotoxicity, as well as the potential genotoxicity of these ENMs. Many of the traditional genotoxicity test methods [e.g. unscheduled DNA synthesis assay, bacterial reverse mutation (Ames) test, etc.,] for determining the DNA damaging potential of chemical and biological compounds are not suitable for the evaluation of ENMs, due to a variety of methodological issues ranging from potential assay interferences to problems centered on low sample throughput. Recently, a number of sensitive, high-throughput genotoxicity assays/platforms (CometChip assay, flow cytometry/micronucleus assay, flow cytometry/γ-H2AX assay, automated 'Fluorimetric Detection of Alkaline DNA Unwinding' (FADU) assay, ToxTracker reporter assay) have been developed, based on substantial modifications and enhancements of traditional genotoxicity assays. These new assays have been used for the rapid measurement of DNA damage (strand breaks), chromosomal damage (micronuclei) and for detecting upregulated DNA damage signalling pathways resulting from ENM exposures. In this critical review, we describe and discuss the fundamental measurement principles and measurement endpoints of these new assays, as well as the modes of operation, analytical metrics and potential interferences, as applicable to ENM exposures. An unbiased discussion of the major technical advantages and limitations of each assay for evaluating and predicting the genotoxic potential of ENMs is also provided. Published by Oxford University Press on

  16. Novel method for the high-throughput processing of slides for the comet assay.

    Science.gov (United States)

    Karbaschi, Mahsa; Cooke, Marcus S

    2014-11-26

    Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. "Scoring", or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure.

  17. High-throughput Biological Cell Classification Featuring Real-time Optical Data Compression

    CERN Document Server

    Jalali, Bahram; Chen, Claire L

    2015-01-01

    High throughput real-time instruments are needed to acquire large data sets for detection and classification of rare events. Enabled by the photonic time stretch digitizer, a new class of instruments with record throughputs have led to the discovery of optical rogue waves [1], detection of rare cancer cells [2], and the highest analog-to-digital conversion performance ever achieved [3]. Featuring continuous operation at 100 million frames per second and shutter speed of less than a nanosecond, the time stretch camera is ideally suited for screening of blood and other biological samples. It has enabled detection of breast cancer cells in blood with record, one-in-a-million, sensitivity [2]. Owing to their high real-time throughput, instruments produce a torrent of data - equivalent to several 4K movies per second - that overwhelm data acquisition, storage, and processing operations. This predicament calls for technologies that compress images in optical domain and in real-time. An example of this, based on war...

  18. Recent advances in high-throughput QCL-based infrared microspectral imaging (Conference Presentation)

    Science.gov (United States)

    Rowlette, Jeremy A.; Fotheringham, Edeline; Nichols, David; Weida, Miles J.; Kane, Justin; Priest, Allen; Arnone, David B.; Bird, Benjamin; Chapman, William B.; Caffey, David B.; Larson, Paul; Day, Timothy

    2017-02-01

    The field of infrared spectral imaging and microscopy is advancing rapidly due in large measure to the recent commercialization of the first high-throughput, high-spatial-definition quantum cascade laser (QCL) microscope. Having speed, resolution and noise performance advantages while also eliminating the need for cryogenic cooling, its introduction has established a clear path to translating the well-established diagnostic capability of infrared spectroscopy into clinical and pre-clinical histology, cytology and hematology workflows. Demand for even higher throughput while maintaining high-spectral fidelity and low-noise performance continues to drive innovation in QCL-based spectral imaging instrumentation. In this talk, we will present for the first time, recent technological advances in tunable QCL photonics which have led to an additional 10X enhancement in spectral image data collection speed while preserving the high spectral fidelity and SNR exhibited by the first generation of QCL microscopes. This new approach continues to leverage the benefits of uncooled microbolometer focal plane array cameras, which we find to be essential for ensuring both reproducibility of data across instruments and achieving the high-reliability needed in clinical applications. We will discuss the physics underlying these technological advancements as well as the new biomedical applications these advancements are enabling, including automated whole-slide infrared chemical imaging on clinically relevant timescales.

  19. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.

    Science.gov (United States)

    Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli

    2018-01-23

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning

  20. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  1. Field high-throughput phenotyping: the new crop breeding frontier.

    Science.gov (United States)

    Araus, José Luis; Cairns, Jill E

    2014-01-01

    Constraints in field phenotyping capability limit our ability to dissect the genetics of quantitative traits, particularly those related to yield and stress tolerance (e.g., yield potential as well as increased drought, heat tolerance, and nutrient efficiency, etc.). The development of effective field-based high-throughput phenotyping platforms (HTPPs) remains a bottleneck for future breeding advances. However, progress in sensors, aeronautics, and high-performance computing are paving the way. Here, we review recent advances in field HTPPs, which should combine at an affordable cost, high capacity for data recording, scoring and processing, and non-invasive remote sensing methods, together with automated environmental data collection. Laboratory analyses of key plant parts may complement direct phenotyping under field conditions. Improvements in user-friendly data management together with a more powerful interpretation of results should increase the use of field HTPPs, therefore increasing the efficiency of crop genetic improvement to meet the needs of future generations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Bifrost: A Python/C++ Framework for High-Throughput Stream Processing in Astronomy

    Science.gov (United States)

    Cranmer, Miles D.; Barsdell, Benjamin R.; Price, Danny C.; Dowell, Jayce; Garsden, Hugh; Dike, Veronica; Eftekhari, Tarraneh; Hegedus, Alexander M.; Malins, Joseph; Obenberger, Kenneth S.; Schinzel, Frank; Stovall, Kevin; Taylor, Gregory B.; Greenhill, Lincoln J.

    Radio astronomy observatories with high throughput back end instruments require real-time data processing. While computing hardware continues to advance rapidly, development of real-time processing pipelines remains difficult and time-consuming, which can limit scientific productivity. Motivated by this, we have developed Bifrost: an open-source software framework for rapid pipeline development.(a) Bifrost combines a high-level Python interface with highly efficient reconfigurable data transport and a library of computing blocks for CPU and GPU processing. The framework is generalizable, but initially it emphasizes the needs of high-throughput radio astronomy pipelines, such as the ability to process data buffers as if they were continuous streams, the capacity to partition processing into distinct data sequences (e.g. separate observations), and the ability to extract specific intervals from buffered data. Computing blocks in the library are designed for applications such as interferometry, pulsar dedispersion and timing, and transient search pipelines. We describe the design and implementation of the Bifrost framework and demonstrate its use as the backbone in the correlation and beamforming back end of the Long Wavelength Array (LWA) station in the Sevilleta National Wildlife Refuge, NM.

  3. The complete automation of cell culture: improvements for high-throughput and high-content screening.

    Science.gov (United States)

    Jain, Shushant; Sondervan, David; Rizzu, Patrizia; Bochdanovits, Zoltan; Caminada, Daniel; Heutink, Peter

    2011-09-01

    Genomic approaches provide enormous amounts of raw data with regard to genetic variation, the diversity of RNA species, and protein complement. High-throughput (HT) and high-content (HC) cellular screens are ideally suited to contextualize the information gathered from other "omic" approaches into networks and can be used for the identification of therapeutic targets. Current methods used for HT-HC screens are laborious, time-consuming, and prone to human error. The authors thus developed an automated high-throughput system with an integrated fluorescent imager for HC screens called the AI.CELLHOST. The implementation of user-defined culturing and assay plate setup parameters allows parallel operation of multiple screens in diverse mammalian cell types. The authors demonstrate that such a system is able to successfully maintain different cell lines in culture for extended periods of time as well as significantly increasing throughput, accuracy, and reproducibility of HT and HC screens.

  4. High-throughput microfluidic device for single cell analysis using multiple integrated soft lithographic pumps.

    Science.gov (United States)

    Patabadige, Damith E W; Mickleburgh, Tom; Ferris, Lorin; Brummer, Gage; Culbertson, Anne H; Culbertson, Christopher T

    2016-05-01

    The ability to accurately control fluid transport in microfluidic devices is key for developing high-throughput methods for single cell analysis. Making small, reproducible changes to flow rates, however, to optimize lysis and injection using pumps external to the microfluidic device are challenging and time-consuming. To improve the throughput and increase the number of cells analyzed, we have integrated previously reported micropumps into a microfluidic device that can increase the cell analysis rate to ∼1000 cells/h and operate for over an hour continuously. In order to increase the flow rates sufficiently to handle cells at a higher throughput, three sets of pumps were multiplexed. These pumps are simple, low-cost, durable, easy to fabricate, and biocompatible. They provide precise control of the flow rate up to 9.2 nL/s. These devices were used to automatically transport, lyse, and electrophoretically separate T-Lymphocyte cells loaded with Oregon green and 6-carboxyfluorescein. Peak overlap statistics predicted the number of fully resolved single-cell electropherograms seen. In addition, there was no change in the average fluorescent dye peak areas indicating that the cells remained intact and the dyes did not leak out of the cells over the 1 h analysis time. The cell lysate peak area distribution followed that expected of an asynchronous steady-state population of immortalized cells. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  6. Quantitative High-Throughput Screening Using a Coincidence Reporter Biocircuit.

    Science.gov (United States)

    Schuck, Brittany W; MacArthur, Ryan; Inglese, James

    2017-04-10

    Reporter-biased artifacts-i.e., compounds that interact directly with the reporter enzyme used in a high-throughput screening (HTS) assay and not the biological process or pharmacology being interrogated-are now widely recognized to reduce the efficiency and quality of HTS used for chemical probe and therapeutic development. Furthermore, narrow or single-concentration HTS perpetuates false negatives during primary screening campaigns. Titration-based HTS, or quantitative HTS (qHTS), and coincidence reporter technology can be employed to reduce false negatives and false positives, respectively, thereby increasing the quality and efficiency of primary screening efforts, where the number of compounds investigated can range from tens of thousands to millions. The three protocols described here allow for generation of a coincidence reporter (CR) biocircuit to interrogate a biological or pharmacological question of interest, generation of a stable cell line expressing the CR biocircuit, and qHTS using the CR biocircuit to efficiently identify high-quality biologically active small molecules. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  7. High-throughput literature mining to support read-across ...

    Science.gov (United States)

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing biological similarity. Many research efforts are underway such as structuring mechanistic data in adverse outcome pathways and investigating the utility of high throughput (HT)/high content (HC) screening data. A largely untapped resource for read-across to date is the biomedical literature. This information has the potential to support read-across by facilitating the identification of valid source analogues with similar biological and toxicological profiles as well as providing the mechanistic understanding for any prediction made. A key challenge in using biomedical literature is to convert and translate its unstructured form into a computable format that can be linked to chemical structure. We developed a novel text-mining strategy to represent literature information for read across. Keywords were used to organize literature into toxicity signatures at the chemical level. These signatures were integrated with HT in vitro data and curated chemical structures. A rule-based algorithm assessed the strength of the literature relationship, providing a mechanism to rank and visualize the signature as literature ToxPIs (LitToxPIs). LitToxPIs were developed for over 6,000 chemicals for a varie

  8. High Throughput Heuristics for Prioritizing Human Exposure to ...

    Science.gov (United States)

    The risk posed to human health by any of the thousands of untested anthropogenic chemicals in our environment is a function of both the potential hazard presented by the chemical, and the possibility of being exposed. Without the capacity to make quantitative, albeit uncertain, forecasts of exposure, the putative risk of adverse health effect from a chemical cannot be evaluated. We used Bayesian methodology to infer ranges of exposure intakes that are consistent with biomarkers of chemical exposures identified in urine samples from the U.S. population by the National Health and Nutrition Examination Survey (NHANES). We perform linear regression on inferred exposure for demographic subsets of NHANES demarked by age, gender, and weight using high throughput chemical descriptors gleaned from databases and chemical structure-based calculators. We find that five of these descriptors are capable of explaining roughly 50% of the variability across chemicals for all the demographic groups examined, including children aged 6-11. For the thousands of chemicals with no other source of information, this approach allows rapid and efficient prediction of average exposure intake of environmental chemicals. The methods described by this manuscript provide a highly improved methodology for HTS of human exposure to environmental chemicals. The manuscript includes a ranking of 7785 environmental chemicals with respect to potential human exposure, including most of the Tox21 in vit

  9. Efficient Management of High-Throughput Screening Libraries with SAVANAH.

    Science.gov (United States)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen; Christiansen, Helle; Tan, Qihua; Mollenhauer, Jan; Baumbach, Jan

    2017-02-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such screens are molecular libraries, that is, microtiter plates with solubilized reagents such as siRNAs, shRNAs, miRNA inhibitors or mimics, and sgRNAs, or small compounds, that is, drugs. These reagents are typically condensed to provide enough material for covering several screens. Library plates thus need to be serially diluted before they can be used as assay plates. This process, however, leads to an explosion in the number of plates and samples to be tracked. Here, we present SAVANAH, the first tool to effectively manage molecular screening libraries across dilution series. It conveniently links (connects) sample information from the library to experimental results from the assay plates. All results can be exported to the R statistical environment or piped into HiTSeekR ( http://hitseekr.compbio.sdu.dk ) for comprehensive follow-up analyses. In summary, SAVANAH supports the HTS community in managing and analyzing HTS experiments with an emphasis on serially diluted molecular libraries.

  10. High-Throughput Identification of Antimicrobial Peptides from Amphibious Mudskippers.

    Science.gov (United States)

    Yi, Yunhai; You, Xinxin; Bian, Chao; Chen, Shixi; Lv, Zhao; Qiu, Limei; Shi, Qiong

    2017-11-22

    Widespread existence of antimicrobial peptides (AMPs) has been reported in various animals with comprehensive biological activities, which is consistent with the important roles of AMPs as the first line of host defense system. However, no big-data-based analysis on AMPs from any fish species is available. In this study, we identified 507 AMP transcripts on the basis of our previously reported genomes and transcriptomes of two representative amphibious mudskippers, Boleophthalmus pectinirostris (BP) and Periophthalmus magnuspinnatus (PM). The former is predominantly aquatic with less time out of water, while the latter is primarily terrestrial with extended periods of time on land. Within these identified AMPs, 449 sequences are novel; 15 were reported in BP previously; 48 are identically overlapped between BP and PM; 94 were validated by mass spectrometry. Moreover, most AMPs presented differential tissue transcription patterns in the two mudskippers. Interestingly, we discovered two AMPs, hemoglobin β1 and amylin, with high inhibitions on Micrococcus luteus. In conclusion, our high-throughput screening strategy based on genomic and transcriptomic data opens an efficient pathway to discover new antimicrobial peptides for ongoing development of marine drugs.

  11. High-Throughput Identification of Antimicrobial Peptides from Amphibious Mudskippers

    Directory of Open Access Journals (Sweden)

    Yunhai Yi

    2017-11-01

    Full Text Available Widespread existence of antimicrobial peptides (AMPs has been reported in various animals with comprehensive biological activities, which is consistent with the important roles of AMPs as the first line of host defense system. However, no big-data-based analysis on AMPs from any fish species is available. In this study, we identified 507 AMP transcripts on the basis of our previously reported genomes and transcriptomes of two representative amphibious mudskippers, Boleophthalmus pectinirostris (BP and Periophthalmus magnuspinnatus (PM. The former is predominantly aquatic with less time out of water, while the latter is primarily terrestrial with extended periods of time on land. Within these identified AMPs, 449 sequences are novel; 15 were reported in BP previously; 48 are identically overlapped between BP and PM; 94 were validated by mass spectrometry. Moreover, most AMPs presented differential tissue transcription patterns in the two mudskippers. Interestingly, we discovered two AMPs, hemoglobin β1 and amylin, with high inhibitions on Micrococcus luteus. In conclusion, our high-throughput screening strategy based on genomic and transcriptomic data opens an efficient pathway to discover new antimicrobial peptides for ongoing development of marine drugs.

  12. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    Directory of Open Access Journals (Sweden)

    Julio Alonso-Padilla

    2014-12-01

    Full Text Available The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  13. SNP-PHAGE – High throughput SNP discovery pipeline

    Directory of Open Access Journals (Sweden)

    Cregan Perry B

    2006-10-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs as defined here are single base sequence changes or short insertion/deletions between or within individuals of a given species. As a result of their abundance and the availability of high throughput analysis technologies SNP markers have begun to replace other traditional markers such as restriction fragment length polymorphisms (RFLPs, amplified fragment length polymorphisms (AFLPs and simple sequence repeats (SSRs or microsatellite markers for fine mapping and association studies in several species. For SNP discovery from chromatogram data, several bioinformatics programs have to be combined to generate an analysis pipeline. Results have to be stored in a relational database to facilitate interrogation through queries or to generate data for further analyses such as determination of linkage disequilibrium and identification of common haplotypes. Although these tasks are routinely performed by several groups, an integrated open source SNP discovery pipeline that can be easily adapted by new groups interested in SNP marker development is currently unavailable. Results We developed SNP-PHAGE (SNP discovery Pipeline with additional features for identification of common haplotypes within a sequence tagged site (Haplotype Analysis and GenBank (-dbSNP submissions. This tool was applied for analyzing sequence traces from diverse soybean genotypes to discover over 10,000 SNPs. This package was developed on UNIX/Linux platform, written in Perl and uses a MySQL database. Scripts to generate a user-friendly web interface are also provided with common queries for preliminary data analysis. A machine learning tool developed by this group for increasing the efficiency of SNP discovery is integrated as a part of this package as an optional feature. The SNP-PHAGE package is being made available open source at http://bfgl.anri.barc.usda.gov/ML/snp-phage/. Conclusion SNP-PHAGE provides a bioinformatics

  14. The Complete Automation of Cell Culture: Improvements for High-Throughput and High-Content Screening

    NARCIS (Netherlands)

    Jain, S.; Sondervan, D.; Rizzu, P.; Bochdanovits, Z.; Caminada, D.; Heutink, P.

    2011-01-01

    Genomic approaches provide enormous amounts of raw data with regard to genetic variation, the diversity of RNA species, and protein complement. high-throughput (HT) and high-content (HC) cellular screens are ideally suited to contextualize the information gathered from other "omic" approaches into

  15. Performance of high-throughput DNA quantification methods

    Directory of Open Access Journals (Sweden)

    Chanock Stephen J

    2003-10-01

    Full Text Available Abstract Background The accuracy and precision of estimates of DNA concentration are critical factors for efficient use of DNA samples in high-throughput genotype and sequence analyses. We evaluated the performance of spectrophotometric (OD DNA quantification, and compared it to two fluorometric quantification methods, the PicoGreen® assay (PG, and a novel real-time quantitative genomic PCR assay (QG specific to a region at the human BRCA1 locus. Twenty-Two lymphoblastoid cell line DNA samples with an initial concentration of ~350 ng/uL were diluted to 20 ng/uL. DNA concentration was estimated by OD and further diluted to 5 ng/uL. The concentrations of multiple aliquots of the final dilution were measured by the OD, QG and PG methods. The effects of manual and robotic laboratory sample handling procedures on the estimates of DNA concentration were assessed using variance components analyses. Results The OD method was the DNA quantification method most concordant with the reference sample among the three methods evaluated. A large fraction of the total variance for all three methods (36.0–95.7% was explained by sample-to-sample variation, whereas the amount of variance attributable to sample handling was small (0.8–17.5%. Residual error (3.2–59.4%, corresponding to un-modelled factors, contributed a greater extent to the total variation than the sample handling procedures. Conclusion The application of a specific DNA quantification method to a particular molecular genetic laboratory protocol must take into account the accuracy and precision of the specific method, as well as the requirements of the experimental workflow with respect to sample volumes and throughput. While OD was the most concordant and precise DNA quantification method in this study, the information provided by the quantitative PCR assay regarding the suitability of DNA samples for PCR may be an essential factor for some protocols, despite the decreased concordance and

  16. High throughput comet assay to study genotoxicity of nanomaterials

    Directory of Open Access Journals (Sweden)

    Naouale El Yamani

    2015-06-01

    Full Text Available The unique physicochemical properties of engineered nanomaterials (NMs have accelerated their use in diverse industrial and domestic products. Although their presence in consumer products represents a major concern for public health safety, their potential impact on human health is poorly understood. There is therefore an urgent need to clarify the toxic effects of NMs and to elucidate the mechanisms involved. In view of the large number of NMs currently being used, high throughput (HTP screening technologies are clearly needed for efficient assessment of toxicity. The comet assay is the most used method in nanogenotoxicity studies and has great potential for increasing throughput as it is fast, versatile and robust; simple technical modifications of the assay make it possible to test many compounds (NMs in a single experiment. The standard gel of 70-100 μL contains thousands of cells, of which only a tiny fraction are actually scored. Reducing the gel to a volume of 5 μL, with just a few hundred cells, allows twelve gels to be set on a standard slide, or 96 as a standard 8x12 array. For the 12 gel format, standard slides precoated with agarose are placed on a metal template and gels are set on the positions marked on the template. The HTP comet assay, incorporating digestion of DNA with formamidopyrimidine DNA glycosylase (FPG to detect oxidised purines, has recently been applied to study the potential induction of genotoxicity by NMs via reactive oxygen. In the NanoTEST project we investigated the genotoxic potential of several well-characterized metal and polymeric nanoparticles with the comet assay. All in vitro studies were harmonized; i.e. NMs were from the same batch, and identical dispersion protocols, exposure time, concentration range, culture conditions, and time-courses were used. As a kidney model, Cos-1 fibroblast-like kidney cells were treated with different concentrations of iron oxide NMs, and cells embedded in minigels (12

  17. Recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers

    DEFF Research Database (Denmark)

    Andersen, Lars Dyrskjøt; Zieger, Karsten; Ørntoft, Torben Falck

    2007-01-01

    individually contributed to the management of the disease. However, the development of high-throughput techniques for simultaneous assessment of a large number of markers has allowed classification of tumors into clinically relevant molecular subgroups beyond those possible by pathological classification. Here......, we review the recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers....

  18. High-throughput and computational approaches for diagnostic and prognostic host tuberculosis biomarkers

    Directory of Open Access Journals (Sweden)

    January Weiner

    2017-03-01

    Full Text Available High-throughput techniques strive to identify new biomarkers that will be useful for the diagnosis, treatment, and prevention of tuberculosis (TB. However, their analysis and interpretation pose considerable challenges. Recent developments in the high-throughput detection of host biomarkers in TB are reported in this review.

  19. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  20. A high-throughput Arabidopsis reverse genetics system.

    Science.gov (United States)

    Sessions, Allen; Burke, Ellen; Presting, Gernot; Aux, George; McElver, John; Patton, David; Dietrich, Bob; Ho, Patrick; Bacwaden, Johana; Ko, Cynthia; Clarke, Joseph D; Cotton, David; Bullis, David; Snell, Jennifer; Miguel, Trini; Hutchison, Don; Kimmerly, Bill; Mitzel, Theresa; Katagiri, Fumiaki; Glazebrook, Jane; Law, Marc; Goff, Stephen A

    2002-12-01

    A collection of Arabidopsis lines with T-DNA insertions in known sites was generated to increase the efficiency of functional genomics. A high-throughput modified thermal asymmetric interlaced (TAIL)-PCR protocol was developed and used to amplify DNA fragments flanking the T-DNA left borders from approximately 100000 transformed lines. A total of 85108 TAIL-PCR products from 52964 T-DNA lines were sequenced and compared with the Arabidopsis genome to determine the positions of T-DNAs in each line. Predicted T-DNA insertion sites, when mapped, showed a bias against predicted coding sequences. Predicted insertion mutations in genes of interest can be identified using Arabidopsis Gene Index name searches or by BLAST (Basic Local Alignment Search Tool) search. Insertions can be confirmed by simple PCR assays on individual lines. Predicted insertions were confirmed in 257 of 340 lines tested (76%). This resource has been named SAIL (Syngenta Arabidopsis Insertion Library) and is available to the scientific community at www.tmri.org.

  1. Use of High Throughput Screening Data in IARC Monograph ...

    Science.gov (United States)

    Purpose: Evaluation of carcinogenic mechanisms serves a critical role in IARC monograph evaluations, and can lead to “upgrade” or “downgrade” of the carcinogenicity conclusions based on human and animal evidence alone. Three recent IARC monograph Working Groups (110, 112, and 113) pioneered analysis of high throughput in vitro screening data from the U.S. Environmental Protection Agency’s ToxCast program in evaluations of carcinogenic mechanisms. Methods: For monograph 110, ToxCast assay data across multiple nuclear receptors were used to test the hypothesis that PFOA acts exclusively through the PPAR family of receptors, with activity profiles compared to several prototypical nuclear receptor-activating compounds. For monographs 112 and 113, ToxCast assays were systematically evaluated and used as an additional data stream in the overall evaluation of the mechanistic evidence. Specifically, ToxCast assays were mapped to 10 “key characteristics of carcinogens” recently identified by an IARC expert group, and chemicals’ bioactivity profiles were evaluated both in absolute terms (number of relevant assays positive for bioactivity) and relative terms (ranking with respect to other compounds evaluated by IARC, using the ToxPi methodology). Results: PFOA activates multiple nuclear receptors in addition to the PPAR family in the ToxCast assays. ToxCast assays offered substantial coverage for 5 of the 10 “key characteristics,” with the greates

  2. High-throughput optical screening of cellular mechanotransduction

    Science.gov (United States)

    Compton, Jonathan L.; Luo, Justin C.; Ma, Huan; Botvinick, Elliot; Venugopalan, Vasan

    2014-09-01

    We introduce an optical platform for rapid, high-throughput screening of exogenous molecules that affect cellular mechanotransduction. Our method initiates mechanotransduction in adherent cells using single laser-microbeam generated microcavitation bubbles without requiring flow chambers or microfluidics. These microcavitation bubbles expose adherent cells to a microtsunami, a transient microscale burst of hydrodynamic shear stress, which stimulates cells over areas approaching 1 mm2. We demonstrate microtsunami-initiated mechanosignalling in primary human endothelial cells. This observed signalling is consistent with G-protein-coupled receptor stimulation, resulting in Ca2+ release by the endoplasmic reticulum. Moreover, we demonstrate the dose-dependent modulation of microtsunami-induced Ca2+ signalling by introducing a known inhibitor to this pathway. The imaging of Ca2+ signalling and its modulation by exogenous molecules demonstrates the capacity to initiate and assess cellular mechanosignalling in real time. We utilize this capability to screen the effects of a set of small molecules on cellular mechanotransduction in 96-well plates using standard imaging cytometry.

  3. Strategies for high-throughput gene cloning and expression.

    Science.gov (United States)

    Dieckman, L J; Hanly, W C; Collart, E R

    2006-01-01

    High-throughput approaches for gene cloning and expression require the development of new, nonstandard tools for use by molecular biologists and biochemists. We have developed and implemented a series of methods that enable the production of expression constructs in 96-well plate format. A screening process is described that facilitates the identification of bacterial clones expressing soluble protein. Application of the solubility screen then provides a plate map that identifies the location of wells containing clones producing soluble proteins. A series of semi-automated methods can then be applied for validation of solubility and production of freezer stocks for the protein production group. This process provides an 80% success rate for the identification of clones producing soluble protein and results in a significant decrease in the level of effort required for the labor-intensive components of validation and preparation of freezer stocks. This process is customized for large-scale structural genomics programs that rely on the production of large amounts of soluble proteins for crystallization trials.

  4. A high-throughput biliverdin assay using infrared fluorescence.

    Science.gov (United States)

    Berlec, Aleš; Štrukelj, Borut

    2014-07-01

    Biliverdin is an intermediate of heme degradation with an established role in veterinary clinical diagnostics of liver-related diseases. The need for chromatographic assays has so far prevented its wider use in diagnostic laboratories. The current report describes a simple, fast, high-throughput, and inexpensive assay, based on the interaction of biliverdin with infrared fluorescent protein (iRFP) that yields functional protein exhibiting infrared fluorescence. The assay is linear in the range of 0-10 µmol/l of biliverdin, has a limit of detection of 0.02 μmol/l, and has a limit of quantification of 0.03 µmol/l. The assay is accurate with relative error less than 0.15, and precise, with coefficient of variation less than 5% in the concentration range of 2-9 µmol/l of biliverdin. More than 95% of biliverdin was recovered from biological samples by simple dimethyl sulfoxide extraction. There was almost no interference by hemin, although bilirubin caused an increase in the biliverdin concentration, probably due to spontaneous oxidation of bilirubin to biliverdin. The newly developed biliverdin assay is appropriate for reliable quantification of large numbers of samples in veterinary medicine.

  5. Functional approach to high-throughput plant growth analysis

    Science.gov (United States)

    2013-01-01

    Method Taking advantage of the current rapid development in imaging systems and computer vision algorithms, we present HPGA, a high-throughput phenotyping platform for plant growth modeling and functional analysis, which produces better understanding of energy distribution in regards of the balance between growth and defense. HPGA has two components, PAE (Plant Area Estimation) and GMA (Growth Modeling and Analysis). In PAE, by taking the complex leaf overlap problem into consideration, the area of every plant is measured from top-view images in four steps. Given the abundant measurements obtained with PAE, in the second module GMA, a nonlinear growth model is applied to generate growth curves, followed by functional data analysis. Results Experimental results on model plant Arabidopsis thaliana show that, compared to an existing approach, HPGA reduces the error rate of measuring plant area by half. The application of HPGA on the cfq mutant plants under fluctuating light reveals the correlation between low photosynthetic rates and small plant area (compared to wild type), which raises a hypothesis that knocking out cfq changes the sensitivity of the energy distribution under fluctuating light conditions to repress leaf growth. Availability HPGA is available at http://www.msu.edu/~jinchen/HPGA. PMID:24565437

  6. Mouse eye enucleation for remote high-throughput phenotyping.

    Science.gov (United States)

    Mahajan, Vinit B; Skeie, Jessica M; Assefnia, Amir H; Mahajan, Maryann; Tsang, Stephen H

    2011-11-19

    The mouse eye is an important genetic model for the translational study of human ophthalmic disease. Blinding diseases in humans, such as macular degeneration, photoreceptor degeneration, cataract, glaucoma, retinoblastoma, and diabetic retinopathy have been recapitulated in transgenic mice.(1-5) Most transgenic and knockout mice have been generated by laboratories to study non-ophthalmic diseases, but genetic conservation between organ systems suggests that many of the same genes may also play a role in ocular development and disease. Hence, these mice represent an important resource for discovering new genotype-phenotype correlations in the eye. Because these mice are scattered across the globe, it is difficult to acquire, maintain, and phenotype them in an efficient, cost-effective manner. Thus, most high-throughput ophthalmic phenotyping screens are restricted to a few locations that require on-site, ophthalmic expertise to examine eyes in live mice. (6-9) An alternative approach developed by our laboratory is a method for remote tissue-acquisition that can be used in large or small-scale surveys of transgenic mouse eyes. Standardized procedures for video-based surgical skill transfer, tissue fixation, and shipping allow any lab to collect whole eyes from mutant animals and send them for molecular and morphological phenotyping. In this video article, we present techniques to enucleate and transfer both unfixed and perfusion fixed mouse eyes for remote phenotyping analyses.

  7. High Throughput Multispectral Image Processing with Applications in Food Science.

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  8. High-throughput membrane surface modification to control NOM fouling.

    Science.gov (United States)

    Zhou, Mingyan; Liu, Hongwei; Kilduff, James E; Langer, Robert; Anderson, Daniel G; Belfort, Georges

    2009-05-15

    A novel method for synthesis and screening of fouling-resistant membrane surfaces was developed by combining a high-throughput platform (HTP) approach together with photoinduced graft polymerization (PGP)forfacile modification of commercial poly(aryl sulfone) membranes. This method is an inexpensive, fast, simple, reproducible, and scalable approach to identify fouling-resistant surfaces appropriate for a specific feed. In this research, natural organic matter (NOM)-resistant surfaces were synthesized and indentified from a library of 66 monomers. Surfaces were prepared via graft polymerization onto poly(ether sulfone) (PES) membranes and were evaluated using an assay involving NOM adsorption, followed by pressure-driven-filtration. In this work new and previously tested low-fouling surfaces for NOM are identified, and their ability to mitigate NOM and protein (bovine serum albumin)fouling is compared. The best-performing monomers were the zwitterion [2-(methacryloyloxy)ethyl]dimethyl-(3-sulfopropyl)ammonium hydroxide, and diacetone acrylamide, a neutral monomer containing an amide group. Other excellent surfaces were synthesized from amides, amines, basic monomers, and long-chain poly(ethylene) glycols. Bench-scale studies conducted for selected monomers verified the scalability of HTP-PGP results. The results and the synthesis and screening method presented here offer new opportunities for choosing new membrane chemistries that minimize NOM fouling.

  9. Assessing the utility and limitations of high throughput virtual screening

    Directory of Open Access Journals (Sweden)

    Paul Daniel Phillips

    2016-05-01

    Full Text Available Due to low cost, speed, and unmatched ability to explore large numbers of compounds, high throughput virtual screening and molecular docking engines have become widely utilized by computational scientists. It is generally accepted that docking engines, such as AutoDock, produce reliable qualitative results for ligand-macromolecular receptor binding, and molecular docking results are commonly reported in literature in the absence of complementary wet lab experimental data. In this investigation, three variants of the sixteen amino acid peptide, α-conotoxin MII, were docked to a homology model of the a3β2-nicotinic acetylcholine receptor. DockoMatic version 2.0 was used to perform a virtual screen of each peptide ligand to the receptor for ten docking trials consisting of 100 AutoDock cycles per trial. The results were analyzed for both variation in the calculated binding energy obtained from AutoDock, and the orientation of bound peptide within the receptor. The results show that, while no clear correlation exists between consistent ligand binding pose and the calculated binding energy, AutoDock is able to determine a consistent positioning of bound peptide in the majority of trials when at least ten trials were evaluated.

  10. The JCSG high-throughput structural biology pipeline.

    Science.gov (United States)

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  11. Generation of RNAi Libraries for High-Throughput Screens

    Directory of Open Access Journals (Sweden)

    Julie Clark

    2006-01-01

    Full Text Available The completion of the genome sequencing for several organisms has created a great demand for genomic tools that can systematically analyze the growing wealth of data. In contrast to the classical reverse genetics approach of creating specific knockout cell lines or animals that is time-consuming and expensive, RNA-mediated interference (RNAi has emerged as a fast, simple, and cost-effective technique for gene knockdown in large scale. Since its discovery as a gene silencing response to double-stranded RNA (dsRNA with homology to endogenous genes in Caenorhabditis elegans (C elegans, RNAi technology has been adapted to various high-throughput screens (HTS for genome-wide loss-of-function (LOF analysis. Biochemical insights into the endogenous mechanism of RNAi have led to advances in RNAi methodology including RNAi molecule synthesis, delivery, and sequence design. In this article, we will briefly review these various RNAi library designs and discuss the benefits and drawbacks of each library strategy.

  12. The Utilization of Formalin Fixed-Paraffin-Embedded Specimens in High Throughput Genomic Studies

    Directory of Open Access Journals (Sweden)

    Pan Zhang

    2017-01-01

    Full Text Available High throughput genomic assays empower us to study the entire human genome in short time with reasonable cost. Formalin fixed-paraffin-embedded (FFPE tissue processing remains the most economical approach for longitudinal tissue specimen storage. Therefore, the ability to apply high throughput genomic applications to FFPE specimens can expand clinical assays and discovery. Many studies have measured the accuracy and repeatability of data generated from FFPE specimens using high throughput genomic assays. Together, these studies demonstrate feasibility and provide crucial guidance for future studies using FFPE specimens. Here, we summarize the findings of these studies and discuss the limitations of high throughput data generated from FFPE specimens across several platforms that include microarray, high throughput sequencing, and NanoString.

  13. High-throughput flow cytometry data normalization for clinical trials.

    Science.gov (United States)

    Finak, Greg; Jiang, Wenxin; Krouse, Kevin; Wei, Chungwen; Sanz, Ignacio; Phippard, Deborah; Asare, Adam; De Rosa, Stephen C; Self, Steve; Gottardo, Raphael

    2014-03-01

    Flow cytometry datasets from clinical trials generate very large datasets and are usually highly standardized, focusing on endpoints that are well defined apriori. Staining variability of individual makers is not uncommon and complicates manual gating, requiring the analyst to adapt gates for each sample, which is unwieldy for large datasets. It can lead to unreliable measurements, especially if a template-gating approach is used without further correction to the gates. In this article, a computational framework is presented for normalizing the fluorescence intensity of multiple markers in specific cell populations across samples that is suitable for high-throughput processing of large clinical trial datasets. Previous approaches to normalization have been global and applied to all cells or data with debris removed. They provided no mechanism to handle specific cell subsets. This approach integrates tightly with the gating process so that normalization is performed during gating and is local to the specific cell subsets exhibiting variability. This improves peak alignment and the performance of the algorithm. The performance of this algorithm is demonstrated on two clinical trial datasets from the HIV Vaccine Trials Network (HVTN) and the Immune Tolerance Network (ITN). In the ITN data set we show that local normalization combined with template gating can account for sample-to-sample variability as effectively as manual gating. In the HVTN dataset, it is shown that local normalization mitigates false-positive vaccine response calls in an intracellular cytokine staining assay. In both datasets, local normalization performs better than global normalization. The normalization framework allows the use of template gates even in the presence of sample-to-sample staining variability, mitigates the subjectivity and bias of manual gating, and decreases the time necessary to analyze large datasets. © 2013 International Society for Advancement of Cytometry.

  14. Mining Chemical Activity Status from High-Throughput Screening Assays.

    Science.gov (United States)

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  15. Mining Chemical Activity Status from High-Throughput Screening Assays

    KAUST Repository

    Soufan, Othman

    2015-12-14

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  16. Applications of Biophysics in High-Throughput Screening Hit Validation.

    Science.gov (United States)

    Genick, Christine Clougherty; Barlier, Danielle; Monna, Dominique; Brunner, Reto; Bé, Céline; Scheufler, Clemens; Ottl, Johannes

    2014-06-01

    For approximately a decade, biophysical methods have been used to validate positive hits selected from high-throughput screening (HTS) campaigns with the goal to verify binding interactions using label-free assays. By applying label-free readouts, screen artifacts created by compound interference and fluorescence are discovered, enabling further characterization of the hits for their target specificity and selectivity. The use of several biophysical methods to extract this type of high-content information is required to prevent the promotion of false positives to the next level of hit validation and to select the best candidates for further chemical optimization. The typical technologies applied in this arena include dynamic light scattering, turbidometry, resonance waveguide, surface plasmon resonance, differential scanning fluorimetry, mass spectrometry, and others. Each technology can provide different types of information to enable the characterization of the binding interaction. Thus, these technologies can be incorporated in a hit-validation strategy not only according to the profile of chemical matter that is desired by the medicinal chemists, but also in a manner that is in agreement with the target protein's amenability to the screening format. Here, we present the results of screening strategies using biophysics with the objective to evaluate the approaches, discuss the advantages and challenges, and summarize the benefits in reference to lead discovery. In summary, the biophysics screens presented here demonstrated various hit rates from a list of ~2000 preselected, IC50-validated hits from HTS (an IC50 is the inhibitor concentration at which 50% inhibition of activity is observed). There are several lessons learned from these biophysical screens, which will be discussed in this article. © 2014 Society for Laboratory Automation and Screening.

  17. High-Throughput Neuroimaging-Genetics Computational Infrastructure

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2014-04-01

    Full Text Available Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate and disseminate novel scientific methods, computational resources and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval and aggregation. Computational processing involves the necessary software, hardware and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical and phenotypic data and meta-data. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI and the Laboratory of Neuro Imaging (LONI at University of Southern California (USC. INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer’s and Parkinson’s data, we provide several examples of translational applications using this infrastructure.

  18. Throughput Maximization for Sensor-Aided Cognitive Radio Networks with Continuous Energy Arrivals.

    Science.gov (United States)

    Nguyen, Thanh-Tung; Koo, Insoo

    2015-11-27

    We consider a Sensor-Aided Cognitive Radio Network (SACRN) in which sensors capable of harvesting energy are distributed throughout the network to support secondary transmitters for sensing licensed channels in order to improve both energy and spectral efficiency. Harvesting ambient energy is one of the most promising solutions to mitigate energy deficiency, prolong device lifetime, and partly reduce the battery size of devices. So far, many works related to SACRN have considered single secondary users capable of harvesting energy in whole slot as well as short-term throughput. In the paper, we consider two types of energy harvesting sensor nodes (EHSN): Type-I sensor nodes will harvest ambient energy in whole slot duration, whereas type-II sensor nodes will only harvest energy after carrying out spectrum sensing. In the paper, we also investigate long-term throughput in the scheduling window, and formulate the throughput maximization problem by considering energy-neutral operation conditions of type-I and -II sensors and the target detection probability. Through simulations, it is shown that the sensing energy consumption of all sensor nodes can be efficiently managed with the proposed scheme to achieve optimal long-term throughput in the window.

  19. High-Throughput Immunogenetics for Clinical and Research Applications in Immunohematology: Potential and Challenges

    NARCIS (Netherlands)

    Langerak, A.W.; Bruggemann, M.; Davi, F.; Darzentas, N.; Dongen, J.J. van; Gonzalez, D.; Cazzaniga, G.; Giudicelli, V.; Lefranc, M.P.; Giraud, M.; Macintyre, E.A.; Hummel, M.; Pott, C.; Groenen, P.J.T.A.; Stamatopoulos, K.

    2017-01-01

    Analysis and interpretation of Ig and TCR gene rearrangements in the conventional, low-throughput way have their limitations in terms of resolution, coverage, and biases. With the advent of high-throughput, next-generation sequencing (NGS) technologies, a deeper analysis of Ig and/or TCR (IG/TR)

  20. Hypoxia-sensitive reporter system for high-throughput screening.

    Science.gov (United States)

    Tsujita, Tadayuki; Kawaguchi, Shin-ichi; Dan, Takashi; Baird, Liam; Miyata, Toshio; Yamamoto, Masayuki

    2015-02-01

    The induction of anti-hypoxic stress enzymes and proteins has the potential to be a potent therapeutic strategy to prevent the progression of ischemic heart, kidney or brain diseases. To realize this idea, small chemical compounds, which mimic hypoxic conditions by activating the PHD-HIF-α system, have been developed. However, to date, none of these compounds were identified by monitoring the transcriptional activation of hypoxia-inducible factors (HIFs). Thus, to facilitate the discovery of potent inducers of HIF-α, we have developed an effective high-throughput screening (HTS) system to directly monitor the output of HIF-α transcription. We generated a HIF-α-dependent reporter system that responds to hypoxic stimuli in a concentration- and time-dependent manner. This system was developed through multiple optimization steps, resulting in the generation of a construct that consists of the secretion-type luciferase gene (Metridia luciferase, MLuc) under the transcriptional regulation of an enhancer containing 7 copies of 40-bp hypoxia responsive element (HRE) upstream of a mini-TATA promoter. This construct was stably integrated into the human neuroblastoma cell line, SK-N-BE(2)c, to generate a reporter system, named SKN:HRE-MLuc. To improve this system and to increase its suitability for the HTS platform, we incorporated the next generation luciferase, Nano luciferase (NLuc), whose longer half-life provides us with flexibility for the use of this reporter. We thus generated a stably transformed clone with NLuc, named SKN:HRE-NLuc, and found that it showed significantly improved reporter activity compared to SKN:HRE-MLuc. In this study, we have successfully developed the SKN:HRE-NLuc screening system as an efficient platform for future HTS.

  1. Towards Chip Scale Liquid Chromatography and High Throughput Immunosensing

    Energy Technology Data Exchange (ETDEWEB)

    Ni, Jing [Iowa State Univ., Ames, IA (United States)

    2000-09-21

    This work describes several research projects aimed towards developing new instruments and novel methods for high throughput chemical and biological analysis. Approaches are taken in two directions. The first direction takes advantage of well-established semiconductor fabrication techniques and applies them to miniaturize instruments that are workhorses in analytical laboratories. Specifically, the first part of this work focused on the development of micropumps and microvalves for controlled fluid delivery. The mechanism of these micropumps and microvalves relies on the electrochemically-induced surface tension change at a mercury/electrolyte interface. A miniaturized flow injection analysis device was integrated and flow injection analyses were demonstrated. In the second part of this work, microfluidic chips were also designed, fabricated, and tested. Separations of two fluorescent dyes were demonstrated in microfabricated channels, based on an open-tubular liquid chromatography (OT LC) or an electrochemically-modulated liquid chromatography (EMLC) format. A reduction in instrument size can potentially increase analysis speed, and allow exceedingly small amounts of sample to be analyzed under diverse separation conditions. The second direction explores the surface enhanced Raman spectroscopy (SERS) as a signal transduction method for immunoassay analysis. It takes advantage of the improved detection sensitivity as a result of surface enhancement on colloidal gold, the narrow width of Raman band, and the stability of Raman scattering signals to distinguish several different species simultaneously without exploiting spatially-separated addresses on a biochip. By labeling gold nanoparticles with different Raman reporters in conjunction with different detection antibodies, a simultaneous detection of a dual-analyte immunoassay was demonstrated. Using this scheme for quantitative analysis was also studied and preliminary dose-response curves from an immunoassay of a

  2. Missing call bias in high-throughput genotyping

    Directory of Open Access Journals (Sweden)

    Lin Rong

    2009-03-01

    Full Text Available Abstract Background The advent of high-throughput and cost-effective genotyping platforms made genome-wide association (GWA studies a reality. While the primary focus has been invested upon the improvement of reducing genotyping error, the problems associated with missing calls are largely overlooked. Results To probe into the effect of missing calls on GWAs, we demonstrated experimentally the prevalence and severity of the problem of missing call bias (MCB in four genotyping technologies (Affymetrix 500 K SNP array, SNPstream, TaqMan, and Illumina Beadlab. Subsequently, we showed theoretically that MCB leads to biased conclusions in the subsequent analyses, including estimation of allele/genotype frequencies, the measurement of HWE and association tests under various modes of inheritance relationships. We showed that MCB usually leads to power loss in association tests, and such power change is greater than what could be achieved by equivalent reduction of sample size unbiasedly. We also compared the bias in allele frequency estimation and in association tests introduced by MCB with those by genotyping errors. Our results illustrated that in most cases, the bias can be greatly reduced by increasing the call-rate at the cost of genotyping error rate. Conclusion The commonly used 'no-call' procedure for the observations of borderline quality should be modified. If the objective is to minimize the bias, the cut-off for call-rate and that for genotyping error rate should be properly coupled in GWA. We suggested that the ongoing QC cut-off for call-rate should be increased, while the cut-off for genotyping error rate can be reduced properly.

  3. A high-throughput microfluidic approach for 1000-fold leukocyte reduction of platelet-rich plasma

    Science.gov (United States)

    Xia, Hui; Strachan, Briony C.; Gifford, Sean C.; Shevkoplyas, Sergey S.

    2016-10-01

    Leukocyte reduction of donated blood products substantially reduces the risk of a number of transfusion-related complications. Current ‘leukoreduction’ filters operate by trapping leukocytes within specialized filtration material, while allowing desired blood components to pass through. However, the continuous release of inflammatory cytokines from the retained leukocytes, as well as the potential for platelet activation and clogging, are significant drawbacks of conventional ‘dead end’ filtration. To address these limitations, here we demonstrate our newly-developed ‘controlled incremental filtration’ (CIF) approach to perform high-throughput microfluidic removal of leukocytes from platelet-rich plasma (PRP) in a continuous flow regime. Leukocytes are separated from platelets within the PRP by progressively syphoning clarified PRP away from the concentrated leukocyte flowstream. Filtrate PRP collected from an optimally-designed CIF device typically showed a ~1000-fold (i.e. 99.9%) reduction in leukocyte concentration, while recovering >80% of the original platelets, at volumetric throughputs of ~1 mL/min. These results suggest that the CIF approach will enable users in many fields to now apply the advantages of microfluidic devices to particle separation, even for applications requiring macroscale flowrates.

  4. High-throughput ocular artifact reduction in multichannel electroencephalography (EEG) using component subspace projection.

    Science.gov (United States)

    Ma, Junshui; Bayram, Sevinç; Tao, Peining; Svetnik, Vladimir

    2011-03-15

    After a review of the ocular artifact reduction literature, a high-throughput method designed to reduce the ocular artifacts in multichannel continuous EEG recordings acquired at clinical EEG laboratories worldwide is proposed. The proposed method belongs to the category of component-based methods, and does not rely on any electrooculography (EOG) signals. Based on a concept that all ocular artifact components exist in a signal component subspace, the method can uniformly handle all types of ocular artifacts, including eye-blinks, saccades, and other eye movements, by automatically identifying ocular components from decomposed signal components. This study also proposes an improved strategy to objectively and quantitatively evaluate artifact reduction methods. The evaluation strategy uses real EEG signals to synthesize realistic simulated datasets with different amounts of ocular artifacts. The simulated datasets enable us to objectively demonstrate that the proposed method outperforms some existing methods when no high-quality EOG signals are available. Moreover, the results of the simulated datasets improve our understanding of the involved signal decomposition algorithms, and provide us with insights into the inconsistency regarding the performance of different methods in the literature. The proposed method was also applied to two independent clinical EEG datasets involving 28 volunteers and over 1000 EEG recordings. This effort further confirms that the proposed method can effectively reduce ocular artifacts in large clinical EEG datasets in a high-throughput fashion. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. High-throughput physiological phenotyping and screening system for the characterization of plant-environment interactions.

    Science.gov (United States)

    Halperin, Ofer; Gebremedhin, Alem; Wallach, Rony; Moshelion, Menachem

    2017-02-01

    We present a simple and effective high-throughput experimental platform for simultaneous and continuous monitoring of water relations in the soil-plant-atmosphere continuum of numerous plants under dynamic environmental conditions. This system provides a simultaneously measured, detailed physiological response profile for each plant in the array, over time periods ranging from a few minutes to the entire growing season, under normal, stress and recovery conditions and at any phenological stage. Three probes for each pot in the array and a specially designed algorithm enable detailed water-relations characterization of whole-plant transpiration, biomass gain, stomatal conductance and root flux. They also enable quantitative calculation of the whole plant water-use efficiency and relative water content at high resolution under dynamic soil and atmospheric conditions. The system has no moving parts and can fit into many growing environments. A screening of 65 introgression lines of a wild tomato species (Solanum pennellii) crossed with cultivated tomato (S. lycopersicum), using our system and conventional gas-exchange tools, confirmed the accuracy of the system as well as its diagnostic capabilities. The use of this high-throughput diagnostic screening method is discussed in light of the gaps in our understanding of the genetic regulation of whole-plant performance, particularly under abiotic stress. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  6. High-throughput combinatorial chemical bath deposition: The case of doping Cu (In, Ga) Se film with antimony

    Science.gov (United States)

    Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong

    2018-01-01

    The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.

  7. Robo-Lector – a novel platform for automated high-throughput cultivations in microtiter plates with high information content

    Directory of Open Access Journals (Sweden)

    Kensy Frank

    2009-08-01

    Full Text Available Abstract Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3 pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main

  8. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  9. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    Science.gov (United States)

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and

  10. High throughput discovery of influenza virus neutralizing antibodies from phage-displayed synthetic antibody libraries.

    Science.gov (United States)

    Chen, Ing-Chien; Chiu, Yi-Kai; Yu, Chung-Ming; Lee, Cheng-Chung; Tung, Chao-Ping; Tsou, Yueh-Liang; Huang, Yi-Jen; Lin, Chia-Lung; Chen, Hong-Sen; Wang, Andrew H-J; Yang, An-Suei

    2017-10-31

    Pandemic and epidemic outbreaks of influenza A virus (IAV) infection pose severe challenges to human society. Passive immunotherapy with recombinant neutralizing antibodies can potentially mitigate the threats of IAV infection. With a high throughput neutralizing antibody discovery platform, we produced artificial anti-hemagglutinin (HA) IAV-neutralizing IgGs from phage-displayed synthetic scFv libraries without necessitating prior memory of antibody-antigen interactions or relying on affinity maturation essential for in vivo immune systems to generate highly specific neutralizing antibodies. At least two thirds of the epitope groups of the artificial anti-HA antibodies resemble those of natural protective anti-HA antibodies, providing alternatives to neutralizing antibodies from natural antibody repertoires. With continuing advancement in designing and constructing synthetic scFv libraries, this technological platform is useful in mitigating not only the threats of IAV pandemics but also those from other newly emerging viral infections.

  11. High pressure inertial focusing for separation and concentration of bacteria at high throughput

    Science.gov (United States)

    Cruz, F. J.; Hjort, K.

    2017-11-01

    Inertial focusing is a phenomenon where particles migrate across streamlines in microchannels and focus at well-defined, size dependent equilibrium points of the cross section. It can be taken into advantage for focusing, separation and concentration of particles at high through-put and high efficiency. As particles decrease in size, smaller channels and higher pressures are needed. Hence, new designs are needed to decrease the pressure drop. In this work a novel design was adapted to focus and separate 1 µm from 3 µm spherical polystyrene particles. Also 0.5 µm spherical polystyrene particles were separated, although in a band instead of a single line. The ability to separate, concentrate and focus bacteria, its simplicity of use and high throughput make this technology a candidate for daily routines in laboratories and hospitals.

  12. High throughput modular chambers for rapid evaluation of anesthetic sensitivity

    Directory of Open Access Journals (Sweden)

    Eckmann David M

    2006-11-01

    Full Text Available Abstract Background Anesthetic sensitivity is determined by the interaction of multiple genes. Hence, a dissection of genetic contributors would be aided by precise and high throughput behavioral screens. Traditionally, anesthetic phenotyping has addressed only induction of anesthesia, evaluated with dose-response curves, while ignoring potentially important data on emergence from anesthesia. Methods We designed and built a controlled environment apparatus to permit rapid phenotyping of twenty-four mice simultaneously. We used the loss of righting reflex to indicate anesthetic-induced unconsciousness. After fitting the data to a sigmoidal dose-response curve with variable slope, we calculated the MACLORR (EC50, the Hill coefficient, and the 95% confidence intervals bracketing these values. Upon termination of the anesthetic, Emergence timeRR was determined and expressed as the mean ± standard error for each inhaled anesthetic. Results In agreement with several previously published reports we find that the MACLORR of halothane, isoflurane, and sevoflurane in 8–12 week old C57BL/6J mice is 0.79% (95% confidence interval = 0.78 – 0.79%, 0.91% (95% confidence interval = 0.90 – 0.93%, and 1.96% (95% confidence interval = 1.94 – 1.97%, respectively. Hill coefficients for halothane, isoflurane, and sevoflurane are 24.7 (95% confidence interval = 19.8 – 29.7%, 19.2 (95% confidence interval = 14.0 – 24.3%, and 33.1 (95% confidence interval = 27.3 – 38.8%, respectively. After roughly 2.5 MACLORR • hr exposures, mice take 16.00 ± 1.07, 6.19 ± 0.32, and 2.15 ± 0.12 minutes to emerge from halothane, isoflurane, and sevoflurane, respectively. Conclusion This system enabled assessment of inhaled anesthetic responsiveness with a higher precision than that previously reported. It is broadly adaptable for delivering an inhaled therapeutic (or toxin to a population while monitoring its vital signs, motor reflexes, and providing precise control

  13. High throughput RNAi assay optimization using adherent cell cytometry

    Directory of Open Access Journals (Sweden)

    Pradhan Leena

    2011-04-01

    Full Text Available Abstract Background siRNA technology is a promising tool for gene therapy of vascular disease. Due to the multitude of reagents and cell types, RNAi experiment optimization can be time-consuming. In this study adherent cell cytometry was used to rapidly optimize siRNA transfection in human aortic vascular smooth muscle cells (AoSMC. Methods AoSMC were seeded at a density of 3000-8000 cells/well of a 96well plate. 24 hours later AoSMC were transfected with either non-targeting unlabeled siRNA (50 nM, or non-targeting labeled siRNA, siGLO Red (5 or 50 nM using no transfection reagent, HiPerfect or Lipofectamine RNAiMax. For counting cells, Hoechst nuclei stain or Cell Tracker green were used. For data analysis an adherent cell cytometer, Celigo® was used. Data was normalized to the transfection reagent alone group and expressed as red pixel count/cell. Results After 24 hours, none of the transfection conditions led to cell loss. Red fluorescence counts were normalized to the AoSMC count. RNAiMax was more potent compared to HiPerfect or no transfection reagent at 5 nM siGLO Red (4.12 +/-1.04 vs. 0.70 +/-0.26 vs. 0.15 +/-0.13 red pixel/cell and 50 nM siGLO Red (6.49 +/-1.81 vs. 2.52 +/-0.67 vs. 0.34 +/-0.19. Fluorescence expression results supported gene knockdown achieved by using MARCKS targeting siRNA in AoSMCs. Conclusion This study underscores that RNAi delivery depends heavily on the choice of delivery method. Adherent cell cytometry can be used as a high throughput-screening tool for the optimization of RNAi assays. This technology can accelerate in vitro cell assays and thus save costs.

  14. High-throughput metal susceptibility testing of microbial biofilms

    Directory of Open Access Journals (Sweden)

    Turner Raymond J

    2005-10-01

    Full Text Available Abstract Background Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. Results This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO32- than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic

  15. High-throughput metal susceptibility testing of microbial biofilms

    Science.gov (United States)

    Harrison, Joe J; Turner, Raymond J; Ceri, Howard

    2005-01-01

    Background Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. Results This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO32-) than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic cell E. coli JM109 to metals

  16. Engineering High Affinity Protein-Protein Interactions Using a High-Throughput Microcapillary Array Platform.

    Science.gov (United States)

    Lim, Sungwon; Chen, Bob; Kariolis, Mihalis S; Dimov, Ivan K; Baer, Thomas M; Cochran, Jennifer R

    2017-02-17

    Affinity maturation of protein-protein interactions requires iterative rounds of protein library generation and high-throughput screening to identify variants that bind with increased affinity to a target of interest. We recently developed a multipurpose protein engineering platform, termed μSCALE (Microcapillary Single Cell Analysis and Laser Extraction). This technology enables high-throughput screening of libraries of millions of cell-expressing protein variants based on their binding properties or functional activity. Here, we demonstrate the first use of the μSCALE platform for affinity maturation of a protein-protein binding interaction. In this proof-of-concept study, we engineered an extracellular domain of the Axl receptor tyrosine kinase to bind tighter to its ligand Gas6. Within 2 weeks, two iterative rounds of library generation and screening resulted in engineered Axl variants with a 50-fold decrease in kinetic dissociation rate, highlighting the use of μSCALE as a new tool for directed evolution.

  17. Microscopy with microlens arrays: high throughput, high resolution and light-field imaging.

    Science.gov (United States)

    Orth, Antony; Crozier, Kenneth

    2012-06-04

    We demonstrate highly parallelized fluorescence scanning microscopy using a refractive microlens array. Fluorescent beads and rat femur tissue are imaged over a 5.5 mm x 5.5 mm field of view at a pixel throughput of up to 4 megapixels/s and a resolution of 706 nm. We also demonstrate the ability to extract different perspective views of a pile of microspheres.

  18. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3.04 "Propulsion Systems," Busek Co. Inc. will develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  19. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    Science.gov (United States)

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  20. High-throughput phenotyping of multicellular organisms: finding the link between genotype and phenotype

    OpenAIRE

    Sozzani, Rosangela; Benfey, Philip N

    2011-01-01

    High-throughput phenotyping approaches (phenomics) are being combined with genome-wide genetic screens to identify alterations in phenotype that result from gene inactivation. Here we highlight promising technologies for 'phenome-scale' analyses in multicellular organisms.

  1. High-throughput phenotyping of multicellular organisms: finding the link between genotype and phenotype

    Science.gov (United States)

    2011-01-01

    High-throughput phenotyping approaches (phenomics) are being combined with genome-wide genetic screens to identify alterations in phenotype that result from gene inactivation. Here we highlight promising technologies for 'phenome-scale' analyses in multicellular organisms. PMID:21457493

  2. EMPeror: a tool for visualizing high-throughput microbial community data

    National Research Council Canada - National Science Library

    Vázquez-Baeza, Yoshiki; Pirrung, Meg; Gonzalez, Antonio; Knight, Rob

    2013-01-01

    As microbial ecologists take advantage of high-throughput sequencing technologies to describe microbial communities across ever-increasing numbers of samples, new analysis tools are required to relate...

  3. High-throughput system-wide engineering and screening for microbial biotechnology.

    Science.gov (United States)

    Vervoort, Yannick; Linares, Alicia Gutiérrez; Roncoroni, Miguel; Liu, Chengxun; Steensels, Jan; Verstrepen, Kevin J

    2017-08-01

    Genetic engineering and screening of large number of cells or populations is a crucial bottleneck in today's systems biology and applied (micro)biology. Instead of using standard methods in bottles, flasks or 96-well plates, scientists are increasingly relying on high-throughput strategies that miniaturize their experiments to the nanoliter and picoliter scale and the single-cell level. In this review, we summarize different high-throughput system-wide genome engineering and screening strategies for microbes. More specifically, we will emphasize the use of multiplex automated genome evolution (MAGE) and CRISPR/Cas systems for high-throughput genome engineering and the application of (lab-on-chip) nanoreactors for high-throughput single-cell or population screening. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  5. An image analysis toolbox for high-throughput C. elegans assays.

    Science.gov (United States)

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H; Riklin-Raviv, Tammy; Conery, Annie L; O'Rourke, Eyleen J; Sokolnicki, Katherine L; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M; Carpenter, Anne E

    2012-04-22

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available through the open-source CellProfiler project and enables objective scoring of whole-worm high-throughput image-based assays of C. elegans for the study of diverse biological pathways that are relevant to human disease.

  6. Deep Recurrent Neural Network for Mobile Human Activity Recognition with High Throughput

    OpenAIRE

    Inoue, Masaya; Inoue, Sozo; Nishida, Takeshi

    2016-01-01

    In this paper, we propose a method of human activity recognition with high throughput from raw accelerometer data applying a deep recurrent neural network (DRNN), and investigate various architectures and its combination to find the best parameter values. The "high throughput" refers to short time at a time of recognition. We investigated various parameters and architectures of the DRNN by using the training dataset of 432 trials with 6 activity classes from 7 people. The maximum recognition ...

  7. Construction and analysis of high-density linkage map using high-throughput sequencing data.

    Directory of Open Access Journals (Sweden)

    Dongyuan Liu

    Full Text Available Linkage maps enable the study of important biological questions. The construction of high-density linkage maps appears more feasible since the advent of next-generation sequencing (NGS, which eases SNP discovery and high-throughput genotyping of large population. However, the marker number explosion and genotyping errors from NGS data challenge the computational efficiency and linkage map quality of linkage study methods. Here we report the HighMap method for constructing high-density linkage maps from NGS data. HighMap employs an iterative ordering and error correction strategy based on a k-nearest neighbor algorithm and a Monte Carlo multipoint maximum likelihood algorithm. Simulation study shows HighMap can create a linkage map with three times as many markers as ordering-only methods while offering more accurate marker orders and stable genetic distances. Using HighMap, we constructed a common carp linkage map with 10,004 markers. The singleton rate was less than one-ninth of that generated by JoinMap4.1. Its total map distance was 5,908 cM, consistent with reports on low-density maps. HighMap is an efficient method for constructing high-density, high-quality linkage maps from high-throughput population NGS data. It will facilitate genome assembling, comparative genomic analysis, and QTL studies. HighMap is available at http://highmap.biomarker.com.cn/.

  8. High-Throughput Fabrication of Nanocomplexes Using 3D-Printed Micromixers

    DEFF Research Database (Denmark)

    Bohr, Adam; Boetker, Johan; Wang, Yingya

    2017-01-01

    3D printing allows a rapid and inexpensive manufacturing of custom made and prototype devices. Micromixers are used for rapid and controlled production of nanoparticles intended for therapeutic delivery. In this study, we demonstrate the fabrication of micromixers using computational design and 3D...... via bulk mixing. Moreover, each micromixer could process more than 2 liters per hour with unaffected performance and the setup could easily be scaled-up by aligning several micromixers in parallel. This demonstrates that 3D printing can be used to prepare disposable high-throughput micromixers...... printing, which enable a continuous and industrial scale production of nanocomplexes formed by electrostatic complexation, using the polymers poly(diallyldimethylammonium chloride) and poly(sodium 4-styrenesulfonate). Several parameters including polymer concentration, flow rate, and flow ratio were...

  9. High-throughput format for the phenotyping of fungi on solid substrates.

    Science.gov (United States)

    Cánovas, David; Studt, Lena; Marcos, Ana T; Strauss, Joseph

    2017-06-27

    Filamentous fungi naturally grow on solid surfaces, yet most genetic and biochemical analyses are still performed in liquid cultures. Here, we report a multiplexing platform using high-throughput photometric continuous reading that allows parallel quantification of hyphal growth and reporter gene expression directly on solid medium, thereby mimicking natural environmental conditions. Using this system, we have quantified fungal growth and expression of secondary metabolite GFP-based reporter genes in saprophytic Aspergillus and phytopathogenic Fusarium species in response to different nutrients, stress conditions and epigenetic modifiers. With this method, we provide not only novel insights into the characteristic of fungal growth but also into the metabolic and time-dependent regulation of secondary metabolite gene expression.

  10. A novel high-throughput drip-flow system to grow autotrophic biofilms of contrasting diversities

    DEFF Research Database (Denmark)

    Kinnunen, Marta; Dechesne, Arnaud; Albrechtsen, Hans-Jørgen

    . Thus, thicker biofilms are likely to host greater diversity. A system with 40 replicates has been constructed using flow-through polypropylene columns housing a defined number of single-sized glass beads supported by a stainless steel mesh. Biofilms consisting primarily of ammonia oxidizing and nitrite...... are often ill controlled and thus likely to be poorly reproducible. The purpose of this work is to develop a high-throughput continuous-flow system for growing replicate microbial biofilms of varying, but controlled, average thickness and associated community diversity. With these replicate biofilms......, the effect of community composition and diversity on various ecological processes can then be rigorously examined. We hypothesize that the increased loading, resulting in thicker biofilms, will decrease the drift in the community and impose limited environmental filtering by providing more diverse niches...

  11. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    Science.gov (United States)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  12. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels...... channel targets accessible for drug screening. Specifically, genuine HTS parallel processing techniques based on arrays of planar silicon chips are being developed, but also lower throughput sequential techniques may be of value in compound screening, lead optimization, and safety screening....... The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery....

  13. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    Science.gov (United States)

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  14. Protocol: A high-throughput DNA extraction system suitable for conifers.

    Science.gov (United States)

    Bashalkhanov, Stanislav; Rajora, Om P

    2008-08-01

    High throughput DNA isolation from plants is a major bottleneck for most studies requiring large sample sizes. A variety of protocols have been developed for DNA isolation from plants. However, many species, including conifers, have high contents of secondary metabolites that interfere with the extraction process or the subsequent analysis steps. Here, we describe a procedure for high-throughput DNA isolation from conifers. We have developed a high-throughput DNA extraction protocol for conifers using an automated liquid handler and modifying the Qiagen MagAttract Plant Kit protocol. The modifications involve change to the buffer system and improving the protocol so that it almost doubles the number of samples processed per kit, which significantly reduces the overall costs. We describe two versions of the protocol: one for medium-throughput (MTP) and another for high-throughput (HTP) DNA isolation. The HTP version works from start to end in the industry-standard 96-well format, while the MTP version provides higher DNA yields per sample processed. We have successfully used the protocol for DNA extraction and genotyping of thousands of individuals of several spruce and a pine species. A high-throughput system for DNA extraction from conifer needles and seeds has been developed and validated. The quality of the isolated DNA was comparable with that obtained from two commonly used methods: the silica-spin column and the classic CTAB protocol. Our protocol provides a fully automatable and cost effective solution for processing large numbers of conifer samples.

  15. Direct multiplex sequencing (DMPS)--a novel method for targeted high-throughput sequencing of ancient and highly degraded DNA

    National Research Council Canada - National Science Library

    Stiller, Mathias; Knapp, Michael; Stenzel, Udo; Hofreiter, Michael; Meyer, Matthias

    2009-01-01

    Although the emergence of high-throughput sequencing technologies has enabled whole-genome sequencing from extinct organisms, little progress has been made in accelerating targeted sequencing from highly degraded DNA...

  16. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    Directory of Open Access Journals (Sweden)

    Nicolas Pinto

    2009-11-01

    Full Text Available While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor. In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  17. Throughput Analysis for a High-Performance FPGA-Accelerated Real-Time Search Application

    Directory of Open Access Journals (Sweden)

    Wim Vanderbauwhede

    2012-01-01

    Full Text Available We propose an FPGA design for the relevancy computation part of a high-throughput real-time search application. The application matches terms in a stream of documents against a static profile, held in off-chip memory. We present a mathematical analysis of the throughput of the application and apply it to the problem of scaling the Bloom filter used to discard nonmatches.

  18. Forecasting Ecological Genomics: High-Tech Animal Instrumentation Meets High-Throughput Sequencing.

    Science.gov (United States)

    Shafer, Aaron B A; Northrup, Joseph M; Wikelski, Martin; Wittemyer, George; Wolf, Jochen B W

    2016-01-01

    Recent advancements in animal tracking technology and high-throughput sequencing are rapidly changing the questions and scope of research in the biological sciences. The integration of genomic data with high-tech animal instrumentation comes as a natural progression of traditional work in ecological genetics, and we provide a framework for linking the separate data streams from these technologies. Such a merger will elucidate the genetic basis of adaptive behaviors like migration and hibernation and advance our understanding of fundamental ecological and evolutionary processes such as pathogen transmission, population responses to environmental change, and communication in natural populations.

  19. Automated degenerate PCR primer design for high-throughput sequencing improves efficiency of viral sequencing

    Directory of Open Access Journals (Sweden)

    Li Kelvin

    2012-11-01

    Full Text Available Abstract Background In a high-throughput environment, to PCR amplify and sequence a large set of viral isolates from populations that are potentially heterogeneous and continuously evolving, the use of degenerate PCR primers is an important strategy. Degenerate primers allow for the PCR amplification of a wider range of viral isolates with only one set of pre-mixed primers, thus increasing amplification success rates and minimizing the necessity for genome finishing activities. To successfully select a large set of degenerate PCR primers necessary to tile across an entire viral genome and maximize their success, this process is best performed computationally. Results We have developed a fully automated degenerate PCR primer design system that plays a key role in the J. Craig Venter Institute’s (JCVI high-throughput viral sequencing pipeline. A consensus viral genome, or a set of consensus segment sequences in the case of a segmented virus, is specified using IUPAC ambiguity codes in the consensus template sequence to represent the allelic diversity of the target population. PCR primer pairs are then selected computationally to produce a minimal amplicon set capable of tiling across the full length of the specified target region. As part of the tiling process, primer pairs are computationally screened to meet the criteria for successful PCR with one of two described amplification protocols. The actual sequencing success rates for designed primers for measles virus, mumps virus, human parainfluenza virus 1 and 3, human respiratory syncytial virus A and B and human metapneumovirus are described, where >90% of designed primer pairs were able to consistently successfully amplify >75% of the isolates. Conclusions Augmenting our previously developed and published JCVI Primer Design Pipeline, we achieved similarly high sequencing success rates with only minor software modifications. The recommended methodology for the construction of the consensus

  20. High-Throughput Phase-Field Design of High-Energy-Density Polymer Nanocomposites.

    Science.gov (United States)

    Shen, Zhong-Hui; Wang, Jian-Jun; Lin, Yuanhua; Nan, Ce-Wen; Chen, Long-Qing; Shen, Yang

    2017-11-22

    Understanding the dielectric breakdown behavior of polymer nanocomposites is crucial to the design of high-energy-density dielectric materials with reliable performances. It is however challenging to predict the breakdown behavior due to the complicated factors involved in this highly nonequilibrium process. In this work, a comprehensive phase-field model is developed to investigate the breakdown behavior of polymer nanocomposites under electrostatic stimuli. It is found that the breakdown strength and path significantly depend on the microstructure of the nanocomposite. The predicted breakdown strengths for polymer nanocomposites with specific microstructures agree with existing experimental measurements. Using this phase-field model, a high throughput calculation is performed to seek the optimal microstructure. Based on the high-throughput calculation, a sandwich microstructure for PVDF-BaTiO3 nanocomposite is designed, where the upper and lower layers are filled with parallel nanosheets and the middle layer is filled with vertical nanofibers. It has an enhanced energy density of 2.44 times that of the pure PVDF polymer. The present work provides a computational approach for understanding the electrostatic breakdown, and it is expected to stimulate future experimental efforts on synthesizing polymer nanocomposites with novel microstructures to achieve high performances. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  2. High-throughput atomic force microscopes operating in parallel

    Science.gov (United States)

    Sadeghian, Hamed; Herfst, Rodolf; Dekker, Bert; Winters, Jasper; Bijnagte, Tom; Rijnbeek, Ramon

    2017-03-01

    Atomic force microscopy (AFM) is an essential nanoinstrument technique for several applications such as cell biology and nanoelectronics metrology and inspection. The need for statistically significant sample sizes means that data collection can be an extremely lengthy process in AFM. The use of a single AFM instrument is known for its very low speed and not being suitable for scanning large areas, resulting in a very-low-throughput measurement. We address this challenge by parallelizing AFM instruments. The parallelization is achieved by miniaturizing the AFM instrument and operating many of them simultaneously. This instrument has the advantages that each miniaturized AFM can be operated independently and that the advances in the field of AFM, both in terms of speed and imaging modalities, can be implemented more easily. Moreover, a parallel AFM instrument also allows one to measure several physical parameters simultaneously; while one instrument measures nano-scale topography, another instrument can measure mechanical, electrical, or thermal properties, making it a lab-on-an-instrument. In this paper, a proof of principle of such a parallel AFM instrument has been demonstrated by analyzing the topography of large samples such as semiconductor wafers. This nanoinstrument provides new research opportunities in the nanometrology of wafers and nanolithography masks by enabling real die-to-die and wafer-level measurements and in cell biology by measuring the nano-scale properties of a large number of cells.

  3. Software Switching for High Throughput Data Acquisition Networks

    CERN Document Server

    AUTHOR|(CDS)2089787; Lehmann Miotto, Giovanna

    The bursty many-to-one communication pattern, typical for data acquisition systems, is particularly demanding for commodity TCP/IP and Ethernet technologies. The problem arising from this pattern is widely known in the literature as \\emph{incast} and can be observed as TCP throughput collapse. It is a result of overloading the switch buffers, when a specific node in a network requests data from multiple sources. This will become even more demanding for future upgrades of the experiments at the Large Hadron Collider at CERN. It is questionable whether commodity TCP/IP and Ethernet technologies in their current form will be still able to effectively adapt to bursty traffic without losing packets due to the scarcity of buffers in the networking hardware. This thesis provides an analysis of TCP/IP performance in data acquisition networks and presents a novel approach to incast congestion in these networks based on software-based packet forwarding. Our first contribution lies in confirming the strong analogies bet...

  4. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  5. Preselection of shotgun clones by oligonucleotide fingerprinting: an efficient and high throughput strategy to reduce redundancy in large-scale sequencing projects

    National Research Council Canada - National Science Library

    Radelof, U; Hennig, S; Seranski, P; Steinfath, M; Ramser, J; Reinhardt, R; Poustka, A; Francis, F; Lehrach, H

    1998-01-01

    .... To reduce the overall effort and cost of those projects and to accelerate the sequencing throughput, we have developed an efficient, high throughput oligonucleotide fingerprinting protocol to select...

  6. Recent progress using high-throughput sequencing technologies in plant molecular breeding.

    Science.gov (United States)

    Gao, Qiang; Yue, Guidong; Li, Wenqi; Wang, Junyi; Xu, Jiaohui; Yin, Ye

    2012-04-01

    High-throughput sequencing is a revolutionary technological innovation in DNA sequencing. This technology has an ultra-low cost per base of sequencing and an overwhelmingly high data output. High-throughput sequencing has brought novel research methods and solutions to the research fields of genomics and post-genomics. Furthermore, this technology is leading to a new molecular breeding revolution that has landmark significance for scientific research and enables us to launch multi-level, multi-faceted, and multi-extent studies in the fields of crop genetics, genomics, and crop breeding. In this paper, we review progress in the application of high-throughput sequencing technologies to plant molecular breeding studies. © 2012 Institute of Botany, Chinese Academy of Sciences.

  7. Repeated Assessment by High-Throughput Assay Demonstrates that Sperm DNA Methylation Levels Are Highly Reproducible

    Science.gov (United States)

    Cortessis, Victoria K.; Siegmund, Kimberly; Houshdaran, Sahar; Laird, Peter W.; Sokol, Rebecca Z.

    2011-01-01

    Objective To assess reliability of high-throughput assay of sperm DNA methylation. Design Observational study comparing DNA methylation of sperm isolated from three divided and twelve longitudinally collected semen samples. Setting Academic Medical Center Patients One man undergoing screening semen analysis during evaluation of the infertile couple and two healthy fertile male volunteers. Interventions Spermatozoa were separated from seminal plasma and somatic cells using gradient separation. DNA was extracted from spermatozoa, and DNA methylation was assessed at 1,505 DNA-sequence specific sites. Main Outcome Measures Repeatability of sperm DNA methylation measures, estimated by correlation coefficients. Results DNA methylation levels were highly correlated within matched sets of divided samples (all r≥0.97) and longitudinal samples (average r=0.97). Conclusions The described methodology reliably assesses methylation of sperm DNA at large numbers of sites. Methylation profiles were consistent over time. High-throughput assessment of sperm DNA methylation is a promising tool for studying the role of epigenetic state in male fertility. PMID:22035967

  8. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  9. Applications of high-throughput clonogenic survival assays in high-LET particle microbeams

    Directory of Open Access Journals (Sweden)

    Antonios eGeorgantzoglou

    2016-01-01

    Full Text Available Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-LET particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells’ clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells’ response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell’s capacity to divide at least 4-5 times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  10. Applications of High-Throughput Clonogenic Survival Assays in High-LET Particle Microbeams.

    Science.gov (United States)

    Georgantzoglou, Antonios; Merchant, Michael J; Jeynes, Jonathan C G; Mayhead, Natalie; Punia, Natasha; Butler, Rachel E; Jena, Rajesh

    2015-01-01

    Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-linear energy transfer (LET) particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells' clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells' response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell's capacity to divide at least four to five times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  11. Improvement of an automated protein crystal exchange system PAM for high-throughput data collection

    Energy Technology Data Exchange (ETDEWEB)

    Hiraki, Masahiko, E-mail: masahiko.hiraki@kek.jp; Yamada, Yusuke; Chavas, Leonard M. G. [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Wakatsuki, Soichi [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); SLAC National Accelerator Laboratory, 2575 Sand Hill Road, MS 69, Menlo Park, CA 94025-7015 (United States); Stanford University, Beckman Center B105, Stanford, CA 94305-5126 (United States); Matsugaki, Naohiro [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)

    2013-11-01

    A special liquid-nitrogen Dewar with double capacity for the sample-exchange robot has been created at AR-NE3A at the Photon Factory, allowing continuous fully automated data collection. In this work, this new system is described and the stability of its calibration is discussed. Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable.

  12. High-throughput high-resolution class I HLA genotyping in East Africa.

    Directory of Open Access Journals (Sweden)

    Rebecca N Koehler

    Full Text Available HLA, the most genetically diverse loci in the human genome, play a crucial role in host-pathogen interaction by mediating innate and adaptive cellular immune responses. A vast number of infectious diseases affect East Africa, including HIV/AIDS, malaria, and tuberculosis, but the HLA genetic diversity in this region remains incompletely described. This is a major obstacle for the design and evaluation of preventive vaccines. Available HLA typing techniques, that provide the 4-digit level resolution needed to interpret immune responses, lack sufficient throughput for large immunoepidemiological studies. Here we present a novel HLA typing assay bridging the gap between high resolution and high throughput. The assay is based on real-time PCR using sequence-specific primers (SSP and can genotype carriers of the 49 most common East African class I HLA-A, -B, and -C alleles, at the 4-digit level. Using a validation panel of 175 samples from Kampala, Uganda, previously defined by sequence-based typing, the new assay performed with 100% sensitivity and specificity. The assay was also implemented to define the HLA genetic complexity of a previously uncharacterized Tanzanian population, demonstrating its inclusion in the major East African genetic cluster. The availability of genotyping tools with this capacity will be extremely useful in the identification of correlates of immune protection and the evaluation of candidate vaccine efficacy.

  13. A high-throughput media design approach for high performance mammalian fed-batch cultures.

    Science.gov (United States)

    Rouiller, Yolande; Périlleux, Arnaud; Collet, Natacha; Jordan, Martin; Stettler, Matthieu; Broly, Hervé

    2013-01-01

    An innovative high-throughput medium development method based on media blending was successfully used to improve the performance of a Chinese hamster ovary fed-batch medium in shaking 96-deepwell plates. Starting from a proprietary chemically-defined medium, 16 formulations testing 43 of 47 components at 3 different levels were designed. Media blending was performed following a custom-made mixture design of experiments considering binary blends, resulting in 376 different blends that were tested during both cell expansion and fed-batch production phases in one single experiment. Three approaches were chosen to provide the best output of the large amount of data obtained. A simple ranking of conditions was first used as a quick approach to select new formulations with promising features. Then, prediction of the best mixes was done to maximize both growth and titer using the Design Expert software. Finally, a multivariate analysis enabled identification of individual potential critical components for further optimization. Applying this high-throughput method on a fed-batch, rather than on a simple batch, process opens new perspectives for medium and feed development that enables identification of an optimized process in a short time frame.

  14. The sva package for removing batch effects and other unwanted variation in high-throughput experiments.

    Science.gov (United States)

    Leek, Jeffrey T; Johnson, W Evan; Parker, Hilary S; Jaffe, Andrew E; Storey, John D

    2012-03-15

    Heterogeneity and latent variables are now widely recognized as major sources of bias and variability in high-throughput experiments. The most well-known source of latent variation in genomic experiments are batch effects-when samples are processed on different days, in different groups or by different people. However, there are also a large number of other variables that may have a major impact on high-throughput measurements. Here we describe the sva package for identifying, estimating and removing unwanted sources of variation in high-throughput experiments. The sva package supports surrogate variable estimation with the sva function, direct adjustment for known batch effects with the ComBat function and adjustment for batch and latent variables in prediction problems with the fsva function.

  15. Plant phenomics and high-throughput phenotyping: accelerating rice functional genomics using multidisciplinary technologies.

    Science.gov (United States)

    Yang, Wanneng; Duan, Lingfeng; Chen, Guoxing; Xiong, Lizhong; Liu, Qian

    2013-05-01

    The functional analysis of the rice genome has entered into a high-throughput stage, and a project named RICE2020 has been proposed to determine the function of every gene in the rice genome by the year 2020. However, as compared with the robustness of genetic techniques, the evaluation of rice phenotypic traits is still performed manually, and the process is subjective, inefficient, destructive and error-prone. To overcome these limitations and help rice phenomics more closely parallel rice genomics, reliable, automatic, multifunctional, and high-throughput phenotyping platforms should be developed. In this article, we discuss the key plant phenotyping technologies, particularly photonics-based technologies, and then introduce their current applications in rice (wheat or barley) phenomics. We also note the major challenges in rice phenomics and are confident that these reliable high-throughput phenotyping tools will give plant scientists new perspectives on the information encoded in the rice genome. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. High throughput single-cell and multiple-cell micro-encapsulation.

    Science.gov (United States)

    Lagus, Todd P; Edd, Jon F

    2012-06-15

    Microfluidic encapsulation methods have been previously utilized to capture cells in picoliter-scale aqueous, monodisperse drops, providing confinement from a bulk fluid environment with applications in high throughput screening, cytometry, and mass spectrometry. We describe a method to not only encapsulate single cells, but to repeatedly capture a set number of cells (here we demonstrate one- and two-cell encapsulation) to study both isolation and the interactions between cells in groups of controlled sizes. By combining drop generation techniques with cell and particle ordering, we demonstrate controlled encapsulation of cell-sized particles for efficient, continuous encapsulation. Using an aqueous particle suspension and immiscible fluorocarbon oil, we generate aqueous drops in oil with a flow focusing nozzle. The aqueous flow rate is sufficiently high to create ordering of particles which reach the nozzle at integer multiple frequencies of the drop generation frequency, encapsulating a controlled number of cells in each drop. For representative results, 9.9 μm polystyrene particles are used as cell surrogates. This study shows a single-particle encapsulation efficiency P(k=1) of 83.7% and a double-particle encapsulation efficiency P(k=2) of 79.5% as compared to their respective Poisson efficiencies of 39.3% and 33.3%, respectively. The effect of consistent cell and particle concentration is demonstrated to be of major importance for efficient encapsulation, and dripping to jetting transitions are also addressed. Continuous media aqueous cell suspensions share a common fluid environment which allows cells to interact in parallel and also homogenizes the effects of specific cells in measurements from the media. High-throughput encapsulation of cells into picoliter-scale drops confines the samples to protect drops from cross-contamination, enable a measure of cellular diversity within samples, prevent dilution of reagents and expressed biomarkers, and amplify

  17. Protocol: A high-throughput DNA extraction system suitable for conifers

    Directory of Open Access Journals (Sweden)

    Rajora Om P

    2008-08-01

    Full Text Available Abstract Background High throughput DNA isolation from plants is a major bottleneck for most studies requiring large sample sizes. A variety of protocols have been developed for DNA isolation from plants. However, many species, including conifers, have high contents of secondary metabolites that interfere with the extraction process or the subsequent analysis steps. Here, we describe a procedure for high-throughput DNA isolation from conifers. Results We have developed a high-throughput DNA extraction protocol for conifers using an automated liquid handler and modifying the Qiagen MagAttract Plant Kit protocol. The modifications involve change to the buffer system and improving the protocol so that it almost doubles the number of samples processed per kit, which significantly reduces the overall costs. We describe two versions of the protocol: one for medium-throughput (MTP and another for high-throughput (HTP DNA isolation. The HTP version works from start to end in the industry-standard 96-well format, while the MTP version provides higher DNA yields per sample processed. We have successfully used the protocol for DNA extraction and genotyping of thousands of individuals of several spruce and a pine species. Conclusion A high-throughput system for DNA extraction from conifer needles and seeds has been developed and validated. The quality of the isolated DNA was comparable with that obtained from two commonly used methods: the silica-spin column and the classic CTAB protocol. Our protocol provides a fully automatable and cost effective solution for processing large numbers of conifer samples.

  18. High throughput discovery of families of high activity WGS catalysts: part I--history and methodology.

    Science.gov (United States)

    Yaccato, Karin; Carhart, Ray; Hagemeyer, Alfred; Herrmann, Michael; Lesik, Andreas; Strasser, Peter; Volpe, Anthony; Turner, Howard; Weinberg, Henry; Grasselli, Robert K; Brooks, Christopher J; Pigos, John M

    2010-05-01

    State-of-art water gas shift catalysts (FeCr for high temperature shift and CuZn for low temperature shift) are not active enough to be used in fuel processors for the production of hydrogen from hydrocarbon fuels for fuel cells. The need for drastically lower catalyst volumes has triggered a search for novel WGS catalysts that are an order of magnitude more active than current systems. Novel catalytic materials for the high, medium and low temperature water gas shift reactions have been discovered by application of combinatorial methodologies. Catalyst libraries were synthesized on 4 inch wafers in 16 x 16 arrays and screened in a high throughput scanning mass spectrometer in the temperature range 200 degrees C to 400 degrees C. More than 200 wafers were screened under various conditions and more than 250,000 experiments were conducted to comprehensively examine catalyst performance for various binary, ternary and higher-order compositions.

  19. Rapid and high-throughput detection of highly pathogenic bacteria by Ibis PLEX-ID technology.

    Directory of Open Access Journals (Sweden)

    Daniela Jacob

    Full Text Available In this manuscript, we describe the identification of highly pathogenic bacteria using an assay coupling biothreat group-specific PCR with electrospray ionization mass spectrometry (PCR/ESI-MS run on an Ibis PLEX-ID high-throughput platform. The biothreat cluster assay identifies most of the potential bioterrorism-relevant microorganisms including Bacillus anthracis, Francisella tularensis, Yersinia pestis, Burkholderia mallei and pseudomallei, Brucella species, and Coxiella burnetii. DNA from 45 different reference materials with different formulations and different concentrations were chosen and sent to a service screening laboratory that uses the PCR/ESI-MS platform to provide a microbial identification service. The standard reference materials were produced out of a repository built up in the framework of the EU funded project "Establishment of Quality Assurances for Detection of Highly Pathogenic Bacteria of Potential Bioterrorism Risk" (EQADeBa. All samples were correctly identified at least to the genus level.

  20. Integrating high-throughput pyrosequencing and quantitative real-time PCR to analyze complex microbial communities.

    Science.gov (United States)

    Zhang, Husen; Parameswaran, Prathap; Badalamenti, Jonathan; Rittmann, Bruce E; Krajmalnik-Brown, Rosa

    2011-01-01

    New high-throughput technologies continue to emerge for studying complex microbial communities. In particular, massively parallel pyrosequencing enables very high numbers of sequences, providing a more complete view of community structures and a more accurate inference of the functions than has been possible just a few years ago. In parallel, quantitative real-time PCR (QPCR) allows quantitative monitoring of specific community members over time, space, or different environmental conditions. In this review, we discuss the principles of these two methods and their complementary applications in studying microbial ecology in bioenvironmental systems. We explain parallel sequencing of amplicon libraries and using bar codes to differentiate multiple samples in a pyrosequencing run. We also describe best procedures and chemistries for QPCR amplifications and address advantages of applying automation to increase accuracy. We provide three examples in which we used pyrosequencing and QPCR together to define and quantify members of microbial communities: in the human large intestine, in a methanogenic digester whose sludge was made more bioavailable by a high-voltage pretreatment, and on the biofilm anode of a microbial electrolytic cell. We highlight our key findings in these systems and how both methods were used in concert to achieve those findings. Finally, we supply detailed methods for generating PCR amplicon libraries for pyrosequencing, pyrosequencing data analysis, QPCR methodology, instrumentation, and automation.

  1. Complementing high-throughput X-ray powder diffraction data with quantum-chemical calculations

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; van de Streek, Jacco; Rantanen, Jukka

    2012-01-01

    High-throughput crystallisation and characterisation platforms provide an efficient means to carry out solid-form screening during the pre-formulation phase. To determine the crystal structures of identified new solid phases, however, usually requires independent crystallisation trials to produce...... obtained only during high-energy processing such as spray drying or milling....

  2. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  3. Accelerating the design of solar thermal fuel materials through high throughput simulations.

    Science.gov (United States)

    Liu, Yun; Grossman, Jeffrey C

    2014-12-10

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  4. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    Science.gov (United States)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  5. High-Throughput 3D Tumor Culture in a Recyclable Microfluidic Platform.

    Science.gov (United States)

    Liu, Wenming; Wang, Jinyi

    2017-01-01

    Three-dimensional (3D) tumor culture miniaturized platforms are of importance to biomimetic model construction and pathophysiological studies. Controllable and high-throughput production of 3D tumors is desirable to make cell-based manipulation dynamic and efficient at micro-scale. Moreover, the 3D culture platform being reusable is convenient to research scholars. In this chapter, we describe a dynamically controlled 3D tumor manipulation and culture method using pneumatic microstructure-based microfluidics, which has potential applications in the fields of tissue engineering, tumor biology, and clinical medicine in a high-throughput way.

  6. Applications of high-throughput plant phenotyping to study nutrient use efficiency.

    Science.gov (United States)

    Berger, Bettina; de Regt, Bas; Tester, Mark

    2013-01-01

    Remote sensing and spectral reflectance measurements of plants has long been used to assess the growth and nutrient status of plants in a noninvasive manner. With improved imaging and computer technologies, these approaches can now be used at high-throughput for more extensive physiological and genetic studies. Here, we present an example of how high-throughput imaging can be used to study the growth of plants exposed to different nutrient levels. In addition, the color of the leaves can be used to estimate leaf chlorophyll and nitrogen status of the plant.

  7. A platform for high-throughput screening of DNA-encoded catalyst libraries in organic solvents.

    Science.gov (United States)

    Hook, K Delaney; Chambers, John T; Hili, Ryan

    2017-10-01

    We have developed a novel high-throughput screening platform for the discovery of small-molecules catalysts for bond-forming reactions. The method employs an in vitro selection for bond-formation using amphiphilic DNA-encoded small molecules charged with reaction substrate, which enables selections to be conducted in a variety of organic or aqueous solvents. Using the amine-catalysed aldol reaction as a catalytic model and high-throughput DNA sequencing as a selection read-out, we demonstrate the 1200-fold enrichment of a known aldol catalyst from a library of 16.7-million uncompetitive library members.

  8. Perspective: Composition–structure–property mapping in high-throughput experiments: Turning data into knowledge

    Directory of Open Access Journals (Sweden)

    Jason R. Hattrick-Simpers

    2016-05-01

    Full Text Available With their ability to rapidly elucidate composition-structure-property relationships, high-throughput experimental studies have revolutionized how materials are discovered, optimized, and commercialized. It is now possible to synthesize and characterize high-throughput libraries that systematically address thousands of individual cuts of fabrication parameter space. An unresolved issue remains transforming structural characterization data into phase mappings. This difficulty is related to the complex information present in diffraction and spectroscopic data and its variation with composition and processing. We review the field of automated phase diagram attribution and discuss the impact that emerging computational approaches will have in the generation of phase diagrams and beyond.

  9. Macro-to-micro structural proteomics: native source proteins for high-throughput crystallization.

    Science.gov (United States)

    Totir, Monica; Echols, Nathaniel; Nanao, Max; Gee, Christine L; Moskaleva, Alisa; Gradia, Scott; Iavarone, Anthony T; Berger, James M; May, Andrew P; Zubieta, Chloe; Alber, Tom

    2012-01-01

    Structural biology and structural genomics projects routinely rely on recombinantly expressed proteins, but many proteins and complexes are difficult to obtain by this approach. We investigated native source proteins for high-throughput protein crystallography applications. The Escherichia coli proteome was fractionated, purified, crystallized, and structurally characterized. Macro-scale fermentation and fractionation were used to subdivide the soluble proteome into 408 unique fractions of which 295 fractions yielded crystals in microfluidic crystallization chips. Of the 295 crystals, 152 were selected for optimization, diffraction screening, and data collection. Twenty-three structures were determined, four of which were novel. This study demonstrates the utility of native source proteins for high-throughput crystallography.

  10. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    Science.gov (United States)

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  11. High-throughput exposure modeling to support prioritization of chemicals in personal care products

    DEFF Research Database (Denmark)

    Csiszar, Susan A.; Ernstoff, Alexi; Fantke, Peter

    2016-01-01

    We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or population per mass...... intakes were associated with body lotion. Bioactive doses derived from high-throughput in vitro toxicity data were combined with the estimated PiFs to demonstrate an approach to estimate bioactive equivalent chemical content and to screen chemicals for risk....

  12. HT-Paxos: High Throughput State-Machine Replication Protocol for Large Clustered Data Centers

    Directory of Open Access Journals (Sweden)

    Vinit Kumar

    2015-01-01

    Full Text Available Paxos is a prominent theory of state-machine replication. Recent data intensive systems that implement state-machine replication generally require high throughput. Earlier versions of Paxos as few of them are classical Paxos, fast Paxos, and generalized Paxos have a major focus on fault tolerance and latency but lacking in terms of throughput and scalability. A major reason for this is the heavyweight leader. Through offloading the leader, we can further increase throughput of the system. Ring Paxos, Multiring Paxos, and S-Paxos are few prominent attempts in this direction for clustered data centers. In this paper, we are proposing HT-Paxos, a variant of Paxos that is the best suitable for any large clustered data center. HT-Paxos further offloads the leader very significantly and hence increases the throughput and scalability of the system, while at the same time, among high throughput state-machine replication protocols, it provides reasonably low latency and response time.

  13. High throughput phenotyping to accelerate crop breeding and monitoring of diseases in the field.

    Science.gov (United States)

    Shakoor, Nadia; Lee, Scott; Mockler, Todd C

    2017-08-01

    Effective implementation of technology that facilitates accurate and high-throughput screening of thousands of field-grown lines is critical for accelerating crop improvement and breeding strategies for higher yield and disease tolerance. Progress in the development of field-based high throughput phenotyping methods has advanced considerably in the last 10 years through technological progress in sensor development and high-performance computing. Here, we review recent advances in high throughput field phenotyping technologies designed to inform the genetics of quantitative traits, including crop yield and disease tolerance. Successful application of phenotyping platforms to advance crop breeding and identify and monitor disease requires: (1) high resolution of imaging and environmental sensors; (2) quality data products that facilitate computer vision, machine learning and GIS; (3) capacity infrastructure for data management and analysis; and (4) automated environmental data collection. Accelerated breeding for agriculturally relevant crop traits is key to the development of improved varieties and is critically dependent on high-resolution, high-throughput field-scale phenotyping technologies that can efficiently discriminate better performing lines within a larger population and across multiple environments. Copyright © 2017. Published by Elsevier Ltd.

  14. Liquid Phase Multiplex High-Throughput Screening of Metagenomic Libraries Using p-Nitrophenyl-Linked Substrates for Accessory Lignocellulosic Enzymes.

    Science.gov (United States)

    Smart, Mariette; Huddy, Robert J; Cowan, Don A; Trindade, Marla

    2017-01-01

    To access the genetic potential contained in large metagenomic libraries, suitable high-throughput functional screening methods are required. Here we describe a high-throughput screening approach which enables the rapid identification of metagenomic library clones expressing functional accessory lignocellulosic enzymes. The high-throughput nature of this method hinges on the multiplexing of both the E. coli metagenomic library clones and the colorimetric p-nitrophenyl linked substrates which allows for the simultaneous screening for β-glucosidases, β-xylosidases, and α-L-arabinofuranosidases. This method is readily automated and compatible with high-throughput robotic screening systems.

  15. INSIDIA: A FIJI Macro Delivering High-Throughput and High-Content Spheroid Invasion Analysis.

    Science.gov (United States)

    Moriconi, Chiara; Palmieri, Valentina; Di Santo, Riccardo; Tornillo, Giusy; Papi, Massimiliano; Pilkington, Geoff; De Spirito, Marco; Gumbleton, Mark

    2017-10-01

    Time-series image capture of in vitro 3D spheroidal cancer models embedded within an extracellular matrix affords examination of spheroid growth and cancer cell invasion. However, a customizable, comprehensive and open source solution for the quantitative analysis of such spheroid images is lacking. Here, the authors describe INSIDIA (INvasion SpheroID ImageJ Analysis), an open-source macro implemented as a customizable software algorithm running on the FIJI platform, that enables high-throughput high-content quantitative analysis of spheroid images (both bright-field gray and fluorescent images) with the output of a range of parameters defining the spheroid "tumor" core and its invasive characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  17. High pressure inertial focusing for separating and concentrating bacteria at high throughput

    Science.gov (United States)

    Cruz, J.; Hooshmand Zadeh, S.; Graells, T.; Andersson, M.; Malmström, J.; Wu, Z. G.; Hjort, K.

    2017-08-01

    Inertial focusing is a promising microfluidic technology for concentration and separation of particles by size. However, there is a strong correlation of increased pressure with decreased particle size. Theory and experimental results for larger particles were used to scale down the phenomenon and find the conditions that focus 1 µm particles. High pressure experiments in robust glass chips were used to demonstrate the alignment. We show how the technique works for 1 µm spherical polystyrene particles and for Escherichia coli, not being harmful for the bacteria at 50 µl min-1. The potential to focus bacteria, simplicity of use and high throughput make this technology interesting for healthcare applications, where concentration and purification of a sample may be required as an initial step.

  18. Microfabrication of a High-Throughput Nanochannel Delivery/Filtration System

    Science.gov (United States)

    Ferrari, Mauro; Liu, Xuewu; Grattoni, Alessandro; Fine, Daniel; Hosali, Sharath; Goodall, Randi; Medema, Ryan; Hudson, Lee

    2011-01-01

    A microfabrication process is proposed to produce a nanopore membrane for continuous passive drug release to maintain constant drug concentrations in the patient s blood throughout the delivery period. Based on silicon microfabrication technology, the dimensions of the nanochannel area, as well as microchannel area, can be precisely controlled, thus providing a steady, constant drug release rate within an extended time period. The multilayered nanochannel structures extend the limit of release rate range of a single-layer nanochannel system, and allow a wide range of pre-defined porosity to achieve any arbitrary drug release rate using any preferred nanochannel size. This membrane system could also be applied to molecular filtration or isolation. In this case, the nanochannel length can be reduced to the nanofabrication limit, i.e., 10s of nm. The nanochannel delivery system membrane is composed of a sandwich of a thin top layer, the horizontal nanochannels, and a thicker bottom wafer. The thin top layer houses an array of microchannels that offers the inlet port for diffusing molecules. It also works as a lid for the nanochannels by providing the channels a top surface. The nanochannels are fabricated by a sacrificial layer technique that obtains smooth surfaces and precisely controlled dimensions. The structure of this nanopore membrane is optimized to yield high mechanical strength and high throughput.

  19. High-Throughput Separation of White Blood Cells From Whole Blood Using Inertial Microfluidics.

    Science.gov (United States)

    Zhang, Jun; Yuan, Dan; Sluyter, Ronald; Yan, Sheng; Zhao, Qianbin; Xia, Huanming; Tan, Say Hwa; Nguyen, Nam-Trung; Li, Weihua

    2017-08-29

    White blood cells (WBCs) constitute only about 0.1% of human blood cells, yet contain rich information about the immune status of the body; thus, separation of WBCs from the whole blood is an indispensable and critical sample preparation step in many scientific, clinical, and diagnostic applications. In this paper, we developed a continuous and high-throughput microfluidic WBC separation platform utilizing the differential inertial focusing of particles in serpentine microchannels. First, separation performance of the proposed method is characterized and evaluated using polystyrene beads in the serpentine channel. The purity of 10-μm polystyrene beads is increased from 0.1% to 80.3% after two cascaded processes, with an average enrichment ratio of 28 times. Next, we investigated focusing and separation properties of Jurkat cells spiked in the blood to mimic the presence of WBCs in whole blood. Finally, separation of WBCs from human whole blood was conducted and separation purity of WBCs was measured by the flow cytometry. The results show that the purity of WBCs can be increased to 48% after two consecutive processes, with an average enrichment ratio of ten times. Meanwhile, a parallelized inertial microfluidic device was designed to provide a high processing flow rate of 288 ml/h for the diluted (×1/20) whole blood. The proposed microfluidic device can potentially work as an upstream component for blood sample preparation and analysis in the integrated microfluidic systems.

  20. Methods and devices for high-throughput dielectrophoretic concentration

    Science.gov (United States)

    Simmons, Blake A.; Cummings, Eric B.; Fiechtner, Gregory J.; Fintschenko, Yolanda; McGraw, Gregory J.; Salmi, Allen

    2010-02-23

    Disclosed herein are methods and devices for assaying and concentrating analytes in a fluid sample using dielectrophoresis. As disclosed, the methods and devices utilize substrates having a plurality of pores through which analytes can be selectively prevented from passing, or inhibited, on application of an appropriate electric field waveform. The pores of the substrate produce nonuniform electric field having local extrema located near the pores. These nonuniform fields drive dielectrophoresis, which produces the inhibition. Arrangements of electrodes and porous substrates support continuous, bulk, multi-dimensional, and staged selective concentration.

  1. Disubstituted 1-aryl-4-aminopiperidine library synthesis using computational drug design and high-throughput batch and flow technologies.

    Science.gov (United States)

    Bryan, Marian C; Hein, Christopher D; Gao, Hua; Xia, Xiaoyang; Eastwood, Heather; Bruenner, Bernd A; Louie, Steven W; Doherty, Elizabeth M

    2013-09-09

    A platform that incorporates computational library design, parallel solution-phase synthesis, continuous flow hydrogenation, and automated high throughput purification and reformatting technologies was applied to the production of a 120-member library of 1-aryl-4-aminopiperidine analogues for drug discovery screening. The application described herein demonstrates the advantages of computational library design coupled with a flexible, modular approach to library synthesis. The enabling technologies described can be readily adopted by the traditional medicinal chemist without extensive training and lengthy process development times.

  2. High-throughput time-stretch microscopy with morphological and chemical specificity

    Science.gov (United States)

    Lei, Cheng; Ugawa, Masashi; Nozawa, Taisuke; Ideguchi, Takuro; Di Carlo, Dino; Ota, Sadao; Ozeki, Yasuyuki; Goda, Keisuke

    2016-03-01

    Particle analysis is an effective method in analytical chemistry for sizing and counting microparticles such as emulsions, colloids, and biological cells. However, conventional methods for particle analysis, which fall into two extreme categories, have severe limitations. Sieving and Coulter counting are capable of analyzing particles with high throughput, but due to their lack of detailed information such as morphological and chemical characteristics, they can only provide statistical results with low specificity. On the other hand, CCD or CMOS image sensors can be used to analyze individual microparticles with high content, but due to their slow charge download, the frame rate (hence, the throughput) is significantly limited. Here by integrating a time-stretch optical microscope with a three-color fluorescent analyzer on top of an inertial-focusing microfluidic device, we demonstrate an optofluidic particle analyzer with a sub-micrometer spatial resolution down to 780 nm and a high throughput of 10,000 particles/s. In addition to its morphological specificity, the particle analyzer provides chemical specificity to identify chemical expressions of particles via fluorescence detection. Our results indicate that we can identify different species of microparticles with high specificity without sacrificing throughput. Our method holds promise for high-precision statistical particle analysis in chemical industry and pharmaceutics.

  3. Predicting gene function through systematic analysis and quality assessment of high-throughput data.

    Science.gov (United States)

    Kemmeren, Patrick; Kockelkorn, Thessa T J P; Bijma, Theo; Donders, Rogier; Holstege, Frank C P

    2005-04-15

    Determining gene function is an important challenge arising from the availability of whole genome sequences. Until recently, approaches based on sequence homology were the only high-throughput method for predicting gene function. Use of high-throughput generated experimental data sets for determining gene function has been limited for several reasons. Here a new approach is presented for integration of high-throughput data sets, leading to prediction of function based on relationships supported by multiple types and sources of data. This is achieved with a database containing 125 different high-throughput data sets describing phenotypes, cellular localizations, protein interactions and mRNA expression levels from Saccharomyces cerevisiae, using a bit-vector representation and information content-based ranking. The approach takes characteristic and qualitative differences between the data sets into account, is highly flexible, efficient and scalable. Database queries result in predictions for 543 uncharacterized genes, based on multiple functional relationships each supported by at least three types of experimental data. Some of these are experimentally verified, further demonstrating their reliability. The results also generate insights into the relative merits of different data types and provide a coherent framework for functional genomic datamining. Free availability over the Internet. f.c.p.holstege@med.uu.nl http://www.genomics.med.uu.nl/pub/pk/comb_gen_network.

  4. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    Science.gov (United States)

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  5. Lights, camera, action: high-throughput plant phenotyping is ready for a close-up.

    Science.gov (United States)

    Fahlgren, Noah; Gehan, Malia A; Baxter, Ivan

    2015-04-01

    Anticipated population growth, shifting demographics, and environmental variability over the next century are expected to threaten global food security. In the face of these challenges, crop yield for food and fuel must be maintained and improved using fewer input resources. In recent years, genetic tools for profiling crop germplasm has benefited from rapid advances in DNA sequencing, and now similar advances are needed to improve the throughput of plant phenotyping. We highlight recent developments in high-throughput plant phenotyping using robotic-assisted imaging platforms and computer vision-assisted analysis tools. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Generating Mouse Models Using Zygote Electroporation of Nucleases (ZEN) Technology with High Efficiency and Throughput.

    Science.gov (United States)

    Wang, Wenbo; Zhang, Yingfan; Wang, Haoyi

    2017-01-01

    Mouse models with genetic modifications are widely used in biology and biomedical research. Although the application of CRISPR-Cas9 system greatly accelerated the process of generating genetically modified mice, the delivery method depending on manual injection of the components into the embryos remains a bottleneck, as it is laborious, low throughput, and technically demanding. To overcome this limitation, we invented and optimized the ZEN (Zygote electroporation of nucleases) technology to deliver CRISPR-Cas9 reagents via electroporation. Using ZEN, we were able to generate genetically modified mouse models with high efficiency and throughput. Here, we describe the protocol in great detail.

  7. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis

    Directory of Open Access Journals (Sweden)

    Na Wen

    2016-07-01

    Full Text Available This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1 prototype demonstration of single-cell encapsulation in microfluidic droplets; (2 technical improvements of single-cell encapsulation in microfluidic droplets; (3 microfluidic droplets enabling single-cell proteomic analysis; (4 microfluidic droplets enabling single-cell genomic analysis; and (5 integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  8. NucTools: analysis of chromatin feature occupancy profiles from high-throughput sequencing data.

    Science.gov (United States)

    Vainshtein, Yevhen; Rippe, Karsten; Teif, Vladimir B

    2017-02-14

    Biomedical applications of high-throughput sequencing methods generate a vast amount of data in which numerous chromatin features are mapped along the genome. The results are frequently analysed by creating binary data sets that link the presence/absence of a given feature to specific genomic loci. However, the nucleosome occupancy or chromatin accessibility landscape is essentially continuous. It is currently a challenge in the field to cope with continuous distributions of deep sequencing chromatin readouts and to integrate the different types of discrete chromatin features to reveal linkages between them. Here we introduce the NucTools suite of Perl scripts as well as MATLAB- and R-based visualization programs for a nucleosome-centred downstream analysis of deep sequencing data. NucTools accounts for the continuous distribution of nucleosome occupancy. It allows calculations of nucleosome occupancy profiles averaged over several replicates, comparisons of nucleosome occupancy landscapes between different experimental conditions, and the estimation of the changes of integral chromatin properties such as the nucleosome repeat length. Furthermore, NucTools facilitates the annotation of nucleosome occupancy with other chromatin features like binding of transcription factors or architectural proteins, and epigenetic marks like histone modifications or DNA methylation. The applications of NucTools are demonstrated for the comparison of several datasets for nucleosome occupancy in mouse embryonic stem cells (ESCs) and mouse embryonic fibroblasts (MEFs). The typical workflows of data processing and integrative analysis with NucTools reveal information on the interplay of nucleosome positioning with other features such as for example binding of a transcription factor CTCF, regions with stable and unstable nucleosomes, and domains of large organized chromatin K9me2 modifications (LOCKs). As potential limitations and problems we discuss how inter-replicate variability of

  9. High throughput computing: a solution for scientific analysis

    Science.gov (United States)

    O'Donnell, M.

    2011-01-01

    Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data. The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:

  10. Holographic memory module with ultra-high capacity and throughput

    Energy Technology Data Exchange (ETDEWEB)

    Vladimir A. Markov, Ph.D.

    2000-06-04

    High capacity, high transfer rate, random access memory systems are needed to archive and distribute the tremendous volume of digital information being generated, for example, the human genome mapping and online libraries. The development of multi-gigabit per second networks underscores the need for next-generation archival memory systems. During Phase I we conducted the theoretical analysis and accomplished experimental tests that validated the key aspects of the ultra-high density holographic data storage module with high transfer rate. We also inspected the secure nature of the encoding method and estimated the performance of full-scale system. Two basic architectures were considered, allowing for reversible compact solid-state configuration with limited capacity, and very large capacity write once read many memory system.

  11. Multidimensional NMR approaches towards highly resolved, sensitive and high-throughput quantitative metabolomics.

    Science.gov (United States)

    Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick

    2017-02-01

    Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis

  13. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  14. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  15. Microfluidic Impedance Flow Cytometry Enabling High-Throughput Single-Cell Electrical Property Characterization

    OpenAIRE

    Jian Chen; Chengcheng Xue; Yang Zhao; Deyong Chen; Min-Hsien Wu; Junbo Wang

    2015-01-01

    This article reviews recent developments in microfluidic impedance flow cytometry for high-throughput electrical property characterization of single cells. Four major perspectives of microfluidic impedance flow cytometry for single-cell characterization are included in this review: (1) early developments of microfluidic impedance flow cytometry for single-cell electrical property characterization; (2) microfluidic impedance flow cytometry with enhanced sensitivity; (3) microfluidic impedance ...

  16. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  17. The Power of High-Throughput Experimentation in Homogeneous Catalysis Research for Fine Chemicals

    NARCIS (Netherlands)

    Vries, Johannes G. de; Vries, André H.M. de

    2003-01-01

    The use of high-throughput experimentation (HTE) in homogeneous catalysis research for the production of fine chemicals is an important breakthrough. Whereas in the past stoichiometric chemistry was often preferred because of time-to-market constraints, HTE allows catalytic solutions to be found

  18. Functional characterisation of human glycine receptors in a fluorescence-based high throughput screening assay

    DEFF Research Database (Denmark)

    Jensen, Anders A.

    2005-01-01

    receptors in this assay were found to be in good agreement with those from electrophysiology studies of the receptors expressed in Xenopus oocytes or mammalian cell lines. Hence, this high throughput screening assay will be of great use in future pharmacological studies of glycine receptors, particular...

  19. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    Science.gov (United States)

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...

  20. Application of high-throughput technologies to a structural proteomics-type analysis of Bacillus anthracis

    NARCIS (Netherlands)

    Au, K.; Folkers, G.E.; Kaptein, R.

    2006-01-01

    A collaborative project between two Structural Proteomics In Europe (SPINE) partner laboratories, York and Oxford, aimed at high-throughput (HTP) structure determination of proteins from Bacillus anthracis, the aetiological agent of anthrax and a biomedically important target, is described. Based

  1. Roche genome sequencer FLX based high-throughput sequencing of ancient DNA

    DEFF Research Database (Denmark)

    Alquezar-Planas, David E; Fordyce, Sarah Louise

    2012-01-01

    Since the development of so-called "next generation" high-throughput sequencing in 2005, this technology has been applied to a variety of fields. Such applications include disease studies, evolutionary investigations, and ancient DNA. Each application requires a specialized protocol to ensure tha...

  2. 20170913 - Retrofit Strategies for Incorporating Xenobiotic Metabolism into High Throughput Screening Assays (EMGS)

    Science.gov (United States)

    The US EPA’s ToxCast program is designed to assess chemical perturbations of molecular and cellular endpoints using a variety of high-throughput screening (HTS) assays. However, existing HTS assays have limited or no xenobiotic metabolism which could lead to a mischaracteri...

  3. PLASMA PROTEIN PROFILING AS A HIGH THROUGHPUT TOOL FOR CHEMICAL SCREENING USING A SMALL FISH MODEL

    Science.gov (United States)

    Hudson, R. Tod, Michael J. Hemmer, Kimberly A. Salinas, Sherry S. Wilkinson, James Watts, James T. Winstead, Peggy S. Harris, Amy Kirkpatrick and Calvin C. Walker. In press. Plasma Protein Profiling as a High Throughput Tool for Chemical Screening Using a Small Fish Model (Abstra...

  4. HTPheno: an image analysis pipeline for high-throughput plant phenotyping.

    Science.gov (United States)

    Hartmann, Anja; Czauderna, Tobias; Hoffmann, Roberto; Stein, Nils; Schreiber, Falk

    2011-05-12

    In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. This paper presents an image analysis pipeline (HTPheno) for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view) during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  5. HTPheno: An image analysis pipeline for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Stein Nils

    2011-05-01

    Full Text Available Abstract Background In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. Results This paper presents an image analysis pipeline (HTPheno for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. Conclusions HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  6. Investigation of non-halogenated solvent mixtures for high throughput fabrication of polymerfullerene solar cells

    NARCIS (Netherlands)

    Schmidt-Hansberg, B.; Sanyal, M.; Grossiord, N.; Galagan, Y.O.; Baunach, M.; Klein, M.F.G.; Colsmann, A.; Scharfer, P.; Lemmer, U.; Dosch, H.; Michels, J.J; Barrena, E.; Schabel, W.

    2012-01-01

    The rapidly increasing power conversion efficiencies of organic solar cells are an important prerequisite towards low cost photovoltaic fabricated in high throughput. In this work we suggest indane as a non-halogenated replacement for the commonly used halogenated solvent o-dichlorobenzene. Indane

  7. ESSENTIALS: Software for Rapid Analysis of High Throughput Transposon Insertion Sequencing Data.

    NARCIS (Netherlands)

    Zomer, A.L.; Burghout, P.J.; Bootsma, H.J.; Hermans, P.W.M.; Hijum, S.A.F.T. van

    2012-01-01

    High-throughput analysis of genome-wide random transposon mutant libraries is a powerful tool for (conditional) essential gene discovery. Recently, several next-generation sequencing approaches, e.g. Tn-seq/INseq, HITS and TraDIS, have been developed that accurately map the site of transposon

  8. Development of a thyroperoxidase inhibition assay for high-throughput screening

    Science.gov (United States)

    High-throughput screening (HTPS) assays to detect inhibitors of thyroperoxidase (TPO), the enzymatic catalyst for thyroid hormone (TH) synthesis, are not currently available. Herein we describe the development of a HTPS TPO inhibition assay. Rat thyroid microsomes and a fluores...

  9. Evaluation of Simple and Inexpensive High-Throughput Methods for Phytic Acid Determination

    DEFF Research Database (Denmark)

    Raboy, Victor; Johnson, Amy; Bilyeu, Kristin

    2017-01-01

    High-throughput/low-cost/low-tech methods for phytic acid determination that are sufficiently accurate and reproducible would be of value in plant genetics, crop breeding and in the food and feed industries. Variants of two candidate methods, those described by Vaintraub and Lapteva (Anal Biochem...

  10. High-throughput genotoxicity assay identifies antioxidants as inducers of DNA damage response and cell death

    Science.gov (United States)

    Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...

  11. Virtual high screening throughput and design of 14α-lanosterol ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-07-06

    Jul 6, 2009 ... Hildebert B. Maurice1*, Esther Tuarira1 and Kennedy Mwambete2. 1School of Pharmaceutical Sciences, Institute of Allied Health Sciences, Muhimbili University of ... high throughput screening (Guardiola-Diaz et al.,. 2001). It is therefore logical to think that developing inhi- bitors against the mycobacterial ...

  12. Increasing ecological inference from high throughput sequencing of fungi in the environment through a tagging approach

    Science.gov (United States)

    D. Lee Taylor; Michael G. Booth; Jack W. McFarland; Ian C. Herriott; Niall J. Lennon; Chad Nusbaum; Thomas G. Marr

    2008-01-01

    High throughput sequencing methods are widely used in analyses of microbial diversity but are generally applied to small numbers of samples, which precludes charaterization of patterns of microbial diversity across space and time. We have designed a primer-tagging approach that allows pooling and subsequent sorting of numerous samples, which is directed to...

  13. The protein crystallography beamline BW6 at DORIS - automatic operation and high-throughput data collection

    CERN Document Server

    Blume, H; Bourenkov, G P; Kosciesza, D; Bartunik, H D

    2001-01-01

    The wiggler beamline BW6 at DORIS has been optimized for de-novo solution of protein structures on the basis of MAD phasing. Facilities for automatic data collection, rapid data transfer and storage, and online processing have been developed which provide adequate conditions for high-throughput applications, e.g., in structural genomics.

  14. A high throughput platform for understanding the influence of excipients on physical and chemical stability

    DEFF Research Database (Denmark)

    Raijada, Dhara; Cornett, Claus; Rantanen, Jukka

    2013-01-01

    The present study puts forward a miniaturized high-throughput platform to understand influence of excipient selection and processing on the stability of a given drug compound. Four model drugs (sodium naproxen, theophylline, amlodipine besylate and nitrofurantoin) and ten different excipients were...

  15. A High-Throughput MALDI-TOF Mass Spectrometry-Based Assay of Chitinase Activity

    Science.gov (United States)

    A high-throughput MALDI-TOF mass spectrometric assay is described for assay of chitolytic enzyme activity. The assay uses unmodified chitin oligosaccharide substrates, and is readily achievable on a microliter scale (2 µL total volume, containing 2 µg of substrate and 1 ng of protein). The speed a...

  16. High-throughput siRNA screening applied to the ubiquitin-proteasome system

    DEFF Research Database (Denmark)

    Poulsen, Esben Guldahl; Nielsen, Sofie V.; Pietras, Elin J.

    2016-01-01

    that are not genetically tractable as, for instance, a yeast model system. Here, we describe a method relying on high-throughput cellular imaging of cells transfected with a targeted siRNA library to screen for components involved in degradation of a protein of interest. This method is a rapid and cost-effective tool...

  17. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  18. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free...

  19. High throughput generated micro-aggregates of chondrocytes stimulate cartilage formation in vitro and in vivo

    NARCIS (Netherlands)

    Moreira Teixeira, Liliana; Leijten, Jeroen Christianus Hermanus; Sobral, J.; Jin, R.; van Apeldoorn, Aart A.; Feijen, Jan; van Blitterswijk, Clemens; Dijkstra, Pieter J.; Karperien, Hermanus Bernardus Johannes

    2012-01-01

    Cell-based cartilage repair strategies such as matrix-induced autologous chondrocyte implantation (MACI) could be improved by enhancing cell performance. We hypothesised that micro-aggregates of chondrocytes generated in high-throughput prior to implantation in a defect could stimulate cartilaginous

  20. DNA from buccal swabs suitable for high-throughput SNP multiplex analysis.

    Science.gov (United States)

    McMichael, Gai L; Gibson, Catherine S; O'Callaghan, Michael E; Goldwater, Paul N; Dekker, Gustaaf A; Haan, Eric A; MacLennan, Alastair H

    2009-12-01

    We sought a convenient and reliable method for collection of genetic material that is inexpensive and noninvasive and suitable for self-collection and mailing and a compatible, commercial DNA extraction protocol to meet quantitative and qualitative requirements for high-throughput single nucleotide polymorphism (SNP) multiplex analysis on an automated platform. Buccal swabs were collected from 34 individuals as part of a pilot study to test commercially available buccal swabs and DNA extraction kits. DNA was quantified on a spectrofluorometer with Picogreen dsDNA prior to testing the DNA integrity with predesigned SNP multiplex assays. Based on the pilot study results, the Catch-All swabs and Isohelix buccal DNA isolation kit were selected for our high-throughput application and extended to a further 1140 samples as part of a large cohort study. The average DNA yield in the pilot study (n=34) was 1.94 microg +/- 0.54 with a 94% genotyping pass rate. For the high-throughput application (n=1140), the average DNA yield was 2.44 microg +/- 1.74 with a >or=93% genotyping pass rate. The Catch-All buccal swabs are a convenient and cost-effective alternative to blood sampling. Combined with the Isohelix buccal DNA isolation kit, they provided DNA of sufficient quantity and quality for high-throughput SNP multiplex analysis.

  1. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    Science.gov (United States)

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  2. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an

  3. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...

  4. High throughput system for magnetic manipulation of cells, polymers, and biomaterials

    Science.gov (United States)

    Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.

    2008-01-01

    In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357

  5. The Impact of Data Fragmentation on High-Throughput Clinical Phenotyping

    Science.gov (United States)

    Wei, Weiqi

    2012-01-01

    Subject selection is essential and has become the rate-limiting step for harvesting knowledge to advance healthcare through clinical research. Present manual approaches inhibit researchers from conducting deep and broad studies and drawing confident conclusions. High-throughput clinical phenotyping (HTCP), a recently proposed approach, leverages…

  6. A high-throughput method for GMO multi-detection using a microfluidic dynamic array

    NARCIS (Netherlands)

    Brod, F.C.A.; Dijk, van J.P.; Voorhuijzen, M.M.; Dinon, A.Z.; Guimarães, L.H.S.; Scholtens, I.M.J.; Arisi, A.C.M.; Kok, E.J.

    2014-01-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNAbased methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the

  7. High-throughput assessment of context-dependent effects of chromatin proteins

    NARCIS (Netherlands)

    Brueckner, L. (Laura); Van Arensbergen, J. (Joris); Akhtar, W. (Waseem); L. Pagie (Ludo); B. van Steensel (Bas)

    2016-01-01

    textabstractBackground: Chromatin proteins control gene activity in a concerted manner. We developed a high-throughput assay to study the effects of the local chromatin environment on the regulatory activity of a protein of interest. The assay combines a previously reported multiplexing strategy

  8. How to design a good photoresist solvent package using solubility parameters and high-throughput research

    Science.gov (United States)

    Tate, Michael P.; Cutler, Charlotte; Sakillaris, Mike; Kaufman, Michael; Estelle, Thomas; Mohler, Carol; Tucker, Chris; Thackeray, Jim

    2014-03-01

    Understanding fundamental properties of photoresists and how interactions between photoresist components affect performance targets are crucial to the continued success of photoresists. More specifically, polymer solubility is critical to the overall performance capability of the photoresist formulation. While several theories describe polymer solvent solubility, the most common industrially applied method is Hansen's solubility parameters. Hansen's method, based on regular solution theory, describes a solute's ability to dissolve in a solvent or solvent blend using four physical properties determined experimentally through regression of solubility data in many known solvents. The four physical parameters are dispersion, polarity, hydrogen bonding, and radius of interaction. Using these parameters a relative cohesive energy difference (RED), which describes a polymer's likelihood to dissolve in a given solvent blend, may be calculated. Leveraging a high throughput workflow to prepare and analyze the thousands of samples necessary to calculate the Hansen's solubility parameters from many different methacrylate-based polymers, we compare the physical descriptors to reveal a large range of polarities and hydrogen bonding. Further, we find that Hansen's model correctly predicts the soluble/insoluble state of 3-component solvent blends where the dispersion, polar, hydrogen-bonding, and radius of interaction values were determined through regression of experimental values. These modeling capabilities have allowed for optimization of the photoresist solubility from initial blending through application providing valuable insights into the nature of photoresist.

  9. High-throughput nucleotide sequence analysis of diverse bacterial communities in leachates of decomposing pig carcasses

    Directory of Open Access Journals (Sweden)

    Seung Hak Yang

    2015-09-01

    Full Text Available The leachate generated by the decomposition of animal carcass has been implicated as an environmental contaminant surrounding the burial site. High-throughput nucleotide sequencing was conducted to investigate the bacterial communities in leachates from the decomposition of pig carcasses. We acquired 51,230 reads from six different samples (1, 2, 3, 4, 6 and 14 week-old carcasses and found that sequences representing the phylum Firmicutes predominated. The diversity of bacterial 16S rRNA gene sequences in the leachate was the highest at 6 weeks, in contrast to those at 2 and 14 weeks. The relative abundance of Firmicutes was reduced, while the proportion of Bacteroidetes and Proteobacteria increased from 3–6 weeks. The representation of phyla was restored after 14 weeks. However, the community structures between the samples taken at 1–2 and 14 weeks differed at the bacterial classification level. The trend in pH was similar to the changes seen in bacterial communities, indicating that the pH of the leachate could be related to the shift in the microbial community. The results indicate that the composition of bacterial communities in leachates of decomposing pig carcasses shifted continuously during the study period and might be influenced by the burial site.

  10. High-Throughput Screen of Natural Product Libraries for Hsp90 Inhibitors

    Directory of Open Access Journals (Sweden)

    Jason Davenport

    2014-02-01

    Full Text Available Hsp90 has become the target of intensive investigation, as inhibition of its function has the ability to simultaneously incapacitate proteins that function in pathways that represent the six hallmarks of cancer. While a number of Hsp90 inhibitors have made it into clinical trials, a number of short-comings have been noted, such that the search continues for novel Hsp90 inhibitors with superior pharmacological properties. To identify new potential Hsp90 inhibitors, we have utilized a high-throughput assay based on measuring Hsp90-dependent refolding of thermally denatured luciferase to screen natural compound libraries. Over 4,000 compounds were screen with over 100 hits. Data mining of the literature indicated that 51 compounds had physiological effects that Hsp90 inhibitors also exhibit, and/or the ability to downregulate the expression levels of Hsp90-dependent proteins. Of these 51 compounds, seven were previously characterized as Hsp90 inhibitors. Four compounds, anthothecol, garcinol, piplartine, and rottlerin, were further characterized, and the ability of these compounds to inhibit the refolding of luciferase, and reduce the rate of growth of MCF7 breast cancer cells, correlated with their ability to suppress the Hsp90-dependent maturation of the heme-regulated eIF2α kinase, and deplete cultured cells of Hsp90-dependent client proteins. Thus, this screen has identified an additional 44 compounds with known beneficial pharmacological properties, but with unknown mechanisms of action as possible new inhibitors of the Hsp90 chaperone machine.

  11. High throughput, non-invasive and dynamic toxicity screening on adherent cells using respiratory measurements.

    Science.gov (United States)

    Beckers, Simone; Noor, Fozia; Müller-Vieira, Ursula; Mayer, Manuela; Strigun, Alexander; Heinzle, Elmar

    2010-03-01

    A dynamic respiration assay based on luminescence decay time detection of oxygen for high throughput toxicological assessment is presented. The method uses 24-well plates (OxoDishes) read with the help of a sensor dish reader placed in a humidified CO(2)-incubator. Adherent primary rat hepatocytes and the human hepatic cell line Hep G2 were exposed to known toxic compounds. Dissolved oxygen concentration, a measure of respiration, was measured with an oxygen sensor optode immobilized in the centre of each well. The cells were maintained in the dishes during the assay period and can afterwards be processed for further analyses. This dynamic, non-invasive measurement allowed calculation of 50% lethal concentrations (LC(50)) for any incubation time point giving concentration-time-dependent responses without further manipulation or removal of the cells from the incubator. Toxicokinetic profiles are compared with Sulforhodamine B assay, a common cytotoxicity assay. The novel assay is robust and flexible, very easy to carry out and provides continuous online respiration data reflecting dynamic toxicity responses. It can be adapted to any cell-based system and the calculated kinetics contributes to understanding of cell death mechanisms. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  12. Surveying the repair of ancient DNA from bones via high-throughput sequencing.

    Science.gov (United States)

    Mouttham, Nathalie; Klunk, Jennifer; Kuch, Melanie; Fourney, Ron; Poinar, Hendrik

    2015-07-01

    DNA damage in the form of abasic sites, chemically altered nucleotides, and strand fragmentation is the foremost limitation in obtaining genetic information from many ancient samples. Upon cell death, DNA continues to endure various chemical attacks such as hydrolysis and oxidation, but repair pathways found in vivo no longer operate. By incubating degraded DNA with specific enzyme combinations adopted from these pathways, it is possible to reverse some of the post-mortem nucleic acid damage prior to downstream analyses such as library preparation, targeted enrichment, and high-throughput sequencing. Here, we evaluate the performance of two available repair protocols on previously characterized DNA extracts from four mammoths. Both methods use endonucleases and glycosylases along with a DNA polymerase-ligase combination. PreCR Repair Mix increases the number of molecules converted to sequencing libraries, leading to an increase in endogenous content and a decrease in cytosine-to-thymine transitions due to cytosine deamination. However, the effects of Nelson Repair Mix on repair of DNA damage remain inconclusive.

  13. pep2pro: the high-throughput proteomics data processing, analysis and visualization tool

    Directory of Open Access Journals (Sweden)

    Matthias eHirsch-Hoffmann

    2012-06-01

    Full Text Available The pep2pro database was built to support effective high-throughput proteome data analysis. Its database schema allows the coherent integration of search results from different database-dependent search algorithms and filtering of the data including control for unambiguous assignment of peptides to proteins. The capacity of the pep2pro database has been exploited in data analysis of various Arabidopsis proteome datasets. The diversity of the datasets and the associated scientific questions required thorough querying of the data. This was supported by the relational format structure of the data that links all information on the sample, spectrum, search database and algorithm to peptide and protein identifications and their post-translational modifications. After publication of datasets they are made available on the pep2pro website at www.pep2pro.ethz.ch. Further, the pep2pro data analysis pipeline also handles data export do the PRIDE database (http://www.ebi.ac.uk/pride and data retrieval by the MASCP Gator (http://gator.masc-proteomics.org/. The utility of pep2pro will continue to be used for analysis of additional datasets and as a data warehouse. The capacity of the pep2pro database for proteome data analysis has now also been made publicly available through the release of pep2pro4all, which consists of a database schema and a script that will populate the database with mass spectrometry data provided in mzIdentML format.

  14. pep2pro: the high-throughput proteomics data processing, analysis, and visualization tool.

    Science.gov (United States)

    Hirsch-Hoffmann, Matthias; Gruissem, Wilhelm; Baerenfaller, Katja

    2012-01-01

    The pep2pro database was built to support effective high-throughput proteome data analysis. Its database schema allows the coherent integration of search results from different database-dependent search algorithms and filtering of the data including control for unambiguous assignment of peptides to proteins. The capacity of the pep2pro database has been exploited in data analysis of various Arabidopsis proteome datasets. The diversity of the datasets and the associated scientific questions required thorough querying of the data. This was supported by the relational format structure of the data that links all information on the sample, spectrum, search database, and algorithm to peptide and protein identifications and their post-translational modifications. After publication of datasets they are made available on the pep2pro website at www.pep2pro.ethz.ch. Further, the pep2pro data analysis pipeline also handles data export do the PRIDE database (http://www.ebi.ac.uk/pride) and data retrieval by the MASCP Gator (http://gator.masc-proteomics.org/). The utility of pep2pro will continue to be used for analysis of additional datasets and as a data warehouse. The capacity of the pep2pro database for proteome data analysis has now also been made publicly available through the release of pep2pro4all, which consists of a database schema and a script that will populate the database with mass spectrometry data provided in mzIdentML format.

  15. MerMade: an oligodeoxyribonucleotide synthesizer for high throughput oligonucleotide production in dual 96-well plates.

    Science.gov (United States)

    Rayner, S; Brignac, S; Bumeister, R; Belosludtsev, Y; Ward, T; Grant, O; O'Brien, K; Evans, G A; Garner, H R

    1998-07-01

    We have designed and constructed a machine that synthesizes two standard 96-well plates of oligonucleotides in a single run using standard phosphoramidite chemistry. The machine is capable of making a combination of standard, degenerate, or modified oligos in a single plate. The run time is typically 17 hr for two plates of 20-mers and a reaction scale of 40 nM. The reaction vessel is a standard polypropylene 96-well plate with a hole drilled in the bottom of each well. The two plates are placed in separate vacuum chucks and mounted on an xy table. Each well in turn is positioned under the appropriate reagent injection line and the reagent is injected by switching a dedicated valve. All aspects of machine operation are controlled by a Macintosh computer, which also guides the user through the startup and shutdown procedures, provides a continuous update on the status of the run, and facilitates a number of service procedures that need to be carried out periodically. Over 25,000 oligos have been synthesized for use in dye terminator sequencing reactions, polymerase chain reactions (PCRs), hybridization, and RT-PCR. Oligos up to 100 bases in length have been made with a coupling efficiency in excess of 99%. These machines, working in conjunction with our oligo prediction code are particularly well suited to application in automated high throughput genomic sequencing.

  16. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    ORGANIZATION REPORT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 10. SPONSOR / MONITOR’S ACRONYM(S)9. SPONSORING / MONITORING AGENCY...the high computing power of the main supercomputer. Each supercomputer is different in node architecture as well as hardware specifications. 2

  17. Embedded image enhancement for high-throughput cameras

    Science.gov (United States)

    Geerts, Stan J. C.; Cornelissen, Dion; de With, Peter H. N.

    2014-03-01

    This paper presents image enhancement for a novel Ultra-High-Definition (UHD) video camera offering 4K images and higher. Conventional image enhancement techniques need to be reconsidered for the high-resolution images and the low-light sensitivity of the new sensor. We study two image enhancement functions and evaluate and optimize the algorithms for embedded implementation in programmable logic (FPGA). The enhancement study involves high-quality Auto White Balancing (AWB) and Local Contrast Enhancement (LCE). We have compared multiple algorithms from literature, both with objective and subjective metrics. In order to objectively compare Local Contrast (LC), an existing LC metric is modified for LC measurement in UHD images. For AWB, we have found that color histogram stretching offers a subjective high image quality and it is among the algorithms with the lowest complexity, while giving only a small balancing error. We impose a color-to-color gain constraint, which improves robustness of low-light images. For local contrast enhancement, a combination of contrast preserving gamma and single-scale Retinex is selected. A modified bilateral filter is designed to prevent halo artifacts, while significantly reducing the complexity and simultaneously preserving quality. We show that by cascading contrast preserving gamma and single-scale Retinex, the visibility of details is improved towards the level appropriate for high-quality surveillance applications. The user is offered control over the amount of enhancement. Also, we discuss the mapping of those functions on a heterogeneous platform to come to an effective implementation while preserving quality and robustness.

  18. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  19. Nanostructured biosensing platform-shadow edge lithography for high-throughput nanofabrication.

    Science.gov (United States)

    Bai, John G; Yeo, Woon-Hong; Chung, Jae-Hyun

    2009-02-07

    One of the critical challenges in nanostructured biosensors is to manufacture an addressable array of nanopatterns at low cost. The addressable array (1) provides multiplexing for biomolecule detection and (2) enables direct detection of biomolecules without labeling and amplification. To fabricate such an array of nanostructures, current nanolithography methods are limited by the lack of either high throughput or high resolution. This paper presents a high-resolution and high-throughput nanolithography method using the compensated shadow effect in high-vacuum evaporation. The approach enables the fabrication of uniform nanogaps down to 20 nm in width across a 100 mm silicon wafer. The nanogap pattern is used as a template for the routine fabrication of zero-, one-, and two-dimensional nanostructures with a high yield. The method can facilitate the fabrication of nanostructured biosensors on a wafer scale at a low manufacturing cost.

  20. Versatile High Throughput Microarray Analysis for Marine Glycobiology

    DEFF Research Database (Denmark)

    Asunción Salmeán, Armando

    Algal cell walls are a type of extracellular matrix mainly made of polysaccharides, highly diverse, complex and heterogeneous. They possess unique and original polymers in their composition including several polysaccharides with industrial relevance such as agar, agarose, carrageenans (red algae...... decaying tissue as a nutrient source. We successfully optimized the method and characterized four new carbohydrate-recognizing proteins (two carrageenan binders, one arabynoxylan binder and one xyloglucan binder), which may be used as analytical probes for polysaccharide research. Finally, we also wanted...

  1. Evaluation of a New Remote Handling Design for High Throughput Annular Centrifugal Contactors

    Energy Technology Data Exchange (ETDEWEB)

    David H. Meikrantz; Troy G. Garn; Jack D. Law; Lawrence L. Macaluso

    2009-09-01

    Advanced designs of nuclear fuel recycling plants are expected to include more ambitious goals for aqueous based separations including; higher separations efficiency, high-level waste minimization, and a greater focus on continuous processes to minimize cost and footprint. Therefore, Annular Centrifugal Contactors (ACCs) are destined to play a more important role for such future processing schemes. Previous efforts defined and characterized the performance of commercial 5 cm and 12.5 cm single-stage ACCs in a “cold” environment. The next logical step, the design and evaluation of remote capable pilot scale ACCs in a “hot” or radioactive environment was reported earlier. This report includes the development of remote designs for ACCs that can process the large throughput rates needed in future nuclear fuel recycling plants. Novel designs were developed for the remote interconnection of contactor units, clean-in-place and drain connections, and a new solids removal collection chamber. A three stage, 12.5 cm diameter rotor module has been constructed and evaluated for operational function and remote handling in highly radioactive environments. This design is scalable to commercial CINC ACC models from V-05 to V-20 with total throughput rates ranging from 20 to 650 liters per minute. The V-05R three stage prototype was manufactured by the commercial vendor for ACCs in the U.S., CINC mfg. It employs three standard V-05 clean-in-place (CIP) units modified for remote service and replacement via new methods of connection for solution inlets, outlets, drain and CIP. Hydraulic testing and functional checks were successfully conducted and then the prototype was evaluated for remote handling and maintenance suitability. Removal and replacement of the center position V-05R ACC unit in the three stage prototype was demonstrated using an overhead rail mounted PaR manipulator. This evaluation confirmed the efficacy of this innovative design for interconnecting and cleaning

  2. Using Mendelian inheritance to improve high-throughput SNP discovery.

    Science.gov (United States)

    Chen, Nancy; Van Hout, Cristopher V; Gottipati, Srikanth; Clark, Andrew G

    2014-11-01

    Restriction site-associated DNA sequencing or genotyping-by-sequencing (GBS) approaches allow for rapid and cost-effective discovery and genotyping of thousands of single-nucleotide polymorphisms (SNPs) in multiple individuals. However, rigorous quality control practices are needed to avoid high levels of error and bias with these reduced representation methods. We developed a formal statistical framework for filtering spurious loci, using Mendelian inheritance patterns in nuclear families, that accommodates variable-quality genotype calls and missing data--both rampant issues with GBS data--and for identifying sex-linked SNPs. Simulations predict excellent performance of both the Mendelian filter and the sex-linkage assignment under a variety of conditions. We further evaluate our method by applying it to real GBS data and validating a subset of high-quality SNPs. These results demonstrate that our metric of Mendelian inheritance is a powerful quality filter for GBS loci that is complementary to standard coverage and Hardy-Weinberg filters. The described method, implemented in the software MendelChecker, will improve quality control during SNP discovery in nonmodel as well as model organisms. Copyright © 2014 by the Genetics Society of America.

  3. Melter Throughput Enhancements for High-Iron HLW

    Energy Technology Data Exchange (ETDEWEB)

    Kruger, A. A. [Department of Energy, Office of River Protection, Richland, Washington (United States); Gan, Hoa [The Catholic University of America, Washington, DC (United States); Joseph, Innocent [The Catholic University of America, Washington, DC (United States); Pegg, Ian L. [The Catholic University of America, Washington, DC (United States); Matlack, Keith S. [The Catholic University of America, Washington, DC (United States); Chaudhuri, Malabika [The Catholic University of America, Washington, DC (United States); Kot, Wing [The Catholic University of America, Washington, DC (United States)

    2012-12-26

    This report describes work performed to develop and test new glass and feed formulations in order to increase glass melting rates in high waste loading glass formulations for HLW with high concentrations of iron. Testing was designed to identify glass and melter feed formulations that optimize waste loading and waste processing rate while meeting all processing and product quality requirements. The work included preparation and characterization of crucible melts to assess melt rate using a vertical gradient furnace system and to develop new formulations with enhanced melt rate. Testing evaluated the effects of waste loading on glass properties and the maximum waste loading that can be achieved. The results from crucible-scale testing supported subsequent DuraMelter 100 (DM100) tests designed to examine the effects of enhanced glass and feed formulations on waste processing rate and product quality. The DM100 was selected as the platform for these tests due to its extensive previous use in processing rate determination for various HLW streams and glass compositions.

  4. High throughput selection of effective serodiagnostics for Trypanosoma cruzi infection.

    Directory of Open Access Journals (Sweden)

    Gretchen Cooley

    2008-10-01

    Full Text Available Diagnosis of Trypanosoma cruzi infection by direct pathogen detection is complicated by the low parasite burden in subjects persistently infected with this agent of human Chagas disease. Determination of infection status by serological analysis has also been faulty, largely due to the lack of well-characterized parasite reagents for the detection of anti-parasite antibodies.In this study, we screened more than 400 recombinant proteins of T. cruzi, including randomly selected and those known to be highly expressed in the parasite stages present in mammalian hosts, for the ability to detect anti-parasite antibodies in the sera of subjects with confirmed or suspected T. cruzi infection.A set of 16 protein groups were identified and incorporated into a multiplex bead array format which detected 100% of >100 confirmed positive sera and also documented consistent, strong and broad responses in samples undetected or discordant using conventional serologic tests. Each serum had a distinct but highly stable reaction pattern. This diagnostic panel was also useful for monitoring drug treatment efficacy in chronic Chagas disease.These results substantially extend the variety and quality of diagnostic targets for Chagas disease and offer a useful tool for determining treatment success or failure.

  5. High-throughput discovery of novel developmental phenotypes

    Science.gov (United States)

    Dickinson, Mary E.; Flenniken, Ann M.; Ji, Xiao; Teboul, Lydia; Wong, Michael D.; White, Jacqueline K.; Meehan, Terrence F.; Weninger, Wolfgang J.; Westerberg, Henrik; Adissu, Hibret; Baker, Candice N.; Bower, Lynette; Brown, James M.; Caddle, L. Brianna; Chiani, Francesco; Clary, Dave; Cleak, James; Daly, Mark J.; Denegre, James M.; Doe, Brendan; Dolan, Mary E.; Edie, Sarah M.; Fuchs, Helmut; Gailus-Durner, Valerie; Galli, Antonella; Gambadoro, Alessia; Gallegos, Juan; Guo, Shiying; Horner, Neil R.; Hsu, Chih-wei; Johnson, Sara J.; Kalaga, Sowmya; Keith, Lance C.; Lanoue, Louise; Lawson, Thomas N.; Lek, Monkol; Mark, Manuel; Marschall, Susan; Mason, Jeremy; McElwee, Melissa L.; Newbigging, Susan; Nutter, Lauryl M.J.; Peterson, Kevin A.; Ramirez-Solis, Ramiro; Rowland, Douglas J.; Ryder, Edward; Samocha, Kaitlin E.; Seavitt, John R.; Selloum, Mohammed; Szoke-Kovacs, Zsombor; Tamura, Masaru; Trainor, Amanda G; Tudose, Ilinca; Wakana, Shigeharu; Warren, Jonathan; Wendling, Olivia; West, David B.; Wong, Leeyean; Yoshiki, Atsushi; MacArthur, Daniel G.; Tocchini-Valentini, Glauco P.; Gao, Xiang; Flicek, Paul; Bradley, Allan; Skarnes, William C.; Justice, Monica J.; Parkinson, Helen E.; Moore, Mark; Wells, Sara; Braun, Robert E.; Svenson, Karen L.; de Angelis, Martin Hrabe; Herault, Yann; Mohun, Tim; Mallon, Ann-Marie; Henkelman, R. Mark; Brown, Steve D.M.; Adams, David J.; Lloyd, K.C. Kent; McKerlie, Colin; Beaudet, Arthur L.; Bucan, Maja; Murray, Stephen A.

    2016-01-01

    Approximately one third of all mammalian genes are essential for life. Phenotypes resulting from mouse knockouts of these genes have provided tremendous insight into gene function and congenital disorders. As part of the International Mouse Phenotyping Consortium effort to generate and phenotypically characterize 5000 knockout mouse lines, we have identified 410 lethal genes during the production of the first 1751 unique gene knockouts. Using a standardised phenotyping platform that incorporates high-resolution 3D imaging, we identified novel phenotypes at multiple time points for previously uncharacterized genes and additional phenotypes for genes with previously reported mutant phenotypes. Unexpectedly, our analysis reveals that incomplete penetrance and variable expressivity are common even on a defined genetic background. In addition, we show that human disease genes are enriched for essential genes identified in our screen, thus providing a novel dataset that facilitates prioritization and validation of mutations identified in clinical sequencing efforts. PMID:27626380

  6. Single DNA molecule patterning for high-throughput epigenetic mapping.

    Science.gov (United States)

    Cerf, Aline; Cipriany, Benjamin R; Benítez, Jaime J; Craighead, Harold G

    2011-11-01

    We present a method for profiling the 5-methyl cytosine distribution on single DNA molecules. Our method combines soft-lithography and molecular elongation to form ordered arrays estimated to contain more than 250 000 individual DNA molecules immobilized on a solid substrate. The methylation state of the DNA is detected and mapped by binding of fluorescently labeled methyl-CpG binding domain peptides to the elongated dsDNA molecules and imaging of their distribution. The stretched molecules are fixed in their extended configuration by adsorption onto the substrate so analysis can be performed with high spatial resolution and signal averaging. We further prove this technique allows imaging of DNA molecules with different methylation states.

  7. High-Throughput Mutational Analysis of a Twister Ribozyme.

    Science.gov (United States)

    Kobori, Shungo; Yokobayashi, Yohei

    2016-08-22

    Recent discoveries of new classes of self-cleaving ribozymes in diverse organisms have triggered renewed interest in the chemistry and biology of ribozymes. Functional analysis and engineering of ribozymes often involve performing biochemical assays on multiple ribozyme mutants. However, because each ribozyme mutant must be individually prepared and assayed, the number and variety of mutants that can be studied are severely limited. All of the single and double mutants of a twister ribozyme (a total of 10 296 mutants) were generated and assayed for their self-cleaving activity by exploiting deep sequencing to count the numbers of cleaved and uncleaved sequences for every mutant. Interestingly, we found that the ribozyme is highly robust against mutations such that 71 % and 30 % of all single and double mutants, respectively, retain detectable activity under the assay conditions. It was also observed that the structural elements that comprise the ribozyme exhibit distinct sensitivity to mutations. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  8. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    Energy Technology Data Exchange (ETDEWEB)

    Li, Fenglei [Iowa State Univ., Ames, IA (United States)

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition

  9. High-Throughput Synthesis, Screening, and Scale-Up of Optimized Conducting Indium Tin Oxides.

    Science.gov (United States)

    Marchand, Peter; Makwana, Neel M; Tighe, Christopher J; Gruar, Robert I; Parkin, Ivan P; Carmalt, Claire J; Darr, Jawwad A

    2016-02-08

    A high-throughput optimization and subsequent scale-up methodology has been used for the synthesis of conductive tin-doped indium oxide (known as ITO) nanoparticles. ITO nanoparticles with up to 12 at % Sn were synthesized using a laboratory scale (15 g/hour by dry mass) continuous hydrothermal synthesis process, and the as-synthesized powders were characterized by powder X-ray diffraction, transmission electron microscopy, energy-dispersive X-ray analysis, and X-ray photoelectron spectroscopy. Under standard synthetic conditions, either the cubic In2O3 phase, or a mixture of InO(OH) and In2O3 phases were observed in the as-synthesized materials. These materials were pressed into compacts and heat-treated in an inert atmosphere, and their electrical resistivities were then measured using the Van der Pauw method. Sn doping yielded resistivities of ∼ 10(-2) Ω cm for most samples with the lowest resistivity of 6.0 × 10(-3) Ω cm (exceptionally conductive for such pressed nanopowders) at a Sn concentration of 10 at %. Thereafter, the optimized lab-scale composition was scaled-up using a pilot-scale continuous hydrothermal synthesis process (at a rate of 100 g/hour by dry mass), and a comparable resistivity of 9.4 × 10(-3) Ω cm was obtained. The use of the synthesized TCO nanomaterials for thin film fabrication was finally demonstrated by deposition of a transparent, conductive film using a simple spin-coating process.

  10. Development of a high-throughput Candida albicans biofilm chip.

    Directory of Open Access Journals (Sweden)

    Anand Srinivasan

    2011-04-01

    Full Text Available We have developed a high-density microarray platform consisting of nano-biofilms of Candida albicans. A robotic microarrayer was used to print yeast cells of C. albicans encapsulated in a collagen matrix at a volume as low as 50 nL onto surface-modified microscope slides. Upon incubation, the cells grow into fully formed "nano-biofilms". The morphological and architectural complexity of these biofilms were evaluated by scanning electron and confocal scanning laser microscopy. The extent of biofilm formation was determined using a microarray scanner from changes in fluorescence intensities due to FUN 1 metabolic processing. This staining technique was also adapted for antifungal susceptibility testing, which demonstrated that, similar to regular biofilms, cells within the on-chip biofilms displayed elevated levels of resistance against antifungal agents (fluconazole and amphotericin B. Thus, results from structural analyses and antifungal susceptibility testing indicated that despite miniaturization, these biofilms display the typical phenotypic properties associated with the biofilm mode of growth. In its final format, the C. albicans biofilm chip (CaBChip is composed of 768 equivalent and spatially distinct nano-biofilms on a single slide; multiple chips can be printed and processed simultaneously. Compared to current methods for the formation of microbial biofilms, namely the 96-well microtiter plate model, this fungal biofilm chip has advantages in terms of miniaturization and automation, which combine to cut reagent use and analysis time, minimize labor intensive steps, and dramatically reduce assay costs. Such a chip should accelerate the antifungal drug discovery process by enabling rapid, convenient and inexpensive screening of hundreds-to-thousands of compounds simultaneously.

  11. High Throughput Sequencing of Extracellular RNA from Human Plasma.

    Directory of Open Access Journals (Sweden)

    Kirsty M Danielson

    Full Text Available The presence and relative stability of extracellular RNAs (exRNAs in biofluids has led to an emerging recognition of their promise as 'liquid biopsies' for diseases. Most prior studies on discovery of exRNAs as disease-specific biomarkers have focused on microRNAs (miRNAs using technologies such as qRT-PCR and microarrays. The recent application of next-generation sequencing to discovery of exRNA biomarkers has revealed the presence of potential novel miRNAs as well as other RNA species such as tRNAs, snoRNAs, piRNAs and lncRNAs in biofluids. At the same time, the use of RNA sequencing for biofluids poses unique challenges, including low amounts of input RNAs, the presence of exRNAs in different compartments with varying degrees of vulnerability to isolation techniques, and the high abundance of specific RNA species (thereby limiting the sensitivity of detection of less abundant species. Moreover, discovery in human diseases often relies on archival biospecimens of varying age and limiting amounts of samples. In this study, we have tested RNA isolation methods to optimize profiling exRNAs by RNA sequencing in individuals without any known diseases. Our findings are consistent with other recent studies that detect microRNAs and ribosomal RNAs as the major exRNA species in plasma. Similar to other recent studies, we found that the landscape of biofluid microRNA transcriptome is dominated by several abundant microRNAs that appear to comprise conserved extracellular miRNAs. There is reasonable correlation of sets of conserved miRNAs across biological replicates, and even across other data sets obtained at different investigative sites. Conversely, the detection of less abundant miRNAs is far more dependent on the exact methodology of RNA isolation and profiling. This study highlights the challenges in detecting and quantifying less abundant plasma miRNAs in health and disease using RNA sequencing platforms.

  12. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  13. Analysis of high-throughput sequencing and annotation strategies for phage genomes.

    Directory of Open Access Journals (Sweden)

    Matthew R Henn

    Full Text Available BACKGROUND: Bacterial viruses (phages play a critical role in shaping microbial populations as they influence both host mortality and horizontal gene transfer. As such, they have a significant impact on local and global ecosystem function and human health. Despite their importance, little is known about the genomic diversity harbored in phages, as methods to capture complete phage genomes have been hampered by the lack of knowledge about the target genomes, and difficulties in generating sufficient quantities of genomic DNA for sequencing. Of the approximately 550 phage genomes currently available in the public domain, fewer than 5% are marine phage. METHODOLOGY/PRINCIPAL FINDINGS: To advance the study of phage biology through comparative genomic approaches we used marine cyanophage as a model system. We compared DNA preparation methodologies (DNA extraction directly from either phage lysates or CsCl purified phage particles, and sequencing strategies that utilize either Sanger sequencing of a linker amplification shotgun library (LASL or of a whole genome shotgun library (WGSL, or 454 pyrosequencing methods. We demonstrate that genomic DNA sample preparation directly from a phage lysate, combined with 454 pyrosequencing, is best suited for phage genome sequencing at scale, as this method is capable of capturing complete continuous genomes with high accuracy. In addition, we describe an automated annotation informatics pipeline that delivers high-quality annotation and yields few false positives and negatives in ORF calling. CONCLUSIONS/SIGNIFICANCE: These DNA preparation, sequencing and annotation strategies enable a high-throughput approach to the burgeoning field of phage genomics.

  14. Design of novel solar thermal fuels with high-throughput ab initio simulations

    Science.gov (United States)

    Liu, Yun; Grossman, Jeffrey

    2014-03-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. Previously we have predicted a new class of functional materials that have the potential to address these challenges. Recently, we have developed an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening algorithm we have developed can run through large numbers of molecules composed of earth-abundant elements, and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical design principles to guide further STF materials design through the correlation between isomerization enthalpy and structural properties.

  15. A high-throughput readout architecture based on PCI-Express Gen3 and DirectGMA technology

    Science.gov (United States)

    Rota, L.; Vogelgesang, M.; Ardila Perez, L. E.; Caselle, M.; Chilingaryan, S.; Dritschler, T.; Zilio, N.; Kopmann, A.; Balzer, M.; Weber, M.

    2016-02-01

    Modern physics experiments produce multi-GB/s data rates. Fast data links and high performance computing stages are required for continuous data acquisition and processing. Because of their intrinsic parallelism and computational power, GPUs emerged as an ideal solution to process this data in high performance computing applications. In this paper we present a high-throughput platform based on direct FPGA-GPU communication. The architecture consists of a Direct Memory Access (DMA) engine compatible with the Xilinx PCI-Express core, a Linux driver for register access, and high- level software to manage direct memory transfers using AMD's DirectGMA technology. Measurements with a Gen3 x8 link show a throughput of 6.4 GB/s for transfers to GPU memory and 6.6 GB/s to system memory. We also assess the possibility of using the architecture in low latency systems: preliminary measurements show a round-trip latency as low as 1 μs for data transfers to system memory, while the additional latency introduced by OpenCL scheduling is the current limitation for GPU based systems. Our implementation is suitable for real-time DAQ system applications ranging from photon science and medical imaging to High Energy Physics (HEP) systems.

  16. Deep Mutational Scanning: Library Construction, Functional Selection, and High-Throughput Sequencing.

    Science.gov (United States)

    Starita, Lea M; Fields, Stanley

    2015-08-03

    Deep mutational scanning is a highly parallel method that uses high-throughput sequencing to track changes in >10(5) protein variants before and after selection to measure the effects of mutations on protein function. Here we outline the stages of a deep mutational scanning experiment, focusing on the construction of libraries of protein sequence variants and the preparation of Illumina sequencing libraries. © 2015 Cold Spring Harbor Laboratory Press.

  17. A High-Throughput, Adaptive FFT Architecture for FPGA-Based Space-Borne Data Processors

    Science.gov (United States)

    Nguyen, Kayla; Zheng, Jason; He, Yutao; Shah, Biren

    2010-01-01

    Historically, computationally-intensive data processing for space-borne instruments has heavily relied on ground-based computing resources. But with recent advances in functional densities of Field-Programmable Gate-Arrays (FPGAs), there has been an increasing desire to shift more processing on-board; therefore relaxing the downlink data bandwidth requirements. Fast Fourier Transforms (FFTs) are commonly used building blocks for data processing applications, with a growing need to increase the FFT block size. Many existing FFT architectures have mainly emphasized on low power consumption or resource usage; but as the block size of the FFT grows, the throughput is often compromised first. In addition to power and resource constraints, space-borne digital systems are also limited to a small set of space-qualified memory elements, which typically lag behind the commercially available counterparts in capacity and bandwidth. The bandwidth limitation of the external memory creates a bottleneck for a large, high-throughput FFT design with large block size. In this paper, we present the Multi-Pass Wide Kernel FFT (MPWK-FFT) architecture for a moderately large block size (32K) with considerations to power consumption and resource usage, as well as throughput. We will also show that the architecture can be easily adapted for different FFT block sizes with different throughput and power requirements. The result is completely contained within an FPGA without relying on external memories. Implementation results are summarized.

  18. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  19. A Novel High-Throughput Approach to Measure Hydroxyl Radicals Induced by Airborne Particulate Matter

    Directory of Open Access Journals (Sweden)

    Yeongkwon Son

    2015-10-01

    Full Text Available Oxidative stress is one of the key mechanisms linking ambient particulate matter (PM exposure with various adverse health effects. The oxidative potential of PM has been used to characterize the ability of PM induced oxidative stress. Hydroxyl radical (•OH is the most destructive radical produced by PM. However, there is currently no high-throughput approach which can rapidly measure PM-induced •OH for a large number of samples with an automated system. This study evaluated four existing molecular probes (disodium terephthalate, 3′-p-(aminophenylfluorescein, coumarin-3-carboxylic acid, and sodium benzoate for their applicability to measure •OH induced by PM in a high-throughput cell-free system using fluorescence techniques, based on both our experiments and on an assessment of the physicochemical properties of the probes reported in the literature. Disodium terephthalate (TPT was the most applicable molecular probe to measure •OH induced by PM, due to its high solubility, high stability of the corresponding fluorescent product (i.e., 2-hydroxyterephthalic acid, high yield compared with the other molecular probes, and stable fluorescence intensity in a wide range of pH environments. TPT was applied in a high-throughput format to measure PM (NIST 1648a-induced •OH, in phosphate buffered saline. The formed fluorescent product was measured at designated time points up to 2 h. The fluorescent product of TPT had a detection limit of 17.59 nM. The soluble fraction of PM contributed approximately 76.9% of the •OH induced by total PM, and the soluble metal ions of PM contributed 57.4% of the overall •OH formation. This study provides a promising cost-effective high-throughput method to measure •OH induced by PM on a routine basis.

  20. Accurate Classification of Protein Subcellular Localization from High-Throughput Microscopy Images Using Deep Learning

    Directory of Open Access Journals (Sweden)

    Tanel Pärnamaa

    2017-05-01

    Full Text Available High-throughput microscopy of many single cells generates high-dimensional data that are far from straightforward to analyze. One important problem is automatically detecting the cellular compartment where a fluorescently-tagged protein resides, a task relatively simple for an experienced human, but difficult to automate on a computer. Here, we train an 11-layer neural network on data from mapping thousands of yeast proteins, achieving per cell localization classification accuracy of 91%, and per protein accuracy of 99% on held-out images. We confirm that low-level network features correspond to basic image characteristics, while deeper layers separate localization classes. Using this network as a feature calculator, we train standard classifiers that assign proteins to previously unseen compartments after observing only a small number of training examples. Our results are the most accurate subcellular localization classifications to date, and demonstrate the usefulness of deep learning for high-throughput microscopy.

  1. The promise and challenge of high-throughput sequencing of the antibody repertoire

    Science.gov (United States)

    Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R

    2014-01-01

    Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474

  2. tcpl: the ToxCast pipeline for high-throughput screening data.

    Science.gov (United States)

    Filer, Dayne L; Kothiya, Parth; Setzer, R Woodrow; Judson, Richard S; Martin, Matthew T

    2017-02-15

    Large high-throughput screening (HTS) efforts are widely used in drug development and chemical toxicity screening. Wide use and integration of these data can benefit from an efficient, transparent and reproducible data pipeline. Summary: The tcpl R package and its associated MySQL database provide a generalized platform for efficiently storing, normalizing and dose-response modeling of large high-throughput and high-content chemical screening data. The novel dose-response modeling algorithm has been tested against millions of diverse dose-response series, and robustly fits data with outliers and cytotoxicity-related signal loss. tcpl is freely available on the Comprehensive R Archive Network under the GPL-2 license. martin.matt@epa.gov.

  3. Quantitative monitoring of Arabidopsis thaliana growth and development using high-throughput plant phenotyping.

    Science.gov (United States)

    Arend, Daniel; Lange, Matthias; Pape, Jean-Michel; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Mücke, Ingo; Klukas, Christian; Altmann, Thomas; Scholz, Uwe; Junker, Astrid

    2016-08-16

    With the implementation of novel automated, high throughput methods and facilities in the last years, plant phenomics has developed into a highly interdisciplinary research domain integrating biology, engineering and bioinformatics. Here we present a dataset of a non-invasive high throughput plant phenotyping experiment, which uses image- and image analysis- based approaches to monitor the growth and development of 484 Arabidopsis thaliana plants (thale cress). The result is a comprehensive dataset of images and extracted phenotypical features. Such datasets require detailed documentation, standardized description of experimental metadata as well as sustainable data storage and publication in order to ensure the reproducibility of experiments, data reuse and comparability among the scientific community. Therefore the here presented dataset has been annotated using the standardized ISA-Tab format and considering the recently published recommendations for the semantical description of plant phenotyping experiments.

  4. High-throughput clone screening followed by protein expression cross-check: A visual assay platform.

    Science.gov (United States)

    Bose, Partha Pratim; Kumar, Prakash

    2017-01-01

    In high-throughput biotechnology and structural biology, molecular cloning is an essential prerequisite for attaining high yields of recombinant protein. However, a rapid, cost-effective, easy clone screening protocol is still required to identify colonies with desired insert along with a cross check method to certify the expression of the desired protein as the end product. We report an easy, fast, sensitive and cheap visual clone screening and protein expression cross check protocol employing gold nanoparticle based plasmonic detection phenomenon. This is a non-gel, non-PCR based visual detection technique, which can be used as simultaneous high throughput clone screening followed by the determination of expression of desired protein. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. A High-Throughput Microfluidic Platform for Mammalian Cell Transfection and Culturing

    Science.gov (United States)

    Woodruff, Kristina; Maerkl, Sebastian J.

    2016-01-01

    Mammalian synthetic biology could be augmented through the development of high-throughput microfluidic systems that integrate cellular transfection, culturing, and imaging. We created a microfluidic chip that cultures cells and implements 280 independent transfections at up to 99% efficiency. The chip can perform co-transfections, in which the number of cells expressing each protein and the average protein expression level can be precisely tuned as a function of input DNA concentration and synthetic gene circuits can be optimized on chip. We co-transfected four plasmids to test a histidine kinase signaling pathway and mapped the dose dependence of this network on the level of one of its constituents. The chip is readily integrated with high-content imaging, enabling the evaluation of cellular behavior and protein expression dynamics over time. These features make the transfection chip applicable to high-throughput mammalian protein and synthetic biology studies. PMID:27030663

  6. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit

    National Research Council Canada - National Science Library

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-01-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated...

  7. Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.

    Science.gov (United States)

    Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil

    2015-07-17

    In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.

  8. High throughput integrated thermal characterization with non-contact optical calorimetry

    Science.gov (United States)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  9. High-Throughput Sequencing of Three Lemnoideae (Duckweeds) Chloroplast Genomes from Total DNA

    Science.gov (United States)

    Wang, Wenqin; Messing, Joachim

    2011-01-01

    Background Chloroplast genomes provide a wealth of information for evolutionary and population genetic studies. Chloroplasts play a particularly important role in the adaption for aquatic plants because they float on water and their major surface is exposed continuously to sunlight. The subfamily of Lemnoideae represents such a collection of aquatic species that because of photosynthesis represents one of the fastest growing plant species on earth. Methods We sequenced the chloroplast genomes from three different genera of Lemnoideae, Spirodela polyrhiza, Wolffiella lingulata and Wolffia australiana by high-throughput DNA sequencing of genomic DNA using the SOLiD platform. Unfractionated total DNA contains high copies of plastid DNA so that sequences from the nucleus and mitochondria can easily be filtered computationally. Remaining sequence reads were assembled into contiguous sequences (contigs) using SOLiD software tools. Contigs were mapped to a reference genome of Lemna minor and gaps, selected by PCR, were sequenced on the ABI3730xl platform. Conclusions This combinatorial approach yielded whole genomic contiguous sequences in a cost-effective manner. Over 1,000-time coverage of chloroplast from total DNA were reached by the SOLiD platform in a single spot on a quadrant slide without purification. Comparative analysis indicated that the chloroplast genome was conserved in gene number and organization with respect to the reference genome of L. minor. However, higher nucleotide substitution, abundant deletions and insertions occurred in non-coding regions of these genomes, indicating a greater genomic dynamics than expected from the comparison of other related species in the Pooideae. Noticeably, there was no transition bias over transversion in Lemnoideae. The data should have immediate applications in evolutionary biology and plant taxonomy with increased resolution and statistical power. PMID:21931804

  10. High-throughput sequencing of three Lemnoideae (duckweeds chloroplast genomes from total DNA.

    Directory of Open Access Journals (Sweden)

    Wenqin Wang

    Full Text Available BACKGROUND: Chloroplast genomes provide a wealth of information for evolutionary and population genetic studies. Chloroplasts play a particularly important role in the adaption for aquatic plants because they float on water and their major surface is exposed continuously to sunlight. The subfamily of Lemnoideae represents such a collection of aquatic species that because of photosynthesis represents one of the fastest growing plant species on earth. METHODS: We sequenced the chloroplast genomes from three different genera of Lemnoideae, Spirodela polyrhiza, Wolffiella lingulata and Wolffia australiana by high-throughput DNA sequencing of genomic DNA using the SOLiD platform. Unfractionated total DNA contains high copies of plastid DNA so that sequences from the nucleus and mitochondria can easily be filtered computationally. Remaining sequence reads were assembled into contiguous sequences (contigs using SOLiD software tools. Contigs were mapped to a reference genome of Lemna minor and gaps, selected by PCR, were sequenced on the ABI3730xl platform. CONCLUSIONS: This combinatorial approach yielded whole genomic contiguous sequences in a cost-effective manner. Over 1,000-time coverage of chloroplast from total DNA were reached by the SOLiD platform in a single spot on a quadrant slide without purification. Comparative analysis indicated that the chloroplast genome was conserved in gene number and organization with respect to the reference genome of L. minor. However, higher nucleotide substitution, abundant deletions and insertions occurred in non-coding regions of these genomes, indicating a greater genomic dynamics than expected from the comparison of other related species in the Pooideae. Noticeably, there was no transition bias over transversion in Lemnoideae. The data should have immediate applications in evolutionary biology and plant taxonomy with increased resolution and statistical power.

  11. Synthetic Substrate for Application in both High and Low Throughput Assays for Botulinum Neurotoxin B Protease Inhibitors

    OpenAIRE

    Salzameda, Nicholas T.; Barbieri, Joseph T.; Janda, Kim D.

    2009-01-01

    A FRET peptide substrate was synthesized and evaluated for enzymatic cleavage by the BoNT/B light chain protease. The FRET substrate was found to be useful in both a high throughput assay to uncover initial “hits” and a low throughput HPLC assay to determine kinetic parameters and modes of inhibition.

  12. High-Throughput Combinatorial Development of High-Entropy Alloys For Light-Weight Structural Applications

    Energy Technology Data Exchange (ETDEWEB)

    Van Duren, Jeroen K; Koch, Carl; Luo, Alan; Sample, Vivek; Sachdev, Anil

    2017-12-29

    The primary limitation of today’s lightweight structural alloys is that specific yield strengths (SYS) higher than 200MPa x cc/g (typical value for titanium alloys) are extremely difficult to achieve. This holds true especially at a cost lower than 5dollars/kg (typical value for magnesium alloys). Recently, high-entropy alloys (HEA) have shown promising SYS, yet the large composition space of HEA makes screening compositions complex and time-consuming. Over the course of this 2-year project we started from 150 billion compositions and reduced the number of potential low-density (<5g/cc), low-cost (<5dollars/kg) high-entropy alloy (LDHEA) candidates that are single-phase, disordered, solid-solution (SPSS) to a few thousand compositions. This was accomplished by means of machine learning to guide design for SPSS LDHEA based on a combination of recursive partitioning, an extensive, experimental HEA database compiled from 24 literature sources, and 91 calculated parameters serving as phenomenological selection rules. Machine learning shows an accuracy of 82% in identifying which compositions of a separate, smaller, experimental HEA database are SPSS HEA. Calculation of Phase Diagrams (CALPHAD) shows an accuracy of 71-77% for the alloys supported by the CALPHAD database, where 30% of the compiled HEA database is not supported by CALPHAD. In addition to machine learning, and CALPHAD, a third tool was developed to aid design of SPSS LDHEA. Phase diagrams were calculated by constructing the Gibbs-free energy convex hull based on easily accessible enthalpy and entropy terms. Surprisingly, accuracy was 78%. Pursuing these LDHEA candidates by high-throughput experimental methods resulted in SPSS LDHEA composed of transition metals (e.g. Cr, Mn, Fe, Ni, Cu) alloyed with Al, yet the high concentration of Al, necessary to bring the mass density below 5.0g/cc, makes these materials hard and brittle, body-centered-cubic (BCC) alloys. A related, yet multi-phase BCC alloy, based

  13. Inertial-ordering-assisted droplet microfluidics for high-throughput single-cell RNA-sequencing.

    Science.gov (United States)

    Moon, Hui-Sung; Je, Kwanghwi; Min, Jae-Woong; Park, Donghyun; Han, Kyung-Yeon; Shin, Seung-Ho; Park, Woong-Yang; Yoo, Chang Eun; Kim, Shin-Hyun

    2018-02-27

    Single-cell RNA-seq reveals the cellular heterogeneity inherent in the population of cells, which is very important in many clinical and research applications. Recent advances in droplet microfluidics have achieved the automatic isolation, lysis, and labeling of single cells in droplet compartments without complex instrumentation. However, barcoding errors occurring in the cell encapsulation process because of the multiple-beads-in-droplet and insufficient throughput because of the low concentration of beads for avoiding multiple-beads-in-a-droplet remain important challenges for precise and efficient expression profiling of single cells. In this study, we developed a new droplet-based microfluidic platform that significantly improved the throughput while reducing barcoding errors through deterministic encapsulation of inertially ordered beads. Highly concentrated beads containing oligonucleotide barcodes were spontaneously ordered in a spiral channel by an inertial effect, which were in turn encapsulated in droplets one-by-one, while cells were simultaneously encapsulated in the droplets. The deterministic encapsulation of beads resulted in a high fraction of single-bead-in-a-droplet and rare multiple-beads-in-a-droplet although the bead concentration increased to 1000 μl -1 , which diminished barcoding errors and enabled accurate high-throughput barcoding. We successfully validated our device with single-cell RNA-seq. In addition, we found that multiple-beads-in-a-droplet, generated using a normal Drop-Seq device with a high concentration of beads, underestimated transcript numbers and overestimated cell numbers. This accurate high-throughput platform can expand the capability and practicality of Drop-Seq in single-cell analysis.

  14. Development of High-Throughput Quantitative Assays for Glucose Uptake in Cancer Cell Lines

    Science.gov (United States)

    Hassanein, Mohamed; Weidow, Brandy; Koehler, Elizabeth; Bakane, Naimish; Garbett, Shawn; Shyr, Yu; Quaranta, Vito

    2013-01-01

    Purpose Metabolism, and especially glucose uptake, is a key quantitative cell trait that is closely linked to cancer initiation and progression. Therefore, developing high-throughput assays for measuring glucose uptake in cancer cells would be enviable for simultaneous comparisons of multiple cell lines and microenvironmental conditions. This study was designed with two specific aims in mind: the first was to develop and validate a high-throughput screening method for quantitative assessment of glucose uptake in “normal” and tumor cells using the fluorescent 2-deoxyglucose analog 2-[N-(7-nitrobenz-2-oxa-1,3-diazol-4-yl)amino]-2-deoxyglucose (2-NBDG), and the second was to develop an image-based, quantitative, single-cell assay for measuring glucose uptake using the same probe to dissect the full spectrum of metabolic variability within populations of tumor cells in vitro in higher resolution. Procedure The kinetics of population-based glucose uptake was evaluated for MCF10A mammary epithelial and CA1d breast cancer cell lines, using 2-NBDG and a fluorometric microplate reader. Glucose uptake for the same cell lines was also examined at the single-cell level using high-content automated microscopy coupled with semi-automated cell-cytometric image analysis approaches. Statistical treatments were also implemented to analyze intra-population variability. Results Our results demonstrate that the high-throughput fluorometric assay using 2-NBDG is a reliable method to assess population-level kinetics of glucose uptake in cell lines in vitro. Similarly, single-cell image-based assays and analyses of 2-NBDG fluorescence proved an effective and accurate means for assessing glucose uptake, which revealed that breast tumor cell lines display intra-population variability that is modulated by growth conditions. Conclusions These studies indicate that 2-NBDG can be used to aid in the high-throughput analysis of the influence of chemotherapeutics on glucose uptake in cancer

  15. High-throughput measurement of rice tillers using a conveyor equipped with x-ray computed tomography.

    Science.gov (United States)

    Yang, Wanneng; Xu, Xiaochun; Duan, Lingfeng; Luo, Qingming; Chen, Shangbin; Zeng, Shaoqun; Liu, Qian

    2011-02-01

    Tillering is one of the most important agronomic traits because the number of shoots per plant determines panicle number, a key component of grain yield. The conventional method of counting tillers is still manual. Under the condition of mass measurement, the accuracy and efficiency could be gradually degraded along with fatigue of experienced staff. Thus, manual measurement, including counting and recording, is not only time consuming but also lack objectivity. To automate this process, we developed a high-throughput facility, dubbed high-throughput system for measuring automatically rice tillers (H-SMART), for measuring rice tillers based on a conventional x-ray computed tomography (CT) system and industrial conveyor. Each pot-grown rice plant was delivered into the CT system for scanning via the conveyor equipment. A filtered back-projection algorithm was used to reconstruct the transverse section image of the rice culms. The number of tillers was then automatically extracted by image segmentation. To evaluate the accuracy of this system, three batches of rice at different growth stages (tillering, heading, or filling) were tested, yielding absolute mean absolute errors of 0.22, 0.36, and 0.36, respectively. Subsequently, the complete machine was used under industry conditions to estimate its efficiency, which was 4320 pots per continuous 24 h workday. Thus, the H-SMART could determine the number of tillers of pot-grown rice plants, providing three advantages over the manual tillering method: absence of human disturbance, automation, and high throughput. This facility expands the application of agricultural photonics in plant phenomics.

  16. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Energy Technology Data Exchange (ETDEWEB)

    Schuster, Andre; Bruno, Kenneth S.; Collett, James R.; Baker, Scott E.; Seiboth, Bernhard; Kubicek, Christian P.; Schmoll, Monika

    2012-01-02

    The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina), represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. RESULTS: Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ) by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414.CONCLUSIONS:Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  17. PCR cycles above routine numbers do not compromise high-throughput DNA barcoding results.

    Science.gov (United States)

    Vierna, J; Doña, J; Vizcaíno, A; Serrano, D; Jovani, R

    2017-10-01

    High-throughput DNA barcoding has become essential in ecology and evolution, but some technical questions still remain. Increasing the number of PCR cycles above the routine 20-30 cycles is a common practice when working with old-type specimens, which provide little amounts of DNA, or when facing annealing issues with the primers. However, increasing the number of cycles can raise the number of artificial mutations due to polymerase errors. In this work, we sequenced 20 COI libraries in the Illumina MiSeq platform. Libraries were prepared with 40, 45, 50, 55, and 60 PCR cycles from four individuals belonging to four species of four genera of cephalopods. We found no relationship between the number of PCR cycles and the number of mutations despite using a nonproofreading polymerase. Moreover, even when using a high number of PCR cycles, the resulting number of mutations was low enough not to be an issue in the context of high-throughput DNA barcoding (but may still remain an issue in DNA metabarcoding due to chimera formation). We conclude that the common practice of increasing the number of PCR cycles should not negatively impact the outcome of a high-throughput DNA barcoding study in terms of the occurrence of point mutations.

  18. High-throughput DNA sequencing errors are reduced by orders of magnitude using circle sequencing

    Science.gov (United States)

    Lou, Dianne I.; Hussmann, Jeffrey A.; McBee, Ross M.; Acevedo, Ashley; Andino, Raul; Press, William H.; Sawyer, Sara L.

    2013-01-01

    A major limitation of high-throughput DNA sequencing is the high rate of erroneous base calls produced. For instance, Illumina sequencing machines produce errors at a rate of ∼0.1–1 × 10−2 per base sequenced. These technologies typically produce billions of base calls per experiment, translating to millions of errors. We have developed a unique library preparation strategy, “circle sequencing,” which allows for robust downstream computational correction of these errors. In this strategy, DNA templates are circularized, copied multiple times in tandem with a rolling circle polymerase, and then sequenced on any high-throughput sequencing machine. Each read produced is computationally processed to obtain a consensus sequence of all linked copies of the original molecule. Physically linking the copies ensures that each copy is independently derived from the original molecule and allows for efficient formation of consensus sequences. The circle-sequencing protocol precedes standard library preparations and is therefore suitable for a broad range of sequencing applications. We tested our method using the Illumina MiSeq platform and obtained errors in our processed sequencing reads at a rate as low as 7.6 × 10−6 per base sequenced, dramatically improving the error rate of Illumina sequencing and putting error on par with low-throughput, but highly accurate, Sanger sequencing. Circle sequencing also had substantially higher efficiency and lower cost than existing barcode-based schemes for correcting sequencing errors. PMID:24243955

  19. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Science.gov (United States)

    Prashar, Ankush; Yildiz, Jane; McNicol, James W; Bryan, Glenn J; Jones, Hamlyn G

    2013-01-01

    The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  20. A versatile toolkit for high throughput functional genomics with Trichoderma reesei

    Directory of Open Access Journals (Sweden)

    Schuster André

    2012-01-01

    Full Text Available Abstract Background The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina, represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. Results Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414. Conclusions Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.

  1. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  2. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  3. Droplet microfluidic technology for single-cell high-throughput screening.

    Science.gov (United States)

    Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L

    2009-08-25

    We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.

  4. High-throughput Sequencing Based Immune Repertoire Study during Infectious Disease

    Directory of Open Access Journals (Sweden)

    Dongni Hou

    2016-08-01

    Full Text Available The selectivity of the adaptive immune response is based on the enormous diversity of T and B cell antigen-specific receptors. The immune repertoire, the collection of T and B cells with functional diversity in the circulatory system at any given time, is dynamic and reflects the essence of immune selectivity. In this article, we review the recent advances in immune repertoire study of infectious diseases that achieved by traditional techniques and high-throughput sequencing techniques. High-throughput sequencing techniques enable the determination of complementary regions of lymphocyte receptors with unprecedented efficiency and scale. This progress in methodology enhances the understanding of immunologic changes during pathogen challenge, and also provides a basis for further development of novel diagnostic markers, immunotherapies and vaccines.

  5. An automated system for high-throughput single cell-based breeding

    Science.gov (United States)

    Yoshimoto, Nobuo; Kida, Akiko; Jie, Xu; Kurokawa, Masaya; Iijima, Masumi; Niimi, Tomoaki; Maturana, Andrés D.; Nikaido, Itoshi; Ueda, Hiroki R.; Tatematsu, Kenji; Tanizawa, Katsuyuki; Kondo, Akihiko; Fujii, Ikuo; Kuroda, Shun'ichi

    2013-01-01

    When establishing the most appropriate cells from the huge numbers of a cell library for practical use of cells in regenerative medicine and production of various biopharmaceuticals, cell heterogeneity often found in an isogenic cell population limits the refinement of clonal cell culture. Here, we demonstrated high-throughput screening of the most suitable cells in a cell library by an automated undisruptive single-cell analysis and isolation system, followed by expansion of isolated single cells. This system enabled establishment of the most suitable cells, such as embryonic stem cells with the highest expression of the pluripotency marker Rex1 and hybridomas with the highest antibody secretion, which could not be achieved by conventional high-throughput cell screening systems (e.g., a fluorescence-activated cell sorter). This single cell-based breeding system may be a powerful tool to analyze stochastic fluctuations and delineate their molecular mechanisms. PMID:23378922

  6. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  7. Current developments in high-throughput analysis for microalgae cellular contents.

    Science.gov (United States)

    Lee, Tsung-Hua; Chang, Jo-Shu; Wang, Hsiang-Yu

    2013-11-01

    Microalgae have emerged as one of the most promising feedstocks for biofuels and bio-based chemical production. However, due to the lack of effective tools enabling rapid and high-throughput analysis of the content of microalgae biomass, the efficiency of screening and identification of microalgae with desired functional components from the natural environment is usually quite low. Moreover, the real-time monitoring of the production of target components from microalgae is also difficult. Recently, research efforts focusing on overcoming this limitation have started. In this review, the recent development of high-throughput methods for analyzing microalgae cellular contents is summarized. The future prospects and impacts of these detection methods in microalgae-related processing and industries are also addressed. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Automated High-Throughput Root Phenotyping of Arabidopsis thaliana Under Nutrient Deficiency Conditions.

    Science.gov (United States)

    Satbhai, Santosh B; Göschl, Christian; Busch, Wolfgang

    2017-01-01

    The central question of genetics is how a genotype determines the phenotype of an organism. Genetic mapping approaches are a key for finding answers to this question. In particular, genome-wide association (GWA) studies have been rapidly adopted to study the architecture of complex quantitative traits. This was only possible due to the improvement of high-throughput and low-cost phenotyping methodologies. In this chapter we provide a detailed protocol for obtaining root trait data from the model species Arabidopsis thaliana using the semiautomated, high-throughput phenotyping pipeline BRAT (Busch-lab Root Analysis Toolchain) for early root growth under the stress condition of iron deficiency. Extracted root trait data can be directly used to perform GWA mapping using the freely accessible web application GWAPP to identify marker polymorphisms associated with the phenotype of interest.

  9. Post-high-throughput screening analysis: an empirical compound prioritization scheme.

    Science.gov (United States)

    Oprea, Tudor I; Bologa, Cristian G; Edwards, Bruce S; Prossnitz, Eric R; Sklar, Larry A

    2005-08-01

    An empirical scheme to evaluate and prioritize screening hits from high-throughput screening (HTS) is proposed. Negative scores are given when chemotypes found in the HTS hits are present in annotated databases such as MDDR and WOMBAT or for testing positive in toxicity-related experiments reported in TOXNET. Positive scores were given for higher measured biological activities, for testing negative in toxicity-related literature, and for good overlap when profiled against drug-related properties. Particular emphasis is placed on estimating aqueous solubility to prioritize in vivo experiments. This empirical scheme is given as an illustration to assist the decision-making process in selecting chemotypes and individual compounds for further experimentation, when confronted with multiple hits from high-throughput experiments. The decision-making process is discussed for a set of G-protein coupled receptor antagonists and validated on a literature example for dihydrofolate reductase inhibition.

  10. High-throughput investigation of catalysts for JP-8 fuel cracking to liquefied petroleum gas.

    Science.gov (United States)

    Bedenbaugh, John E; Kim, Sungtak; Sasmaz, Erdem; Lauterbach, Jochen

    2013-09-09

    Portable power technologies for military applications necessitate the production of fuels similar to LPG from existing feedstocks. Catalytic cracking of military jet fuel to form a mixture of C₂-C₄ hydrocarbons was investigated using high-throughput experimentation. Cracking experiments were performed in a gas-phase, 16-sample high-throughput reactor. Zeolite ZSM-5 catalysts with low Si/Al ratios (≤25) demonstrated the highest production of C₂-C₄ hydrocarbons at moderate reaction temperatures (623-823 K). ZSM-5 catalysts were optimized for JP-8 cracking activity to LPG through varying reaction temperature and framework Si/Al ratio. The reducing atmosphere required during catalytic cracking resulted in coking of the catalyst and a commensurate decrease in conversion rate. Rare earth metal promoters for ZSM-5 catalysts were screened to reduce coking deactivation rates, while noble metal promoters reduced onset temperatures for coke burnoff regeneration.

  11. Towards low-delay and high-throughput cognitive radio vehicular networks

    Directory of Open Access Journals (Sweden)

    Nada Elgaml

    2017-12-01

    Full Text Available Cognitive Radio Vehicular Ad-hoc Networks (CR-VANETs exploit cognitive radios to allow vehicles to access the unused channels in their radio environment. Thus, CR-VANETs do not only suffer the traditional CR problems, especially spectrum sensing, but also suffer new challenges due to the highly dynamic nature of VANETs. In this paper, we present a low-delay and high-throughput radio environment assessment scheme for CR-VANETs that can be easily incorporated with the IEEE 802.11p standard developed for VANETs. Simulation results show that the proposed scheme significantly reduces the time to get the radio environment map and increases the CR-VANET throughput.

  12. Macro-to-micro structural proteomics: native source proteins for high-throughput crystallization.

    Directory of Open Access Journals (Sweden)

    Monica Totir

    Full Text Available Structural biology and structural genomics projects routinely rely on recombinantly expressed proteins, but many proteins and complexes are difficult to obtain by this approach. We investigated native source proteins for high-throughput protein crystallography applications. The Escherichia coli proteome was fractionated, purified, crystallized, and structurally characterized. Macro-scale fermentation and fractionation were used to subdivide the soluble proteome into 408 unique fractions of which 295 fractions yielded crystals in microfluidic crystallization chips. Of the 295 crystals, 152 were selected for optimization, diffraction screening, and data collection. Twenty-three structures were determined, four of which were novel. This study demonstrates the utility of native source proteins for high-throughput crystallography.

  13. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy.

    Science.gov (United States)

    Lai, Yiu Wai; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios; Hofmann, Martin R; Ludwig, Alfred

    2011-10-01

    A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  14. Multiple and high-throughput droplet reactions via combination of microsampling technique and microfluidic chip

    KAUST Repository

    Wu, Jinbo

    2012-11-20

    Microdroplets offer unique compartments for accommodating a large number of chemical and biological reactions in tiny volume with precise control. A major concern in droplet-based microfluidics is the difficulty to address droplets individually and achieve high throughput at the same time. Here, we have combined an improved cartridge sampling technique with a microfluidic chip to perform droplet screenings and aggressive reaction with minimal (nanoliter-scale) reagent consumption. The droplet composition, distance, volume (nanoliter to subnanoliter scale), number, and sequence could be precisely and digitally programmed through the improved sampling technique, while sample evaporation and cross-contamination are effectively eliminated. Our combined device provides a simple model to utilize multiple droplets for various reactions with low reagent consumption and high throughput. © 2012 American Chemical Society.

  15. High-Throughput Computing on High-Performance Platforms: A Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Matteo, Turilli [Rutgers University; Angius, Alessio [Rutgers University; Oral, H Sarp [ORNL; De, K [University of Texas at Arlington; Klimentov, A [Brookhaven National Laboratory (BNL); Wells, Jack C. [ORNL; Jha, S [Rutgers University

    2017-10-01

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i) a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.

  16. Marine natural product libraries for high-throughput screening and rapid drug discovery.

    Science.gov (United States)

    Bugni, Tim S; Richards, Burt; Bhoite, Leen; Cimbora, Daniel; Harper, Mary Kay; Ireland, Chris M

    2008-06-01

    There is a need for diverse molecular libraries for phenotype-selective and high-throughput screening. To make marine natural products (MNPs) more amenable to newer screening paradigms and shorten discovery time lines, we have created an MNP library characterized online using MS. To test the potential of the library, we screened a subset of the library in a phenotype-selective screen to identify compounds that inhibited the growth of BRCA2-deficient cells.

  17. Geochip: A high throughput genomic tool for linking community structure to functions

    Energy Technology Data Exchange (ETDEWEB)

    Van Nostrand, Joy D.; Liang, Yuting; He, Zhili; Li, Guanghe; Zhou, Jizhong

    2009-01-30

    GeoChip is a comprehensive functional gene array that targets key functional genes involved in the geochemical cycling of N, C, and P, sulfate reduction, metal resistance and reduction, and contaminant degradation. Studies have shown the GeoChip to be a sensitive, specific, and high-throughput tool for microbial community analysis that has the power to link geochemical processes with microbial community structure. However, several challenges remain regarding the development and applications of microarrays for microbial community analysis.

  18. Computational and Statistical Methods for High-Throughput Mass Spectrometry-Based PTM Analysis.

    Science.gov (United States)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analysis allows the quantitative comparison of thousands of modified peptides over different conditions. However, the large and complex datasets produced pose multiple data interpretation challenges, ranging from spectral interpretation to statistical and multivariate analyses. Here, we present a typical workflow to interpret such data.

  19. In Vitro High Throughput Screening, What Next? Lessons from the Screening for Aurora Kinase Inhibitors

    Directory of Open Access Journals (Sweden)

    Thi-My-Nhung Hoang

    2014-02-01

    Full Text Available Based on in vitro assays, we performed a High Throughput Screening (HTS to identify kinase inhibitors among 10,000 small chemical compounds. In this didactic paper, we describe step-by-step the approach to validate the hits as well as the major pitfalls encountered in the development of active molecules. We propose a decision tree that could be adapted to most in vitro HTS.

  20. In vitro high throughput screening, what next? Lessons from the screening for aurora kinase inhibitors.

    Science.gov (United States)

    Hoang, Thi-My-Nhung; Vu, Hong-Lien; Le, Ly-Thuy-Tram; Nguyen, Chi-Hung; Molla, Annie

    2014-02-27

    Based on in vitro assays, we performed a High Throughput Screening (HTS) to identify kinase inhibitors among 10,000 small chemical compounds. In this didactic paper, we describe step-by-step the approach to validate the hits as well as the major pitfalls encountered in the development of active molecules. We propose a decision tree that could be adapted to most in vitro HTS.

  1. High-Throughput SNP Discovery And Genetic Mapping In Perennial Ryegrass

    DEFF Research Database (Denmark)

    Asp, Torben; Studer, Bruno; Lübberstedt, Thomas

    Gene-associated single nucleotide polymorphisms (SNPs) are of major interest for genome analysis and breeding applications in the key grassland species perennial ryegrass. High-throughput 454 Titanium transcriptome sequencing was performed on two genotypes, which previously have been used...... in the VrnA mapping population. Here we report on large-scale SNP discovery, and the construction of a genetic map enabling QTL fine mapping, map-based cloning, and comparative genomics in perennial ryegrass....

  2. DRABAL: novel method to mine large high-throughput screening assays using Bayesian active learning

    OpenAIRE

    Soufan, Othman; Ba-Alawi, Wail; Afeef, Moataz; Essack, Magbubah; Kalnis, Panos; Bajic, Vladimir B.

    2016-01-01

    Background Mining high-throughput screening (HTS) assays is key for enhancing decisions in the area of drug repositioning and drug discovery. However, many challenges are encountered in the process of developing suitable and accurate methods for extracting useful information from these assays. Virtual screening and a wide variety of databases, methods and solutions proposed to-date, did not completely overcome these challenges. This study is based on a multi-label classification (MLC) techniq...

  3. Machine learning in computational biology to accelerate high-throughput protein expression

    DEFF Research Database (Denmark)

    Sastry, Anand; Monk, Jonathan M.; Tegel, Hanna

    2017-01-01

    over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Results: Combining computational biology......Motivation: The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where...

  4. A Concept for a Sensitive Micro Total Analysis System for High Throughput Fluorescence Imaging

    OpenAIRE

    Rabner, Arthur; Shacham, Yosi

    2006-01-01

    This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis...

  5. Patterning cell using Si-stencil for high-throughput assay

    KAUST Repository

    Wu, Jinbo

    2011-01-01

    In this communication, we report a newly developed cell pattering methodology by a silicon-based stencil, which exhibited advantages such as easy handling, reusability, hydrophilic surface and mature fabrication technologies. Cell arrays obtained by this method were used to investigate cell growth under a temperature gradient, which demonstrated the possibility of studying cell behavior in a high-throughput assay. This journal is © The Royal Society of Chemistry 2011.

  6. Streptococcus mutans Protein Synthesis during Mixed-Species Biofilm Development by High-Throughput Quantitative Proteomics

    OpenAIRE

    Klein, Marlise I.; Xiao, Jin; Lu, Bingwen; Delahunty, Claire M.; Yates, John R.; Koo, Hyun

    2012-01-01

    Biofilms formed on tooth surfaces are comprised of mixed microbiota enmeshed in an extracellular matrix. Oral biofilms are constantly exposed to environmental changes, which influence the microbial composition, matrix formation and expression of virulence. Streptococcus mutans and sucrose are key modulators associated with the evolution of virulent-cariogenic biofilms. In this study, we used a high-throughput quantitative proteomics approach to examine how S. mutans produces relevant proteins...

  7. Statistical Methods for Integrating Multiple Types of High-Throughput Data

    OpenAIRE

    Xie, Yang; Ahn, Chul

    2010-01-01

    Large-scale sequencing, copy number, mRNA, and protein data have given great promise to the biomedical research, while posing great challenges to data management and data analysis. Integrating different types of high-throughput data from diverse sources can increase the statistical power of data analysis and provide deeper biological understanding. This chapter uses two biomedical research examples to illustrate why there is an urgent need to develop reliable and robust methods for integratin...

  8. Acanthamoeba castellanii: a new high-throughput method for drug screening in vitro

    OpenAIRE

    Ortega-Rivas, Antonio; Padrón, José M; Valladares, Basilio; Elsheikha, Hany M

    2016-01-01

    Despite significant public health impact, there is no specific antiprotozoal therapy for prevention and treatment of Acanthamoeba castellanii infection. There is a need for new and efficient anti-Acanthamoeba drugs that are less toxic and can reduce treatment duration and frequency of administration. In this context a new, rapid and sensitive assay is required for high-throughput activity testing and screening of new therapeutic compounds. A colorimetric assay based on sulforhodamine B (SRB) ...

  9. Peranan Biologi Molekuler Dan Hts (High Throughput Screening) Dalam Pengembangan Obat Sintetik Baru

    OpenAIRE

    Nurrochmad, Arief

    2004-01-01

    Recently, the discovery of new drugs uses the new concept by modern techniques instead ofthe convenstional techniques. In the development of scientific knowledge, the role of molecularbiology and the modern techniques in the investigations and discovery new drug becomes theimportant things. Many methods and modern techniques use in the discovery of new drugs, i.e,genetic enginering, DNA recombinant, radioligand binding assay technique, HTS techniques (HighThroughput Screening), and mass ligan...

  10. A high-throughput method for quantifying metabolically active yeast cells

    DEFF Research Database (Denmark)

    Nandy, Subir Kumar; Knudsen, Peter Boldsen; Rosenkjær, Alexander

    2015-01-01

    By redesigning the established methylene blue reduction test for bacteria and yeast, we present a cheap and efficient methodology for quantitative physiology of eukaryotic cells applicable for high-throughput systems. Validation of themethod in fermenters and highthroughput systems proved...... equivalent, displaying reduction curves that interrelated directly with CFU counts. For growth rate estimation, the methylene blue reduction test (MBRT) proved superior, since the discriminatory nature of the method allowed for the quantification of metabolically active cells only, excluding dead cells...

  11. High-throughput, image-based screening of pooled genetic-variant libraries.

    Science.gov (United States)

    Emanuel, George; Moffitt, Jeffrey R; Zhuang, Xiaowei

    2017-12-01

    We report a high-throughput screening method that allows diverse genotypes and corresponding phenotypes to be imaged in individual cells. We achieve genotyping by introducing barcoded genetic variants into cells as pooled libraries and reading the barcodes out using massively multiplexed fluorescence in situ hybridization. To demonstrate the power of image-based pooled screening, we identified brighter and more photostable variants of the fluorescent protein YFAST among 60,000 variants.

  12. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data

    OpenAIRE

    Althammer, Sonja Daniela; González-Vallinas Rostes, Juan, 1983-; Ballaré, Cecilia Julia; Beato, Miguel; Eyras Jiménez, Eduardo

    2011-01-01

    Motivation: High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein?DNA and protein?RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. Results: We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or b...

  13. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  14. High Throughput Single-cell and Multiple-cell Micro-encapsulation

    OpenAIRE

    Lagus, Todd P.; Edd, Jon F.

    2012-01-01

    Microfluidic encapsulation methods have been previously utilized to capture cells in picoliter-scale aqueous, monodisperse drops, providing confinement from a bulk fluid environment with applications in high throughput screening, cytometry, and mass spectrometry. We describe a method to not only encapsulate single cells, but to repeatedly capture a set number of cells (here we demonstrate one- and two-cell encapsulation) to study both isolation and the interactions between cells in groups of ...

  15. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Science.gov (United States)

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  16. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  17. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  18. Engineering serendipity: High-throughput discovery of materials that resist bacterial attachment.

    Science.gov (United States)

    Magennis, E P; Hook, A L; Davies, M C; Alexander, C; Williams, P; Alexander, M R

    2016-04-01

    Controlling the colonisation of materials by microorganisms is important in a wide range of industries and clinical settings. To date, the underlying mechanisms that govern the interactions of bacteria with material surfaces remain poorly understood, limiting the ab initio design and engineering of biomaterials to control bacterial attachment. Combinatorial approaches involving high-throughput screening have emerged as key tools for identifying materials to control bacterial attachment. The hundreds of different materials assessed using these methods can be carried out with the aid of computational modelling. This approach can develop an understanding of the rules used to predict bacterial attachment to surfaces of non-toxic synthetic materials. Here we outline our view on the state of this field and the challenges and opportunities in this area for the coming years. This opinion article on high throughput screening methods reflects one aspect of how the field of biomaterials research has developed and progressed. The piece takes the reader through key developments in biomaterials discovery, particularly focusing on need to reduce bacterial colonisation of surfaces. Such bacterial resistant surfaces are increasingly required in this age of antibiotic resistance. The influence and origin of high-throughput methods are discussed with insights into the future of biomaterials development where computational methods may drive materials development into new fertile areas of discovery. New biomaterials will exhibit responsiveness to adapt to the biological environment and promote better integration and reduced rejection or infection. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  19. Carbohydrate chips for studying high-throughput carbohydrate-protein interactions.

    Science.gov (United States)

    Park, Sungjin; Lee, Myung-ryul; Pyo, Soon-Jin; Shin, Injae

    2004-04-21

    Carbohydrate-protein interactions play important biological roles in living organisms. For the most part, biophysical and biochemical methods have been used for studying these biomolecular interactions. Less attention has been given to the development of high-throughput methods to elucidate recognition events between carbohydrates and proteins. In the current effort to develop a novel high-throughput tool for monitoring carbohydrate-protein interactions, we prepared carbohydrate microarrays by immobilizing maleimide-linked carbohydrates on thiol-derivatized glass slides and carried out lectin binding experiments by using these microarrays. The results showed that carbohydrates with different structural features selectively bound to the corresponding lectins with relative binding affinities that correlated with those obtained from solution-based assays. In addition, binding affinities of lectins to carbohydrates were also quantitatively analyzed by determining IC(50) values of soluble carbohydrates with the carbohydrate microarrays. To fabricate carbohydrate chips that contained more diverse carbohydrate probes, solution-phase parallel and enzymatic glycosylations were performed. Three model disaccharides were in parallel synthesized in solution-phase and used as carbohydrate probes for the fabrication of carbohydrate chips. Three enzymatic glycosylations on glass slides were consecutively performed to generate carbohydrate microarrays that contained the complex oligosaccharide, sialyl Le(x). Overall, these works demonstrated that carbohydrate chips could be efficiently prepared by covalent immobilization of maleimide-linked carbohydrates on the thiol-coated glass slides and applied for the high-throughput analyses of carbohydrate-protein interactions.

  20. A strategy for primary high throughput cytotoxicity screening in pharmaceutical toxicology.

    Science.gov (United States)

    Bugelski, P J; Atif, U; Molton, S; Toeg, I; Lord, P G; Morgan, D G

    2000-10-01

    Recent advances in combinatorial chemistry and high throughput screens for pharmacologic activity have created an increasing demand for in vitro high throughput screens for toxicological evaluation in the early phases of drug discovery. To develop a strategy for such a screen, we have conducted a data mining study of the National Cancer Institute's Developmental Therapeutics Program (DTP) cytotoxicity database. Using hierarchical cluster analysis, we confirmed that the different tissues of origin and individual cell lines showed differential sensitivity to compounds in the DTP Standard Agents database. Surprisingly, however, approaching the data globally, linear regression analysis showed that the differences were relatively minor. Comparison with the literature on acute toxicity in mice showed that the predictive power of growth inhibition was marginally superior to that of cell death. This datamining study suggests that in designing a strategy for high throughput cytotoxicity screening: a single cell line, the choice of which may not be critical, can be used as a primary screen; a single end point may be an adequate measure and a cut off value for 50% growth inhibition between 10(-6) and 10(-8) M may be a reasonable starting point for accepting a cytotoxic compound for scale up and further study.

  1. High-throughput gene expression profiling of memory differentiation in primary human T cells

    Directory of Open Access Journals (Sweden)

    Russell Kate

    2008-08-01

    Full Text Available Abstract Background The differentiation of naive T and B cells into memory lymphocytes is essential for immunity to pathogens. Therapeutic manipulation of this cellular differentiation program could improve vaccine efficacy and the in vitro expansion of memory cells. However, chemical screens to identify compounds that induce memory differentiation have been limited by 1 the lack of reporter-gene or functional assays that can distinguish naive and memory-phenotype T cells at high throughput and 2 a suitable cell-line representative of naive T cells. Results Here, we describe a method for gene-expression based screening that allows primary naive and memory-phenotype lymphocytes to be discriminated based on complex genes signatures corresponding to these differentiation states. We used ligation-mediated amplification and a fluorescent, bead-based detection system to quantify simultaneously 55 transcripts representing naive and memory-phenotype signatures in purified populations of human T cells. The use of a multi-gene panel allowed better resolution than any constituent single gene. The method was precise, correlated well with Affymetrix microarray data, and could be easily scaled up for high-throughput. Conclusion This method provides a generic solution for high-throughput differentiation screens in primary human T cells where no single-gene or functional assay is available. This screening platform will allow the identification of small molecules, genes or soluble factors that direct memory differentiation in naive human lymphocytes.

  2. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online.

  3. High-Throughput Cloning and Expression Library Creation for Functional Proteomics

    Science.gov (United States)

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-01-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047

  4. Arabidopsis Seed Content QTL Mapping Using High-Throughput Phenotyping: The Assets of Near Infrared Spectroscopy.

    Science.gov (United States)

    Jasinski, Sophie; Lécureuil, Alain; Durandet, Monique; Bernard-Moulin, Patrick; Guerche, Philippe

    2016-01-01

    Seed storage compounds are of crucial importance for human diet, feed and industrial uses. In oleo-proteaginous species like rapeseed, seed oil and protein are the qualitative determinants that conferred economic value to the harvested seed. To date, although the biosynthesis pathways of oil and storage protein are rather well-known, the factors that determine how these types of reserves are partitioned in seeds have to be identified. With the aim of implementing a quantitative genetics approach, requiring phenotyping of 100s of plants, our first objective was to establish near-infrared reflectance spectroscopic (NIRS) predictive equations in order to estimate oil, protein, carbon, and nitrogen content in Arabidopsis seed with high-throughput level. Our results demonstrated that NIRS is a powerful non-destructive, high-throughput method to assess the content of these four major components studied in Arabidopsis seed. With this tool in hand, we analyzed Arabidopsis natural variation for these four components and illustrated that they all displayed a wide range of variation. Finally, NIRS was used in order to map QTL for these four traits using seeds from the Arabidopsis thaliana Ct-1 × Col-0 recombinant inbred line population. Some QTL co-localized with QTL previously identified, but others mapped to chromosomal regions never identified so far for such traits. This paper illustrates the usefulness of NIRS predictive equations to perform accurate high-throughput phenotyping of Arabidopsis seed content, opening new perspectives in gene identification following QTL mapping and genome wide association studies.

  5. Arabidopsis seed content QTL mapping using high-throughput phenotyping: the assets of Near Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    Sophie Jasinski

    2016-11-01

    Full Text Available Seed storage compounds are of crucial importance for human diet, feed and industrial uses. In oleo-proteaginous species like rapeseed, seed oil and protein are the qualitative determinants that conferred economic value to the harvested seed. To date, although the biosynthesis pathways of oil and storage protein are rather well known, the factors that determine how these types of reserves are partitioned in seeds have to be identified. With the aim of implementing a quantitative genetics approach, requiring phenotyping of hundreds of plants, our first objective was to establish near-infrared reflectance spectroscopic (NIRS predictive equations in order to estimate oil, protein, carbon and nitrogen content in Arabidopsis seed with high-throughput level. Our results demonstrated that NIRS is a powerful non-destructive, high-throughput method to assess the content of these four major components studied in Arabidopsis seed. With this tool in hand, we analysed Arabidopsis natural variation for these four components and illustrated that they all displayed a wide range of variation. Finally, NIRS was used in order to map QTL for these four traits using seeds from the Arabidopsis thaliana Ct-1 x Col-0 recombinant inbred line population. Some QTL co-localised with QTL previously identified, but others mapped to chromosomal regions never identified so far for such traits. This paper illustrates the usefulness of NIRS predictive equations to perform accurate high-throughput phenotyping of Arabidopsis seed content, opening new perspectives in gene identification following QTL mapping and Genome Wide Association Studies.

  6. High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics

    Science.gov (United States)

    Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine

    2016-06-01

    Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.

  7. Evaluation of a pooled strategy for high-throughput sequencing of cosmid clones from metagenomic libraries.

    Directory of Open Access Journals (Sweden)

    Kathy N Lam

    Full Text Available High-throughput sequencing methods have been instrumental in the growing field of metagenomics, with technological improvements enabling greater throughput at decreased costs. Nonetheless, the economy of high-throughput sequencing cannot be fully leveraged in the subdiscipline of functional metagenomics. In this area of research, environmental DNA is typically cloned to generate large-insert libraries from which individual clones are isolated, based on specific activities of interest. Sequence data are required for complete characterization of such clones, but the sequencing of a large set of clones requires individual barcode-based sample preparation; this can become costly, as the cost of clone barcoding scales linearly with the number of clones processed, and thus sequencing a large number of metagenomic clones often remains cost-prohibitive. We investigated a hybrid Sanger/Illumina pooled sequencing strategy that omits barcoding altogether, and we evaluated this strategy by comparing the pooled sequencing results to reference sequence data obtained from traditional barcode-based sequencing of the same set of clones. Using identity and coverage metrics in our evaluation, we show that pooled sequencing can generate high-quality sequence data, without producing problematic chimeras. Though caveats of a pooled strategy exist and further optimization of the method is required to improve recovery of complete clone sequences and to avoid circumstances that generate unrecoverable clone sequences, our results demonstrate that pooled sequencing represents an effective and low-cost alternative for sequencing large sets of metagenomic clones.

  8. High-throughput screening to identify selective inhibitors of microbial sulfate reduction (and beyond)

    Science.gov (United States)

    Carlson, H. K.; Coates, J. D.; Deutschbauer, A. M.

    2015-12-01

    The selective perturbation of complex microbial ecosystems to predictably influence outcomes in engineered and industrial environments remains a grand challenge for geomicrobiology. In some industrial ecosystems, such as oil reservoirs, sulfate reducing microorganisms (SRM) produce hydrogen sulfide which is toxic, explosive and corrosive. Current strategies to selectively inhibit sulfidogenesis are based on non-specific biocide treatments, bio-competitive exclusion by alternative electron acceptors or sulfate-analogs which are competitive inhibitors or futile/alternative substrates of the sulfate reduction pathway. Despite the economic cost of sulfidogenesis, there has been minimal exploration of the chemical space of possible inhibitory compounds, and very little work has quantitatively assessed the selectivity of putative souring treatments. We have developed a high-throughput screening strategy to target SRM, quantitatively ranked the selectivity and potency of hundreds of compounds and identified previously unrecognized SRM selective inhibitors and synergistic interactions between inhibitors. Once inhibitor selectivity is defined, high-throughput characterization of microbial community structure across compound gradients and identification of fitness determinants using isolate bar-coded transposon mutant libraries can give insights into the genetic mechanisms whereby compounds structure microbial communities. The high-throughput (HT) approach we present can be readily applied to target SRM in diverse environments and more broadly, could be used to identify and quantify the potency and selectivity of inhibitors of a variety of microbial metabolisms. Our findings and approach are relevant for engineering environmental ecosystems and also to understand the role of natural gradients in shaping microbial niche space.

  9. High throughput workflow for coacervate formation and characterization in shampoo systems.

    Science.gov (United States)

    Kalantar, T H; Tucker, C J; Zalusky, A S; Boomgaard, T A; Wilson, B E; Ladika, M; Jordan, S L; Li, W K; Zhang, X; Goh, C G

    2007-01-01

    Cationic cellulosic polymers find wide utility as benefit agents in shampoo. Deposition of these polymers onto hair has been shown to mend split-ends, improve appearance and wet combing, as well as provide controlled delivery of insoluble actives. The deposition is thought to be enhanced by the formation of a polymer/surfactant complex that phase-separates from the bulk solution upon dilution. A standard characterization method has been developed to characterize the coacervate formation upon dilution, but the test is time and material prohibitive. We have developed a semi-automated high throughput workflow to characterize the coacervate-forming behavior of different shampoo formulations. A procedure that allows testing of real use shampoo dilutions without first formulating a complete shampoo was identified. This procedure was adapted to a Tecan liquid handler by optimizing the parameters for liquid dispensing as well as for mixing. The high throughput workflow enabled preparation and testing of hundreds of formulations with different types and levels of cationic cellulosic polymers and surfactants, and for each formulation a haze diagram was constructed. Optimal formulations and their dilutions that give substantial coacervate formation (determined by haze measurements) were identified. Results from this high throughput workflow were shown to reproduce standard haze and bench-top turbidity measurements, and this workflow has the advantages of using less material and allowing more variables to be tested with significant time savings.

  10. A high throughput array microscope for the mechanical characterization of biomaterials

    Science.gov (United States)

    Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard

    2015-02-01

    In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.

  11. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  12. High Throughput, Polymeric Aqueous Two-Phase Printing of Tumor Spheroids

    Science.gov (United States)

    Atefi, Ehsan; Lemmo, Stephanie; Fyffe, Darcy; Luker, Gary D.; Tavana, Hossein

    2014-01-01

    This paper presents a new 3D culture microtechnology for high throughput production of tumor spheroids and validates its utility for screening anti-cancer drugs. We use two immiscible polymeric aqueous solutions and microprint a submicroliter drop of the “patterning” phase containing cells into a bath of the “immersion” phase. Selecting proper formulations of biphasic systems using a panel of biocompatible polymers results in the formation of a round drop that confines cells to facilitate spontaneous formation of a spheroid without any external stimuli. Adapting this approach to robotic tools enables straightforward generation and maintenance of spheroids of well-defined size in standard microwell plates and biochemical analysis of spheroids in situ, which is not possible with existing techniques for spheroid culture. To enable high throughput screening, we establish a phase diagram to identify minimum cell densities within specific volumes of the patterning drop to result in a single spheroid. Spheroids show normal growth over long-term incubation and dose-dependent decrease in cellular viability when treated with drug compounds, but present significant resistance compared to monolayer cultures. The unprecedented ease of implementing this microtechnology and its robust performance will benefit high throughput studies of drug screening against cancer cells with physiologically-relevant 3D tumor models. PMID:25411577

  13. The application of the high throughput sequencing technology in the transposable elements.

    Science.gov (United States)

    Liu, Zhen; Xu, Jian-hong

    2015-09-01

    High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.

  14. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  15. A general approach for discriminative de novo motif discovery from high-throughput data.

    Science.gov (United States)

    Grau, Jan; Posch, Stefan; Grosse, Ivo; Keilwagen, Jens

    2013-11-01

    De novo motif discovery has been an important challenge of bioinformatics for the past two decades. Since the emergence of high-throughput techniques like ChIP-seq, ChIP-exo and protein-binding microarrays (PBMs), the focus of de novo motif discovery has shifted to runtime and accuracy on large data sets. For this purpose, specialized algorithms have been designed for discovering motifs in ChIP-seq or PBM data. However, none of the existing approaches work perfectly for all three high-throughput techniques. In this article, we propose Dimont, a general approach for fast and accurate de novo motif discovery from high-throughput data. We demonstrate that Dimont yields a higher number of correct motifs from ChIP-seq data than any of the specialized approaches and achieves a higher accuracy for predicting PBM intensities from probe sequence than any of the approaches specifically designed for that purpose. Dimont also reports the expected motifs for several ChIP-exo data sets. Investigating differences between in vitro and in vivo binding, we find that for most transcription factors, the motifs discovered by Dimont are in good accordance between techniques, but we also find notable exceptions. We also observe that modeling intra-motif dependencies may increase accuracy, which indicates that more complex motif models are a worthwhile field of research.

  16. Comprehensive molecular diagnosis of Bardet-Biedl syndrome by high-throughput targeted exome sequencing.

    Directory of Open Access Journals (Sweden)

    Dong-Jun Xing

    Full Text Available Bardet-Biedl syndrome (BBS is an autosomal recessive disorder with significant genetic heterogeneity. BBS is linked to mutations in 17 genes, which contain more than 200 coding exons. Currently, BBS is diagnosed by direct DNA sequencing for mutations in these genes, which because of the large genomic screening region is both time-consuming and expensive. In order to develop a practical method for the clinic diagnosis of BBS, we have developed a high-throughput targeted exome sequencing (TES for genetic diagnosis. Five typical BBS patients were recruited and screened for mutations in a total of 144 known genes responsible for inherited retinal diseases, a hallmark symptom of BBS. The genomic DNA of these patients and their families were subjected to high-throughput DNA re-sequencing. Deep bioinformatics analysis was carried out to filter the massive sequencing data, which were further confirmed through co-segregation analysis. TES successfully revealed mutations in BBS genes in each patient and family member. Six pathological mutations, including five novel mutations, were revealed in the genes BBS2, MKKS, ARL6, MKS1. This study represents the first report of targeted exome sequencing in BBS patients and demonstrates that high-throughput TES is an accurate and rapid method for the genetic diagnosis of BBS.

  17. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    Science.gov (United States)

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  18. Multi-shaped-beam (MSB): an evolutionary approach for high throughput e-beam lithography

    Science.gov (United States)

    Slodowski, Matthias; Döring, Hans-Joachim; Stolberg, Ines A.; Dorl, Wolfgang

    2010-09-01

    The development of next-generation lithography (NGL) such as EUV, NIL and maskless lithography (ML2) are driven by the half pitch reduction and increasing integration density of integrated circuits down to the 22nm node and beyond. For electron beam direct write (EBDW) several revolutionary pixel based concepts have been under development since several years. By contrast an evolutionary and full package high throughput multi electron-beam approach called Multi Shaped Beam (MSB), which is based on proven Variable Shaped Beam (VSB) technology, will be presented in this paper. In the recent decade VSB has already been applied in EBDW for device learning, early prototyping and low volume fabrication in production environments for both silicon and compound semiconductor applications. Above all the high resolution and the high flexibility due to the avoidance of expensive masks for critical layers made it an attractive solution for advanced technology nodes down to 32nm half pitch. The limitation in throughput of VSB has been mitigated in a major extension of VSB by the qualification of the cell projection (CP) technology concurrently used with VSB. With CP more pixels in complex shapes can be projected in one shot, enabling a remarkable shot count reduction for repetitive pattern. The most advanced step to extend the mature VSB technology for higher throughput is its parallelization in one column applying MEMS based multi deflection arrays. With this Vistec MSB technology, multiple shaped beamlets are generated simultaneously, each controllable individually in shape size and beam on time. Compared to pixel based ML2 approaches the MSB technology enables the maskless, variable and parallel projection of a large number of pixels per beamlet times the number of beamlets. Basic concepts, exposure examples and performance results of each of the described throughput enhancement steps will be presented.

  19. Neuraminidase activity provides a practical read-out for a high throughput influenza antiviral screening assay

    Directory of Open Access Journals (Sweden)

    Wu Meng

    2008-09-01

    Full Text Available Abstract Background The emergence of influenza strains that are resistant to commonly used antivirals has highlighted the need to develop new compounds that target viral gene products or host mechanisms that are essential for effective virus replication. Existing assays to identify potential antiviral compounds often use high throughput screening assays that target specific viral replication steps. To broaden the search for antivirals, cell-based replication assays can be performed, but these are often labor intensive and have limited throughput. Results We have adapted a traditional virus neutralization assay to develop a practical, cell-based, high throughput screening assay. This assay uses viral neuraminidase (NA as a read-out to quantify influenza replication, thereby offering an assay that is both rapid and sensitive. In addition to identification of inhibitors that target either viral or host factors, the assay allows simultaneous evaluation of drug toxicity. Antiviral activity was demonstrated for a number of known influenza inhibitors including amantadine that targets the M2 ion channel, zanamivir that targets NA, ribavirin that targets IMP dehydrogenase, and bis-indolyl maleimide that targets protein kinase A/C. Amantadine-resistant strains were identified by comparing IC50 with that of the wild-type virus. Conclusion Antivirals with specificity for a broad range of targets are easily identified in an accelerated viral inhibition assay that uses NA as a read-out of replication. This assay is suitable for high throughput screening to identify potential antivirals or can be used to identify drug-resistant influenza strains.

  20. A 3D-printed mini-hydrocyclone for high throughput particle separation: application to primary harvesting of microalgae.

    Science.gov (United States)

    Shakeel Syed, Maira; Rafeie, Mehdi; Henderson, Rita; Vandamme, Dries; Asadnia, Mohsen; Ebrahimi Warkiani, Majid

    2017-07-11

    The separation of micro-sized particles in a continuous flow is crucial part of many industrial processes, from biopharmaceutical manufacturing to water treatment. Conventional separation techniques such as centrifugation and membrane filtration are largely limited by factors such as clogging, processing time and operation efficiency. Microfluidic based techniques have been gaining great attention in recent years as efficient and powerful approaches for particle-liquid separation. Yet the production of such systems using standard micro-fabrication techniques is proven to be tedious, costly and have cumbersome user interfaces, which all render commercialization difficult. Here, we demonstrate the design, fabrication and evaluation based on CFD simulation as well as experimentation of 3D-printed miniaturized hydrocyclones with smaller cut-size for high-throughput particle/cell sorting. The characteristics of the mini-cyclones were numerically investigated using computational fluid dynamics (CFD) techniques previously revealing that reduction in the size of the cyclone results in smaller cut-size of the particles. To showcase its utility, high-throughput algae harvesting from the medium with low energy input is demonstrated for the marine microalgae Tetraselmis suecica. Final microalgal biomass concentration was increased by 7.13 times in 11 minutes of operation time using our designed hydrocyclone (HC-1). We expect that this elegant approach can surmount the shortcomings of other microfluidic technologies such as clogging, low-throughput, cost and difficulty in operation. By moving away from production of planar microfluidic systems using conventional microfabrication techniques and embracing 3D-printing technology for construction of discrete elements, we envision 3D-printed mini-cyclones can be part of a library of standardized active and passive microfluidic components, suitable for particle-liquid separation.

  1. A simple dual online ultra-high pressure liquid chromatography system (sDO-UHPLC) for high throughput proteome analysis.

    Science.gov (United States)

    Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won

    2015-08-21

    We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.

  2. High-throughput screening for industrial enzyme production hosts by droplet microfluidics

    DEFF Research Database (Denmark)

    Sjostrom, Staffan L.; Bai, Yunpeng; Huang, Mingtao

    2014-01-01

    A high-throughput method for single cell screening by microfluidic droplet sorting is applied to a whole-genome mutated yeast cell library yielding improved production hosts of secreted industrial enzymes. The sorting method is validated by enriching a yeast strain 14 times based on its α......-amylase production, close to the theoretical maximum enrichment. Furthermore, a 105 member yeast cell library is screened yielding a clone with a more than 2-fold increase in α-amylase production. The increase in enzyme production results from an improvement of the cellular functions of the production host......) with the genotype (contained in the cell) inside a droplet enables selection of single cells with improved enzyme production capacity by droplet sorting. The platform has a throughput over 300 times higher than that of the current industry standard, an automated microtiter plate screening system. At the same time...

  3. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules.

    Directory of Open Access Journals (Sweden)

    Antonino Ingargiola

    Full Text Available We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions.

  4. Optimisation Issues of High Throughput Medical Data and Video Streaming Traffic in 3G Wireless Environments.

    Science.gov (United States)

    Istepanian, R S H; Philip, N

    2005-01-01

    In this paper we describe some of the optimisation issues relevant to the requirements of high throughput of medical data and video streaming traffic in 3G wireless environments. In particular we present a challenging 3G mobile health care application that requires a demanding 3G medical data throughput. We also describe the 3G QoS requirement of mObile Tele-Echography ultra-Light rObot system (OTELO that is designed to provide seamless 3G connectivity for real-time ultrasound medical video streams and diagnosis from a remote site (robotic and patient station) manipulated by an expert side (specialists) that is controlling the robotic scanning operation and presenting a real-time feedback diagnosis using 3G wireless communication links.

  5. Droplet-based microfluidics platform for ultra-high-throughput bioprospecting of cellulolytic microorganisms.

    Science.gov (United States)

    Najah, Majdi; Calbrix, Raphaël; Mahendra-Wijaya, I Putu; Beneyton, Thomas; Griffiths, Andrew D; Drevelle, Antoine

    2014-12-18

    Discovery of microorganisms producing enzymes that can efficiently hydrolyze cellulosic biomass is of great importance for biofuel production. To date, however, only a miniscule fraction of natural biodiversity has been tested because of the relatively low throughput of screening systems and their limitation to screening only culturable microorganisms. Here, we describe an ultra-high-throughput droplet-based microfluidic system that allowed the screening of over 100,000 cells in less than 20 min. Uncultured bacteria from a wheat stubble field were screened directly by compartmentalization of single bacteria in 20 pl droplets containing a fluorogenic cellobiohydrolase substrate. Sorting of droplets based on cellobiohydrolase activity resulted in a bacterial population with 17- and 7-fold higher cellobiohydrolase and endogluconase activity, respectively, and very different taxonomic diversity than when selected for growth on medium containing starch and carboxymethylcellulose as carbon source. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. High Throughput Pseudorandom Number Generator Based on Variable Argument Unified Hyperchaos

    Directory of Open Access Journals (Sweden)

    Kaiyu Wang

    2014-01-01

    Full Text Available This paper presents a new multioutput and high throughput pseudorandom number generator. The scheme is to make the homogenized Logistic chaotic sequence as unified hyperchaotic system parameter. So the unified hyperchaos can transfer in different chaotic systems and the output can be more complex with the changing of homogenized Logistic chaotic output. Through processing the unified hyperchaotic 4-way outputs, the output will be extended to 26 channels. In addition, the generated pseudorandom sequences have all passed NIST SP800-22 standard test and DIEHARD test. The system is designed in Verilog HDL and experimentally verified on a Xilinx Spartan 6 FPGA for a maximum throughput of 16.91 Gbits/s for the native chaotic output and 13.49 Gbits/s for the resulting pseudorandom number generators.

  7. High Throughput Line-of-Sight MIMO Systems for Next Generation Backhaul Applications

    Science.gov (United States)

    Song, Xiaohang; Cvetkovski, Darko; Hälsig, Tim; Rave, Wolfgang; Fettweis, Gerhard; Grass, Eckhard; Lankl, Berthold

    2017-09-01

    The evolution to ultra-dense next generation networks requires a massive increase in throughput and deployment flexibility. Therefore, novel wireless backhaul solutions that can support these demands are needed. In this work we present an approach for a millimeter wave line-of-sight MIMO backhaul design, targeting transmission rates in the order of 100 Gbit/s. We provide theoretical foundations for the concept showcasing its potential, which are confirmed through channel measurements. Furthermore, we provide insights into the system design with respect to antenna array setup, baseband processing, synchronization, and channel equalization. Implementation in a 60 GHz demonstrator setup proves the feasibility of the system concept for high throughput backhauling in next generation networks.

  8. High-throughput sockets over RDMA for the Intel Xeon Phi coprocessor

    CERN Document Server

    Santogidis, Aram

    2017-01-01

    In this paper we describe the design, implementation and performance of Trans4SCIF, a user-level socket-like transport library for the Intel Xeon Phi coprocessor. Trans4SCIF library is primarily intended for high-throughput applications. It uses RDMA transfers over the native SCIF support, in a way that is transparent for the application, which has the illusion of using conventional stream sockets. We also discuss the integration of Trans4SCIF with the ZeroMQ messaging library, used extensively by several applications running at CERN. We show that this can lead to a substantial, up to 3x, increase of application throughput compared to the default TCP/IP transport option.

  9. A high-throughput and quantitative method to assess the mutagenic potential of translesion DNA synthesis

    Science.gov (United States)

    Taggart, David J.; Camerlengo, Terry L.; Harrison, Jason K.; Sherrer, Shanen M.; Kshetry, Ajay K.; Taylor, John-Stephen; Huang, Kun; Suo, Zucai

    2013-01-01

    Cellular genomes are constantly damaged by endogenous and exogenous agents that covalently and structurally modify DNA to produce DNA lesions. Although most lesions are mended by various DNA repair pathways in vivo, a significant number of damage sites persist during genomic replication. Our understanding of the mutagenic outcomes derived from these unrepaired DNA lesions has been hindered by the low throughput of existing sequencing methods. Therefore, we have developed a cost-effective high-throughput short oligonucleotide sequencing assay that uses next-generation DNA sequencing technology for the assessment of the mutagenic profiles of translesion DNA synthesis catalyzed by any error-prone DNA polymerase. The vast amount of sequencing data produced were aligned and quantified by using our novel software. As an example, the high-throughput short oligonucleotide sequencing assay was used to analyze the types and frequencies of mutations upstream, downstream and at a site-specifically placed cis–syn thymidine–thymidine dimer generated individually by three lesion-bypass human Y-family DNA polymerases. PMID:23470999

  10. Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines

    Science.gov (United States)

    Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.

    2017-01-01

    Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.

  11. DESIGN OF LOW EPI AND HIGH THROUGHPUT CORDIC CELL TO IMPROVE THE PERFORMANCE OF MOBILE ROBOT

    Directory of Open Access Journals (Sweden)

    P. VELRAJKUMAR

    2014-04-01

    Full Text Available This paper mainly focuses on pass logic based design, which gives an low Energy Per Instruction (EPI and high throughput COrdinate Rotation Digital Computer (CORDIC cell for application of robotic exploration. The basic components of CORDIC cell namely register, multiplexer and proposed adder is designed using pass transistor logic (PTL design. The proposed adder is implemented in bit-parallel iterative CORDIC circuit whereas designed using DSCH2 VLSI CAD tool and their layouts are generated by Microwind 3 VLSI CAD tool. The propagation delay, area and power dissipation are calculated from the simulated results for proposed adder based CORDIC cell. The EPI, throughput and effect of temperature are calculated from generated layout. The output parameter of generated layout is analysed using BSIM4 advanced analyzer. The simulated result of the proposed adder based CORDIC circuit is compared with other adder based CORDIC circuits. From the analysis of these simulated results, it was found that the proposed adder based CORDIC circuit dissipates low power, gives faster response, low EPI and high throughput.

  12. A High Throughput Medium Access Control Implementation Based on IEEE 802.11e Standard

    Science.gov (United States)

    Huang, Min Li; Lee, Jin; Setiawan, Hendra; Ochi, Hiroshi; Park, Sin-Chong

    With the growing demand for high-performance multimedia applications over wireless channels, we need to develop a Medium Access Control (MAC) system that supports high throughput and quality of service enhancements. This paper presents the standard analysis, design architecture and design issues leading to the implementation of an IEEE 802.11e based MAC system that supports MAC throughput of over 100Mbps. In order to meet the MAC layer timing constraints, a hardware/software co-design approach is adopted. The proposed MAC architecture is implemented on the Xilinx Virtex-II Pro Field-Programmable Gate Array (FPGA) (XC2VP70-5FF1704C) prototype, and connected to a host computer through an external Universal Serial Bus (USB) interface. The total FPGA resource utilization is 11, 508 out of 33, 088 (34%) available slices. The measured MAC throughput is 100.7Mbps and 109.2Mbps for voice and video access categories, transmitted at a data rate of 260Mbps based on IEEE 802.11n Physical Layer (PHY), using the contention-based hybrid coordination function channel access mechanism.

  13. High-throughput cell mechanical phenotyping for label-free titration assays of cytoskeletal modifications.

    Science.gov (United States)

    Golfier, Stefan; Rosendahl, Philipp; Mietke, Alexander; Herbig, Maik; Guck, Jochen; Otto, Oliver

    2017-08-01

    The mechanical fingerprint of cells is inherently linked to the structure of the cytoskeleton and can serve as a label-free marker for cell homeostasis or pathologic states. How cytoskeletal composition affects the physical response of cells to external loads has been intensively studied with a spectrum of techniques, yet quantitative and statistically powerful investigations in the form of titration assays are hampered by the low throughput of most available methods. In this study, we employ real-time deformability cytometry (RT-DC), a novel microfluidic tool to examine the effects of biochemically modified F-actin and microtubule stability and nuclear chromatin structure on cell deformation in a human leukemia cell line (HL60). The high throughput of our method facilitates extensive titration assays that allow for significance assessment of the observed effects and extraction of half-maximal concentrations for most of the applied reagents. We quantitatively show that integrity of the F-actin cortex and microtubule network dominate cell deformation on millisecond timescales probed with RT-DC. Drug-induced alterations in the nuclear chromatin structure were not found to consistently affect cell deformation. The sensitivity of the high-throughput cell mechanical measurements to the cytoskeletal modifications we present in this study opens up new possibilities for label-free dose-response assays of cytoskeletal modifications. © 2017 The Authors Cytoskeleton Published by Wiley Periodicals, Inc.

  14. Chromatographic Monoliths for High-Throughput Immunoaffinity Isolation of Transferrin from Human Plasma

    Directory of Open Access Journals (Sweden)

    Irena Trbojević-Akmačić

    2016-06-01

    Full Text Available Changes in protein glycosylation are related to different diseases and have a potential as diagnostic and prognostic disease biomarkers. Transferrin (Tf glycosylation changes are common marker for congenital disorders of glycosylation. However, biological interindividual variability of Tf N-glycosylation and genes involved in glycosylation regulation are not known. Therefore, high-throughput Tf isolation method and large scale glycosylation studies are needed in order to address these questions. Due to their unique chromatographic properties, the use of chromatographic monoliths enables very fast analysis cycle, thus significantly increasing sample preparation throughput. Here, we are describing characterization of novel immunoaffinity-based monolithic columns in a 96-well plate format for specific high-throughput purification of human Tf from blood plasma. We optimized the isolation and glycan preparation procedure for subsequent ultra performance liquid chromatography (UPLC analysis of Tf N-glycosylation and managed to increase the sensitivity for approximately three times compared to initial experimental conditions, with very good reproducibility. This work is licensed under a Creative Commons Attribution 4.0 International License.

  15. High-Throughput Quantification of Nanoparticle Degradation Using Computational Microscopy and Its Application to Drug Delivery Nanocapsules

    KAUST Repository

    Ray, Aniruddha

    2017-04-25

    Design and synthesis of degradable nanoparticles are very important in drug delivery and biosensing fields. Although accurate assessment of nanoparticle degradation rate would improve the characterization and optimization of drug delivery vehicles, current methods rely on estimating the size of the particles at discrete points over time using, for example, electron microscopy or dynamic light scattering (DLS), among other techniques, all of which have drawbacks and practical limitations. There is a significant need for a high-throughput and cost-effective technology to accurately monitor nanoparticle degradation as a function of time and using small amounts of sample. To address this need, here we present two different computational imaging-based methods for monitoring and quantification of nanoparticle degradation. The first method is suitable for discrete testing, where a computational holographic microscope is designed to track the size changes of protease-sensitive protein-core nanoparticles following degradation, by periodically sampling a subset of particles mixed with proteases. In the second method, a sandwich structure was utilized to observe, in real-time, the change in the properties of liquid nanolenses that were self-assembled around degrading nanoparticles, permitting continuous monitoring and quantification of the degradation process. These cost-effective holographic imaging based techniques enable high-throughput monitoring of the degradation of any type of nanoparticle, using an extremely small amount of sample volume that is at least 3 orders of magnitude smaller than what is required by, for example, DLS-based techniques.

  16. Target-dependent enrichment of virions determines the reduction of high-throughput sequencing in virus discovery.

    Directory of Open Access Journals (Sweden)

    Randi Holm Jensen

    Full Text Available Viral infections cause many different diseases stemming both from well-characterized viral pathogens but also from emerging viruses, and the search for novel viruses continues to be of great importance. High-throughput sequencing is an important technology for this purpose. However, viral nucleic acids often constitute a minute proportion of the total genetic material in a sample from infected tissue. Techniques to enrich viral targets in high-throughput sequencing have been reported, but the sensitivity of such methods is not well established. This study compares different library preparation techniques targeting both DNA and RNA with and without virion enrichment. By optimizing the selection of intact virus particles, both by physical and enzymatic approaches, we assessed the effectiveness of the specific enrichment of viral sequences as compared to non-enriched sample preparations by selectively looking for and counting read sequences obtained from shotgun sequencing. Using shotgun sequencing of total DNA or RNA, viral targets were detected at concentrations corresponding to the predicted level, providing a foundation for estimating the effectiveness of virion enrichment. Virion enrichment typically produced a 1000-fold increase in the proportion of DNA virus sequences. For RNA virions the gain was less pronounced with a maximum 13-fold increase. This enrichment varied between the different sample concentrations, with no clear trend. Despite that less sequencing was required to identify target sequences, it was not evident from our data that a lower detection level was achieved by virion enrichment compared to shotgun sequencing.

  17. Human Genome Sequencing at the Population Scale: A Primer on High-Throughput DNA Sequencing and Analysis.

    Science.gov (United States)

    Goldfeder, Rachel L; Wall, Dennis P; Khoury, Muin J; Ioannidis, John P A; Ashley, Euan A

    2017-10-15

    Most human diseases have underlying genetic causes. To better understand the impact of genes on disease and its implications for medicine and public health, researchers have pursued methods for determining the sequences of individual genes, then all genes, and now complete human genomes. Massively parallel high-throughput sequencing technology, where DNA is sheared into smaller pieces, sequenced, and then computationally reordered and analyzed, enables fast and affordable sequencing of full human genomes. As the price of sequencing continues to decline, more and more individuals are having their genomes sequenced. This may facilitate better population-level disease subtyping and characterization, as well as individual-level diagnosis and personalized treatment and prevention plans. In this review, we describe several massively parallel high-throughput DNA sequencing technologies and their associated strengths, limitations, and error modes, with a focus on applications in epidemiologic research and precision medicine. We detail the methods used to computationally process and interpret sequence data to inform medical or preventative action. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. AMBIENT: Active Modules for Bipartite Networks--using high-throughput transcriptomic data to dissect metabolic response.

    Science.gov (United States)

    Bryant, William A; Sternberg, Michael J E; Pinney, John W

    2013-03-25

    With the continued proliferation of high-throughput biological experiments, there is a pressing need for tools to integrate the data produced in ways that produce biologically meaningful conclusions. Many microarray studies have analysed transcriptomic data from a pathway perspective, for instance by testing for KEGG pathway enrichment in sets of upregulated genes. However, the increasing availability of species-specific metabolic models provides the opportunity to analyse these data in a more objective, system-wide manner. Here we introduce ambient (Active Modules for Bipartite Networks), a simulated annealing approach to the discovery of metabolic subnetworks (modules) that are significantly affected by a given genetic or environmental change. The metabolic modules returned by ambient are connected parts of the bipartite network that change coherently between conditions, providing a more detailed view of metabolic changes than standard approaches based on pathway enrichment. ambient is an effective and flexible tool for the analysis of high-throughput data in a metabolic context. The same approach can be applied to any system in which reactions (or metabolites) can be assigned a score based on some biological observation, without the limitation of predefined pathways. A Python implementation of ambient is available at http://www.theosysbio.bio.ic.ac.uk/ambient.

  19. RECENT PROCESS AND EQUIPMENT IMPROVEMENTS TO INCREASE HIGH LEVEL WASTE THROUGHPUT AT THE DEFENSE WASTE PROCESSING FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Odriscoll, R; Allan Barnes, A; Jim Coleman, J; Timothy Glover, T; Robert Hopkins, R; Dan Iverson, D; Jeff Leita, J

    2008-01-15

    The Savannah River Site's (SRS) Defense Waste Processing Facility (DWPF) began stabilizing high level waste (HLW) in a glass matrix in 1996. Over the past few years, there have been several process and equipment improvements at the DWPF to increase the rate at which the high level waste can be stabilized. These improvements have either directly increased waste processing rates or have desensitized the process to upsets, thereby minimizing downtime and increasing production. Improvements due to optimization of waste throughput with increased HLW loading of the glass resulted in a 6% waste throughput increase based upon operational efficiencies. Improvements in canister production include the pour spout heated bellows liner (5%), glass surge (siphon) protection software (2%), melter feed pump software logic change to prevent spurious interlocks of the feed pump with subsequent dilution of feed stock (2%) and optimization of the steam atomized scrubber (SAS) operation to minimize downtime (3%) for a total increase in canister production of 12%. A number of process recovery efforts have allowed continued operation. These include the off gas system pluggage and restoration, slurry mix evaporator (SME) tank repair and replacement, remote cleaning of melter top head center nozzle, remote melter internal inspection, SAS pump J-Tube recovery, inadvertent pour scenario resolutions, dome heater transformer bus bar cooling water leak repair and new Infra-red camera for determination of glass height in the canister are discussed.

  20. High-throughput RNA structure probing reveals critical folding events during early 60S ribosome assembly in yeast.

    Science.gov (United States)

    Burlacu, Elena; Lackmann, Fredrik; Aguilar, Lisbeth-Carolina; Belikov, Sergey; Nues, Rob van; Trahan, Christian; Hector, Ralph D; Dominelli-Whiteley, Nicholas; Cockroft, Scott L; Wieslander, Lars; Oeffinger, Marlene; Granneman, Sander

    2017-09-28

    While the protein composition of various yeast 60S ribosomal subunit assembly intermediates has been studied in detail, little is known about ribosomal RNA (rRNA) structural rearrangements that take place during early 60S assembly steps. Using a high-throughput RNA structure probing method, we provide nucleotide resolution insights into rRNA structural rearrangements during nucleolar 60S assembly. Our results suggest that many rRNA-folding steps, such as folding of 5.8S rRNA, occur at a very specific stage of assembly, and propose that downstream nuclear assembly events can only continue once 5.8S folding has been completed. Our maps of nucleotide flexibility enable making predictions about the establishment of protein-rRNA interactions, providing intriguing insights into the temporal order of protein-rRNA as well as long-range inter-domain rRNA interactions. These data argue that many distant domains in the rRNA can assemble simultaneously during early 60S assembly and underscore the enormous complexity of 60S synthesis.Ribosome biogenesis is a dynamic process that involves the ordered assembly of ribosomal proteins and numerous RNA structural rearrangements. Here the authors apply ChemModSeq, a high-throughput RNA structure probing method, to quantitatively measure changes in RNA flexibility during the nucleolar stages of 60S assembly in yeast.

  1. High-throughput retrotransposon-based fluorescent markers: improved information content and allele discrimination

    Directory of Open Access Journals (Sweden)

    Baker David

    2009-07-01

    Full Text Available Abstract Background Dense genetic maps, together with the efficiency and accuracy of their construction, are integral to genetic studies and marker assisted selection for plant breeding. High-throughput multiplex markers that are robust and reproducible can contribute to both efficiency and accuracy. Multiplex markers are often dominant and so have low information content, this coupled with the pressure to find alternatives to radio-labelling, has led us to adapt the SSAP (sequence specific amplified polymorphism marker method from a 33P labelling procedure to fluorescently tagged markers analysed from an automated ABI 3730 xl platform. This method is illustrated for multiplexed SSAP markers based on retrotransposon insertions of pea and is applicable for the rapid and efficient generation of markers from genomes where repetitive element sequence information is available for primer design. We cross-reference SSAP markers previously generated using the 33P manual PAGE system to fluorescent peaks, and use these high-throughput fluorescent SSAP markers for further genetic studies in Pisum. Results The optimal conditions for the fluorescent-labelling method used a triplex set of primers in the PCR. These included a fluorescently labelled specific primer together with its unlabelled counterpart, plus an adapter-based primer with two bases of selection on the 3' end. The introduction of the unlabelled specific primer helped to optimise the fluorescent signal across the range of fragment sizes expected, and eliminated the need for extensive dilutions of PCR amplicons. The software (GeneMarker Version 1.6 used for the high-throughput data analysis provided an assessment of amplicon size in nucleotides, peak areas and fluorescence intensity in a table format, so providing additional information content for each marker. The method has been tested in a small-scale study with 12 pea accessions resulting in 467 polymorphic fluorescent SSAP markers of which

  2. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  3. A cell-based high-throughput screening assay for radiation susceptibility using automated cell counting.

    Science.gov (United States)

    Hodzic, Jasmina; Dingjan, Ilse; Maas, Mariëlle Jp; van der Meulen-Muileman, Ida H; de Menezes, Renee X; Heukelom, Stan; Verheij, Marcel; Gerritsen, Winald R; Geldof, Albert A; van Triest, Baukelien; van Beusechem, Victor W

    2015-02-27

    Radiotherapy is one of the mainstays in the treatment for cancer, but its success can be limited due to inherent or acquired resistance. Mechanisms underlying radioresistance in various cancers are poorly understood and available radiosensitizers have shown only modest clinical benefit. There is thus a need to identify new targets and drugs for more effective sensitization of cancer cells to irradiation. Compound and RNA interference high-throughput screening technologies allow comprehensive enterprises to identify new agents and targets for radiosensitization. However, the gold standard assay to investigate radiosensitivity of cancer cells in vitro, the colony formation assay (CFA), is unsuitable for high-throughput screening. We developed a new high-throughput screening method for determining radiation susceptibility. Fast and uniform irradiation of batches up to 30 microplates was achieved using a Perspex container and a clinically employed linear accelerator. The readout was done by automated counting of fluorescently stained nuclei using the Acumen eX3 laser scanning cytometer. Assay performance was compared to that of the CFA and the CellTiter-Blue homogeneous uniform-well cell viability assay. The assay was validated in a whole-genome siRNA library screening setting using PC-3 prostate cancer cells. On 4 different cancer cell lines, the automated cell counting assay produced radiation dose response curves that followed a linear-quadratic equation and that exhibited a better correlation to the results of the CFA than did the cell viability assay. Moreover, the cell counting assay could be used to detect radiosensitization by silencing DNA-PKcs or by adding caffeine. In a high-throughput screening setting, using 4 Gy irradiated and control PC-3 cells, the effects of DNA-PKcs siRNA and non-targeting control siRNA could be clearly discriminated. We developed a simple assay for radiation susceptibility that can be used for high-throughput screening. This will

  4. High throughput vegetable oil-in-water emulsification with a high porosity micro-engineered membrane

    NARCIS (Netherlands)

    Wagdare, N.A.; Marcelis, A.T.M.; Ho, O.B.; Boom, R.M.; Rijn, van C.J.M.

    2010-01-01

    Emulsification with high porosity micro-engineered membranes leads to stable emulsions with a low droplet span when, besides a surfactant in the continuous phase, an additional, suitable surfactant is used in the dispersed phase. This surfactant should exhibit relatively fast adsorption dynamics,

  5. Protocols and programs for high-throughput growth and aging phenotyping in yeast.

    Directory of Open Access Journals (Sweden)

    Paul P Jung

    Full Text Available In microorganisms, and more particularly in yeasts, a standard phenotyping approach consists in the analysis of fitness by growth rate determination in different conditions. One growth assay that combines high throughput with high resolution involves the generation of growth curves from 96-well plate microcultivations in thermostated and shaking plate readers. To push the throughput of this method to the next level, we have adapted it in this study to the use of 384-well plates. The values of the extracted growth parameters (lag time, doubling time and yield of biomass correlated well between experiments carried out in 384-well plates as compared to 96-well plates or batch cultures, validating the higher-throughput approach for phenotypic screens. The method is not restricted to the use of the budding yeast Saccharomyces cerevisiae, as shown by consistent results for other species selected from the Hemiascomycete class. Furthermore, we used the 384-well plate microcultivations to develop and validate a higher-throughput assay for yeast Chronological Life Span (CLS, a parameter that is still commonly determined by a cumbersome method based on counting "Colony Forming Units". To accelerate analysis of the large datasets generated by the described growth and aging assays, we developed the freely available software tools GATHODE and CATHODE. These tools allow for semi-automatic determination of growth parameters and CLS behavior from typical plate reader output files. The described protocols and programs will increase the time- and cost-efficiency of a number of yeast-based systems genetics experiments as well as various types of screens.

  6. Protocols and programs for high-throughput growth and aging phenotyping in yeast.

    Science.gov (United States)

    Jung, Paul P; Christian, Nils; Kay, Daniel P; Skupin, Alexander; Linster, Carole L

    2015-01-01

    In microorganisms, and more particularly in yeasts, a standard phenotyping approach consists in the analysis of fitness by growth rate determination in different conditions. One growth assay that combines high throughput with high resolution involves the generation of growth curves from 96-well plate microcultivations in thermostated and shaking plate readers. To push the throughput of this method to the next level, we have adapted it in this study to the use of 384-well plates. The values of the extracted growth parameters (lag time, doubling time and yield of biomass) correlated well between experiments carried out in 384-well plates as compared to 96-well plates or batch cultures, validating the higher-throughput approach for phenotypic screens. The method is not restricted to the use of the budding yeast Saccharomyces cerevisiae, as shown by consistent results for other species selected from the Hemiascomycete class. Furthermore, we used the 384-well plate microcultivations to develop and validate a higher-throughput assay for yeast Chronological Life Span (CLS), a parameter that is still commonly determined by a cumbersome method based on counting "Colony Forming Units". To accelerate analysis of the large datasets generated by the described growth and aging assays, we developed the freely available software tools GATHODE and CATHODE. These tools allow for semi-automatic determination of growth parameters and CLS behavior from typical plate reader output files. The described protocols and programs will increase the time- and cost-efficiency of a number of yeast-based systems genetics experiments as well as various types of screens.

  7. Advanced transfection with Lipofectamine 2000 reagent: primary neurons, siRNA, and high-throughput applications.

    Science.gov (United States)

    Dalby, Brian; Cates, Sharon; Harris, Adam; Ohki, Elise C; Tilkins, Mary L; Price, Paul J; Ciccarone, Valentina C

    2004-06-01

    Lipofectamine 2000 is a cationic liposome based reagent that provides high transfection efficiency and high levels of transgene expression in a range of mammalian cell types in vitro using a simple protocol. Optimum transfection efficiency and subsequent cell viability depend on a number of experimental variables such as cell density, liposome and DNA concentrations, liposome-DNA complexing time, and the presence or absence of media components such as antibiotics and serum. The importance of these factors in Lipofectamine 2000 mediated transfection will be discussed together with some specific applications: transfection of primary neurons, high throughput transfection, and delivery of small interfering RNAs. Copyright 2003 Elsevier Inc.

  8. Continuous high PRF waveforms for challenging environments

    Science.gov (United States)

    Jaroszewski, Steven; Corbeil, Allan; Ryland, Robert; Sobota, David

    2017-05-01

    Current airborne radar systems segment the available time-on-target during each beam dwell into multiple Coherent Processing Intervals (CPIs) in order to eliminate range eclipsing, solve for unambiguous range, and increase the detection performance against larger Radar Cross Section (RCS) targets. As a consequence, these radars do not realize the full Signal-to-Noise Ratio (SNR) increase and detection performance improvement that is possible. Continuous High Pulse Repetition Frequency (HPRF) waveforms and processing enables the coherent integration of all available radar data over the full time-on-target. This can greatly increase the SNR for air targets at long range and/or with weak radar returns and significantly improve the detection performance against such targets. TSC worked with its partner KeyW to implement a Continuous HPRF waveform in their Sahara radar testbed and obtained measured radar data on both a ground vehicle target and an airborne target of opportunity. This experimental data was processed by TSC to validate the expected benefits of Continuous HPRF waveforms.

  9. High-throughput detection, genotyping and quantification of the human papillomavirus using real-time PCR.

    Science.gov (United States)

    Micalessi, Isabel M; Boulet, Gaëlle A V; Bogers, Johannes J; Benoy, Ina H; Depuydt, Christophe E

    2011-12-20

    The establishment of the causal relationship between high-risk human papillomavirus (HR-HPV) infection and cervical cancer and its precursors has resulted in the development of HPV DNA detection systems. Currently, real-time PCR assays for the detection of HPV, such as the RealTime High Risk (HR) HPV assay (Abbott) and the cobas® 4800 HPV Test (Roche Molecular Diagnostics) are commercially available. However, none of them enables the detection and typing of all HR-HPV types in a clinical high-throughput setting. This paper describes the laboratory workflow and the validation of a type-specific real-time quantitative PCR (qPCR) assay for high-throughput HPV detection, genotyping and quantification. This assay is routinely applied in a liquid-based cytology screening setting (700 samples in 24 h) and was used in many epidemiological and clinical studies. The TaqMan-based qPCR assay enables the detection of 17 HPV genotypes and β-globin in seven multiplex reactions. These HPV types include all 12 high-risk types (HPV16, 18, 31, 33, 35, 39, 45, 51, 52, 56, 58, 59), three probably high-risk types (HPV53, 66 and 68), one low-risk type (HPV6) and one undetermined risk type (HPV67). An analytical sensitivity of ≤100 copies was obtained for all the HPV types. The analytical specificity of each primer pair was 100% and an intra- and inter-run variability of real-time PCR approach enables detection of 17 HPV types, identification of the HPV type and determination of the viral load in a single sensitive assay suitable for high-throughput screening.

  10. Barcoded sequencing workflow for high throughput digitization of hybridoma antibody variable domain sequences.

    Science.gov (United States)

    Chen, Yongmei; Kim, Si Hyun; Shang, Yonglei; Guillory, Joseph; Stinson, Jeremy; Zhang, Qing; Hötzel, Isidro; Hoi, Kam Hon

    2018-01-20

    Since the invention of Hybridoma technology by Milstein and Köhler in 1975, its application has greatly advanced the antibody discovery process. The technology enables both functional screening and long-term archival of the immortalized monoclonal antibody producing B cells. Despite the dependable cryopreservation technology for hybridoma cells, practicality of long-term storage has been outpaced by recent progress in robotics and automations, which enables routine identification of thousands of antigen specific hybridoma clones. Such throughput increase imposes two nascent challenges in the antibody discovery process, namely limited cryopreservation storage space and limited throughput in conventional antibody sequencing. We herein provide a barcoded sequencing workflow that utilizes next generation sequencing to expand the conventional sequencing capacity. Accompanied with the bioinformatics tools we describe, the barcoded sequencing workflow robustly reports unambiguous antibody sequences as confirmed with Sanger sequencing controls. In complement with the commonly accessible recombinant DNA technology, the barcoded sequencing workflow allows for high throughput digitization of the antibody sequences and provides an effective solution to the limitations imposed by physical storage and sequencing capacity. Copyright © 2018 Genentech, Inc. Published by Elsevier B.V. All rights reserved.

  11. A comparison of DNA extraction methods for high-throughput DNA analyses.

    Science.gov (United States)

    Schiebelhut, Lauren M; Abboud, Sarah S; Gómez Daglio, Liza E; Swift, Holly F; Dawson, Michael N

    2017-07-01

    The inclusion of next-generation sequencing technologies in population genetic and phylogenetic studies has elevated the need to balance time and cost of DNA extraction without compromising DNA quality. We tested eight extraction methods - ranging from low- to high-throughput techniques - and eight phyla: Annelida, Arthropoda, Cnidaria, Chordata, Echinodermata, Mollusca, Ochrophyta and Porifera. We assessed DNA yield, purity, efficacy and cost of each method. Extraction efficacy was quantified using the proportion of successful polymerase chain reaction (PCR) amplification of two molecular markers for metazoans (mitochondrial COI and nuclear histone 3) and one for Ochrophyta (mitochondrial nad6) at four time points - 0.5, 1, 2 and 3 years following extraction. DNA yield and purity were quantified using NanoDrop absorbance ratios. Cost was estimated in terms of time and material expense. Results show differences in DNA yield, purity and PCR success between extraction methods and that performance also varied by taxon. The traditional time-intensive, low-throughput CTAB phenol-chloroform extraction performed well across taxa, but other methods also performed well and provide the opportunity to reduce time spent at the bench and increase throughput. © 2016 John Wiley & Sons Ltd.

  12. A homogeneous, high-throughput fluorescence anisotropy-based DNA supercoiling assay.

    Science.gov (United States)

    Shapiro, Adam; Jahic, Haris; Prasad, Swati; Ehmann, David; Thresher, Jason; Gao, Ning; Hajec, Laurel

    2010-10-01

    The degree of supercoiling of DNA is vital for cellular processes, such as replication and transcription. DNA topology is controlled by the action of DNA topoisomerase enzymes. Topoisomerases, because of their importance in cellular replication, are the targets of several anticancer and antibacterial drugs. In the search for new drugs targeting topoisomerases, a biochemical assay compatible with automated high-throughput screening (HTS) would be valuable. Gel electrophoresis is the standard method for measuring changes in the extent of supercoiling of plasmid DNA when acted upon by topoisomerases, but this is a low-throughput and laborious method. A medium-throughput method was described previously that quantitatively distinguishes relaxed and supercoiled plasmids by the difference in their abilities to form triplex structures with an immobilized oligonucleotide. In this article, the authors describe a homogeneous supercoiling assay based on triplex formation in which the oligonucleotide strand is labeled with a fluorescent dye and the readout is fluorescence anisotropy. The new assay requires no immobilization, filtration, or plate washing steps and is therefore well suited to HTS for inhibitors of topoisomerases. The utility of this assay is demonstrated with relaxation of supercoiled plasmid by Escherichia coli topoisomerase I, supercoiling of relaxed plasmid by E. coli DNA gyrase, and inhibition of gyrase by fluoroquinolones and nalidixic acid.

  13. RESULTS OF THE FY09 ENHANCED DOE HIGH LEVEL WASTE MELTER THROUGHPUT STUDIES AT SRNL

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, F.; Edwards, T.

    2010-06-23

    High-level waste (HLW) throughput (i.e., the amount of waste processed per unit time) is a function of two critical parameters: waste loading (WL) and melt rate. For the Waste Treatment and Immobilization Plant (WTP) at the Hanford Site and the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS), increasing HLW throughput would significantly reduce the overall mission life cycle costs for the Department of Energy (DOE). The objective of this task is to develop data, assess property models, and refine or develop the necessary models to support increased WL of HLW at SRS. It is a continuation of the studies initiated in FY07, but is under the specific guidance of a Task Change Request (TCR)/Work Authorization received from DOE headquarters (Project Number RV071301). Using the data generated in FY07, FY08 and historical data, two test matrices (60 glasses total) were developed at the Savannah River National Laboratory (SRNL) in order to generate data in broader compositional regions. These glasses were fabricated and characterized using chemical composition analysis, X-ray Diffraction (XRD), viscosity, liquidus temperature (TL) measurement and durability as defined by the Product Consistency Test (PCT). The results of this study are summarized below: (1) In general, the current durability model predicts the durabilities of higher waste loading glasses quite well. A few of the glasses exhibited poorer durability than predicted. (2) Some of the glasses exhibited anomalous behavior with respect to durability (normalized leachate for boron (NL [B])). The quenched samples of FY09EM21-02, -07 and -21 contained no nepheline or other wasteform affecting crystals, but have unacceptable NL [B] values (> 10 g/L). The ccc sample of FY09EM21-07 has a NL [B] value that is more than one half the value of the quenched sample. These glasses also have lower concentrations of Al{sub 2}O{sub 3} and SiO{sub 2}. (3) Five of the ccc samples (EM-13, -14, -15, -29 and

  14. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping.

    Science.gov (United States)

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-06-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays 'Fernandez') plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. © 2014 American Society of Plant Biologists. All Rights Reserved.

  15. A high-throughput phenotypic screen identifies clofazimine as a potential treatment for cryptosporidiosis.

    Directory of Open Access Journals (Sweden)

    Melissa S Love

    2017-02-01

    Full Text Available Cryptosporidiosis has emerged as a leading cause of non-viral diarrhea in children under five years of age in the developing world, yet the current standard of care to treat Cryptosporidium infections, nitazoxanide, demonstrates limited and immune-dependent efficacy. Given the lack of treatments with universal efficacy, drug discovery efforts against cryptosporidiosis are necessary to find therapeutics more efficacious than the standard of care. To date, cryptosporidiosis drug discovery efforts have been limited to a few targeted mechanisms in the parasite and whole cell phenotypic screens against small, focused collections of compounds. Using a previous screen as a basis, we initiated the largest known drug discovery effort to identify novel anticryptosporidial agents. A high-content imaging assay for inhibitors of Cryptosporidium parvum proliferation within a human intestinal epithelial cell line was miniaturized and automated to enable high-throughput phenotypic screening against a large, diverse library of small molecules. A screen of 78,942 compounds identified 12 anticryptosporidial hits with sub-micromolar activity, including clofazimine, an FDA-approved drug for the treatment of leprosy, which demonstrated potent and selective in vitro activity (EC50 = 15 nM against C. parvum. Clofazimine also displayed activity against C. hominis-the other most clinically-relevant species of Cryptosporidium. Importantly, clofazimine is known to accumulate within epithelial cells of the small intestine, the primary site of Cryptosporidium infection. In a mouse model of acute cryptosporidiosis, a once daily dosage regimen for three consecutive days or a single high dose resulted in reduction of oocyst shedding below the limit detectable by flow cytometry. Recently, a target product profile (TPP for an anticryptosporidial compound was proposed by Huston et al. and highlights the need for a short dosing regimen (< 7 days and formulations for children < 2

  16. High Throughput Preparation of Aligned Nanofibers Using an Improved Bubble-Electrospinning

    Directory of Open Access Journals (Sweden)

    Liang Yu

    2017-11-01

    Full Text Available An improved bubble-electrospinning, consisting of a cone shaped air nozzle, a copper solution reservoir connected directly to the power generator, and a high speed rotating copper wire drum as a collector, was presented successfully to obtain high throughput preparation of aligned nanofibers. The influences of drum rotation speed on morphology and properties of obtained nanofibers were explored and researched. The results showed that the alignment degree, diameter distribution, and properties of nanofibers were improved with the increase of the drum rotation speed.

  17. High-throughput method for optimum solubility screening for homogeneity and crystallization of proteins

    Science.gov (United States)

    Kim, Sung-Hou [Moraga, CA; Kim, Rosalind [Moraga, CA; Jancarik, Jamila [Walnut Creek, CA

    2012-01-31

    An optimum solubility screen in which a panel of buffers and many additives are provided in order to obtain the most homogeneous and monodisperse protein condition for protein crystallization. The present methods are useful for proteins that aggregate and cannot be concentrated prior to setting up crystallization screens. A high-throughput method using the hanging-drop method and vapor diffusion equilibrium and a panel of twenty-four buffers is further provided. Using the present methods, 14 poorly behaving proteins have been screened, resulting in 11 of the proteins having highly improved dynamic light scattering results allowing concentration of the proteins, and 9 were crystallized.

  18. High-throughput glycosylation analysis of therapeutic immunoglobulin G by capillary gel electrophoresis using a DNA analyzer.

    NARCIS (Netherlands)

    Reusch, D.; Haberger, M.; Kailich, T.; Heidenreich, A.K.; Kampe, M.; Bulau, P.; Wuhrer, M.

    2014-01-01

    The Fc glycosylation of therapeutic antibodies is crucial for their effector functions and their behavior in pharmacokinetics and pharmacodynamics. To monitor the Fc glycosylation in bioprocess development and characterization, high-throughput techniques for glycosylation analysis are needed. Here,

  19. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Contact Substances for Use in Chemical Prioritization

    Data.gov (United States)

    U.S. Environmental Protection Agency — Under the ExpoCast program, United States Environmental Protection Agency (EPA) researchers have developed a high-throughput (HT) framework for estimating aggregate...

  20. HTTK R Package v1.4 - JSS Article on HTTK: R Package for High-Throughput Toxicokinetics

    Data.gov (United States)

    U.S. Environmental Protection Agency — httk: High-Throughput Toxicokinetics Functions and data tables for simulation and statistical analysis of chemical toxicokinetics ("TK") using data obtained from...