WorldWideScience

Sample records for high-throughput quantum chemistry

  1. High-throughput quantum chemistry and virtual screening for OLED material components

    Science.gov (United States)

    Halls, Mathew D.; Giesen, David J.; Hughes, Thomas F.; Goldberg, Alexander; Cao, Yixiang

    2013-09-01

    Computational structure enumeration, analysis using an automated simulation workflow and filtering of large chemical structure libraries to identify lead systems, has become a central paradigm in drug discovery research. Transferring this paradigm to challenges in materials science is now possible due to advances in the speed of computational resources and the efficiency and stability of chemical simulation packages. State-of-the-art software tools that have been developed for drug discovery can be applied to efficiently explore the chemical design space to identify solutions for problems such as organic light-emitting diode material components. In this work, virtual screening for OLED materials based on intrinsic quantum mechanical properties is illustrated. Also, a new approach to more reliably identify candidate systems is introduced that is based on the chemical reaction energetics of defect pathways for OLED materials.

  2. Organic chemistry. Nanomole-scale high-throughput chemistry for the synthesis of complex molecules.

    Science.gov (United States)

    Buitrago Santanilla, Alexander; Regalado, Erik L; Pereira, Tony; Shevlin, Michael; Bateman, Kevin; Campeau, Louis-Charles; Schneeweis, Jonathan; Berritt, Simon; Shi, Zhi-Cai; Nantermet, Philippe; Liu, Yong; Helmy, Roy; Welch, Christopher J; Vachal, Petr; Davies, Ian W; Cernak, Tim; Dreher, Spencer D

    2015-01-02

    At the forefront of new synthetic endeavors, such as drug discovery or natural product synthesis, large quantities of material are rarely available and timelines are tight. A miniaturized automation platform enabling high-throughput experimentation for synthetic route scouting to identify conditions for preparative reaction scale-up would be a transformative advance. Because automated, miniaturized chemistry is difficult to carry out in the presence of solids or volatile organic solvents, most of the synthetic "toolkit" cannot be readily miniaturized. Using palladium-catalyzed cross-coupling reactions as a test case, we developed automation-friendly reactions to run in dimethyl sulfoxide at room temperature. This advance enabled us to couple the robotics used in biotechnology with emerging mass spectrometry-based high-throughput analysis techniques. More than 1500 chemistry experiments were carried out in less than a day, using as little as 0.02 milligrams of material per reaction.

  3. High-Throughput Screening and Optimization of Binary Quantum Dots Cosensitized Solar Cell.

    Science.gov (United States)

    Yuan, Ding; Xiao, Lina; Luo, Jianheng; Luo, Yanhong; Meng, Qingbo; Mao, Bing-Wei; Zhan, Dongping

    2016-07-20

    Quantum dots (QDs) are considered as the alternative of dye sensitizers for solar cells. However, interfacial construction and evaluation of photocatalytic nanomaterials still remains challenge through the conventional methodology involving demo devices. We propose here a high-throughput screening and optimizing method based on combinatorial chemistry and scanning electrochemical microscopy (SECM). A homogeneous TiO2 catalyst layer is coated on a FTO substrate, which is then covered by a dark mask to expose the photocatalyst array. On each photocatalyst spot, different successive ionic layer adsorption and reaction (SILAR) processes are performed by a programmed solution dispenser to load the binary PbxCd1-xS QDs sensitizers. An optical fiber is employed as the scanning tip of SECM, and the photocatalytic current is recorded during the imaging experiment, through which the optimized technical parameters are figured out. To verify the validity of the combinatorial SECM imaging results, the controlled trials are performed with the corresponding photovoltaic demo devices. The harmonious accordance proved that the methodology based on combinatorial chemistry and SECM is valuable for the interfacial construction, high-throughput screening, and optimization of QDSSCs. Furthermore, the PbxCd1-xS/CdS QDs cosensitized solar cell optimized by SECM achieves a short circuit current density of 24.47 mA/cm(2), an open circuit potential of 421 mV, a fill factor of 0.52, and a photovoltaic conversion efficiency of 5.33%.

  4. Computational chemistry, data mining, high-throughput synthesis and screening - informatics and integration in drug discovery

    OpenAIRE

    Manly, Charles J.

    2001-01-01

    Drug discovery today includes considerable focus of laboratory automation and other resources on both combinatorial chemistry and high-throughput screening, and computational chemistry has been a part of pharmaceutical research for many years. The real benefit of these technologies is beyond the exploitation of each individually. Only recently have significant efforts focused on effectively integrating these and other discovery disciplines to realize their larger potential. This technical not...

  5. Turning tumor-promoting copper into an anti-cancer weapon via high-throughput chemistry.

    Science.gov (United States)

    Wang, F; Jiao, P; Qi, M; Frezza, M; Dou, Q P; Yan, B

    2010-01-01

    Copper is an essential element for multiple biological processes. Its concentration is elevated to a very high level in cancer tissues for promoting cancer development through processes such as angiogenesis. Organic chelators of copper can passively reduce cellular copper and serve the role as inhibitors of angiogenesis. However, they can also actively attack cellular targets such as proteasome, which plays a critical role in cancer development and survival. The discovery of such molecules initially relied on a step by step synthesis followed by biological assays. Today high-throughput chemistry and high-throughput screening have significantly expedited the copper-binding molecules discovery to turn "cancer-promoting" copper into anti-cancer agents.

  6. Quantum chemistry

    CERN Document Server

    Lowe, John P

    1993-01-01

    Praised for its appealing writing style and clear pedagogy, Lowe's Quantum Chemistry is now available in its Second Edition as a text for senior undergraduate- and graduate-level chemistry students. The book assumes little mathematical or physical sophistication and emphasizes an understanding of the techniques and results of quantum chemistry, thus enabling students to comprehend much of the current chemical literature in which quantum chemical methods or concepts are used as tools. The book begins with a six-chapter introduction of standard one-dimensional systems, the hydrogen atom,

  7. Complementing high-throughput X-ray powder diffraction data with quantum-chemical calculations

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; van de Streek, Jacco; Rantanen, Jukka

    2012-01-01

    High-throughput crystallisation and characterisation platforms provide an efficient means to carry out solid-form screening during the pre-formulation phase. To determine the crystal structures of identified new solid phases, however, usually requires independent crystallisation trials to produce...... single crystals or bulk samples of sufficient quantity to carry out high-quality X-ray diffraction measurements. This process could be made more efficient by a robust procedure for crystal structure determination directly from high-throughput X-ray powder diffraction (XRPD) data. Quantum......-chemical calculations based on dispersion-corrected density functional theory (DFT-D) have now become feasible for typical small organic molecules used as active pharmaceutical ingredients. We demonstrate how these calculations can be applied to complement high-throughput XRPD data by determining the crystal structure...

  8. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  9. Solid electrodes in electroanalytical chemistry: present applications and prospects for high throughput screening of drug compounds.

    Science.gov (United States)

    Uslu, Bengi; Ozkan, Sibel A

    2007-08-01

    This review summarizes recent progress in the development and application of solid electrodes to the screening of pharmaceutical dosage forms and biological fluids. Recent trends and advances in the electroanalytical chemistry of solid electrodes, microelectrodes and electrochemical sensors are reviewed. The varieties of solid electrodes and their basic physico-chemical properties and some specific characteristics including some supramolecular phenomena at their surface are surveyed. This review also includes some selected designs and their applications. Despite many reviews about individual solid electrodes in the literature, this review offers the first comprehensive report on all forms of solid electrodes. Special attention is paid to the possibilities of solid electrodes in high throughput electroanalytical investigation of drug dosage forms and biological samples using modern electroanalytical techniques. Various selected studies on these subjects since 1996 are reviewed in this paper.

  10. High-throughput virtual screening using quantum mechanical probes: discovery of selective kinase inhibitors.

    Science.gov (United States)

    Zhou, Ting; Caflisch, Amedeo

    2010-07-05

    A procedure based on semi-empirical quantum mechanical (QM) calculations of interaction energy is proposed for the rapid screening of compound poses generated by high-throughput docking. Small molecules (consisting of 2-10 atoms and termed "probes") are overlapped with polar groups in the binding site of the protein target. The interaction energy values between each compound pose and the probes, calculated by a semi-empirical Hamiltonian, are used as filters. The QM probe method does not require fixed partial charges and takes into account polarization and charge-transfer effects which are not captured by conventional force fields. The procedure is applied to screen approximately 100 million poses (of 2.7 million commercially available compounds) obtained by high-throughput docking in the ATP binding site of the tyrosine kinase erythropoietin-producing human hepatocellular carcinoma receptor B4 (EphB4). Three QM probes on the hinge region and one at the entrance pocket are employed to select for binding affinity, while a QM probe on the side chain of the so-called gatekeeper residue (a hypervariable residue in the kinome) is used to enforce selectivity. The poses with favorable interactions with the five QM probes are filtered further for hydrophobic matching and low ligand strain. In this way, a single-digit micromolar inhibitor of EphB4 with a relatively good selectivity profile is identified in a multimillion-compound library upon experimental tests of only 23 molecules.

  11. Self-encoding Functional Resin Applying for Combinatorial Chemistry and High Throughput Screening

    Institute of Scientific and Technical Information of China (English)

    DU Lei; CHEN Tong-sheng

    2004-01-01

    A novel solid phase organic synthesis resin was synthesized for combinatorial high-throughput screening,which based on FTIR spectra self-encoding functional resin technology. A new deconvolution strategy termed position encoding deconvolution had illustrated and was compared with some popular combinatorial deconvolution strategies in efficiency and information content. The mimic high throughput screening of hexapeptide library successfully proved the applying of the self-encoding functional resin technology and the position encoding deconvolution strategy.

  12. Turbocharged molecular discovery of OLED emitters: from high-throughput quantum simulation to highly efficient TADF devices

    Science.gov (United States)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Ha, Dong-Gwang; Einzinger, Markus; Wu, Tony; Baldo, Marc A.; Aspuru-Guzik, Alán.

    2016-09-01

    Discovering new OLED emitters requires many experiments to synthesize candidates and test performance in devices. Large scale computer simulation can greatly speed this search process but the problem remains challenging enough that brute force application of massive computing power is not enough to successfully identify novel structures. We report a successful High Throughput Virtual Screening study that leveraged a range of methods to optimize the search process. The generation of candidate structures was constrained to contain combinatorial explosion. Simulations were tuned to the specific problem and calibrated with experimental results. Experimentalists and theorists actively collaborated such that experimental feedback was regularly utilized to update and shape the computational search. Supervised machine learning methods prioritized candidate structures prior to quantum chemistry simulation to prevent wasting compute on likely poor performers. With this combination of techniques, each multiplying the strength of the search, this effort managed to navigate an area of molecular space and identify hundreds of promising OLED candidate structures. An experimentally validated selection of this set shows emitters with external quantum efficiencies as high as 22%.

  13. High-throughput flow cytometric screening of combinatorial chemistry bead libraries for proteomics and drug discovery

    Science.gov (United States)

    Leary, James F.; Reece, Lisa M.; Yang, Xian-Bin; Gorenstein, David

    2005-04-01

    For proteomics drug discovery applications, combinatorial microbead thioaptamer libraries (one thioaptamer sequence per bead) are being created by split synthesis method, creating a "proteomics library" of protein capture beads which can be analyzed by high-throughput screening methods in this case, flow cytometry and cell sorting. Thioaptamers, oligonucleotides with thiophosphate backbone substitutions, function like antibodies in terms of recognizing specific protein sequences but have a number of advantages over antibody libraries. These proteomics beads can then be analyzed by high-speed flow cytometry and sorted to single-bead level depending on relative fluorescence brightness of fluorescently-labeled proteins, or for a specific protein from all of the molecules of cell subpopulations being analyzed. The thioaptamer sequences on a given bead showing high affinity for that protein can then be sequenced. Alternatively, the protein-capturing beads can be analyzed by MALDI-TOF mass spectrometry for analysis of the bound proteins. The beads can be thought of as equivalent to single-element positions of a proteomics chip arrays but with the advantage of being able to much more rapidly analyze hundreds of millions of possible amino acid sequences/epitopes on the basis of thioaptamer sequence affinities to select single sequences of interest. Additionally, those beads can be manipulated and isolated at the single bead level by high-throughput flow cytometry/cell sorting for subsequent sequencing of the thioaptamer sequences.

  14. High Throughput and High Speed Organic Synthesis in Drug Discovery:An Integration of Chemistry and Technology

    Institute of Scientific and Technical Information of China (English)

    Li Chen

    2004-01-01

    Drug discovery is a complicated process that involves multiple synthetic chemistry tasks.Among them, lead generation and optimization is the core business in the discovery research.During the stage of lead generation, a large library of many thousands individual compounds will be screened against a biological target to identify a set of hits that showed desirable activity. Once a hit has been identified, analog synthesis and development of SAR around this hit and establishment of relationship between targeted protein and its cellular function become the key activity in the drug discovery project. During this process, focused libraries of a few hundred compounds (X00) will be synthesized and tested. For synthetic chemistry tasks, the efficiency can be enhanced through "parallel processing or high throughput chemisay" using automation technology with which a large number of compounds can be synthesized, purified and plated at same time for biological and pharmaceutical screenings. To accelerate the lead optimization high speed synthesis is conducted when necessary and to improve "single process" efficiency using automation technology, when the rate limiting step become developing synthetic method to produce large amount of compounds (in gram quantity) that can be used in vivo pharmacology study. With these in mind, we designed integrated HTC technology platform which supports the key chemical synthesis activities in discovery research. In this presentation, several examples will be presented to illustrate the applications of high throughput and high speed organic synthesis in drug discovery.

  15. A rapid and high-throughput quantum dots bioassay for monitoring of perfluorooctane sulfonate in environmental water samples

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Jiong; Wan Yanjian; Li Yuanyuan; Zhang Qiongfang; Xu Shunqing [Minister of Education Key Laboratory of Environment and Health, School of Public Health, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, Hubei Province 430030 (China); Zhu Huijun [Cranfield Health, Cranfield University, Kempston, Bedfordshire, MK43 0AL (United Kingdom); Shu Baihua, E-mail: shubaihua@hotmail.com [Minister of Education Key Laboratory of Environment and Health, School of Public Health, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, Hubei Province 430030 (China)

    2011-05-15

    Currently HPLC/MS is the state of the art tool for environmental/drinking water perfluorooctane sulfonate (PFOS) monitoring. PFOS can bind to peroxisomal proliferator-activated receptor-alpha (PPAR{alpha}), which forms heterodimers with retinoid X receptors (RXRs) and binds to PPAR response elements. In this bioassay free PFOS in water samples competes with immobilized PFOS in ELISA plates for a given amount of PPAR{alpha}-RXR{alpha}. It can be determined indirectly by immobilizing PPAR{alpha}-RXR{alpha}-PFOS complex to another plate coated with PPAR{alpha} antibody and subsequent measuring the level of PPAR{alpha}-RXR{alpha} by using biotin-modified PPAR{alpha}-RXR{alpha} probes-quantum dots-streptavidin detection system. The rapid and high-throughput bioassay demonstrated a detection limit of 2.5 ng L{sup -1} with linear range between 2.5 ng L{sup -1} and 75 ng L{sup -1}. Detection results of environmental water samples were highly consistent between the bioassay and HPLC/MS. - We developed a rapid and high-throughput bioassay for monitoring of PFOS in environmental water samples. - Highlights: > We developed a rapid and high-throughput bioassay for monitoring of PFOS in water. > We detected the PFOS concentration of water samples by two methods. > The bioassay is effective for evaluating PFOS contamination level.

  16. High-Throughput Synthetic Chemistry Enabled by Organic Solvent Disintegrating Tablet.

    Science.gov (United States)

    Li, Tingting; Xu, Lei; Xing, Yanjun; Xu, Bo

    2017-01-17

    Synthetic chemistry remains a time- and labor-intensive process of inherent hazardous nature. Our organic solvent disintegrating tablet (O-Tab) technology has shown potential to make industrial/synthetic chemistry more efficient. As is the case with pharmaceutical tablets, our reagent-containing O-Tabs are mechanically strong, but disintegrate rapidly when in contact with reaction media (organic solvents). For O-Tabs containing sensitive chemicals, they can be further coated to insulate them from air and moisture. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. High throughput Sr isotope analysis using an automated column chemistry system

    Science.gov (United States)

    Mackey, G. N.; Fernandez, D.

    2011-12-01

    A new method has been developed for rapidly measuring 87Sr/86Sr isotope ratios using an autosampler that automates column chemistry for Sr purification. The autosampler, a SC2 DX with FAST2 valve block, produced by Elemental Scientific, Inc., utilizes a pair of six-way valves, a vacuum, and a peristaltic pump to load a sample from an autosampler tube onto the Eichrom Sr Resin in the separation column. The autosampler then elutes the sample from the column directly into the spray chamber of the mass spectrometer. Measurements are made on a Thermo-Finnigan Neptune ICP-MS. Sample-blank pairs require approximately 30 minutes for analysis. Normal throughput for the system is 24 samples and 11 standards per day. Adjustment of the pump speed allows for rapid loading of the column followed by a 3-minute data acquisition period with no fractionation of the Sr being eluted from the column. All data are blank-, interference-, and normalization-corrected online using 86Sr/88Sr = 0.1194. Analytical precision on a typical 66 ng/g analysis is ±0.00003 (2σ SE). Reproducibility of the SRM987 Sr standard (66 ng/g) over the course of a typical sequence is ±0.00004 (2σ SD, n=11). For comparison, offline column separation of the SRM987 Sr standard (66 ng/g) was conducted and measured using the same instrument method, yielding a reproducibility of ±0.00004 (2σ SD, n=7). The long-term average of the SRM987 standard (10-200 ng/g) utilizing the online column chemistry method is 0.71027 ± 0.00010 (2σ SD, n=239). A small memory effect has been measured by alternating spiked samples (87Sr/86Sr = 0.67465) with the SRM987 standard. The bias measured in this test (87Sr/86Sr +0.00006) slightly exceeds the 2σ standard reproducibility for a typical run with sample and standard concentrations near 66 ng/g, but is within the 2σ long-term reproducibility of the method. The optimal concentration range for the offline column chemistry system is 50-250 ng/g Sr. Sample concentrations above 250

  18. Advances in quantum chemistry

    CERN Document Server

    Sabin, John R

    2013-01-01

    Advances in Quantum Chemistry presents surveys of current topics in this rapidly developing field that has emerged at the cross section of the historically established areas of mathematics, physics, chemistry, and biology. It features detailed reviews written by leading international researchers. This volume focuses on the theory of heavy ion physics in medicine.Advances in Quantum Chemistry presents surveys of current topics in this rapidly developing field that has emerged at the cross section of the historically established areas of mathematics, physics, chemistry, and biology. It features

  19. Principles of quantum chemistry

    CERN Document Server

    George, David V

    2013-01-01

    Principles of Quantum Chemistry focuses on the application of quantum mechanics in physical models and experiments of chemical systems.This book describes chemical bonding and its two specific problems - bonding in complexes and in conjugated organic molecules. The very basic theory of spectroscopy is also considered. Other topics include the early development of quantum theory; particle-in-a-box; general formulation of the theory of quantum mechanics; and treatment of angular momentum in quantum mechanics. The examples of solutions of Schroedinger equations; approximation methods in quantum c

  20. Quantum mechanics in chemistry

    CERN Document Server

    Schatz, George C

    2002-01-01

    Intended for graduate and advanced undergraduate students, this text explores quantum mechanical techniques from the viewpoint of chemistry and materials science. Dynamics, symmetry, and formalism are emphasized. An initial review of basic concepts from introductory quantum mechanics is followed by chapters examining symmetry, rotations, and angular momentum addition. Chapter 4 introduces the basic formalism of time-dependent quantum mechanics, emphasizing time-dependent perturbation theory and Fermi's golden rule. Chapter 5 sees this formalism applied to the interaction of radiation and matt

  1. Quantum chemistry an introduction

    CERN Document Server

    Kauzmann, Walter

    2013-01-01

    Quantum Chemistry: An Introduction provides information pertinent to the fundamental aspects of quantum mechanics. This book presents the theory of partial differentiation equations by using the classical theory of vibrations as a means of developing physical insight into this essential branch of mathematics.Organized into five parts encompassing 16 chapters, this book begins with an overview of how quantum mechanical deductions are made. This text then describes the achievements and limitations of the application of quantum mechanics to chemical problems. Other chapters provide a brief survey

  2. Fundamentals of quantum chemistry

    CERN Document Server

    House, J E

    2004-01-01

    An introduction to the principles of quantum mechanics needed in physical chemistry. Mathematical tools are presented and developed as needed and only basic calculus, chemistry, and physics is assumed. Applications include atomic and molecular structure, spectroscopy, alpha decay, tunneling, and superconductivity. New edition includes sections on perturbation theory, orbital symmetry of diatomic molecules, the Huckel MO method and Woodward/Hoffman rules as well as a new chapter on SCF and Hartree-Fock methods. * This revised text clearly presents basic q

  3. Mapping quantum yield for (Fe-Zn-Sn-Ti)Ox photoabsorbers using a high throughput photoelectrochemical screening system.

    Science.gov (United States)

    Xiang, Chengxiang; Haber, Joel; Marcin, Martin; Mitrovic, Slobodan; Jin, Jian; Gregoire, John M

    2014-03-10

    Combinatorial synthesis and screening of light absorbers are critical to material discoveries for photovoltaic and photoelectrochemical applications. One of the most effective ways to evaluate the energy-conversion properties of a semiconducting light absorber is to form an asymmetric junction and investigate the photogeneration, transport and recombination processes at the semiconductor interface. This standard photoelectrochemical measurement is readily made on a semiconductor sample with a back-side metallic contact (working electrode) and front-side solution contact. In a typical combinatorial material library, each sample shares a common back contact, requiring novel instrumentation to provide spatially resolved and thus sample-resolved measurements. We developed a multiplexing counter electrode with a thin layer assembly, in which a rectifying semiconductor/liquid junction was formed and the short-circuit photocurrent was measured under chopped illumination for each sample in a material library. The multiplexing counter electrode assembly demonstrated a photocurrent sensitivity of sub-10 μA cm(-2) with an external quantum yield sensitivity of 0.5% for each semiconductor sample under a monochromatic ultraviolet illumination source. The combination of cell architecture and multiplexing allows high-throughput modes of operation, including both fast-serial and parallel measurements. To demonstrate the performance of the instrument, the external quantum yields of 1819 different compositions from a pseudoquaternary metal oxide library, (Fe-Zn-Sn-Ti)Ox, at 385 nm were collected in scanning serial mode with a throughput of as fast as 1 s per sample. Preliminary screening results identified a promising ternary composition region centered at Fe0.894Sn0.103Ti0.0034Ox, with an external quantum yield of 6.7% at 385 nm.

  4. Product Chemistry and Process Efficiency of Biomass Torrefaction, Pyrolysis and Gasification Studied by High-Throughput Techniques and Multivariate Analysis

    Science.gov (United States)

    Xiao, Li

    Despite the great passion and endless efforts on development of renewable energy from biomass, the commercialization and scale up of biofuel production is still under pressure and facing challenges. New ideas and facilities are being tested around the world targeting at reducing cost and improving product value. Cutting edge technologies involving analytical chemistry, statistics analysis, industrial engineering, computer simulation, and mathematics modeling, etc. keep integrating modern elements into this classic research. One of those challenges of commercializing biofuel production is the complexity from chemical composition of biomass feedstock and the products. Because of this, feedstock selection and process optimization cannot be conducted efficiently. This dissertation attempts to further evaluate biomass thermal decomposition process using both traditional methods and advanced technique (Pyrolysis Molecular Beam Mass Spectrometry). Focus has been made on data base generation of thermal decomposition products from biomass at different temperatures, finding out the relationship between traditional methods and advanced techniques, evaluating process efficiency and optimizing reaction conditions, comparison of typically utilized biomass feedstock and new search on innovative species for economical viable feedstock preparation concepts, etc. Lab scale quartz tube reactors and 80il stainless steel sample cups coupled with auto-sampling system were utilized to simulate the complicated reactions happened in real fluidized or entrained flow reactors. Two main high throughput analytical techniques used are Near Infrared Spectroscopy (NIR) and Pyrolysis Molecular Beam Mass Spectrometry (Py-MBMS). Mass balance, carbon balance, and product distribution are presented in detail. Variations of thermal decomposition temperature range from 200°C to 950°C. Feedstocks used in the study involve typical hardwood and softwood (red oak, white oak, yellow poplar, loblolly pine

  5. Handbook of relativistic quantum chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Wenjian (ed.) [Peking Univ., Beijing (China). Center for Computational Science and Engineering

    2017-03-01

    This handbook focuses on the foundations of relativistic quantum mechanics and addresses a number of fundamental issues never covered before in a book. For instance: How can many-body theory be combined with quantum electrodynamics? How can quantum electrodynamics be interfaced with relativistic quantum chemistry? What is the most appropriate relativistic many-electron Hamiltonian? How can we achieve relativistic explicit correlation? How can we formulate relativistic properties? - just to name a few. Since relativistic quantum chemistry is an integral component of computational chemistry, this handbook also supplements the ''Handbook of Computational Chemistry''. Generally speaking, it aims to establish the 'big picture' of relativistic molecular quantum mechanics as the union of quantum electrodynamics and relativistic quantum chemistry. Accordingly, it provides an accessible introduction for readers new to the field, presents advanced methodologies for experts, and discusses possible future perspectives, helping readers understand when/how to apply/develop the methodologies.

  6. Overview on the current status on virtual high-throughput screening and combinatorial chemistry approaches in multi-target anticancer drug discovery; Part II.

    Science.gov (United States)

    Geromichalos, George D; Alifieris, Constantinos E; Geromichalou, Elena G; Trafalis, Dimitrios T

    2016-01-01

    Conventional drug design embraces the "one gene, one drug, one disease" philosophy. Nowadays, new generation of anticancer drugs, able to inhibit more than one pathway, is believed to play a major role in contemporary anticancer drug research. In this way, polypharmacology, focusing on multi-target drugs, has emerged as a new paradigm in drug discovery. A number of recent successful drugs have in part or in whole emerged from a structure-based research approach. Many advances including crystallography and informatics are behind these successes. In this part II we will review the role and methodology of ligand-, structure- and fragment-based computer-aided drug design computer aided drug desing (CADD), virtual high throughput screening (vHTS), de novo drug design, fragment-based design and structure-based molecular docking, homology modeling, combinatorial chemistry and library design, pharmacophore model chemistry and informatics in modern drug discovery.

  7. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  8. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  9. Quantum Computing for Quantum Chemistry

    Science.gov (United States)

    2010-09-01

    This three-year project consisted on the development and application of quantum computer algorithms for chemical applications. In particular, we developed algorithms for chemical reaction dynamics, electronic structure and protein folding. The first quantum computing for

  10. Elementary quantum chemistry

    CERN Document Server

    Pilar, Frank L

    2003-01-01

    Useful introductory course and reference covers origins of quantum theory, Schrödinger wave equation, quantum mechanics of simple systems, electron spin, quantum states of atoms, Hartree-Fock self-consistent field method, more. 1990 edition.

  11. Chemogenomics: a discipline at the crossroad of high throughput technologies, biomarker research, combinatorial chemistry, genomics, cheminformatics, bioinformatics and artificial intelligence.

    Science.gov (United States)

    Maréchal, Eric

    2008-09-01

    Chemogenomics is the study of the interaction of functional biological systems with exogenous small molecules, or in broader sense the study of the intersection of biological and chemical spaces. Chemogenomics requires expertises in biology, chemistry and computational sciences (bioinformatics, cheminformatics, large scale statistics and machine learning methods) but it is more than the simple apposition of each of these disciplines. Biological entities interacting with small molecules can be isolated proteins or more elaborate systems, from single cells to complete organisms. The biological space is therefore analyzed at various postgenomic levels (genomic, transcriptomic, proteomic or any phenotypic level). The space of small molecules is partially real, corresponding to commercial and academic collections of compounds, and partially virtual, corresponding to the chemical space possibly synthesizable. Synthetic chemistry has developed novel strategies allowing a physical exploration of this universe of possibilities. A major challenge of cheminformatics is to charter the virtual space of small molecules using realistic biological constraints (bioavailability, druggability, structural biological information). Chemogenomics is a descendent of conventional pharmaceutical approaches, since it involves the screening of chemolibraries for their effect on biological targets, and benefits from the advances in the corresponding enabling technologies and the introduction of new biological markers. Screening was originally motivated by the rigorous discovery of new drugs, neglecting and throwing away any molecule that would fail to meet the standards required for a therapeutic treatment. It is now the basis for the discovery of small molecules that might or might not be directly used as drugs, but which have an immense potential for basic research, as probes to explore an increasing number of biological phenomena. Concerns about the environmental impact of chemical industry

  12. Relativistic quantum chemistry on quantum computers

    DEFF Research Database (Denmark)

    Veis, L.; Visnak, J.; Fleig, T.

    2012-01-01

    The past few years have witnessed a remarkable interest in the application of quantum computing for solving problems in quantum chemistry more efficiently than classical computers allow. Very recently, proof-of-principle experimental realizations have been reported. However, so far only...... the nonrelativistic regime (i.e., the Schrodinger equation) has been explored, while it is well known that relativistic effects can be very important in chemistry. We present a quantum algorithm for relativistic computations of molecular energies. We show how to efficiently solve the eigenproblem of the Dirac......-Coulomb Hamiltonian on a quantum computer and demonstrate the functionality of the proposed procedure by numerical simulations of computations of the spin-orbit splitting in the SbH molecule. Finally, we propose quantum circuits with three qubits and nine or ten controlled-NOT (CNOT) gates, which implement a proof...

  13. Topological quantum chemistry

    Science.gov (United States)

    Bradlyn, Barry; Elcoro, L.; Cano, Jennifer; Vergniory, M. G.; Wang, Zhijun; Felser, C.; Aroyo, M. I.; Bernevig, B. Andrei

    2017-07-01

    Since the discovery of topological insulators and semimetals, there has been much research into predicting and experimentally discovering distinct classes of these materials, in which the topology of electronic states leads to robust surface states and electromagnetic responses. This apparent success, however, masks a fundamental shortcoming: topological insulators represent only a few hundred of the 200,000 stoichiometric compounds in material databases. However, it is unclear whether this low number is indicative of the esoteric nature of topological insulators or of a fundamental problem with the current approaches to finding them. Here we propose a complete electronic band theory, which builds on the conventional band theory of electrons, highlighting the link between the topology and local chemical bonding. This theory of topological quantum chemistry provides a description of the universal (across materials), global properties of all possible band structures and (weakly correlated) materials, consisting of a graph-theoretic description of momentum (reciprocal) space and a complementary group-theoretic description in real space. For all 230 crystal symmetry groups, we classify the possible band structures that arise from local atomic orbitals, and show which are topologically non-trivial. Our electronic band theory sheds new light on known topological insulators, and can be used to predict many more.

  14. Introducing Relativity into Quantum Chemistry

    Science.gov (United States)

    Li, Wai-Kee; Blinder, S. M.

    2011-01-01

    It is not often realized by chemists that the special theory of relativity is behind several aspects of quantum chemistry. The Schrdinger equation itself is based on relations between space-time and energy-momentum four vectors. Electron spin is, of course, the most obvious manifestation of relativity. The chemistry of some heavy elements is…

  15. Remedial Mathematics for Quantum Chemistry

    Science.gov (United States)

    Koopman, Lodewijk; Brouwer, Natasa; Heck, Andre; Buma, Wybren Jan

    2008-01-01

    Proper mathematical skills are important for every science course and mathematics-intensive chemistry courses rely on a sound mathematical pre-knowledge. In the first-year quantum chemistry course at this university, it was noticed that many students lack basic mathematical knowledge. To tackle the mathematics problem, a remedial mathematics…

  16. Introducing Relativity into Quantum Chemistry

    Science.gov (United States)

    Li, Wai-Kee; Blinder, S. M.

    2011-01-01

    It is not often realized by chemists that the special theory of relativity is behind several aspects of quantum chemistry. The Schrdinger equation itself is based on relations between space-time and energy-momentum four vectors. Electron spin is, of course, the most obvious manifestation of relativity. The chemistry of some heavy elements is…

  17. High throughput drug profiling

    OpenAIRE

    Entzeroth, Michael; Chapelain, Béatrice; Guilbert, Jacques; Hamon, Valérie

    2000-01-01

    High throughput screening has significantly contributed to advances in drug discovery. The great increase in the number of samples screened has been accompanied by increases in costs and in the data required for the investigated compounds. High throughput profiling addresses the issues of compound selectivity and specificity. It combines conventional screening with data mining technologies to give a full set of data, enabling development candidates to be more fully compared.

  18. Problems in Quantum Chemistry and Spectroscopy

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2015-01-01

    A collection of 22 introductory exercise problems for the course "Quantum Chemistry and Spectroscopy (QCS)".......A collection of 22 introductory exercise problems for the course "Quantum Chemistry and Spectroscopy (QCS)"....

  19. Handbook of computational quantum chemistry

    CERN Document Server

    Cook, David B

    2005-01-01

    Quantum chemistry forms the basis of molecular modeling, a tool widely used to obtain important chemical information and visual images of molecular systems. Recent advances in computing have resulted in considerable developments in molecular modeling, and these developments have led to significant achievements in the design and synthesis of drugs and catalysts. This comprehensive text provides upper-level undergraduates and graduate students with an introduction to the implementation of quantum ideas in molecular modeling, exploring practical applications alongside theoretical explanations.Wri

  20. Quantum Chemistry via the Periodic Law.

    Science.gov (United States)

    Blinder, S. M.

    1981-01-01

    Describes an approach to quantum mechanics exploiting the periodic structure of the elements as a foundation for the quantum theory of matter. Indicates that a quantum chemistry course can be developed using this approach. (SK)

  1. A high-throughput homogeneous immunoassay based on Förster resonance energy transfer between quantum dots and gold nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Jing [School of Chemistry and Chemical Engineering, Southeast University, Nanjing 211189 (China); School of Chemistry and Chemical Engineering, Jiangsu University, Zhengjiang 212013 (China); Wang, Chengquan [Changzhou College of Information Technology, Changzhou 213164 (China); Pan, Xiaohu [School of Chemistry and Chemical Engineering, Southeast University, Nanjing 211189 (China); Liu, Songqin, E-mail: liusq@seu.edu.cn [School of Chemistry and Chemical Engineering, Southeast University, Nanjing 211189 (China)

    2013-02-06

    Graphical abstract: A Förster resonance energy transfer system by using polyclonal goat anti-CEA antibody labeled luminescent CdTe quantum dots (QDs) as donor and monoclonal goat anti-CEA antibody labeled gold nanoparticles (AuNPs) as acceptor for sensitive detection of tumor marker was proposed. Highlights: ► A homogeneous immunosensing strategy based on FRET for detection of tumor marker was proposed. ► Close of QDs and AuNPs allow the occurrence of quenching the photoluminescence of nano-bio-probes. ► Signal quenching was monitored by a self-developed image analyzer. ► The fluorometric assay format is attractive for widespread carcinoma screening and even field use. -- Abstract: A novel homogeneous immunoassay based on Förster resonance energy transfer for sensitive detection of tumor, e.g., marker with carcinoembryonic antigen (CEA), was proposed. The assay was consisted of polyclonal goat anti-CEA antibody labeled luminescent CdTe quantum dots (QDs) as donor and monoclonal goat anti-CEA antibody labeled gold nanoparticles (AuNPs) as acceptor. In presence of CEA, the bio-affinity between antigen and antibody made the QDs and AuNPs close enough, thus the photoluminescence (PL) quenching of CdTe QDs occurred. The PL properties could be transformed into the fluorometric variation, corresponding to the target antigen concentration, and could be easily monitored and analyzed with the home-made image analysis software. The fluorometric results indicated a linear detection range of 1–110 ng mL{sup −1} for CEA, with a detection limit of 0.3 ng mL{sup −1}. The proposed assay configuration was attractive for carcinoma screening or single sample in point-of-care testing, and even field use. In spite of the limit of available model analyte, this approach could be easily extended to detection of a wide range of biomarkers.

  2. High-throughput and rapid fluorescent visualization sensor of urinary citrate by CdTe quantum dots.

    Science.gov (United States)

    Zhuo, Shujuan; Gong, Jiajia; Zhang, Ping; Zhu, Changqing

    2015-08-15

    In this paper, we have presented a novel CdTe quantum dots (QDs) based fluorescent sensor for visual and turn-on sensing of citrate in human urine samples. The europium ion (Eu(3+)) can lead to the fluorescence quenching of thioglycollic acid (TGA) modified CdTe QDs due to photoinduced electron transfer accompanied by the change of emission color from yellow to orange. Next, addition of citrate breaks the preformed assembly because citrate can replace the CdTe QDs, based on the fact that the Eu(3+) ion displays higher affinity with citrate than the CdTe QDs. Thus the photoinduced electron transfer is switched off, and the fluorescence emission of CdTe QDs is rapidly (within 5min) recovered, simultaneously, the orange emission color restores to yellow. Such proposed strategy may conveniently discriminate the patient of renal stone from normal person by naked eyes. In addition to visualization detection, the fluorescence responses can be used for well quantifying citrate in the range of 0.67-133μM. So, the present, simple, low-cost and visualized citrate fluorescence sensor has great potential in the applications for earlier screening in clinical detection.

  3. Real-time Quantum Chemistry

    CERN Document Server

    Haag, Moritz P

    2012-01-01

    Significant progress in the development of efficient and fast algorithms for quantum chemical calculations has been made in the past two decades. The main focus has always been the desire to be able to treat ever larger molecules or molecular assemblies---especially linear and sub-linear scaling techniques are devoted to the accomplishment of this goal. However, as many chemical reactions are rather local, they usually involve only a limited number of atoms so that models of about two hundred (or even less) atoms embedded in a suitable environment are sufficient to study their mechanisms. Thus, the system size does not need to be enlarged, but remains constant for reactions of this type that can be described by less than two hundred atoms. The question then arises how fast one can obtain the quantum chemical results. This question is not directly answered by linear-scaling techniques. In fact, ideas such as haptic quantum chemistry or interactive quantum chemistry require an immediate provision of quantum che...

  4. Computing protein infrared spectroscopy with quantum chemistry.

    Science.gov (United States)

    Besley, Nicholas A

    2007-12-15

    Quantum chemistry is a field of science that has undergone unprecedented advances in the last 50 years. From the pioneering work of Boys in the 1950s, quantum chemistry has evolved from being regarded as a specialized and esoteric discipline to a widely used tool that underpins much of the current research in chemistry today. This achievement was recognized with the award of the 1998 Nobel Prize in Chemistry to John Pople and Walter Kohn. As the new millennium unfolds, quantum chemistry stands at the forefront of an exciting new era. Quantitative calculations on systems of the magnitude of proteins are becoming a realistic possibility, an achievement that would have been unimaginable to the early pioneers of quantum chemistry. In this article we will describe ongoing work towards this goal, focusing on the calculation of protein infrared amide bands directly with quantum chemical methods.

  5. Quantum Nanobiology and Biophysical Chemistry

    DEFF Research Database (Denmark)

    2013-01-01

    An introduction was provided in the first issue by way of an Editorial to this special two issue volume of Current Physical Chemistry – “Quantum Nanobiology and Biophysical Chemistry” [1]. The Guest Editors would like to thank all the authors and referees who have contributed to this second issue....... demonstrate extremely low detection performance of acyl-homoserine lactone in a biologically relevant system using surface enhanced Raman spectroscopy. Sugihara and Bondar evaluate the influence of methyl-groups and the protein environment on retinal geometries in rhodopsin and bacteriorhodopsin, two...

  6. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    Science.gov (United States)

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  7. Understanding Quantum Numbers in General Chemistry Textbooks

    Science.gov (United States)

    Niaz, Mansoor; Fernandez, Ramon

    2008-01-01

    Quantum numbers and electron configurations form an important part of the general chemistry curriculum and textbooks. The objectives of this study are: (1) Elaboration of a framework based on the following aspects: (a) Origin of the quantum hypothesis, (b) Alternative interpretations of quantum mechanics, (c) Differentiation between an orbital and…

  8. Understanding Quantum Numbers in General Chemistry Textbooks

    Science.gov (United States)

    Niaz, Mansoor; Fernandez, Ramon

    2008-01-01

    Quantum numbers and electron configurations form an important part of the general chemistry curriculum and textbooks. The objectives of this study are: (1) Elaboration of a framework based on the following aspects: (a) Origin of the quantum hypothesis, (b) Alternative interpretations of quantum mechanics, (c) Differentiation between an orbital and…

  9. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  10. A Quantum Chemistry Concept Inventory for Physical Chemistry Classes

    Science.gov (United States)

    Dick-Perez, Marilu; Luxford, Cynthia J.; Windus, Theresa L.; Holme, Thomas

    2016-01-01

    A 14-item, multiple-choice diagnostic assessment tool, the quantum chemistry concept inventory or QCCI, is presented. Items were developed based on published student misconceptions and content coverage and then piloted and used in advanced physical chemistry undergraduate courses. In addition to the instrument itself, data from both a pretest,…

  11. A Quantum Chemistry Concept Inventory for Physical Chemistry Classes

    Science.gov (United States)

    Dick-Perez, Marilu; Luxford, Cynthia J.; Windus, Theresa L.; Holme, Thomas

    2016-01-01

    A 14-item, multiple-choice diagnostic assessment tool, the quantum chemistry concept inventory or QCCI, is presented. Items were developed based on published student misconceptions and content coverage and then piloted and used in advanced physical chemistry undergraduate courses. In addition to the instrument itself, data from both a pretest,…

  12. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  13. Overview on the current status of virtual high-throughput screening and combinatorial chemistry approaches in multi-target anticancer drug discovery; Part I.

    Science.gov (United States)

    Geromichalos, George D; Alifieris, Constantinos E; Geromichalou, Elena G; Trafalis, Dimitrios T

    2016-01-01

    Conventional drug design embraces the "one gene, one drug, one disease" philosophy. Nowadays, new generation of anti- cancer drugs, able to inhibit more than one pathway, is believed to play a major role in contemporary anticancer drug research. In this way, polypharmacology, focusing on multi-target drugs, has emerged as a new paradigm in drug discovery. A number of recent successful drugs have in part or in whole emerged from a structure-based research approach. Many advances including crystallography and informatics are behind these successes. Increasing insight into the genetics and molecular biology of cancer has resulted in the identification of an increasing number of potential molecular targets, for anticancer drug discovery and development. These targets can be approached through exploitation of emerging structural biology, "rational" drug design, screening of chemical libraries, or a combination of these methods. The result is the rapid discovery of new anticancer drugs. In this article we discuss the application of molecular modeling, molecular docking and virtual high-throughput screening to multi-targeted anticancer drug discovery. Efforts have been made to employ in silico methods for facilitating the search and design of selective multi-target agents. These computer aided molecular design methods have shown promising potential in facilitating drug discovery directed at selective multiple targets and is expected to contribute to intelligent lead anticancer drugs.

  14. Density functional theory in quantum chemistry

    CERN Document Server

    Tsuneda, Takao

    2014-01-01

    This book examines density functional theory based on the foundation of quantum chemistry. Unconventional in approach, it reviews basic concepts, then describes the physical meanings of state-of-the-art exchange-correlation functionals and their corrections.

  15. The Application of SCC-DV-Xα Computational Method of Quantum Chemistry in Cement Chemistry

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    It has been explored why quantum chemistry is applied to the research field of cement chemistry. The fundamental theory of SCC-DV-Xα computational method of quantum chemistry is synopsized. The results obtained by computational quantum chemistry method in recent years of valence-bond structures and hydration activity of some cement clinker minerals, mechanical strength and stabilization of some hydrates are summarized and evaluated. Finally the prospects of the future application of quantum chemistry to cement chemistry are depicted.

  16. Simulating chemistry using quantum computers

    CERN Document Server

    Kassal, Ivan; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2010-01-01

    The difficulty of simulating quantum systems, well-known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  17. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  18. Towards quantum chemistry on a quantum computer.

    Science.gov (United States)

    Lanyon, B P; Whitfield, J D; Gillett, G G; Goggin, M E; Almeida, M P; Kassal, I; Biamonte, J D; Mohseni, M; Powell, B J; Barbieri, M; Aspuru-Guzik, A; White, A G

    2010-02-01

    Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.

  19. Adiabatic quantum simulation of quantum chemistry.

    Science.gov (United States)

    Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-10-13

    We show how to apply the quantum adiabatic algorithm directly to the quantum computation of molecular properties. We describe a procedure to map electronic structure Hamiltonians to 2-body qubit Hamiltonians with a small set of physically realizable couplings. By combining the Bravyi-Kitaev construction to map fermions to qubits with perturbative gadgets to reduce the Hamiltonian to 2-body, we obtain precision requirements on the coupling strengths and a number of ancilla qubits that scale polynomially in the problem size. Hence our mapping is efficient. The required set of controllable interactions includes only two types of interaction beyond the Ising interactions required to apply the quantum adiabatic algorithm to combinatorial optimization problems. Our mapping may also be of interest to chemists directly as it defines a dictionary from electronic structure to spin Hamiltonians with physical interactions.

  20. Quantum theory and chemistry: Two propositions

    Science.gov (United States)

    Aronowitz, S.

    1980-01-01

    Two propositions concerning quantum chemistry are proposed. First, it is proposed that the nonrelativistic Schroedinger equation, where the Hamiltonian operator is associated with an assemblage of nuclei and electrons, can never be arranged to yield specific molecules in the chemists' sense. It is argued that this result is a necessary condition if the Schroedinger has relevancy to chemistry. Second, once a system is in a particular state with regard to interactions among its components (the assemblage of nuclei and electrons), it cannot spontaneously eliminate any of those interactions. This leads to a subtle form of irreversibility.

  1. Cellular Effect of High Doses of Silica-Coated Quantum Dot Profiled with High Throughput Gene Expression Analysis and High Content Cellomics Measurements

    OpenAIRE

    Zhang, Tingting; Stilwell, Jackie L.; Gerion, Daniele; Ding, Lianghao; Elboudwarej, Omeed; Cooke, Patrick A.; Gray, Joe W.; Alivisatos, A. Paul; Chen, Fanqing Frank

    2006-01-01

    Quantum dots (Qdots) are now used extensively for labeling in biomedical research, and this use is predicted to grow because of their many advantages over alternative labeling methods. Uncoated Qdots made of core/shell CdSe/ZnS are toxic to cells because of the release of Cd2+ ions into the cellular environment. This problem has been partially overcome by coating Qdots with polymers, poly(ethylene glycol) (PEG), or other inert molecules. The most promising coating to date, for reducing toxici...

  2. Orbital entanglement in quantum chemistry

    CERN Document Server

    Boguslawski, Katharina

    2014-01-01

    The basic concepts of orbital entanglement and its application to chemistry are briefly reviewed. The calculation of orbital entanglement measures from correlated wavefunctions is discussed in terms of reduced $n$-particle density matrices. Possible simplifications in their evaluation are highlighted in case of seniority-zero wavefunctions. Specifically, orbital entanglement allows us to dissect electron correlation effects in its strong and weak contributions, to determine bond orders, to assess the quality and stability of active space calculations, to monitor chemical reactions, and to identify points along the reaction coordinate where electronic wavefunctions change drastically. Thus, orbital entanglement represents a useful and intuitive tool to interpret complex electronic wavefunctions and to facilitate a qualitative understanding of electronic structure and how it changes in chemical processes.

  3. High-throughput methods for electron crystallography.

    Science.gov (United States)

    Stokes, David L; Ubarretxena-Belandia, Iban; Gonen, Tamir; Engel, Andreas

    2013-01-01

    Membrane proteins play a tremendously important role in cell physiology and serve as a target for an increasing number of drugs. Structural information is key to understanding their function and for developing new strategies for combating disease. However, the complex physical chemistry associated with membrane proteins has made them more difficult to study than their soluble cousins. Electron crystallography has historically been a successful method for solving membrane protein structures and has the advantage of providing a native lipid environment for these proteins. Specifically, when membrane proteins form two-dimensional arrays within a lipid bilayer, electron microscopy can be used to collect images and diffraction and the corresponding data can be combined to produce a three-dimensional reconstruction, which under favorable conditions can extend to atomic resolution. Like X-ray crystallography, the quality of the structures are very much dependent on the order and size of the crystals. However, unlike X-ray crystallography, high-throughput methods for screening crystallization trials for electron crystallography are not in general use. In this chapter, we describe two alternative methods for high-throughput screening of membrane protein crystallization within the lipid bilayer. The first method relies on the conventional use of dialysis for removing detergent and thus reconstituting the bilayer; an array of dialysis wells in the standard 96-well format allows the use of a liquid-handling robot and greatly increases throughput. The second method relies on titration of cyclodextrin as a chelating agent for detergent; a specialized pipetting robot has been designed not only to add cyclodextrin in a systematic way, but to use light scattering to monitor the reconstitution process. In addition, the use of liquid-handling robots for making negatively stained grids and methods for automatically imaging samples in the electron microscope are described.

  4. Quantum Dots: An Experiment for Physical or Materials Chemistry

    Science.gov (United States)

    Winkler, L. D.; Arceo, J. F.; Hughes, W. C.; DeGraff, B. A.; Augustine, B. H.

    2005-01-01

    An experiment is conducted for obtaining quantum dots for physical or materials chemistry. This experiment serves to both reinforce the basic concept of quantum confinement and providing a useful bridge between the molecular and solid-state world.

  5. Cellular effect of high doses of silica-coated quantum dot profiled with high throughput gene expression analysis and high content cellomics measurements.

    Science.gov (United States)

    Zhang, Tingting; Stilwell, Jackie L; Gerion, Daniele; Ding, Lianghao; Elboudwarej, Omeed; Cooke, Patrick A; Gray, Joe W; Alivisatos, A Paul; Chen, Fanqing Frank

    2006-04-01

    Quantum dots (Qdots) are now used extensively for labeling in biomedical research, and this use is predicted to grow because of their many advantages over alternative labeling methods. Uncoated Qdots made of core/shell CdSe/ZnS are toxic to cells because of the release of Cd2+ ions into the cellular environment. This problem has been partially overcome by coating Qdots with polymers, poly(ethylene glycol) (PEG), or other inert molecules. The most promising coating to date, for reducing toxicity, appears to be PEG. When PEG-coated silanized Qdots (PEG-silane-Qdots) are used to treat cells, toxicity is not observed, even at dosages above 10-20 nM, a concentration inducing death when cells are treated with polymer or mercaptoacid coated Qdots. Because of the importance of Qdots in current and future biomedical and clinical applications, we believe it is essential to more completely understand and verify this negative global response from cells treated with PEG-silane-Qdots. Consequently, we examined the molecular and cellular response of cells treated with two different dosages of PEG-silane-Qdots. Human fibroblasts were exposed to 8 and 80 nM of these Qdots, and both phenotypic as well as whole genome expression measurements were made. PEG-silane-Qdots did not induce any statistically significant cell cycle changes and minimal apoptosis/necrosis in lung fibroblasts (IMR-90) as measured by high content image analysis, regardless of the treatment dosage. A slight increase in apoptosis/necrosis was observed in treated human skin fibroblasts (HSF-42) at both the low and the high dosages. We performed genome-wide expression array analysis of HSF-42 exposed to doses 8 and 80 nM to link the global cell response to a molecular and genetic phenotype. We used a gene array containing approximately 22,000 total probe sets, containing 18,400 probe sets from known genes. Only approximately 50 genes (approximately 0.2% of all the genes tested) exhibited a statistically significant

  6. Inverse Quantum Chemistry: Concepts and Strategies for Rational Compound Design

    CERN Document Server

    Weymuth, Thomas

    2014-01-01

    The rational design of molecules and materials is becoming more and more important. With the advent of powerful computer systems and sophisticated algorithms, quantum chemistry plays an important role in rational design. While traditional quantum chemical approaches predict the properties of a predefined molecular structure, the goal of inverse quantum chemistry is to find a structure featuring one or more desired properties. Herein, we review inverse quantum chemical approaches proposed so far and discuss their advantages as well as their weaknesses.

  7. Experimental study of quantum simulation for quantum chemistry with a nuclear magnetic resonance simulator.

    Science.gov (United States)

    Lu, Dawei; Xu, Nanyang; Xu, Boruo; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-10-13

    Quantum computers have been proved to be able to mimic quantum systems efficiently in polynomial time. Quantum chemistry problems, such as static molecular energy calculations and dynamical chemical reaction simulations, become very intractable on classical computers with scaling up of the system. Therefore, quantum simulation is a feasible and effective approach to tackle quantum chemistry problems. Proof-of-principle experiments have been implemented on the calculation of the hydrogen molecular energies and one-dimensional chemical isomerization reaction dynamics using nuclear magnetic resonance systems. We conclude that quantum simulation will surpass classical computers for quantum chemistry in the near future.

  8. First-principles quantum chemistry in the life sciences.

    Science.gov (United States)

    van Mourik, Tanja

    2004-12-15

    The area of computational quantum chemistry, which applies the principles of quantum mechanics to molecular and condensed systems, has developed drastically over the last decades, due to both increased computer power and the efficient implementation of quantum chemical methods in readily available computer programs. Because of this, accurate computational techniques can now be applied to much larger systems than before, bringing the area of biochemistry within the scope of electronic-structure quantum chemical methods. The rapid pace of progress of quantum chemistry makes it a very exciting research field; calculations that are too computationally expensive today may be feasible in a few months' time! This article reviews the current application of 'first-principles' quantum chemistry in biochemical and life sciences research, and discusses its future potential. The current capability of first-principles quantum chemistry is illustrated in a brief examination of computational studies on neurotransmitters, helical peptides, and DNA complexes.

  9. Alternative algebraic approaches in quantum chemistry

    Science.gov (United States)

    Mezey, Paul G.

    2015-01-01

    Various algebraic approaches of quantum chemistry all follow a common principle: the fundamental properties and interrelations providing the most essential features of a quantum chemical representation of a molecule or a chemical process, such as a reaction, can always be described by algebraic methods. Whereas such algebraic methods often provide precise, even numerical answers, nevertheless their main role is to give a framework that can be elaborated and converted into computational methods by involving alternative mathematical techniques, subject to the constraints and directions provided by algebra. In general, algebra describes sets of interrelations, often phrased in terms of algebraic operations, without much concern with the actual entities exhibiting these interrelations. However, in many instances, the very realizations of two, seemingly unrelated algebraic structures by actual quantum chemical entities or properties play additional roles, and unexpected connections between different algebraic structures are often giving new insight. Here we shall be concerned with two alternative algebraic structures: the fundamental group of reaction mechanisms, based on the energy-dependent topology of potential energy surfaces, and the interrelations among point symmetry groups for various distorted nuclear arrangements of molecules. These two, distinct algebraic structures provide interesting interrelations, which can be exploited in actual studies of molecular conformational and reaction processes. Two relevant theorems will be discussed.

  10. Orthogonal NGS for High Throughput Clinical Diagnostics.

    Science.gov (United States)

    Chennagiri, Niru; White, Eric J; Frieden, Alexander; Lopez, Edgardo; Lieber, Daniel S; Nikiforov, Anastasia; Ross, Tristen; Batorsky, Rebecca; Hansen, Sherry; Lip, Va; Luquette, Lovelace J; Mauceli, Evan; Margulies, David; Milos, Patrice M; Napolitano, Nichole; Nizzari, Marcia M; Yu, Timothy; Thompson, John F

    2016-04-19

    Next generation sequencing is a transformative technology for discovering and diagnosing genetic disorders. However, high-throughput sequencing remains error-prone, necessitating variant confirmation in order to meet the exacting demands of clinical diagnostic sequencing. To address this, we devised an orthogonal, dual platform approach employing complementary target capture and sequencing chemistries to improve speed and accuracy of variant calls at a genomic scale. We combined DNA selection by bait-based hybridization followed by Illumina NextSeq reversible terminator sequencing with DNA selection by amplification followed by Ion Proton semiconductor sequencing. This approach yields genomic scale orthogonal confirmation of ~95% of exome variants. Overall variant sensitivity improves as each method covers thousands of coding exons missed by the other. We conclude that orthogonal NGS offers improvements in variant calling sensitivity when two platforms are used, better specificity for variants identified on both platforms, and greatly reduces the time and expense of Sanger follow-up, thus enabling physicians to act on genomic results more quickly.

  11. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  12. From Physical Chemistry to Quantum Chemistry: How Chemists Dealt with Mathematics

    OpenAIRE

    Kostas Gavroglu; Ana Simões

    2012-01-01

    Discussing the relationship of mathematics to chemistry is closely related to the emergence of physical chemistry and of quantum chemistry. We argue that, perhaps, the most significant issue that the 'mathematization of chemistry' has historically raised is not so much methodological, as it is philosophical: the discussion over the ontological status of theoretical entities which were introduced in the process. A systematic study of such an approach to the mathematization of chemistry may, pe...

  13. Steps toward fault-tolerant quantum chemistry.

    Energy Technology Data Exchange (ETDEWEB)

    Taube, Andrew Garvin

    2010-05-01

    Developing quantum chemistry programs on the coming generation of exascale computers will be a difficult task. The programs will need to be fault-tolerant and minimize the use of global operations. This work explores the use a task-based model that uses a data-centric approach to allocate work to different processes as it applies to quantum chemistry. After introducing the key problems that appear when trying to parallelize a complicated quantum chemistry method such as coupled-cluster theory, we discuss the implications of that model as it pertains to the computational kernel of a coupled-cluster program - matrix multiplication. Also, we discuss the extensions that would required to build a full coupled-cluster program using the task-based model. Current programming models for high-performance computing are fault-intolerant and use global operations. Those properties are unsustainable as computers scale to millions of CPUs; instead one must recognize that these systems will be hierarchical in structure, prone to constant faults, and global operations will be infeasible. The FAST-OS HARE project is introducing a scale-free computing model to address these issues. This model is hierarchical and fault-tolerant by design, allows for the clean overlap of computation and communication, reducing the network load, does not require checkpointing, and avoids the complexity of many HPC runtimes. Development of an algorithm within this model requires a change in focus from imperative programming to a data-centric approach. Quantum chemistry (QC) algorithms, in particular electronic structure methods, are an ideal test bed for this computing model. These methods describe the distribution of electrons in a molecule, which determine the properties of the molecule. The computational cost of these methods is high, scaling quartically or higher in the size of the molecule, which is why QC applications are major users of HPC resources. The complexity of these algorithms means that

  14. Quantum chemistry-assisted synthesis route development

    Energy Technology Data Exchange (ETDEWEB)

    Hori, Kenji; Sumimoto, Michinori [Graduate School of Science and Engineering, Yamaguchi University, Tokiwadai, Ube, Yamaguchi 755-8611 (Japan); Murafuji, Toshihiro [Graduate School of Medicine, Yamaguchi University, Yamaguchi, Yamaguchi 753-8512 (Japan)

    2015-12-31

    We have been investigating “quantum chemistry-assisted synthesis route development” using in silico screenings and applied the method to several targets. Another example was conducted to develop synthesis routes for a urea derivative, namely 1-(4-(trifluoromethyl)-2-oxo-2H-chromen-7-yl)urea. While five synthesis routes were examined, only three routes passed the second in silico screening. Among them, the reaction of 7-amino-4-(trifluoromethyl)-2H-chromen-2-one and O-methyl carbamate with BF{sub 3} as an additive was ranked as the first choice for synthetic work. We were able to experimentally obtain the target compound even though its yield was as low as 21 %. The theoretical result was thus consistent with that observed. The summary of transition state data base (TSDB) is also provided. TSDB is the key to reducing time of in silico screenings.

  15. Virtually going green: The role of quantum computational chemistry in reducing pollution and toxicity in chemistry

    Science.gov (United States)

    Stevens, Jonathan

    2017-07-01

    Continuing advances in computational chemistry has permitted quantum mechanical calculation to assist in research in green chemistry and to contribute to the greening of chemical practice. Presented here are recent examples illustrating the contribution of computational quantum chemistry to green chemistry, including the possibility of using computation as a green alternative to experiments, but also illustrating contributions to greener catalysis and the search for greener solvents. Examples of applications of computation to ambitious projects for green synthetic chemistry using carbon dioxide are also presented.

  16. From transistor to trapped-ion computers for quantum chemistry.

    Science.gov (United States)

    Yung, M-H; Casanova, J; Mezzacapo, A; McClean, J; Lamata, L; Aspuru-Guzik, A; Solano, E

    2014-01-07

    Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.

  17. Solutions to selected exercise problems in quantum chemistry and spectroscopy

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    Suggested solutions to a number of problems from the collection "Exercise Problems in Quantum Chemistry and Spectroscopy", previously published on ResearchGate (DOI: 10.13140/RG.2.1.4024.8162).......Suggested solutions to a number of problems from the collection "Exercise Problems in Quantum Chemistry and Spectroscopy", previously published on ResearchGate (DOI: 10.13140/RG.2.1.4024.8162)....

  18. Solutions to selected exercise problems in quantum chemistry and spectroscopy

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    Suggested solutions to a number of problems from the collection "Exercise Problems in Quantum Chemistry and Spectroscopy", previously published on ResearchGate (DOI: 10.13140/RG.2.1.4024.8162).......Suggested solutions to a number of problems from the collection "Exercise Problems in Quantum Chemistry and Spectroscopy", previously published on ResearchGate (DOI: 10.13140/RG.2.1.4024.8162)....

  19. Fluorescent biosensors for high throughput screening of protein kinase inhibitors.

    Science.gov (United States)

    Prével, Camille; Pellerano, Morgan; Van, Thi Nhu Ngoc; Morris, May C

    2014-02-01

    High throughput screening assays aim to identify small molecules that interfere with protein function, activity, or conformation, which can serve as effective tools for chemical biology studies of targets involved in physiological processes or pathways of interest or disease models, as well as templates for development of therapeutics in medicinal chemistry. Fluorescent biosensors constitute attractive and powerful tools for drug discovery programs, from high throughput screening assays, to postscreen characterization of hits, optimization of lead compounds, and preclinical evaluation of candidate drugs. They provide a means of screening for inhibitors that selectively target enzymatic activity, conformation, and/or function in vitro. Moreover, fluorescent biosensors constitute useful tools for cell- and image-based, multiplex and multiparametric, high-content screening. Application of fluorescence-based sensors to screen large and complex libraries of compounds in vitro, in cell-based formats or whole organisms requires several levels of optimization to establish robust and reproducible assays. In this review, we describe the different fluorescent biosensor technologies which have been applied to high throughput screens, and discuss the prerequisite criteria underlying their successful application. Special emphasis is placed on protein kinase biosensors, since these enzymes constitute one of the most important classes of therapeutic targets in drug discovery.

  20. HTRF(®): pioneering technology for high-throughput screening.

    Science.gov (United States)

    Degorce, François

    2006-12-01

    Cisbio international pioneered the field of homogeneous fluorescence methodologies and time-resolved fluorescence resonance in particular, through its proprietary technology, HTRF(®). The development was based on Prof. Jean-Marie Lehn's research on rare earth fluorescence properties (awarded the Nobel Prize in Chemistry in 1987) and on Cisbio's expertise in homogenous time-resolved fluorescence (HTRF). The technology is used in assay development and drug screening, most notably in high-throughput screening applications. This highly powerful technology is particularly applied to the areas of G-protein-coupled receptor and kinase screening, as well as a series of targets related to inflammation, metabolic diseases and CNS disorders.

  1. High-throughput DNA sequencing: a genomic data manufacturing process.

    Science.gov (United States)

    Huang, G M

    1999-01-01

    The progress trends in automated DNA sequencing operation are reviewed. Technological development in sequencing instruments, enzymatic chemistry and robotic stations has resulted in ever-increasing capacity of sequence data production. This progress leads to a higher demand on laboratory information management and data quality assessment. High-throughput laboratories face the challenge of organizational management, as well as technology management. Engineering principles of process control should be adopted in this biological data manufacturing procedure. While various systems attempt to provide solutions to automate different parts of, or even the entire process, new technical advances will continue to change the paradigm and provide new challenges.

  2. Data Management for High-Throughput Genomics

    CERN Document Server

    Roehm, Uwe

    2009-01-01

    Today's sequencing technology allows sequencing an individual genome within a few weeks for a fraction of the costs of the original Human Genome project. Genomics labs are faced with dozens of TB of data per week that have to be automatically processed and made available to scientists for further analysis. This paper explores the potential and the limitations of using relational database systems as the data processing platform for high-throughput genomics. In particular, we are interested in the storage management for high-throughput sequence data and in leveraging SQL and user-defined functions for data analysis inside a database system. We give an overview of a database design for high-throughput genomics, how we used a SQL Server database in some unconventional ways to prototype this scenario, and we will discuss some initial findings about the scalability and performance of such a more database-centric approach.

  3. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  4. High-throughput computing in the sciences.

    Science.gov (United States)

    Morgan, Mark; Grimshaw, Andrew

    2009-01-01

    While it is true that the modern computer is many orders of magnitude faster than that of yesteryear; this tremendous growth in CPU clock rates is now over. Unfortunately, however, the growth in demand for computational power has not abated; whereas researchers a decade ago could simply wait for computers to get faster, today the only solution to the growing need for more powerful computational resource lies in the exploitation of parallelism. Software parallelization falls generally into two broad categories--"true parallel" and high-throughput computing. This chapter focuses on the latter of these two types of parallelism. With high-throughput computing, users can run many copies of their software at the same time across many different computers. This technique for achieving parallelism is powerful in its ability to provide high degrees of parallelism, yet simple in its conceptual implementation. This chapter covers various patterns of high-throughput computing usage and the skills and techniques necessary to take full advantage of them. By utilizing numerous examples and sample codes and scripts, we hope to provide the reader not only with a deeper understanding of the principles behind high-throughput computing, but also with a set of tools and references that will prove invaluable as she explores software parallelism with her own software applications and research.

  5. INTRODUCTION OF THE HIGH THROUGHPUT SCREENING SYSTEM

    Institute of Scientific and Technical Information of China (English)

    李元

    2001-01-01

    In this article, we introduce the system of high throughput screening (HTS). Its role in new drug study and current development is described. The relationship between research achievements of genome study and new type screening model of new drugs is emphasized. The personal opinions of current problems about HTS study in China are raised.

  6. INTRODUCTION OF THE HIGH THROUGHPUT SCREENING SYSTEM

    Institute of Scientific and Technical Information of China (English)

    李元

    2001-01-01

    In this article, we introduce the system of high throughput screening (HTS). Its role in new drug study and current development is described. The relationship between research achievements of genome study and new type screening model of new drugs is emphasized. The personal opinions of current problems about HTS study in China are raised.``

  7. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Romao, Joana; Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for lar

  8. High-Throughput Contact Flow Lithography.

    Science.gov (United States)

    Le Goff, Gaelle C; Lee, Jiseok; Gupta, Ankur; Hill, William Adam; Doyle, Patrick S

    2015-10-01

    High-throughput fabrication of graphically encoded hydrogel microparticles is achieved by combining flow contact lithography in a multichannel microfluidic device and a high capacity 25 mm LED UV source. Production rates of chemically homogeneous particles are improved by two orders of magnitude. Additionally, the custom-built contact lithography instrument provides an affordable solution for patterning complex microstructures on surfaces.

  9. Quantum chemistry and charge transport in biomolecules with superconducting circuits

    Science.gov (United States)

    García-Álvarez, L.; Las Heras, U.; Mezzacapo, A.; Sanz, M.; Solano, E.; Lamata, L.

    2016-06-01

    We propose an efficient protocol for digital quantum simulation of quantum chemistry problems and enhanced digital-analog quantum simulation of transport phenomena in biomolecules with superconducting circuits. Along these lines, we optimally digitize fermionic models of molecular structure with single-qubit and two-qubit gates, by means of Trotter-Suzuki decomposition and Jordan-Wigner transformation. Furthermore, we address the modelling of system-environment interactions of biomolecules involving bosonic degrees of freedom with a digital-analog approach. Finally, we consider gate-truncated quantum algorithms to allow the study of environmental effects.

  10. Quantum chemistry and charge transport in biomolecules with superconducting circuits

    Science.gov (United States)

    García-Álvarez, L.; Las Heras, U.; Mezzacapo, A.; Sanz, M.; Solano, E.; Lamata, L.

    2016-01-01

    We propose an efficient protocol for digital quantum simulation of quantum chemistry problems and enhanced digital-analog quantum simulation of transport phenomena in biomolecules with superconducting circuits. Along these lines, we optimally digitize fermionic models of molecular structure with single-qubit and two-qubit gates, by means of Trotter-Suzuki decomposition and Jordan-Wigner transformation. Furthermore, we address the modelling of system-environment interactions of biomolecules involving bosonic degrees of freedom with a digital-analog approach. Finally, we consider gate-truncated quantum algorithms to allow the study of environmental effects. PMID:27324814

  11. Quantum chemistry and charge transport in biomolecules with superconducting circuits.

    Science.gov (United States)

    García-Álvarez, L; Las Heras, U; Mezzacapo, A; Sanz, M; Solano, E; Lamata, L

    2016-06-21

    We propose an efficient protocol for digital quantum simulation of quantum chemistry problems and enhanced digital-analog quantum simulation of transport phenomena in biomolecules with superconducting circuits. Along these lines, we optimally digitize fermionic models of molecular structure with single-qubit and two-qubit gates, by means of Trotter-Suzuki decomposition and Jordan-Wigner transformation. Furthermore, we address the modelling of system-environment interactions of biomolecules involving bosonic degrees of freedom with a digital-analog approach. Finally, we consider gate-truncated quantum algorithms to allow the study of environmental effects.

  12. The density matrix renormalization group for ab initio quantum chemistry

    CERN Document Server

    Wouters, Sebastian

    2014-01-01

    During the past 15 years, the density matrix renormalization group (DMRG) has become increasingly important for ab initio quantum chemistry. Its underlying wavefunction ansatz, the matrix product state (MPS), is a low-rank decomposition of the full configuration interaction tensor. The virtual dimension of the MPS, the rank of the decomposition, controls the size of the corner of the many-body Hilbert space that can be reached with the ansatz. This parameter can be systematically increased until numerical convergence is reached. The MPS ansatz naturally captures exponentially decaying correlation functions. Therefore DMRG works extremely well for noncritical one-dimensional systems. The active orbital spaces in quantum chemistry are however often far from one-dimensional, and relatively large virtual dimensions are required to use DMRG for ab initio quantum chemistry (QC-DMRG). The QC-DMRG algorithm, its computational cost, and its properties are discussed. Two important aspects to reduce the computational co...

  13. Computational quantum chemistry and adaptive ligand modeling in mechanistic QSAR.

    Science.gov (United States)

    De Benedetti, Pier G; Fanelli, Francesca

    2010-10-01

    Drugs are adaptive molecules. They realize this peculiarity by generating different ensembles of prototropic forms and conformers that depend on the environment. Among the impressive amount of available computational drug discovery technologies, quantitative structure-activity relationship approaches that rely on computational quantum chemistry descriptors are the most appropriate to model adaptive drugs. Indeed, computational quantum chemistry descriptors are able to account for the variation of the intramolecular interactions of the training compounds, which reflect their adaptive intermolecular interaction propensities. This enables the development of causative, interpretive and reasonably predictive quantitative structure-activity relationship models, and, hence, sound chemical information finalized to drug design and discovery.

  14. Quantum Mechanics in Chemistry (by Jack Simons and Jeff Nichols)

    Science.gov (United States)

    McCallum, C. Michael

    1998-12-01

    Topics in Physical Chemistry Series. Oxford University Press: New York, 1997. xxiii + 612 pp. Illustrations. ISBN 0-19-508200-1. $75.00. One of the problems faced by graduate-level quantum mechanics courses in chemistry is that there is often little time for studying chemical problems. Students must learn so much matrix algebra and notation that a first-semester course seems more like a math or physics course than chemistry. Another problem is the focus of most graduate texts. Excellent texts, such as those by Sakurai, and older treatments, such as Messiah and Cohen-Tannoudji, offer a comprehensive amount of mathematical rigor to go along with chemistry problems, but it seems the intended audience is hard-core theoretical or physical chemistry students. Requirements that are more general, such as reaction-path dynamics, structure and term symbols, and symmetry in quantum mechanical problems, are often left behind. Schatz and Ratner's Book Quantum Mechanics in Chemistry (Prentice Hall) is one book that fills this gap (at least for second-semester students); Simons and Nichols' new book is another, but it is a book that requires revision before it can be seriously considered.

  15. Microfabricated high-throughput electronic particle detector

    Science.gov (United States)

    Wood, D. K.; Requa, M. V.; Cleland, A. N.

    2007-10-01

    We describe the design, fabrication, and use of a radio frequency reflectometer integrated with a microfluidic system, applied to the very high-throughput measurement of micron-scale particles, passing in a microfluidic channel through the sensor region. The device operates as a microfabricated Coulter counter [U.S. Patent No. 2656508 (1953)], similar to a design we have described previously, but here with significantly improved electrode geometry as well as including electronic tuning of the reflectometer; the two improvements yielding an improvement by more than a factor of 10 in the signal to noise and in the diametric discrimination of single particles. We demonstrate the high-throughput discrimination of polystyrene beads with diameters in the 4-10μm range, achieving diametric resolutions comparable to the intrinsic spread of diameters in the bead distribution, at rates in excess of 15×106beads/h.

  16. HIGH THROUGHPUT DRILLING OF TITANIUM ALLOYS

    Institute of Scientific and Technical Information of China (English)

    LI Rui; SHIH Albert Jau-Min

    2007-01-01

    The experiments of high throughput drilling of Ti-6Al-4V at 183 m/min cutting speed and 156 mm3/s material removal rate using a 4 mm diameter WC-Co spiral point drill are conducted. At this material removal rate, it took only 0.57 s to drill a hole in a 6.35 mm thick Ti plate. Supplying the cutting fluid via through-the-drill holes and the balance of cutting speed and feed have proven to be critical for drill life. An inverse heat transfer model is developed to predict the heat flux and the drill temperature distribution in drilling. A three-dimensional finite element modeling of drilling is conducted to predict the thrust force and torque. Experimental result demonstrates that, using proper machining process parameters, tool geometry, and fine-grained WC-Co tool material, the high throughput machining of Ti alloy is technically feasible.

  17. Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry

    OpenAIRE

    Rappoport, Dmitrij; Galvin, Cooper J.; Zubarev, Dmitry; Aspuru-Guzik, Alan

    2014-01-01

    While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reacti...

  18. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  19. Quantum Chemistry in Nanoscale Environments: Insights on Surface-Enhanced Raman Scattering and Organic Photovoltaics

    Science.gov (United States)

    Olivares-Amaya, Roberto

    The understanding of molecular effects in nanoscale environments is becoming increasingly relevant for various emerging fields. These include spectroscopy for molecular identification as well as in finding molecules for energy harvesting. Theoretical quantum chemistry has been increasingly useful to address these phenomena to yield an understanding of these effects. In the first part of this dissertation, we study the chemical effect of surface-enhanced Raman scattering (SERS). We use quantum chemistry simulations to study the metal-molecule interactions present in these systems. We find that the excitations that provide a chemical enhancement contain a mixed contribution from the metal and the molecule. Moreover, using atomistic studies we propose an additional source of enhancement, where a transition metal dopant surface could provide an additional enhancement. We also develop methods to study the electrostatic effects of molecules in metallic environments. We study the importance of image-charge effects, as well as field-bias to molecules interacting with perfect conductors. The atomistic modeling and the electrostatic approximation enable us to study the effects of the metal interacting with the molecule in a complementary fashion, which provides a better understanding of the complex effects present in SERS. In the second part of this dissertation, we present the Harvard Clean Energy Project, a high-throughput approach for a large-scale computational screening and design of organic photovoltaic materials. We create molecular libraries to search for candidates structures and use quantum chemistry, machine learning and cheminformatics methods to characterize these systems and find structure-property relations. The scale of this study requires an equally large computational resource. We rely on distributed volunteer computing to obtain these properties. In the third part of this dissertation we present our work related to the acceleration of electronic structure

  20. Quantum kernel applications in medicinal chemistry.

    Science.gov (United States)

    Huang, Lulu; Massa, Lou

    2012-07-01

    Progress in the quantum mechanics of biological molecules is being driven by computational advances. The notion of quantum kernels can be introduced to simplify the formalism of quantum mechanics, making it especially suitable for parallel computation of very large biological molecules. The essential idea is to mathematically break large biological molecules into smaller kernels that are calculationally tractable, and then to represent the full molecule by a summation over the kernels. The accuracy of the kernel energy method (KEM) is shown by systematic application to a great variety of molecular types found in biology. These include peptides, proteins, DNA and RNA. Examples are given that explore the KEM across a variety of chemical models, and to the outer limits of energy accuracy and molecular size. KEM represents an advance in quantum biology applicable to problems in medicine and drug design.

  1. State of the art in quantum chemistry today

    Science.gov (United States)

    Stepanov, Nikolai F.

    2004-01-01

    Modern quantum chemistry is the quantum theory of structure an dynamics of molecular systems. The development of quantum chemistry for almost 8 decades gave birth to many concepts of modern chemistry and created many calculation technique which are widely used to obtain preliminary as well as high-precision information on molecular properties. In the end of XXth century it becomes the real fundamental base of chemistry and the active tool for the qualitative interpretation of structural features and physical and chemical properties including dynamics of chemical transformations. It turns out to be a very power means to obtaining quantitative results for molecular clusters as well as isolated molecules in free states and in external fields. The computational programs created in the last decades of the XXth century and permanently refined provide the reliable quantitative information on molecular equilibrium configurations, harmonic vibrational frequencies and anharmonic force constants, frequencies and intensities of the first electronic transitions, energies of the formations and potential barriers, the parameters important for interpretation of ESR and NMR spectra, electric and magnetic moments, and many other characteristics of molecular systems. The last two decades turned many quantum chemists to comparatively large molecules especially those with the distinctly pronounced biological activity. Nevertheless, small molecules the calculation results for which can pretend on the highest precision still serve as a strong attractor for the people who deal with methodological as well as applied problems.

  2. Quantum Mechanics and Conceptual Change in High School Chemistry Textbooks.

    Science.gov (United States)

    Shiland, Thomas W.

    1997-01-01

    Examines the presentation of quantum mechanics in eight secondary chemistry texts for elements associated with a conceptual change model: (1) dissatisfaction; (2) intelligibility; (3) plausibility; and (4) fruitfulness. Reports that these elements were not present in sufficient quantities to promote conceptual change. Presents recommendations for…

  3. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  4. Quantum information and computation for chemistry

    CERN Document Server

    Kais, Sabre; Rice, Stuart A

    2014-01-01

    Examines the intersection of quantum information and chemical physics The Advances in Chemical Physics series is dedicated to reviewing new and emerging topics as well as the latest developments in traditional areas of study in the field of chemical physics. Each volume features detailed comprehensive analyses coupled with individual points of view that integrate the many disciplines of science that are needed for a full understanding of chemical physics. This volume of the series explores the latest research findings, applications, and new research paths from the quantum information science

  5. Automated High Throughput Drug Target Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  6. Quantum chemistry, band structures and polymers

    Science.gov (United States)

    André, Jean-Marie

    2012-06-01

    A short review of the long way from the first calculations on polyenes after the second world war to the recent electronic devices like Organic Light Emitting Diodes or Photovoltaic Cells is given. It shows how quantum chemical methods on one side and synthesis or experiments have (or should have) interacted as incentives to new methods and technologies.

  7. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    Science.gov (United States)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-07-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi-Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources.

  8. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  9. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  10. Control through operators for quantum chemistry

    CERN Document Server

    Laurent, Philippe; Salomon, Julien; Turinici, Gabriel

    2012-01-01

    We consider the problem of operator identification in quantum control. The free Hamiltonian and the dipole moment are searched such that a given target state is reached at a given time. A local existence result is obtained. As a by-product, our works reveals necessary conditions on the laser field to make the identification feasible. In the last part of this work, some algorithms are proposed to compute effectively these operators.

  11. Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry.

    Science.gov (United States)

    Rappoport, Dmitrij; Galvin, Cooper J; Zubarev, Dmitry Yu; Aspuru-Guzik, Alán

    2014-03-11

    While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reactive potential energy surfaces and are combined here with quantum chemical structure optimizations, which yield the structures and energies of the reaction intermediates and products. Application of heuristics-aided quantum chemical methodology to the formose reaction reproduces the experimentally observed reaction products, major reaction pathways, and autocatalytic cycles.

  12. The Dalton quantum chemistry program system

    DEFF Research Database (Denmark)

    Aidas, Kestutis; Angeli, C.; Bak, K.L.

    2014-01-01

    Dalton is a powerful general-purpose program system for the study of molecular electronic structure at the Hartree–Fock, Kohn–Sham, multiconfigurational self-consistent-field, Møller–Plesset, configuration-interaction, and coupled-cluster levels of theory. Apart from the total energy, a wide vari......-medium and quantum-mechanics/molecular-mechanics models. Large molecules may be studied using linear-scaling and massively parallel algorithms. Dalton is distributed at no cost from http://www.daltonprogram.org for a number of UNIX platforms....

  13. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  14. Quantum Chemistry of Solids LCAO Treatment of Crystals and Nanostructures

    CERN Document Server

    Evarestov, Robert A

    2012-01-01

    Quantum Chemistry of Solids delivers a comprehensive account of the main features and possibilities of LCAO methods for the first principles calculations of electronic structure of periodic systems. The first part describes the basic theory underlying the LCAO methods  applied to periodic systems and the use of Hartree-Fock(HF), Density Function theory(DFT) and hybrid Hamiltonians. The translation and site symmetry consideration is included to establish connection between k-space solid –state physics and real-space quantum chemistry. The inclusion of electron correlation effects for periodic systems is considered on the basis of localized crystalline orbitals. The possibilities of LCAO methods for chemical bonding analysis in periodic systems are discussed. The second part deals with the applications of LCAO methods  for calculations of bulk crystal properties, including magnetic ordering and crystal structure optimization.  In the second edition two new chapters are added in the application part II of t...

  15. A high-throughput neutron spectrometer

    Science.gov (United States)

    Stampfl, Anton; Noakes, Terry; Bartsch, Friedl; Bertinshaw, Joel; Veliscek-Carolan, Jessica; Nateghi, Ebrahim; Raeside, Tyler; Yethiraj, Mohana; Danilkin, Sergey; Kearley, Gordon

    2010-03-01

    A cross-disciplinary high-throughput neutron spectrometer is currently under construction at OPAL, ANSTO's open pool light-water research reactor. The spectrometer is based on the design of a Be-filter spectrometer (FANS) that is operating at the National Institute of Standards research reactor in the USA. The ANSTO filter-spectrometer will be switched in and out with another neutron spectrometer, the triple-axis spectrometer, Taipan. Thus two distinct types of neutron spectrometers will be accessible: one specialised to perform phonon dispersion analysis and the other, the filter-spectrometer, designed specifically to measure vibrational density of states. A summary of the design will be given along with a detailed ray-tracing analysis. Some preliminary results will be presented from the spectrometer.

  16. Applications of High Throughput Nucleotide Sequencing

    DEFF Research Database (Denmark)

    Waage, Johannes Eichler

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come......-sequencing, a study of the effects on alternative RNA splicing of KO of the nonsense mediated RNA decay system in Mus, using digital gene expression and a custom-built exon-exon junction mapping pipeline is presented (article I). Evolved from this work, a Bioconductor package, spliceR, for classifying alternative...... splicing events and coding potential of isoforms from full isoform deconvolution software, such as Cufflinks (article II), is presented. Finally, a study using 5’-end RNA-seq for alternative promoter detection between healthy patients and patients with acute promyelocytic leukemia is presented (article III...

  17. Quasi-classical alternatives in quantum chemistry

    CERN Document Server

    Gineityte, V

    2014-01-01

    The article contains an overview of authors achievements in development of alternative quantum-chemical approaches oriented towards revival of the classical tradition of qualitative chemical thinking instead of obtaining numerical results. The above-mentioned tradition is concluded to be based mainly on principles (rules) of additivity, transferability and locality of molecular properties. Accordingly, model Hamiltonian matrices are used in the approaches under development (called quasi-classical alternatives), wherein algebraic parameters play the role of matrix elements and these are assumed to be transferable for similar atoms and/or atomic orbitals in addition. Further, passing to delocalized descriptions of electronic structures (as usual) is expected to be the main origin of difficulties seeking to formulate quasi-classical alternatives. In the framework of the canonical method of molecular orbitals (MOs), delocalization is shown to be partially avoidable by invoking a recently-suggested approach to sec...

  18. Quantum Chemistry via Walks in Determinant Space

    Energy Technology Data Exchange (ETDEWEB)

    Umrigar, Cyrus J. [Cornell Univ., Ithaca, NY (United States)

    2016-01-05

    There are many chemical questions of practical interest to the DOE that could be answered if there were an electronic structure method that provided consistently accurate results for all systems at an affordable computational cost. The coupled cluster method with single, double and perturbative triple excitations (CCSD(T)) is the most frequently used high-order method, but it has known deficiencies, e.g., in the description of stretched bonds. The full configuration interaction (FCI) method is the most robust method for treating electronic correlations, but it is little used because its computational cost scales exponentially in the size of the system. The largest calculation that has been done to date employed 10 billion determinants. In this regard, there was a major advance in 2010. The Alavi group at Cambridge University developed a stochastic approach to FCI --- combining it with ideas from quantum Monte Carlo (QMC) --- called FCIQMC, that allows one to go to a far larger number of determinants in certain circumstances. The computational cost is exponential in the system and basis size but with a much reduced exponent compared to conventional FCI. In this project Umrigar's group made several major improvements to the FCIQMC method that increased its efficiency by many orders of magnitude. In addition this project resulted in a cross-fertilization of ideas between the FCIQMC method, the older phaseless auxilliary-field quantum Monte Carlo (AFQMC) method developed by Zhang and Krakauer (two of the PI's of this project), and symmetry-restored wavefunctions developed by Scuseria (also a PI of this project).

  19. Let Students Derive, by Themselves, Two-Dimensional Atomic and Molecular Quantum Chemistry from Scratch

    Science.gov (United States)

    Ge, Yingbin

    2016-01-01

    Hands-on exercises are designed for undergraduate physical chemistry students to derive two-dimensional quantum chemistry from scratch for the H atom and H[subscript 2] molecule, both in the ground state and excited states. By reducing the mathematical complexity of the traditional quantum chemistry teaching, these exercises can be completed…

  20. High-throughput DNA droplet assays using picoliter reactor volumes.

    Science.gov (United States)

    Srisa-Art, Monpichar; deMello, Andrew J; Edel, Joshua B

    2007-09-01

    The online characterization and detection of individual droplets at high speeds, low analyte concentrations, and perfect detection efficiencies is a significant challenge underpinning the application of microfluidic droplet reactors to high-throughput chemistry and biology. Herein, we describe the integration of confocal fluorescence spectroscopy as a high-efficiency detection method for droplet-based microfluidics. Issues such as surface contamination, rapid mixing, and rapid detection, as well as low detections limits have been addressed with the approach described when compared to conventional laminar flow-based fluidics. Using such a system, droplet size, droplet shape, droplet formation frequencies, and droplet compositions can be measured accurately and precisely at kilohertz frequencies. Taking advantage of this approach, we demonstrate a high-throughput biological assay based on fluorescence resonance energy transfer (FRET). By attaching a FRET donor (Alexa Fluor 488) to streptavidin and labeling a FRET acceptor (Alexa Fluor 647) on one DNA strand and biotin on the complementary strand, donor and acceptor molecules are brought in proximity due to streptavidin-biotin binding, resulting in FRET. Fluorescence bursts of the donor and acceptor from each droplet can be monitored simultaneously using separate avalanche photodiode detectors operating in single photon counting mode. Binding assays were investigated and compared between fixed streptavidin and DNA concentrations. Binding curves fit perfectly to Hill-Waud models, and the binding ratio between streptavidin and biotin was evaluated and found to be in agreement with the biotin binding sites on streptavidin. FRET efficiency for this FRET pair was also investigated from the binding results. Efficiency results show that this detection system can precisely measure FRET even at low FRET efficiencies.

  1. NWChem: Quantum Chemistry Simulations at Scale

    Energy Technology Data Exchange (ETDEWEB)

    Apra, Edoardo; Kowalski, Karol; Hammond, Jeff R.; Klemm, Michael

    2015-01-17

    Methods based on quantum mechanics equations have been developed since the 1930's with the purpose of accurately studying the electronic structure of molecules. However, it is only during the last two decades that intense development of new computational algorithms has opened the possibility of performing accurate simulations of challenging molecular processes with high-order many-body methods. A wealth of evidence indicates that the proper inclusion of instantaneous interactions between electrons (or the so-called electron correlation effects) is indispensable for the accurate characterization of chemical reactivity, molecular properties, and interactions of light with matter. The availability of reliable methods for benchmarking of medium-size molecular systems provides also a unique chance to propagate high-level accuracy across spatial scales through the multiscale methodologies. Some of these methods have potential to utilize computational resources in an effi*cient way since they are characterized by high numerical complexity and appropriate level of data granularity, which can be effi*ciently distributed over multi-processor architectures. The broad spectrum of coupled cluster (CC) methods falls into this class of methodologies. Several recent CC implementations clearly demonstrated the scalability of CC formalisms on architectures composed of hundreds thousand computational cores. In this context NWChem provides a collection of Tensor Contraction Engine (TCE) generated parallel implementations of various coupled cluster methods capable of taking advantage of many thousand of cores on leadership class parallel architectures.

  2. Bond additivity corrections for quantum chemistry methods

    Energy Technology Data Exchange (ETDEWEB)

    C. F. Melius; M. D. Allendorf

    1999-04-01

    In the 1980's, the authors developed a bond-additivity correction procedure for quantum chemical calculations called BAC-MP4, which has proven reliable in calculating the thermochemical properties of molecular species, including radicals as well as stable closed-shell species. New Bond Additivity Correction (BAC) methods have been developed for the G2 method, BAC-G2, as well as for a hybrid DFT/MP2 method, BAC-Hybrid. These BAC methods use a new form of BAC corrections, involving atomic, molecular, and bond-wise additive terms. These terms enable one to treat positive and negative ions as well as neutrals. The BAC-G2 method reduces errors in the G2 method due to nearest-neighbor bonds. The parameters within the BAC-G2 method only depend on atom types. Thus the BAC-G2 method can be used to determine the parameters needed by BAC methods involving lower levels of theory, such as BAC-Hybrid and BAC-MP4. The BAC-Hybrid method should scale well for large molecules. The BAC-Hybrid method uses the differences between the DFT and MP2 as an indicator of the method's accuracy, while the BAC-G2 method uses its internal methods (G1 and G2MP2) to provide an indicator of its accuracy. Indications of the average error as well as worst cases are provided for each of the BAC methods.

  3. High Throughput Screening for Neurodegeneration and Complex Disease Phenotypes

    OpenAIRE

    Varma, Hemant; Lo, Donald C.; Stockwell, Brent R.

    2008-01-01

    High throughput screening (HTS) for complex diseases is challenging. This stems from the fact that complex phenotypes are difficult to adapt to rapid, high throughput assays. We describe the recent development of high throughput and high-content screens (HCS) for neurodegenerative diseases, with a focus on inherited neurodegenerative disorders, such as Huntington's disease. We describe, among others, HTS assays based on protein aggregation, neuronal death, caspase activation and mutant protei...

  4. Block-adaptive quantum mechanics: an adaptive divide-and-conquer approach to interactive quantum chemistry.

    Science.gov (United States)

    Bosson, Maël; Grudinin, Sergei; Redon, Stephane

    2013-03-05

    We present a novel Block-Adaptive Quantum Mechanics (BAQM) approach to interactive quantum chemistry. Although quantum chemistry models are known to be computationally demanding, we achieve interactive rates by focusing computational resources on the most active parts of the system. BAQM is based on a divide-and-conquer technique and constrains some nucleus positions and some electronic degrees of freedom on the fly to simplify the simulation. As a result, each time step may be performed significantly faster, which in turn may accelerate attraction to the neighboring local minima. By applying our approach to the nonself-consistent Atom Superposition and Electron Delocalization Molecular Orbital theory, we demonstrate interactive rates and efficient virtual prototyping for systems containing more than a thousand of atoms on a standard desktop computer.

  5. High-throughput rod-induced electrospinning

    Science.gov (United States)

    Wu, Dezhi; Xiao, Zhiming; Teh, Kwok Siong; Han, Zhibin; Luo, Guoxi; Shi, Chuan; Sun, Daoheng; Zhao, Jinbao; Lin, Liwei

    2016-09-01

    A high throughput electrospinning process, directly from flat polymer solution surfaces induced by a moving insulating rod, has been proposed and demonstrated. Different rods made of either phenolic resin or paper with a diameter of 1-3 cm and a resistance of about 100-500 MΩ, has been successfully utilized in the process. The rod is placed approximately 10 mm above the flat polymer solution surface with a moving speed of 0.005-0.4 m s-1 this causes the solution to generate multiple liquid jets under an applied voltage of 15-60 kV for the tip-less electrospinning process. The local electric field induced by the rod can boost electrohydrodynamic instability in order to generate Taylor cones and liquid jets. Experimentally, it is found that a large rod diameter and a small solution-to-rod distance can enhance the local electrical field to reduce the magnitude of the applied voltage. In the prototype setup with poly (ethylene oxide) polymer solution, an area of 5 cm  ×  10 cm and under an applied voltage of 60 kV, the maximum throughput of nanofibers is recorded to be approximately144 g m-2 h-1.

  6. Patterning cell using Si-stencil for high-throughput assay

    KAUST Repository

    Wu, Jinbo

    2011-01-01

    In this communication, we report a newly developed cell pattering methodology by a silicon-based stencil, which exhibited advantages such as easy handling, reusability, hydrophilic surface and mature fabrication technologies. Cell arrays obtained by this method were used to investigate cell growth under a temperature gradient, which demonstrated the possibility of studying cell behavior in a high-throughput assay. This journal is © The Royal Society of Chemistry 2011.

  7. Development of massively parallel quantum chemistry program SMASH

    Energy Technology Data Exchange (ETDEWEB)

    Ishimura, Kazuya [Department of Theoretical and Computational Molecular Science, Institute for Molecular Science 38 Nishigo-Naka, Myodaiji, Okazaki, Aichi 444-8585 (Japan)

    2015-12-31

    A massively parallel program for quantum chemistry calculations SMASH was released under the Apache License 2.0 in September 2014. The SMASH program is written in the Fortran90/95 language with MPI and OpenMP standards for parallelization. Frequently used routines, such as one- and two-electron integral calculations, are modularized to make program developments simple. The speed-up of the B3LYP energy calculation for (C{sub 150}H{sub 30}){sub 2} with the cc-pVDZ basis set (4500 basis functions) was 50,499 on 98,304 cores of the K computer.

  8. Relativistic quantum chemistry the fundamental theory of molecular science

    CERN Document Server

    Reiher, Markus

    2014-01-01

    Einstein proposed his theory of special relativity in 1905. For a long time it was believed that this theory has no significant impact on chemistry. This view changed in the 1970s when it was realized that (nonrelativistic) Schrödinger quantum mechanics yields results on molecular properties that depart significantly from experimental results. Especially when heavy elements are involved, these quantitative deviations can be so large that qualitative chemical reasoning and understanding is affected. For this to grasp the appropriate many-electron theory has rapidly evolved. Nowadays relativist

  9. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  10. Fluorescent Approaches to High Throughput Crystallography

    Science.gov (United States)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    We have shown that by covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and the presence of the probe at low concentrations does not affect the X-ray data quality or the crystallization behavior. The presence of the trace fluorescent label gives a number of advantages when used with high throughput crystallizations. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a dark background. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Brightly fluorescent crystals are readily found against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. We are now testing the use of high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that kinetics leading to non-structured phases may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Preliminary experiments with test proteins have resulted in the extraction of a number of crystallization conditions from screening outcomes based solely on the presence of bright fluorescent regions. Subsequent experiments will test this approach using a wider

  11. High Throughput Profiling of Molecular Shapes in Crystals

    Science.gov (United States)

    Spackman, Peter R.; Thomas, Sajesh P.; Jayatilaka, Dylan

    2016-02-01

    Molecular shape is important in both crystallisation and supramolecular assembly, yet its role is not completely understood. We present a computationally efficient scheme to describe and classify the molecular shapes in crystals. The method involves rotation invariant description of Hirshfeld surfaces in terms of of spherical harmonic functions. Hirshfeld surfaces represent the boundaries of a molecule in the crystalline environment, and are widely used to visualise and interpret crystalline interactions. The spherical harmonic description of molecular shapes are compared and classified by means of principal component analysis and cluster analysis. When applied to a series of metals, the method results in a clear classification based on their lattice type. When applied to around 300 crystal structures comprising of series of substituted benzenes, naphthalenes and phenylbenzamide it shows the capacity to classify structures based on chemical scaffolds, chemical isosterism, and conformational similarity. The computational efficiency of the method is demonstrated with an application to over 14 thousand crystal structures. High throughput screening of molecular shapes and interaction surfaces in the Cambridge Structural Database (CSD) using this method has direct applications in drug discovery, supramolecular chemistry and materials design.

  12. Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules

    CERN Document Server

    Lester, William A; Reynolds, PJ

    1994-01-01

    This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n

  13. Applications of Quantum Chemistry to the Study of Carbon Nanotubes

    Science.gov (United States)

    Jaffe, Richard L.

    2005-01-01

    For several years, scientists at NASA Ames have been studying the properties of carbon nanotubes using various experimental and computational methods. In this talk, I will compare different strategies for using quantum chemistry calculations to describe the electronic structure, deformation and chemical functionalization of single wall carbon nanotubes (SWNT) and the physisorption of small molecules on nanotube surfaces. The SWNT can be treated as an infinite (periodic) or finite length carbon cylinder or as a polycyclic aromatic hydrocarbon (PAH) molecule with an imposed curvature maintained by external constraints (as if it were cut out of the SWNT surface). Calculations are carried out using DFT and MP2 methods and a variety of atomic orbital basis sets from minimal (STO-3G) to valence triple zeta. The optimal approach is based on the particular SWNT property of interest. Examples to be discussed include: nanotube fluorination and other functionalization reactions; coating of nanotubes by water vapor and low-molecular weight organic molecules; and the nature of the interface between SWNT and liquids such as water and amines. In many cases, the quantum chemistry calculations are used to parameterize or validate force fields for molecular dynamics simulations. The results of these calculations have helped explain experimental data and contributed to the design of novel materials and sensors based on carbon nanotubes. Some of this research is described in the following papers:

  14. Fuzzy electron density fragments in macromolecular quantum chemistry, combinatorial quantum chemistry, functional group analysis, and shape-activity relations.

    Science.gov (United States)

    Mezey, Paul G

    2014-09-16

    Conspectus Just as complete molecules have no boundaries and have "fuzzy" electron density clouds approaching zero density exponentially at large distances from the nearest nucleus, a physically justified choice for electron density fragments exhibits similar behavior. Whereas fuzzy electron densities, just as any fuzzy object, such as a thicker cloud on a foggy day, do not lend themselves to easy visualization, one may partially overcome this by using isocontours. Whereas a faithful representation of the complete fuzzy density would need infinitely many such isocontours, nevertheless, by choosing a selected few, one can still obtain a limited pictorial representation. Clearly, such images are of limited value, and one better relies on more complete mathematical representations, using, for example, density matrices of fuzzy fragment densities. A fuzzy density fragmentation can be obtained in an exactly additive way, using the output from any of the common quantum chemical computational techniques, such as Hartree-Fock, MP2, and various density functional approaches. Such "fuzzy" electron density fragments properly represented have proven to be useful in a rather wide range of applications, for example, (a) using them as additive building blocks leading to efficient linear scaling macromolecular quantum chemistry computational techniques, (b) the study of quantum chemical functional groups, (c) using approximate fuzzy fragment information as allowed by the holographic electron density theorem, (d) the study of correlations between local shape and activity, including through-bond and through-space components of interactions between parts of molecules and relations between local molecular shape and substituent effects, (e) using them as tools of density matrix extrapolation in conformational changes, (f) physically valid averaging and statistical distribution of several local electron densities of common stoichiometry, useful in electron density databank mining, for

  15. Expression of results in quantum chemistry physical chemistry division commission on physicochemical symbols, terminology and units

    CERN Document Server

    Whiffen, D H

    2013-01-01

    Expression of Results in Quantum Chemistry recommends the appropriate insertion of physical constants in the output information of a theoretical paper in order to make the numerical end results of theoretical work easily transformed to SI units by the reader. The acceptance of this recommendation would circumvent the need for a set of atomic units each with its own symbol and name. It is the traditional use of the phrase """"atomic units"""" in this area which has obscured the real problem. The four SI dimensions of length, mass, time, and current require four physical constants to be permitte

  16. High throughput optoelectronic smart pixel systems using diffractive optics

    Science.gov (United States)

    Chen, Chih-Hao

    1999-12-01

    Recent developments in digital video, multimedia technology and data networks have greatly increased the demand for high bandwidth communication channels and high throughput data processing. Electronics is particularly suited for switching, amplification and logic functions, while optics is more suitable for interconnections and communications with lower energy and crosstalk. In this research, we present the design, testing, integration and demonstration of several optoelectronic smart pixel devices and system architectures. These systems integrate electronic switching/processing capability with parallel optical interconnections to provide high throughput network communication and pipeline data processing. The Smart Pixel Array Cellular Logic processor (SPARCL) is designed in 0.8 m m CMOS and hybrid integrated with Multiple-Quantum-Well (MQW) devices for pipeline image processing. The Smart Pixel Network Interface (SAPIENT) is designed in 0.6 m m GaAs and monolithically integrated with LEDs to implement a highly parallel optical interconnection network. The Translucent Smart Pixel Array (TRANSPAR) design is implemented in two different versions. The first version, TRANSPAR-MQW, is designed in 0.5 m m CMOS and flip-chip integrated with MQW devices to provide 2-D pipeline processing and translucent networking using the Carrier- Sense-MultipleAccess/Collision-Detection (CSMA/CD) protocol. The other version, TRANSPAR-VM, is designed in 1.2 m m CMOS and discretely integrated with VCSEL-MSM (Vertical-Cavity-Surface- Emitting-Laser and Metal-Semiconductor-Metal detectors) chips and driver/receiver chips on a printed circuit board. The TRANSPAR-VM provides an option of using the token ring network protocol in addition to the embedded functions of TRANSPAR-MQW. These optoelectronic smart pixel systems also require micro-optics devices to provide high resolution, high quality optical interconnections and external source arrays. In this research, we describe an innovative

  17. Few-Qubit Magnetic Resonance Quantum Information Processors: Simulating Chemistry and Physics

    CERN Document Server

    Criger, Ben; Baugh, Jonathan

    2012-01-01

    We review recent progress made in quantum information processing (QIP) which can be applied in the simulation of quantum systems and chemical phenomena. The review is focused on quantum algorithms which are useful for quantum simulation of chemistry and advances in nuclear magnetic resonance (NMR) and electron spin resonance (ESR) QIP. Discussions also include a number of recent experiments demonstrating the current capabilities of the NMR QIP for quantum simulation and prospects for spin-based implementations of QIP.

  18. Quantum-State-Resolved Ion-Molecule Chemistry

    Science.gov (United States)

    Chen, Gary; Yang, Tiangang; Campbell, Wesley; Hudson, Eric

    2016-05-01

    We propose a method to achieve quantum-state-resolved ion-molecule chemistry by utilizing cryogenic buffer gas cooling techniques and a combination of ion imaging and mass spectrometry of targets in an RF Paul trap. Cold molecular species produced by a cryogenic buffer gas beam (CBGB) are introduced to target ion species in an linear quadrupole trap (LQT) where ion imaging techniques and time of flight mass spectrometry (ToF) are then used to observe the target ions and the charged reaction products [1,2]. By taking advantage of the large ion-neutral interaction cross sections and characteristically long ion trap lifetimes, we can utilize the precision control over quantum states allowed by an ion trap to resolve state-to-state quantum chemical reactions without high-density molecular sample production, well within proposed capabilities. The combination of these two very general cold species production techniques allows for production and observation of a broad range of ion-neutral reactions. We initially plan to study chemical reactions between sympathetically cooled carbon ions (via laser cooled beryllium ions) with buffer gas cooled water. This work is supported by the US Air Force Office of Scientific Research.

  19. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  20. Walking in the woods with quantum chemistry--applications of quantum chemical calculations in natural products research.

    Science.gov (United States)

    Tantillo, Dean J

    2013-08-01

    This Highlight describes applications of quantum chemical calculations to problems in natural products chemistry, including the elucidation of natural product structures (distinguishing between constitutional isomers, distinguishing between diastereomers, and assigning absolute configuration) and determination of reasonable mechanisms for their formation.

  1. Development of a high-throughput replicon assay for the identification of respiratory syncytial virus inhibitors.

    Science.gov (United States)

    Tiong-Yip, Choi-Lai; Plant, Helen; Sharpe, Paul; Fan, Jun; Rich, Kirsty; Gorseth, Elise; Yu, Qin

    2014-01-01

    Respiratory syncytial virus (RSV) drug discovery has been hindered by the lack of good chemistry starting points and would benefit from robust and convenient assays for high-throughput screening (HTS). In this paper, we present the development and optimization of a 384-well RSV replicon assay that enabled HTS for RSV replication inhibitors with a low bio-containment requirement. The established replicon assay was successfully implemented for high-throughput screening. A validation screen was performed which demonstrated high assay performance and reproducibility. Assay quality was further confirmed via demonstration of appropriate pharmacology for different classes of RSV replication tool inhibitors. RSV replicon and cytotoxicity assays were further developed into a multiplexed format that measured both inhibition of viral replication and cytotoxicity from the same well. This provided a time and cost efficient approach to support lead optimization. In summary, we have developed a robust RSV replicon assay to help expedite the discovery of novel RSV therapeutics.

  2. Gradient Technology for High-Throughput Screening of Interactions between Cells and Nanostructured Materials

    Directory of Open Access Journals (Sweden)

    Andrew Michelmore

    2012-01-01

    Full Text Available We present a novel substrate suitable for the high-throughput analysis of cell response to variations in surface chemistry and nanotopography. Electrochemical etching was used to produce silicon wafers with nanopores between 10 and 100 nm in diameter. Over this substrate and flat silicon wafers, a gradient film ranging from hydrocarbon to carboxylic acid plasma polymer was deposited, with the concentration of surface carboxylic acid groups varying between 0.7 and 3% as measured by XPS. MG63 osteoblast-like cells were then cultured on these substrates and showed greatest cell spreading and adhesion onto porous silicon with a carboxylic acid group concentration between 2-3%. This method has great potential for high-throughput screening of cell-material interaction with particular relevance to tissue engineering.

  3. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek Co. Inc. proposes to develop a high throughput, nominal 100 W Hall Effect Thruster (HET). This HET will be sized for small spacecraft (< 180 kg), including...

  4. High Throughput Hall Thruster for Small Spacecraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek is developing a high throughput nominal 100-W Hall Effect Thruster. This device is well sized for spacecraft ranging in size from several tens of kilograms to...

  5. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    Science.gov (United States)

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  6. HIGH THROUGHPUT OF MAP PROCESSOR USING PIPELINE WINDOW DECODING

    Directory of Open Access Journals (Sweden)

    P. Nithya

    2012-11-01

    Full Text Available Turbo codes are one of the most efficient error correcting code which approaches the Shannon limit.The high throughput in turbo decoder can be achieved by parallelizing several soft Input Soft Output(SISOunits together.In this way,multiple SISO decoders work on the same data frame at the same values and delievers soft outputs can be split into three terms like the soft channel and a priori input and the extrinsic value.The extrinsic value is used for the next iteration.The high throughput of Max-Log-MAP processor tha supports both single Binary(SBand Double-binary(DB convolutional turbo codes.Decoding of these codes however an iterative processing is requires high computation rate and latency.Thus in order to achieve high throughput and to reduce latency by using serial processing techniques.The pipeline window(PWdecoding is introduced to support arbitrary frame sizes with high throughput.

  7. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  8. A Quantum Chemistry Study on Structural Properties of Petroleum Resin

    Institute of Scientific and Technical Information of China (English)

    Wang Daxi; Pan Yueqiu; Zhang Hongye

    2007-01-01

    The geometries of resins with single-layer (SG), double-layer (DG) and triple-layer (TG) were calculated with the quantum chemistry method. The geometries and net charges of atoms were obtained. The calculated average distances between layers were 0.5348 nm and 0.5051 nm and the action energies were -9.6355 kJ/mol and -32.2803 kJ/mol for resins DG and TG, respectively. Higher electronegative polar atoms can easily form hydrogen bonds with hydrogen atoms of other resin molecules, resulting in resin aggregates. The minimum cross-sectional diameters of resin molecules are too large to enter the pores of zeolite, so they are likely to crack on the surface of zeolite.

  9. FLASH Assembly of TALENs Enables High-Throughput Genome Editing

    OpenAIRE

    Reyon, Deepak; Tsai, Shengdar Q.; Khayter, Cyd; Foden, Jennifer A.; Sander, Jeffry D.; Joung, J. Keith

    2012-01-01

    Engineered transcription activator-like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the Fast Ligation-based Automatable Solid-phase High-throughput (FLASH) platform, a rapid and cost-effective method we developed to enable ...

  10. Inferential literacy for experimental high-throughput biology.

    Science.gov (United States)

    Miron, Mathieu; Nadon, Robert

    2006-02-01

    Many biologists believe that data analysis expertise lags behind the capacity for producing high-throughput data. One view within the bioinformatics community is that biological scientists need to develop algorithmic skills to meet the demands of the new technologies. In this article, we argue that the broader concept of inferential literacy, which includes understanding of data characteristics, experimental design and statistical analysis, in addition to computation, more adequately encompasses what is needed for efficient progress in high-throughput biology.

  11. Virtual high throughput screening (vHTS) - A perspective

    OpenAIRE

    Subramaniam, Sangeetha; Mehrotra, Monica; Gupta, Dinesh,

    2008-01-01

    With the exponential rise in the number of viable novel drug targets, computational methods are being increasingly applied to accelerate the drug discovery process. Virtual High Throughput Screening (vHTS) is one such established methodology to identify drug candidates from large collection of compound libraries. Although it complements the expensive and time consuming High Throughput Screening (HTS) of compound libraries, vHTS possess inherent challenges. The successful vHTS requires the car...

  12. C. elegans in high-throughput drug discovery

    OpenAIRE

    O’Reilly, Linda P.; Cliff J Luke; Perlmutter, David H.; Silverman, Gary A.; Pak, Stephen C.

    2013-01-01

    C. elegans has proven to be a useful model organism for investigating molecular and cellular aspects of numerous human diseases. More recently, investigators have explored the use of this organism as a tool for drug discovery. Although earlier drug screens were labor-intensive and low in throughput, recent advances in high-throughput liquid workflows, imaging platforms and data analysis software have made C. elegans a viable option for automated high-throughput drug screens. This review will ...

  13. Optically encoded microspheres for high-throughput analysis of genes and proteins

    Science.gov (United States)

    Gao, Xiaohu; Han, Mingyong; Nie, Shuming

    2002-06-01

    We have developed a novel optical coding technology for massively parallel and high-throughput analysis of biological molecules. Its unprecedented multiplexing capability is based on the unique optical properties of semiconductor quantum dots (QDs) and the ability to incorporate multicolor QQs into small polymer beads at precisely controlled ratios. The use of 10 intensity levels and 6 colors could theoretically code one million nucleic acid or protein sequences. Imaging and spectroscopic studies indicate that the QD tagged beads are highly uniform and reproducible, yielding bead identification accuracies as high as 99.99 percent under favorable conditions. DNA hybridization results demonstrate that the coding and target signals can be simultaneously read at the single-bead level. This spectral coding technology is expected to open new opportunities in gene expression studies, high-throughput screening, and medical diagnosis.

  14. Going beyond "no-pair relativistic quantum chemistry".

    Science.gov (United States)

    Liu, Wenjian; Lindgren, Ingvar

    2013-07-07

    The current field of relativistic quantum chemistry (RQC) has been built upon the no-pair and no-retardation approximations. While retardation effects must be treated in a time-dependent manner through quantum electrodynamics (QED) and are hence outside RQC, the no-pair approximation (NPA) has to be removed from RQC for it has some fundamental defects. Both configuration space and Fock space formulations have been proposed in the literature to do this. However, the former is simply wrong, whereas the latter is still incomplete. To resolve the old problems pertinent to the NPA itself and new problems beyond the NPA, we propose here an effective many-body (EMB) QED approach that is in full accordance with standard methodologies of electronic structure. As a first application, the full second order energy E2 of a closed-shell many-electron system subject to the instantaneous Coulomb-Breit interaction is derived, both algebraically and diagrammatically. It is shown that the same E2 can be obtained by means of 3 Goldstone-like diagrams through the standard many-body perturbation theory or 28 Feynman diagrams through the S-matrix technique. The NPA arises naturally by retaining only the terms involving the positive energy states. The potential dependence of the NPA can be removed by adding in the QED one-body counter terms involving the negative energy states, thereby leading to a "potential-independent no-pair approximation" (PI-NPA). The NPA, PI-NPA, EMB-QED, and full QED then span a continuous spectrum of relativistic molecular quantum mechanics.

  15. Exponentially more precise quantum simulation of fermions I: Quantum chemistry in second quantization

    CERN Document Server

    Babbush, Ryan; Kivlichan, Ian D; Wei, Annie Y; Love, Peter J; Aspuru-Guzik, Alán

    2015-01-01

    We introduce novel algorithms for the quantum simulation of molecular systems which are asymptotically more efficient than those based on the Trotter-Suzuki decomposition. We present the first application of a recently developed technique for simulating Hamiltonian evolution using a truncated Taylor series to obtain logarithmic scaling with the inverse of the desired precision, an exponential improvement over all prior methods. The two algorithms developed in this work rely on a second quantized encoding of the wavefunction in which the state of an $N$ spin-orbital system is encoded in ${\\cal O}(N)$ qubits. Our first algorithm has time complexity $\\widetilde{\\cal O}(N^8 t)$. Our second algorithm involves on-the-fly computation of molecular integrals, in a way that is exponentially more precise than classical sampling methods, by using the truncated Taylor series simulation technique. Our second algorithm has the lowest asymptotic time complexity of any second quantized approach to quantum chemistry simulation...

  16. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  17. Solved and unsolved problems in relativistic quantum chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Kutzelnigg, Werner, E-mail: werner.kutzelnigg@rub.de [Lehrstuhl fuer Theoretische Chemie, Ruhr-Universitaet Bochum, D-44780 Bochum (Germany)

    2012-02-20

    Graphical abstract: The graphical abstract represents the Dirac-Coulomb Hamiltonian in Fock space in a diagrammatic notation. A line (vertical or slanted) with an upgoing arrow represents an eletron, with a downgoing arrow a positron. A cross in the first line means the potential created by a nucleus, a broken line represents the Coulomb interaction between electrons and positrons. Highlights: Black-Right-Pointing-Pointer Relativistic many-electron theory needs a Fock space and a field-dependent vacuum. Black-Right-Pointing-Pointer A good starting point is QED in Coulomb gauge without transversal photons. Black-Right-Pointing-Pointer The Dirac underworld picture is obsolete. Black-Right-Pointing-Pointer A kinetically balanced even-tempered Gaussian basis is complete. Black-Right-Pointing-Pointer 'Quantum chemistry in Fock space is preferable over QED. - Abstract: A hierarchy of approximations in relativistic many-electron theory is discussed that starts with the Dirac equation and its expansion in a kinetically balanced basis, via a formulation of non-interacting electrons in Fock space (which is the only consistent way to deal with negative-energy states). The most straightforward approximate Hamiltonian for interacting electrons is derived from quantum electrodynamics (QED) in Coulomb gauge with the neglect of transversal photons. This allows an exact (non-perturbative) decoupling of the electromagnetic field from the fermionic field. The electric interaction of the fermions is non-retarded and non-quantized. The quantization of the fermionic field leads to a polarizable vacuum. The simplest (but somewhat problematic) approximation is a no-pair projected theory with external-field projectors. The Dirac-Coulomb operator in configuration space (first quantization) is not acceptable, even if the Brown-Ravenhall disease is much less virulent than often claimed. Effects of transversal photons, such as the Breit interaction and renormalized self-interaction can be

  18. Quantum Chemistry Meets Rotational Spectroscopy for Astrochemistry: Increasing Molecular Complexity

    Science.gov (United States)

    Puzzarini, Cristina

    2016-06-01

    For many years, scientists suspected that the interstellar medium was too hostile for organic species and that only a few simple molecules could be formed under such extreme conditions. However, the detection of approximately 180 molecules in interstellar or circumstellar environments in recent decades has changed this view dramatically. A rich chemistry has emerged, and relatively complex molecules such as C60 and C70 are formed. Recently, researchers have also detected complex organic and potentially prebiotic molecules, such as amino acids, in meteorites and in other space environments. Those discoveries have further stimulated the debate on the origin of the building blocks of life in the universe. Rotational spectroscopy plays a crucial role in the investigation of planetary atmosphere and the interstellar medium. Increasingly these astrochemical investigations are assisted by quantum-mechanical calculations of structures as well as spectroscopic and thermodynamic properties to guide and support observations, line assignments, and data analysis in these new and chemically complicated situations. However, it has proved challenging to extend accurate quantum-chemical computational approaches to larger systems because of the unfavorable scaling with the number of degrees of freedom (both electronic and nuclear). In this contribution, it is demonstrated that it is now possible to compute physicochemical properties of building blocks of biomolecules with an accuracy rivaling that of the most sophisticated experimental techniques. We analyze the spectroscopic properties of representative building blocks of DNA bases (uracil and thiouracil), of proteins (glycine and glycine dipeptide analogue), and also of PAH (phenalenyl radical and cation). V. Barone, M. Biczysko, C. Puzzarini 2015, Acc. Chem. Res., 48, 1413

  19. Whole-Parts Strategies in Quantum Chemistry: Some Philosophical and Mereological Lessons

    Directory of Open Access Journals (Sweden)

    Jean-Pierre Llored

    2014-12-01

    Full Text Available Philosophers mainly refer to quantum chemistry in order to address questions about the reducibility or autonomy of chemistry relative to quantum physics, and to argue for or against ontological emergence. To make their point, they scrutinize quantum approximations and formalisms as if they were independent of the questions at stake. This paper proposes a return to history and to the laboratory so as to emphasize how quantum chemists never cease to negotiate the relationships between a molecule, its parts, and its environment. This investigation will enable us to draw methodological conclusions about the role of history within philosophical studies, and to examine how quantum chemistry can clarify important philosophical and mereological issues related to the emergence/reduction debate, or to the way instruments and contexts are involved in the material making and the formal description of wholes and parts.

  20. Dynamic nuclear polarization NMR spectroscopy allows high-throughput characterization of microporous organic polymers.

    Science.gov (United States)

    Blanc, Frédéric; Chong, Samantha Y; McDonald, Tom O; Adams, Dave J; Pawsey, Shane; Caporini, Marc A; Cooper, Andrew I

    2013-10-16

    Dynamic nuclear polarization (DNP) solid-state NMR was used to obtain natural abundance (13)C and (15)N CP MAS NMR spectra of microporous organic polymers with excellent signal-to-noise ratio, allowing for unprecedented details in the molecular structure to be determined for these complex polymer networks. Sensitivity enhancements larger than 10 were obtained with bis-nitroxide radical at 14.1 T and low temperature (∼105 K). This DNP MAS NMR approach allows efficient, high-throughput characterization of libraries of porous polymers prepared by combinatorial chemistry methods.

  1. Combinatorial and high-throughput screening approaches for strain engineering.

    Science.gov (United States)

    Liu, Wenshan; Jiang, Rongrong

    2015-03-01

    Microbes have long been used in the industry to produce valuable biochemicals. Combinatorial engineering approaches, new strain engineering tools derived from inverse metabolic engineering, have started to attract attention in recent years, including genome shuffling, error-prone DNA polymerase, global transcription machinery engineering (gTME), random knockout/overexpression libraries, ribosome engineering, multiplex automated genome engineering (MAGE), customized optimization of metabolic pathways by combinatorial transcriptional engineering (COMPACTER), and library construction of "tunable intergenic regions" (TIGR). Since combinatorial approaches and high-throughput screening methods are fundamentally interconnected, color/fluorescence-based, growth-based, and biosensor-based high-throughput screening methods have been reviewed. We believe that with the help of metabolic engineering tools and new combinatorial approaches, plus effective high-throughput screening methods, researchers will be able to achieve better results on improving microorganism performance under stress or enhancing biochemical yield.

  2. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  3. A quantum chemistry study of natural gas hydrates.

    Science.gov (United States)

    Atilhan, Mert; Pala, Nezih; Aparicio, Santiago

    2014-04-01

    The structure and properties of natural gas hydrates containing hydrocarbons, CO₂, and N₂ molecules were studied by using computational quantum chemistry methods via the density functional theory approach. All host cages involved in I, II, and H types structures where filled with hydrocarbons up to pentanes, CO₂ and N₂ molecules, depending on their size, and the structures of these host-guest systems optimized. Structural properties, vibrational spectra, and density of states were analyzed together with results from atoms-in-a-molecule and natural bond orbitals methods. The inclusion of dispersion terms in the used functional plays a vital role for obtaining reliable information, and thus, B97D functional was shown to be useful for these systems. Results showed remarkable interaction energies, not strongly affected by the type of host cage, with molecules tending to be placed at the center of the cavities when host cages and guest molecules cavities are of similar size, but with molecules approaching hexagonal faces for larger cages. Vibrational properties show remarkable features in certain regions, with shiftings rising from host-guest interactions, and useful patterns in the terahertz region rising from water surface vibrations strongly coupled with guest molecules. Likewise, calculations on crystal systems for the I and H types were carried out using a pseudopotential approach combined with Grimme's method to take account of dispersion.

  4. Droplet microfluidics for high-throughput biological assays.

    Science.gov (United States)

    Guo, Mira T; Rotem, Assaf; Heyman, John A; Weitz, David A

    2012-06-21

    Droplet microfluidics offers significant advantages for performing high-throughput screens and sensitive assays. Droplets allow sample volumes to be significantly reduced, leading to concomitant reductions in cost. Manipulation and measurement at kilohertz speeds enable up to 10(8) samples to be screened in one day. Compartmentalization in droplets increases assay sensitivity by increasing the effective concentration of rare species and decreasing the time required to reach detection thresholds. Droplet microfluidics combines these powerful features to enable currently inaccessible high-throughput screening applications, including single-cell and single-molecule assays.

  5. High-throughput Binary Vectors for Plant Gene Function Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhi-Yong Lei; Ping Zhao; Min-Jie Cao; Rong Cui; Xi Chen; Li-Zhong Xiong; Qi-Fa Zhang; David J. Oliver; Cheng-Bin Xiang

    2007-01-01

    A series of high-throughput binary cloning vectors were constructed to facilitate gene function analysis in higher plants. This vector series consists of plasmids designed for plant expression, promoter analysis, gene silencing,and green fluorescent protein fusions for protein localization. These vectors provide for high-throughput and efficient cloning utilizing sites for λ phage integrase/excisionase. In addition, unique restriction sites are incorporated in a multiple cloning site and enable promoter replacement. The entire vector series are available with complete sequence information and detailed annotations and are freely distributed to the scientific community for non-commercial uses.

  6. Screening and synthesis: high throughput technologies applied to parasitology.

    Science.gov (United States)

    Morgan, R E; Westwood, N J

    2004-01-01

    High throughput technologies continue to develop in response to the challenges set by the genome projects. This article discusses how the techniques of both high throughput screening (HTS) and synthesis can influence research in parasitology. Examples of the use of targeted and phenotype-based HTS using unbiased compound collections are provided. The important issue of identifying the protein target(s) of bioactive compounds is discussed from the synthetic chemist's perspective. This article concludes by reviewing recent examples of successful target identification studies in parasitology.

  7. Perspective: Data infrastructure for high throughput materials discovery

    Science.gov (United States)

    Pfeif, E. A.; Kroenlein, K.

    2016-05-01

    Computational capability has enabled materials design to evolve from trial-and-error towards more informed methodologies that require large amounts of data. Expert-designed tools and their underlying databases facilitate modern-day high throughput computational methods. Standard data formats and communication standards increase the impact of traditional data, and applying these technologies to a high throughput experimental design provides dense, targeted materials data that are valuable for material discovery. Integrated computational materials engineering requires both experimentally and computationally derived data. Harvesting these comprehensively requires different methods of varying degrees of automation to accommodate variety and volume. Issues of data quality persist independent of type.

  8. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  9. High throughput recombinant protein production of fungal secreted proteins

    DEFF Research Database (Denmark)

    Vala, Andrea Lages Lino; Roth, Doris; Grell, Morten Nedergaard

    2011-01-01

    a high-throughput protein production system with a special focus on fungal secreted proteins. We use a ligation independent cloning to clone target genes into expression vectors for E. coli and P. pastoris and a small scale test expression to identify constructs producing soluble protein. Expressed...... interaction), between fungi of the order Entomophthorales and aphids (pathogenic interaction), and in the mycoparasitic interaction between the oomycetes Pythium oligandrum and P. ultimum. In general, the high-throughput protein production system can lead to a better understanding of fungal/host interactions...

  10. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  11. The many roles of quantum chemical predictions in synthetic organic chemistry.

    Science.gov (United States)

    Nguyen, Quynh Nhu N; Tantillo, Dean J

    2014-03-01

    This account discusses representative case studies for various applications of quantum chemical calculations in synthetic organic chemistry. These include confirmation of target structures, methodology development, and catalyst design. These examples demonstrate how predictions from quantum chemical calculations can be utilized to streamline synthetic efforts.

  12. Efficient Management of High-Throughput Screening Libraries with SAVANAH

    DEFF Research Database (Denmark)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen;

    2016-01-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such scr...

  13. High-throughput bioinformatics with the Cyrille2 pipeline system.

    NARCIS (Netherlands)

    Fiers, M.W.E.J.; Burgt, van der A.; Datema, E.; Groot, de J.C.W.; Ham, van R.C.H.J.

    2008-01-01

    Background - Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses a

  14. Algorithms for mapping high-throughput DNA sequences

    DEFF Research Database (Denmark)

    Frellsen, Jes; Menzel, Peter; Krogh, Anders

    2014-01-01

    Abstract High-throughput sequencing (HTS) technologies revolutionized the field of molecular biology by enabling large scale whole genome sequencing as well as a broad range of experiments for studying the cell's inner workings directly on DNA or RNA level. Given the dramatically increased rate...

  15. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning (EF

  16. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    2007-01-01

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an organism

  17. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  18. Chemometric Optimization Studies in Catalysis Employing High-Throughput Experimentation

    NARCIS (Netherlands)

    Pereira, S.R.M.

    2008-01-01

    The main topic of this thesis is the investigation of the synergies between High-Throughput Experimentation (HTE) and Chemometric Optimization methodologies in Catalysis research and of the use of such methodologies to maximize the advantages of using HTE methods. Several case studies were analysed

  19. High-Throughput, Large-Scale SNP Genotyping: Bioinformatics Considerations

    OpenAIRE

    Margetic, Nino

    2004-01-01

    In order to provide a high-throughput, large-scale genotyping facility at the national level we have developed a set of inter-dependent information systems. A combination of commercial, publicly-available and in-house developed tools links a series of data repositories based both on flat files and relational databases providing an almost complete semi-automated pipeline.

  20. An improved high throughput sequencing method for studying oomycete communities

    DEFF Research Database (Denmark)

    Sapkota, Rumakanta; Nicolaisen, Mogens

    2015-01-01

    Culture-independent studies using next generation sequencing have revolutionizedmicrobial ecology, however, oomycete ecology in soils is severely lagging behind. The aimof this study was to improve and validate standard techniques for using high throughput sequencing as a tool for studying oomyce...

  1. Efficient Management of High-Throughput Screening Libraries with SAVANAH

    DEFF Research Database (Denmark)

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen

    2017-01-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis...

  2. Automatic Spot Identification for High Throughput Microarray Analysis

    Science.gov (United States)

    Wu, Eunice; Su, Yan A.; Billings, Eric; Brooks, Bernard R.; Wu, Xiongwu

    2013-01-01

    High throughput microarray analysis has great potential in scientific research, disease diagnosis, and drug discovery. A major hurdle toward high throughput microarray analysis is the time and effort needed to accurately locate gene spots in microarray images. An automatic microarray image processor will allow accurate and efficient determination of spot locations and sizes so that gene expression information can be reliably extracted in a high throughput manner. Current microarray image processing tools require intensive manual operations in addition to the input of grid parameters to correctly and accurately identify gene spots. This work developed a method, herein called auto-spot, to automate the spot identification process. Through a series of correlation and convolution operations, as well as pixel manipulations, this method makes spot identification an automatic and accurate process. Testing with real microarray images has demonstrated that this method is capable of automatically extracting subgrids from microarray images and determining spot locations and sizes within each subgrid, regardless of variations in array patterns and background noises. With this method, we are one step closer to the goal of high throughput microarray analysis. PMID:24298393

  3. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning (EF

  4. MIPHENO: data normalization for high throughput metabolite analysis

    Directory of Open Access Journals (Sweden)

    Bell Shannon M

    2012-01-01

    Full Text Available Abstract Background High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course of months and years, often without the controls needed to compare directly across the dataset. Few methods are available to facilitate comparisons of high throughput metabolic data generated in batches where explicit in-group controls for normalization are lacking. Results Here we describe MIPHENO (Mutant Identification by Probabilistic High throughput-Enabled Normalization, an approach for post-hoc normalization of quantitative first-pass screening data in the absence of explicit in-group controls. This approach includes a quality control step and facilitates cross-experiment comparisons that decrease the false non-discovery rates, while maintaining the high accuracy needed to limit false positives in first-pass screening. Results from simulation show an improvement in both accuracy and false non-discovery rate over a range of population parameters (p -16 and a modest but significant (p -16 improvement in area under the receiver operator characteristic curve of 0.955 for MIPHENO vs 0.923 for a group-based statistic (z-score. Analysis of the high throughput phenotypic data from the Arabidopsis Chloroplast 2010 Project (http://www.plastid.msu.edu/ showed ~ 4-fold increase in the ability to detect previously described or expected phenotypes over the group based statistic. Conclusions Results demonstrate MIPHENO offers substantial benefit in improving the ability to detect putative mutant phenotypes from post-hoc analysis of large data sets. Additionally, it facilitates data interpretation and permits cross-dataset comparison where group-based controls are missing. MIPHENO is applicable to a wide range of high throughput screenings and the code is

  5. High-Throughput Small Molecule Identification using MALDI-TOF and a Nano-Layered Substrate

    Science.gov (United States)

    Lee, Jeong Heon; Choi, Hak Soo; Nasr, Khaled A.; Ha, Miyoung; Kim, Yangsun; Frangioni, John V.

    2011-01-01

    Encoderless combinatorial chemistry requires high-throughput product identification without the use of chemical or other tags. We developed a novel nano-layered substrate plate and combined it with a microarraying robot, matrix-assisted laser desorption/ionization (MALDI) mass spectrometry, and custom software to produce a high-throughput small molecule identification system. To optimize system performance, we spotted 5 different chemical entities, spanning a m/z range of 195 to 1338, in 20,304 spots for a total of 101,520 molecules. The initial spot identification rate was 99.85% (20,273 spots), and after a proofreading algorithm was added, 100% of 20,304 spots and 101,520 molecules were identified. An internal recalibration algorithm also significantly improved mass accuracy to as low as 45 ppm. Using this optimized system, 47 different chemical entities, spanning a m/z range of 138 to 1,592, were spotted over 5,076 spots and could be identified with 100% accuracy. Our study lays the foundation for improved encoderless combinatorial chemistry. PMID:21651231

  6. Nanoliter homogenous ultra-high throughput screening microarray for lead discoveries and IC50 profiling.

    Science.gov (United States)

    Ma, Haiching; Horiuchi, Kurumi Y; Wang, Yuan; Kucharewicz, Stefan A; Diamond, Scott L

    2005-04-01

    Microfluidic technologies offer the potential for highly productive and low-cost ultra-high throughput screening and high throughput selectivity profiling. Such technologies need to provide the flexibility of plate-based assays as well as be less expensive to operate. Presented here is a unique microarray system (the Reaction Biology [Malvern, PA] DiscoveryDot), which runs over 6,000 homogeneous reactions per 1" x 3" microarray using chemical libraries or compound dilutions printed in 1-nl volumes. A simple and rapid piezo-activation method delivers from 30 to 300 pl of biochemical targets and detector chemistries to each reaction. The fluorescent signals are detected and analyzed with conventional microarray scanners and software. The DiscoveryDot platform is highly customizable, and reduces consumption of targets and reaction chemistries by >40-fold and the consumption of compounds by >10,000-fold, compared to 384-well plate assay. We demonstrate here that the DiscoveryDot platform is compatible with conventional large-volume well-based reactions, with a Z' factor of >0.6 for many enzymes, such as the caspase family enzymes, matrix metalloproteinase, serine proteases, kinases, and histone deacetylases. The platform is well equipped for 50% inhibitory concentration (IC50) profiling studies of enzyme inhibitors, with up to 10 dilution conditions of each test compound printed in duplicate, and each microarray chip can generate over 300 IC50 measurements against a given target.

  7. Teaching the Philosophical Interpretations of Quantum Mechanics and Quantum Chemistry Through Controversies

    Science.gov (United States)

    Garritz, Andoni

    2013-07-01

    This study has the key premise of teaching history and philosophy of physical sciences to illustrate how controversies and rivalries among scientists play a key role in the progress of science and why scientific development is not only founded on the accumulation of experimental data. The author is a defender of teachers who consider philosophical, historical and socio-scientific issues. In particular, the disputes can be used in science teaching to promote students awareness of the "historicity" of science and to facilitate the understanding of scientific progress beyond that of inductive generalizations. The establishment of a theory is accompanied with philosophical interpretations all the way. The author will try to show that it gives excellent results in teaching and learning to bring to the foreground the complexity that surrounds the development of ideas in science, illustrating how controversies, presuppositions, contradictions and inconsistencies find a place in the work of scientists and philosophers alike. In this sense, the case of quantum mechanics and quantum chemistry is very solid because it is historically full of controversies among their heads: Einstein, Bohr, De Broglie, Heisenberg, Schrödinger, Born, Lewis, Langmuir, Bader, Hoffmann and Pauling, at least.

  8. Principles of conjugating quantum dots to proteins via carbodiimide chemistry.

    Science.gov (United States)

    Song, Fayi; Chan, Warren C W

    2011-12-09

    The covalent coupling of nanomaterials to bio-recognition molecules is a critical intermediate step in using nanomaterials for biology and medicine. Here we investigate the carbodiimide-mediated conjugation of fluorescent quantum dots to different proteins (e.g., immunoglobulin G, bovine serum albumin, and horseradish peroxidase). To enable these studies, we developed a simple method to isolate quantum dot bioconjugates from unconjugated quantum dots. The results show that the reactant concentrations and protein type will impact the overall number of proteins conjugated onto the surfaces of the quantum dots, homogeneity of the protein-quantum dot conjugate population, quantum efficiency, binding avidity, and enzymatic kinetics. We propose general principles that should be followed for the successful coupling of proteins to quantum dots.

  9. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  10. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...... maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content...... and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers...

  11. Sensitivity study of reliable, high-throughput resolution metricsfor photoresists

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Christopher N.; Naulleau, Patrick P.

    2007-07-30

    The resolution of chemically amplified resists is becoming an increasing concern, especially for lithography in the extreme ultraviolet (EUV) regime. Large-scale screening and performance-based down-selection is currently underway to identify resist platforms that can support shrinking feature sizes. Resist screening efforts, however, are hampered by the absence of reliable resolution metrics that can objectively quantify resist resolution in a high-throughput fashion. Here we examine two high-throughput metrics for resist resolution determination. After summarizing their details and justifying their utility, we characterize the sensitivity of both metrics to two of the main experimental uncertainties associated with lithographic exposure tools, namely: limited focus control and limited knowledge of optical aberrations. For an implementation at EUV wavelengths, we report aberration and focus limited error bars in extracted resolution of {approx} 1.25 nm RMS for both metrics making them attractive candidates for future screening and down-selection efforts.

  12. A high-throughput label-free nanoparticle analyser

    Science.gov (United States)

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M.; Ruoslahti, Erkki; Cleland, Andrew N.

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10-6 l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  13. High-throughput theoretical design of lithium battery materials

    Science.gov (United States)

    Shi-Gang, Ling; Jian, Gao; Rui-Juan, Xiao; Li-Quan, Chen

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. Project supported by the National Natural Science Foundation of China (Grant Nos. 11234013 and 51172274) and the National High Technology Research and Development Program of China (Grant No. 2015AA034201).

  14. A system for performing high throughput assays of synaptic function.

    Directory of Open Access Journals (Sweden)

    Chris M Hempel

    Full Text Available Unbiased, high-throughput screening has proven invaluable for dissecting complex biological processes. Application of this general approach to synaptic function would have a major impact on neuroscience research and drug discovery. However, existing techniques for studying synaptic physiology are labor intensive and low-throughput. Here, we describe a new high-throughput technology for performing assays of synaptic function in primary neurons cultured in microtiter plates. We show that this system can perform 96 synaptic vesicle cycling assays in parallel with high sensitivity, precision, uniformity, and reproducibility and can detect modulators of presynaptic function. By screening libraries of pharmacologically defined compounds on rat forebrain cultures, we have used this system to identify novel effects of compounds on specific aspects of presynaptic function. As a system for unbiased compound as well as genomic screening, this technology has significant applications for basic neuroscience research and for the discovery of novel, mechanism-based treatments for central nervous system disorders.

  15. Multiple column high-throughput e-beam inspection (EBI)

    Science.gov (United States)

    Lam, David K.; Monahan, Kevin M.; Liu, Enden D.; Tran, Cong; Prescop, Ted

    2012-03-01

    Single-column e-beam systems are used in production for the detection of electrical defects, but are too slow to be used for the detection of small physical defects, and can't meet future inspection requirements. This paper presents a multiplecolumn e-beam technology for high throughput wafer inspection. Multibeam has developed all-electrostatic columns for high-resolution imaging. The elimination of magnetic coils enables the columns to be small; e-beam deflection is faster in the absence of magnetic hysteresis. Multiple miniaturecolumns are assembled in an array. An array of 100 columns covers the entire surface of a 300mm wafer, affording simultaneous cross-wafer sampling. Column performance simulations and system architecture are presented. Also provided are examples of high throughput, more efficient, multiple-column wafer inspection.

  16. Galaxy High Throughput Genotyping Pipeline for GeneTitan.

    Science.gov (United States)

    Karpenko, Oleksiy; Bahroos, Neil; Chukhman, Morris; Dong, Xiao; Kanabar, Pinal; Arbieva, Zarema; Jackson, Tommie; Hendrickson, William

    2013-01-01

    Latest genotyping solutions allow for rapid testing of more than two million markers in one experiment. Fully automated instruments such as Affymetrix GeneTitan enable processing of large numbers of samples in a truly high-throughput manner. In concert with solutions like Axiom, fully customizable array plates can now utilize automated workflows that can leverage multi-channel instrumentation like the GeneTitan. With the growing size of raw data output, the serial computational architecture of the software, typically distributed by the vendors on turnkey desktop solutions for quality control and genotype calling, becomes legacy rather than an advantage. Advanced software techniques provide power, flexibility, and can be deployed in an HPC environment, but become technically inconvenient for biologists to use. Here we present a pipeline that uses Galaxy as a mechanism to lower the barrier for complex analysis, and increase efficiency by leveraging high-throughput computing.

  17. High-throughput screening in the C. elegans nervous system.

    Science.gov (United States)

    Kinser, Holly E; Pincus, Zachary

    2016-06-03

    The nematode Caenorhabditis elegans is widely used as a model organism in the field of neurobiology. The wiring of the C. elegans nervous system has been entirely mapped, and the animal's optical transparency allows for in vivo observation of neuronal activity. The nematode is also small in size, self-fertilizing, and inexpensive to cultivate and maintain, greatly lending to its utility as a whole-animal model for high-throughput screening (HTS) in the nervous system. However, the use of this organism in large-scale screens presents unique technical challenges, including reversible immobilization of the animal, parallel single-animal culture and containment, automation of laser surgery, and high-throughput image acquisition and phenotyping. These obstacles require significant modification of existing techniques and the creation of new C. elegans-based HTS platforms. In this review, we outline these challenges in detail and survey the novel technologies and methods that have been developed to address them.

  18. High-throughput screening for modulators of cellular contractile force

    CERN Document Server

    Park, Chan Young; Tambe, Dhananjay; Chen, Bohao; Lavoie, Tera; Dowell, Maria; Simeonov, Anton; Maloney, David J; Marinkovic, Aleksandar; Tschumperlin, Daniel J; Burger, Stephanie; Frykenberg, Matthew; Butler, James P; Stamer, W Daniel; Johnson, Mark; Solway, Julian; Fredberg, Jeffrey J; Krishnan, Ramaswamy

    2014-01-01

    When cellular contractile forces are central to pathophysiology, these forces comprise a logical target of therapy. Nevertheless, existing high-throughput screens are limited to upstream signaling intermediates with poorly defined relationship to such a physiological endpoint. Using cellular force as the target, here we screened libraries to identify novel drug candidates in the case of human airway smooth muscle cells in the context of asthma, and also in the case of Schlemm's canal endothelial cells in the context of glaucoma. This approach identified several drug candidates for both asthma and glaucoma. We attained rates of 1000 compounds per screening day, thus establishing a force-based cellular platform for high-throughput drug discovery.

  19. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    High-throughput screening is extensively applied for identification of drug targets and drug discovery and recently it found entry into toxicity testing. Reverse phase protein arrays (RPPAs) are used widespread for quantification of protein markers. We reasoned that RPPAs also can be utilized...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si...... a robotic screening platform. Furthermore, we automated sample tracking and data analysis by developing a bundled bioinformatics tool named “MIRACLE”. Automation and RPPA-based viability/toxicity readouts enable rapid testing of large sample numbers, while granting the possibility for flexible consecutive...

  20. Spotsizer: High-throughput quantitative analysis of microbial growth

    Science.gov (United States)

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  1. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  2. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy

    Directory of Open Access Journals (Sweden)

    Hossein Pourmodheji

    2016-06-01

    Full Text Available Nuclear Magnetic Resonance (NMR is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS. In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery.

  3. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy.

    Science.gov (United States)

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-06-09

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery.

  4. High throughput screening operations at the University of Kansas.

    Science.gov (United States)

    Roy, Anuradha

    2014-05-01

    The High Throughput Screening Laboratory at University of Kansas plays a critical role in advancing academic interest in the identification of chemical probes as tools to better understand the biological and biochemical basis of new therapeutic targets. The HTS laboratory has an open service policy and collaborates with internal and external academia as well as for-profit organizations to execute projects requiring HTS-compatible assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization.

  5. Systematic error detection in experimental high-throughput screening

    OpenAIRE

    2011-01-01

    Abstract Background High-throughput screening (HTS) is a key part of the drug discovery process during which thousands of chemical compounds are screened and their activity levels measured in order to identify potential drug candidates (i.e., hits). Many technical, procedural or environmental factors can cause systematic measurement error or inequalities in the conditions in which the measurements are taken. Such systematic error has the potential to critically affect the hit selection proces...

  6. Targeted high-throughput sequencing of tagged nucleic acid samples

    OpenAIRE

    M.; Meyer; Stenzel, U.; Myles, S.; Prüfer, K; Hofreiter, M.

    2007-01-01

    High-throughput 454 DNA sequencing technology allows much faster and more cost-effective sequencing than traditional Sanger sequencing. However, the technology imposes inherent limitations on the number of samples that can be processed in parallel. Here we introduce parallel tagged sequencing (PTS), a simple, inexpensive and flexible barcoding technique that can be used for parallel sequencing any number and type of double-stranded nucleic acid samples. We demonstrate that PTS is particularly...

  7. Mass spectrometry for high-throughput metabolomics analysis of urine

    OpenAIRE

    Abdelrazig, Salah M.A.

    2015-01-01

    Direct electrospray ionisation-mass spectrometry (direct ESI-MS), by omitting the chromatographic step, has great potential for application as a high-throughput approach for untargeted urine metabolomics analysis compared to liquid chromatography-mass spectrometry (LC-MS). The rapid development and technical innovations revealed in the field of ambient ionisation MS such as nanoelectrospray ionisation (nanoESI) chip-based infusion and liquid extraction surface analysis mass spectrometry (LESA...

  8. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  9. Generating barcoded libraries for multiplex high-throughput sequencing.

    Science.gov (United States)

    Knapp, Michael; Stiller, Mathias; Meyer, Matthias

    2012-01-01

    Molecular barcoding is an essential tool to use the high throughput of next generation sequencing platforms optimally in studies involving more than one sample. Various barcoding strategies allow for the incorporation of short recognition sequences (barcodes) into sequencing libraries, either by ligation or polymerase chain reaction (PCR). Here, we present two approaches optimized for generating barcoded sequencing libraries from low copy number extracts and amplification products typical of ancient DNA studies.

  10. Condor-COPASI: high-throughput computing for biochemical networks

    OpenAIRE

    Kent Edward; Hoops Stefan; Mendes Pedro

    2012-01-01

    Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary experti...

  11. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Document Server

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  12. FLASH assembly of TALENs for high-throughput genome editing.

    Science.gov (United States)

    Reyon, Deepak; Tsai, Shengdar Q; Khayter, Cyd; Foden, Jennifer A; Sander, Jeffry D; Joung, J Keith

    2012-05-01

    Engineered transcription activator–like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published, and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the fast ligation-based automatable solid-phase high-throughput (FLASH) system, a rapid and cost-effective method for large-scale assembly of TALENs. We tested 48 FLASH-assembled TALEN pairs in a human cell–based EGFP reporter system and found that all 48 possessed efficient gene-modification activities. We also used FLASH to assemble TALENs for 96 endogenous human genes implicated in cancer and/or epigenetic regulation and found that 84 pairs were able to efficiently introduce targeted alterations. Our results establish the robustness of TALEN technology and demonstrate that FLASH facilitates high-throughput genome editing at a scale not currently possible with other genome modification technologies.

  13. Condor-COPASI: high-throughput computing for biochemical networks

    Directory of Open Access Journals (Sweden)

    Kent Edward

    2012-07-01

    Full Text Available Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage.

  14. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  15. High throughput biotechnology in traditional fermented food industry.

    Science.gov (United States)

    Yang, Yong; Xu, Rong-man; Song, Jia; Wang, Wei-min

    2010-11-01

    Traditional fermented food is not only the staple food for most of developing countries but also the key healthy food for developed countries. As the healthy function of these foods are gradually discovered, more and more high throughput biotechnologies are being used to promote the old and new industry. As a result, the microflora, manufacturing processes and product healthy function of these foods were pushed forward either in the respect of profundity or extensiveness nowadays. The application and progress of the high throughput biotechnologies into traditional fermented food industries were different from each other, which was reviewed and detailed by the catalogues of fermented milk products (yogurt, cheese), fermented sausages, fermented vegetables (kimchi, sauerkraut), fermented cereals (sourdough) and fermented beans (tempeh, natto). Given the further promotion by high throughput biotechnologies, the middle and/or down-stream process of traditional fermented foods would be optimized and the process of industrialization of local traditional fermented food having many functional factors but in small quantity would be accelerated. The article presents some promising patents on traditional fermented food industry.

  16. A microdroplet dilutor for high-throughput screening

    Science.gov (United States)

    Niu, Xize; Gielen, Fabrice; Edel, Joshua B.; Demello, Andrew J.

    2011-06-01

    Pipetting and dilution are universal processes used in chemical and biological laboratories to assay and experiment. In microfluidics such operations are equally in demand, but difficult to implement. Recently, droplet-based microfluidics has emerged as an exciting new platform for high-throughput experimentation. However, it is challenging to vary the concentration of droplets rapidly and controllably. To this end, we developed a dilution module for high-throughput screening using droplet-based microfluidics. Briefly, a nanolitre-sized sample droplet of defined concentration is trapped within a microfluidic chamber. Through a process of droplet merging, mixing and re-splitting, this droplet is combined with a series of smaller buffer droplets to generate a sequence of output droplets that define a digital concentration gradient. Importantly, the formed droplets can be merged with other reagent droplets to enable rapid chemical and biological screens. As a proof of concept, we used the dilutor to perform a high-throughput homogeneous DNA-binding assay using only nanolitres of sample.

  17. High-throughput screening of cell responses to biomaterials.

    Science.gov (United States)

    Yliperttula, Marjo; Chung, Bong Geun; Navaladi, Akshay; Manbachi, Amir; Urtti, Arto

    2008-10-02

    Biomaterials have emerged as powerful regulators of the cellular microenvironment for drug discovery, tissue engineering research and chemical testing. Although biomaterial-based matrices control the cellular behavior, these matrices are still far from being optimal. In principle, efficacy of biomaterial development for the cell cultures can be improved by using high-throughput techniques that allow screening of a large number of materials and manipulate microenvironments in a controlled manner. Several cell responses such as toxicity, proliferation, and differentiation have been used to evaluate the biomaterials thus providing basis for further selection of the lead biomimetic materials or microenvironments. Although high-throughput techniques provide an initial screening of the desired properties, more detailed follow-up studies of the selected materials are required to understand the true value of a 'positive hit'. High-throughput methods may become important tools in the future development of biomaterials-based cell cultures that will enable more realistic pre-clinical prediction of pharmacokinetics, pharmacodynamics, and toxicity. This is highly important, because predictive pre-clinical methods are needed to improve the high attrition rate of drug candidates during clinical testing.

  18. NCBI GEO: archive for high-throughput functional genomic data.

    Science.gov (United States)

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron

    2009-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.

  19. Human development VIII: a theory of "deep" quantum chemistry and cell consciousness: quantum chemistry controls genes and biochemistry to give cells and higher organisms consciousness and complex behavior.

    Science.gov (United States)

    Ventegodt, Søren; Hermansen, Tyge Dahl; Flensborg-Madsen, Trine; Nielsen, Maj Lyck; Merrick, Joav

    2006-11-14

    Deep quantum chemistry is a theory of deeply structured quantum fields carrying the biological information of the cell, making it able to remember, intend, represent the inner and outer world for comparison, understand what it "sees", and make choices on its structure, form, behavior and division. We suggest that deep quantum chemistry gives the cell consciousness and all the qualities and abilities related to consciousness. We use geometric symbolism, which is a pre-mathematical and philosophical approach to problems that cannot yet be handled mathematically. Using Occam's razor we have started with the simplest model that works; we presume this to be a many-dimensional, spiral fractal. We suggest that all the electrons of the large biological molecules' orbitals make one huge "cell-orbital", which is structured according to the spiral fractal nature of quantum fields. Consciousness of single cells, multi cellular structures as e.g. organs, multi-cellular organisms and multi-individual colonies (like ants) and human societies can thus be explained by deep quantum chemistry. When biochemical activity is strictly controlled by the quantum-mechanical super-orbital of the cell, this orbital can deliver energetic quanta as biological information, distributed through many fractal levels of the cell to guide form and behavior of an individual single or a multi-cellular organism. The top level of information is the consciousness of the cell or organism, which controls all the biochemical processes. By this speculative work inspired by Penrose and Hameroff we hope to inspire other researchers to formulate more strict and mathematically correct hypothesis on the complex and coherence nature of matter, life and consciousness.

  20. Problems and solutions in quantum chemistry and physics

    CERN Document Server

    Johnson, Charles S

    1988-01-01

    Unusually varied problems, with detailed solutions, cover quantum mechanics, wave mechanics, angular momentum, molecular spectroscopy, scattering theory, more. 280 problems, plus 139 supplementary exercises.

  1. Atoms and molecules in cavities, from weak to strong coupling in quantum-electrodynamics (QED) chemistry.

    Science.gov (United States)

    Flick, Johannes; Ruggenthaler, Michael; Appel, Heiko; Rubio, Angel

    2017-03-21

    In this work, we provide an overview of how well-established concepts in the fields of quantum chemistry and material sciences have to be adapted when the quantum nature of light becomes important in correlated matter-photon problems. We analyze model systems in optical cavities, where the matter-photon interaction is considered from the weak- to the strong-coupling limit and for individual photon modes as well as for the multimode case. We identify fundamental changes in Born-Oppenheimer surfaces, spectroscopic quantities, conical intersections, and efficiency for quantum control. We conclude by applying our recently developed quantum-electrodynamical density-functional theory to spontaneous emission and show how a straightforward approximation accurately describes the correlated electron-photon dynamics. This work paves the way to describe matter-photon interactions from first principles and addresses the emergence of new states of matter in chemistry and material science.

  2. Photoelectron Imaging as a Quantum Chemistry Visualization Tool

    Science.gov (United States)

    Grumbling, Emily R.; Pichugin, Kostyantyn; Mabbs, Richard; Sanov, Andrei

    2011-01-01

    An overview and simple example of photoelectron imaging is presented, highlighting its efficacy as a pedagogical tool for visualizing quantum phenomena. Specifically, photoelectron imaging of H[superscript -] (the simplest negative ion) is used to demonstrate several quantum mechanical principles. This example could be incorporated into an…

  3. Linear-scaling and parallelizable algorithms for stochastic quantum chemistry

    CERN Document Server

    Booth, George H; Alavi, Ali

    2013-01-01

    For many decades, quantum chemical method development has been dominated by algorithms which involve increasingly complex series of tensor contractions over one-electron orbital spaces. Procedures for their derivation and implementation have evolved to require the minimum amount of logic and rely heavily on computationally efficient library-based matrix algebra and optimized paging schemes. In this regard, the recent development of exact stochastic quantum chemical algorithms to reduce computational scaling and memory overhead requires a contrasting algorithmic philosophy, but one which when implemented efficiently can often achieve higher accuracy/cost ratios with small random errors. Additionally, they can exploit the continuing trend for massive parallelization which hinders the progress of deterministic high-level quantum chemical algorithms. In the Quantum Monte Carlo community, stochastic algorithms are ubiquitous but the discrete Fock space of quantum chemical methods is often unfamiliar, and the metho...

  4. Multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans.

    Science.gov (United States)

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-02-17

    The booming nanotechnology industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At the organism level, our automated system analyzes hundreds of individual animals for body length, locomotion speed, and lifespan. To demonstrate the utility of our system, we applied this technology to test the toxicity of 20 nanomaterials at four concentrations. Only fullerene nanoparticles (nC60), fullerol, TiO2, and CeO2 showed little or no toxicity. Various degrees of toxicity were detected from different forms of carbon nanotubes, graphene, carbon black, Ag, and fumed SiO2 nanoparticles. Aminofullerene and ultraviolet-irradiated nC60 also showed small but significant toxicity. We further investigated the effects of nanomaterial size, shape, surface chemistry, and exposure conditions on toxicity. Our data are publicly available at the open-access nanotoxicity database www.QuantWorm.org/nano.

  5. A multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans

    Science.gov (United States)

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-01-01

    The booming nanotech industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At the organism level, our automated system analyzes hundreds of individual animals for body length, locomotion speed, and lifespan. To demonstrate the utility of our system, we applied this technology to test the toxicity of 20 nanomaterials under four concentrations. Only fullerene nanoparticles (nC60), fullerol, TiO2, and CeO2 showed little or no toxicity. Various degrees of toxicity were detected from different forms of carbon nanotubes, graphene, carbon black, Ag, and fumed SiO2 nanoparticles. Aminofullerene and UV irradiated nC60 also showed small but significant toxicity. We further investigated the effects of nanomaterial size, shape, surface chemistry, and exposure conditions on toxicity. Our data are publicly available at the open-access nanotoxicity database www.QuantWorm.org/nano. PMID:25611253

  6. Whole cell strategies based on lux genes for high throughput applications toward new antimicrobials.

    Science.gov (United States)

    Galluzzi, Lorenzo; Karp, Matti

    2006-08-01

    The discovery/development of novel drug candidates has witnessed dramatic changes over the last two decades. Old methods to identify lead compounds are not suitable to screen wide libraries generated by combinatorial chemistry techniques. High throughput screening (HTS) has become irreplaceable and hundreds of different approaches have been described. Assays based on purified components are flanked by whole cell-based assays, in which reporter genes are used to monitor, directly or indirectly, the influence of a chemical over the metabolism of living cells. The most convenient and widely used reporters for real-time measurements are luciferases, light emitting enzymes from evolutionarily distant organisms. Autofluorescent proteins have been also extensively employed, but proved to be more suitable for end-point measurements, in situ applications - such as the localization of fusion proteins in specific subcellular compartments - or environmental studies on microbial populations. The trend toward miniaturization and the technical advances in detection and liquid handling systems will allow to reach an ultra high throughput screening (uHTS), with 100,000 of compounds routinely screened each day. Here we show how similar approaches may be applied also to the search for new and potent antimicrobial agents.

  7. High-Throughput Continuous Flow Synthesis of Nickel Nanoparticles for the Catalytic Hydrodeoxygenation of Guaiacol

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Emily J.; Habas, Susan E.; Wang, Lu; Ruddy, Daniel A.; White, Erick A.; Baddour, Frederick G.; Griffin, Michael B.; Schaidle, Joshua A.; Malmstadt, Noah; Brutchey, Richard L.

    2016-11-07

    The translation of batch chemistries to high-throughput continuous flow methods dresses scaling, automation, and reproducibility concerns associated with the implementation of colloidally prepared nanoparticle (NP) catalysts for industrial catalytic processes. Nickel NPs were synthesized by the high-temperature amine reduction of a Ni2+ precursor using a continuous millifluidic (mF) flow method, achieving yields greater than 60%. The resulting Ni NP catalysts were compared against catalysts prepared in a batch reaction under conditions analogous to the continuous flow conditions with respect to total reaction volume, time, and temperature and by traditional incipient wetness (IW) impregnation for the hydrodeoxygenation (HDO) of guaiacol under ex situ catalytic fast pyrolysis conditions. Compared to the IW method, the colloidally prepared NPs displayed increased morphological control and narrowed size distributions, and the NPs prepared by both methods showed similar size, shape, and crystallinity. The Ni NP catalyst synthesized by the continuous flow method exhibited similar H-adsorption site densities, site-time yields, and selectivities towards deoxygenated products as compared to the analogous batch reaction, and outperformed the IW catalyst with respect to higher selectivity to lower oxygen content products and a 6.9-fold slower deactivation rate. These results demonstrate the utility of synthesizing colloidal Ni NP catalysts using continuous flow methods while maintaining the catalytic properties displayed by the batch equivalent. This methodology can be extended to other catalytically relevant base metals for the high-throughput synthesis of metal NPs for the catalytic production of biofuels.

  8. Development of fast and high throughput tomography using CMOS image detector at SPring-8

    Science.gov (United States)

    Uesugi, Kentaro; Hoshino, Masato; Takeuchi, Akihisa; Suzuki, Yoshio; Yagi, Naoto

    2012-10-01

    A fast micro-tomography system and a high throughput micro-tomography system using state-of-the-art Complementary Metal Oxide Semiconductor (CMOS) imaging devices have been developed at SPring-8. Those systems adopt simple projection type tomography using synchrotron radiation X-ray. The fast micro-tomography system achieves a scan time around 2 s with 1000 projections, which is 15 times faster than previously developed system at SPring-8. The CMOS camera for fast tomography has 64 Giga Byte on-board memory, therefore, the obtained images must be transferred to a PC at the appropriate timing. A melting process of snow at room temperature was imaged every 30 s as a demonstration of the system. The high throughput tomography system adopts a scientific CMOS (sCMOS) camera with a low noise and high quantum efficiency. The system achieves a scan time around 5 minutes which is three times faster than before. The images quality of the system has been compared to the existing system with Charge-Coupled Device (CCD) camera. The results have shown the advantage of the new sCMOS camera.

  9. A high-throughput microtiter plate based method for the determination of peracetic acid and hydrogen peroxide.

    Directory of Open Access Journals (Sweden)

    Karson S Putt

    Full Text Available Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.

  10. A high-throughput microtiter plate based method for the determination of peracetic acid and hydrogen peroxide.

    Science.gov (United States)

    Putt, Karson S; Pugh, Randall B

    2013-01-01

    Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.

  11. High-throughput screening to enhance oncolytic virus immunotherapy

    Directory of Open Access Journals (Sweden)

    Allan KJ

    2016-04-01

    Full Text Available KJ Allan,1,2 David F Stojdl,1–3 SL Swift1 1Children’s Hospital of Eastern Ontario (CHEO Research Institute, 2Department of Biology, Microbiology and Immunology, 3Department of Pediatrics, University of Ottawa, Ottawa, ON, Canada Abstract: High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. Keywords: oncolytic, virus, screen, high-throughput, cancer, chemical, genomic, immunotherapy

  12. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  13. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Energy Technology Data Exchange (ETDEWEB)

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  14. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  15. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families....

  16. SSFinder: High Throughput CRISPR-Cas Target Sites Prediction Tool

    Directory of Open Access Journals (Sweden)

    Santosh Kumar Upadhyay

    2014-01-01

    Full Text Available Clustered regularly interspaced short palindromic repeats (CRISPR and CRISPR-associated protein (Cas system facilitates targeted genome editing in organisms. Despite high demand of this system, finding a reliable tool for the determination of specific target sites in large genomic data remained challenging. Here, we report SSFinder, a python script to perform high throughput detection of specific target sites in large nucleotide datasets. The SSFinder is a user-friendly tool, compatible with Windows, Mac OS, and Linux operating systems, and freely available online.

  17. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.;

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification....... Monoclonal antibodies were raised to different targets in single batch runs of 6-10 wk using multiplexed immunisations, automated fusion and cell-culture, and a novel antigen-coated microarray-screening assay. In a large-scale experiment, where eight mice were immunized with ten antigens each, we generated...

  18. High-throughput synthesis and analysis of acylated cyanohydrins.

    Science.gov (United States)

    Hamberg, Anders; Lundgren, Stina; Wingstrand, Erica; Moberg, Christina; Hult, Karl

    2007-01-01

    The yields and optical purities of products obtained from chiral Lewis acid/Lewis base-catalysed additions of alpha-ketonitriles to prochiral aldehydes could be accurately determined by an enzymatic method. The amount of remaining aldehyde was determined after its reduction to an alcohol, whilst the two product enantiomers were analysed after subsequent hydrolysis first by the (S)-selective Candida antarctica lipase B and then by the unselective pig liver esterase. The method could be used for analysis of products obtained from a number of aromatic aldehydes and aliphatic ketonitriles. Microreactor technology was successfully combined with high-throughput analysis for efficient catalyst optimization.

  19. Computational Proteomics: High-throughput Analysis for Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  20. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r...... to the presence of filamentous microorganisms was monitored weekly over 4 months. Microthrix was identified as a causative filament and suitable control measures were introduced. The level of Microthrix was reduced after 1-2 months but a number of other filamentous species were still present, with most of them...

  1. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  2. Molecular Orbitals of NO, NO[superscript+], and NO[superscript-]: A Computational Quantum Chemistry Experiment

    Science.gov (United States)

    Orenha, Renato P.; Galembeck, Sérgio E.

    2014-01-01

    This computational experiment presents qualitative molecular orbital (QMO) and computational quantum chemistry exercises of NO, NO[superscript+], and NO[superscript-]. Initially students explore several properties of the target molecules by Lewis diagrams and the QMO theory. Then, they compare qualitative conclusions with EHT and DFT calculations…

  3. The Relationships between PCK Components: The Case of Quantum Chemistry Professors

    Science.gov (United States)

    Padilla, Kira; Van Driel, Jan

    2011-01-01

    The purpose of this paper is to capture the pedagogical content knowledge (PCK) of university professors about quantum chemistry. More specifically, we aimed to identify and analyze relationships between specific PCK components, using an adapted version of the model of PCK of Magnusson "et al.". A sample of university professors (n = 6)…

  4. Students' Levels of Explanations, Models, and Misconceptions in Basic Quantum Chemistry: A Phenomenographic Study

    Science.gov (United States)

    Stefani, Christina; Tsaparlis, Georgios

    2009-01-01

    We investigated students' knowledge constructions of basic quantum chemistry concepts, namely atomic orbitals, the Schrodinger equation, molecular orbitals, hybridization, and chemical bonding. Ausubel's theory of meaningful learning provided the theoretical framework and phenomenography the method of analysis. The semi-structured interview with…

  5. High-Throughput Particle Manipulation Based on Hydrodynamic Effects in Microchannels

    Directory of Open Access Journals (Sweden)

    Chao Liu

    2017-03-01

    Full Text Available Microfluidic techniques are effective tools for precise manipulation of particles and cells, whose enrichment and separation is crucial for a wide range of applications in biology, medicine, and chemistry. Recently, lateral particle migration induced by the intrinsic hydrodynamic effects in microchannels, such as inertia and elasticity, has shown its promise for high-throughput and label-free particle manipulation. The particle migration can be engineered to realize the controllable focusing and separation of particles based on a difference in size. The widespread use of inertial and viscoelastic microfluidics depends on the understanding of hydrodynamic effects on particle motion. This review will summarize the progress in the fundamental mechanisms and key applications of inertial and viscoelastic particle manipulation.

  6. Perfect high throughput screening assay: a crucial technique for drug discovery

    Institute of Scientific and Technical Information of China (English)

    Guan-hua DU

    2005-01-01

    @@ Since being developed approximately 20 years ago, high throughput screening (HTS) has become one of the key techniques used in drug discovery[1]. However, three main problems are recognized with the use of HTS; namely, with the compound library, drug targets, and assay methods. Until now, the compound library has evolved based on the techniques of combinatorial chemistry and modern phytochemistry. Several functional proteins have emerged following the advance of genomics and proteomics. However,although many functional proteins have been discovered recently, they are not, as sometimes claimed, real drug targets;at best, they might be potential drug targets. The ideal targets selected for drug screening should qualify as drug targets[2]. The selection of targets for drug screening is a crucial procedure in drug screening.

  7. Big Data Meets Quantum Chemistry Approximations: The Δ-Machine Learning Approach.

    Science.gov (United States)

    Ramakrishnan, Raghunathan; Dral, Pavlo O; Rupp, Matthias; von Lilienfeld, O Anatole

    2015-05-12

    Chemically accurate and comprehensive studies of the virtual space of all possible molecules are severely limited by the computational cost of quantum chemistry. We introduce a composite strategy that adds machine learning corrections to computationally inexpensive approximate legacy quantum methods. After training, highly accurate predictions of enthalpies, free energies, entropies, and electron correlation energies are possible, for significantly larger molecular sets than used for training. For thermochemical properties of up to 16k isomers of C7H10O2 we present numerical evidence that chemical accuracy can be reached. We also predict electron correlation energy in post Hartree-Fock methods, at the computational cost of Hartree-Fock, and we establish a qualitative relationship between molecular entropy and electron correlation. The transferability of our approach is demonstrated, using semiempirical quantum chemistry and machine learning models trained on 1 and 10% of 134k organic molecules, to reproduce enthalpies of all remaining molecules at density functional theory level of accuracy.

  8. Big Data meets Quantum Chemistry Approximations: The $\\Delta$-Machine Learning Approach

    CERN Document Server

    Ramakrishnan, Raghunathan; Rupp, Matthias; von Lilienfeld, O Anatole

    2015-01-01

    Chemically accurate and comprehensive studies of the virtual space of all possible molecules are severely limited by the computational cost of quantum chemistry. We introduce a composite strategy that adds machine learning corrections to computationally inexpensive approximate legacy quantum methods. After training, highly accurate predictions of enthalpies, free energies, entropies, and electron correlation energies are possible, for significantly larger molecular sets than used for training. For thermochemical properties of up to 16k constitutional isomers of C$_7$H$_{10}$O$_2$ we present numerical evidence that chemical accuracy can be reached. We also predict electron correlation energy in post Hartree-Fock methods, at the computational cost of Hartree-Fock, and we establish a qualitative relationship between molecular entropy and electron correlation. The transferability of our approach is demonstrated, using semi-empirical quantum chemistry and machine learning models trained on 1 and 10\\% of 134k organ...

  9. New high-throughput methods of investigating polymer electrolytes

    Science.gov (United States)

    Alcock, Hannah J.; White, Oliver C.; Jegelevicius, Grazvydas; Roberts, Matthew R.; Owen, John R.

    2011-03-01

    Polymer electrolyte films have been prepared by solution casting techniques from precursor solutions of a poly(vinylidene fluoride-co-hexafluoropropylene) (PVdF-HFP), lithium-bis(trifluoromethane) sulfonimide (LiTFSI), and propylene carbonate (PC). Arrays of graded composition were characterised by electrochemical impedance spectroscopy (EIS), differential scanning calorimetry (DSC) and X-ray diffraction (XRD) using high throughput techniques. Impedance analysis showed the resistance of the films as a function of LiTFSI, PC and polymer content. The ternary plot of conductivity shows an area that combines a solid-like mechanical stability with high conductivity, 1 × 10-5 S cm-1 at the composition 0.55/0.15/0.30 wt% PVdF-HFP/LiTFSI/PC, increasing with PC content. In regions with less than a 50 wt% fraction of PVdF-HFP the films were too soft to give meaningful results by this method. The DSC measurements on solvent free, salt-doped polymers show a reduced crystallinity, and high throughput XRD patterns show that non-polar crystalline phases are suppressed by the presence of LiTFSI and PC.

  10. Evaluation of a high throughput starch analysis optimised for wood.

    Directory of Open Access Journals (Sweden)

    Chandra Bellasio

    Full Text Available Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11 was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood of four species (coniferous and flowering plants. The optimised protocol proved to be remarkably precise and accurate (3%, suitable for a high throughput routine analysis (35 samples a day of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  11. Evaluation of a high throughput starch analysis optimised for wood.

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  12. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  13. A high-throughput cidality screen for Mycobacterium tuberculosis.

    Directory of Open Access Journals (Sweden)

    Parvinder Kaur

    Full Text Available Exposure to Mycobacterium tuberculosis (Mtb aerosols is a major threat to tuberculosis (TB researchers, even in bio-safety level-3 (BSL-3 facilities. Automation and high-throughput screens (HTS in BSL3 facilities are essential for minimizing manual aerosol-generating interventions and facilitating TB research. In the present study, we report the development and validation of a high-throughput, 24-well 'spot-assay' for selecting bactericidal compounds against Mtb. The bactericidal screen concept was first validated in the fast-growing surrogate Mycobacterium smegmatis (Msm and subsequently confirmed in Mtb using the following reference anti-tubercular drugs: rifampicin, isoniazid, ofloxacin and ethambutol (RIOE, acting on different targets. The potential use of the spot-assay to select bactericidal compounds from a large library was confirmed by screening on Mtb, with parallel plating by the conventional gold standard method (correlation, r2 = 0.808. An automated spot-assay further enabled an MBC90 determination on resistant and sensitive Mtb clinical isolates. The implementation of the spot-assay in kinetic screens to enumerate residual Mtb after either genetic silencing (anti-sense RNA, AS-RNA or chemical inhibition corroborated its ability to detect cidality. This relatively simple, economical and quantitative HTS considerably minimized the bio-hazard risk and enabled the selection of novel vulnerable Mtb targets and mycobactericidal compounds. Thus, spot-assays have great potential to impact the TB drug discovery process.

  14. High-throughput protein analysis integrating bioinformatics and experimental assays.

    Science.gov (United States)

    del Val, Coral; Mehrle, Alexander; Falkenhahn, Mechthild; Seiler, Markus; Glatting, Karl-Heinz; Poustka, Annemarie; Suhai, Sandor; Wiemann, Stefan

    2004-01-01

    The wealth of transcript information that has been made publicly available in recent years requires the development of high-throughput functional genomics and proteomics approaches for its analysis. Such approaches need suitable data integration procedures and a high level of automation in order to gain maximum benefit from the results generated. We have designed an automatic pipeline to analyse annotated open reading frames (ORFs) stemming from full-length cDNAs produced mainly by the German cDNA Consortium. The ORFs are cloned into expression vectors for use in large-scale assays such as the determination of subcellular protein localization or kinase reaction specificity. Additionally, all identified ORFs undergo exhaustive bioinformatic analysis such as similarity searches, protein domain architecture determination and prediction of physicochemical characteristics and secondary structure, using a wide variety of bioinformatic methods in combination with the most up-to-date public databases (e.g. PRINTS, BLOCKS, INTERPRO, PROSITE SWISSPROT). Data from experimental results and from the bioinformatic analysis are integrated and stored in a relational database (MS SQL-Server), which makes it possible for researchers to find answers to biological questions easily, thereby speeding up the selection of targets for further analysis. The designed pipeline constitutes a new automatic approach to obtaining and administrating relevant biological data from high-throughput investigations of cDNAs in order to systematically identify and characterize novel genes, as well as to comprehensively describe the function of the encoded proteins.

  15. New technologies for ultra-high throughput genotyping in plants.

    Science.gov (United States)

    Appleby, Nikki; Edwards, David; Batley, Jacqueline

    2009-01-01

    Molecular genetic markers represent one of the most powerful tools for the analysis of plant genomes and the association of heritable traits with underlying genetic variation. Molecular marker technology has developed rapidly over the last decade, with the development of high-throughput genotyping methods. Two forms of sequence-based marker, simple sequence repeats (SSRs), also known as microsatellites and single nucleotide polymorphisms (SNPs) now predominate applications in modern plant genetic analysis, along the anonymous marker systems such as amplified fragment length polymorphisms (AFLPs) and diversity array technology (DArT). The reducing cost of DNA sequencing and increasing availability of large sequence data sets permits the mining of this data for large numbers of SSRs and SNPs. These may then be used in applications such as genetic linkage analysis and trait mapping, diversity analysis, association studies and marker-assisted selection. Here, we describe automated methods for the discovery of molecular markers and new technologies for high-throughput, low-cost molecular marker genotyping. Genotyping examples include multiplexing of SSRs using Multiplex-Ready marker technology (MRT); DArT genotyping; SNP genotyping using the Invader assay, the single base extension (SBE), oligonucleotide ligation assay (OLA) SNPlex system, and Illumina GoldenGate and Infinium methods.

  16. High throughput instruments, methods, and informatics for systems biology.

    Energy Technology Data Exchange (ETDEWEB)

    Sinclair, Michael B.; Cowie, Jim R. (New Mexico State University, Las Cruces, NM); Van Benthem, Mark Hilary; Wylie, Brian Neil; Davidson, George S.; Haaland, David Michael; Timlin, Jerilyn Ann; Aragon, Anthony D. (University of New Mexico, Albuquerque, NM); Keenan, Michael Robert; Boyack, Kevin W.; Thomas, Edward Victor; Werner-Washburne, Margaret C. (University of New Mexico, Albuquerque, NM); Mosquera-Caro, Monica P. (University of New Mexico, Albuquerque, NM); Martinez, M. Juanita (University of New Mexico, Albuquerque, NM); Martin, Shawn Bryan; Willman, Cheryl L. (University of New Mexico, Albuquerque, NM)

    2003-12-01

    High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery.

  17. High-throughput screening to enhance oncolytic virus immunotherapy

    Science.gov (United States)

    Allan, KJ; Stojdl, David F; Swift, SL

    2016-01-01

    High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs) are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. PMID:27579293

  18. Human transcriptome array for high-throughput clinical studies.

    Science.gov (United States)

    Xu, Weihong; Seok, Junhee; Mindrinos, Michael N; Schweitzer, Anthony C; Jiang, Hui; Wilhelmy, Julie; Clark, Tyson A; Kapur, Karen; Xing, Yi; Faham, Malek; Storey, John D; Moldawer, Lyle L; Maier, Ronald V; Tompkins, Ronald G; Wong, Wing Hung; Davis, Ronald W; Xiao, Wenzhong

    2011-03-01

    A 6.9 million-feature oligonucleotide array of the human transcriptome [Glue Grant human transcriptome (GG-H array)] has been developed for high-throughput and cost-effective analyses in clinical studies. This array allows comprehensive examination of gene expression and genome-wide identification of alternative splicing as well as detection of coding SNPs and noncoding transcripts. The performance of the array was examined and compared with mRNA sequencing (RNA-Seq) results over multiple independent replicates of liver and muscle samples. Compared with RNA-Seq of 46 million uniquely mappable reads per replicate, the GG-H array is highly reproducible in estimating gene and exon abundance. Although both platforms detect similar expression changes at the gene level, the GG-H array is more sensitive at the exon level. Deeper sequencing is required to adequately cover low-abundance transcripts. The array has been implemented in a multicenter clinical program and has generated high-quality, reproducible data. Considering the clinical trial requirements of cost, sample availability, and throughput, the GG-H array has a wide range of applications. An emerging approach for large-scale clinical genomic studies is to first use RNA-Seq to the sufficient depth for the discovery of transcriptome elements relevant to the disease process followed by high-throughput and reliable screening of these elements on thousands of patient samples using custom-designed arrays.

  19. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  20. High-throughput electrical characterization for robust overlay lithography control

    Science.gov (United States)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  1. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput.

  2. Computational analysis of high-throughput flow cytometry data

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  3. High throughput discovery of new fouling-resistant surfaces†

    Science.gov (United States)

    Zhou, Mingyan; Liu, Hongwei; Venkiteshwaran, Adith; Kilduff, James; Anderson, Daniel G.; Langer, Robert; Belfort, Georges

    2017-01-01

    A novel high throughput method for synthesis and screening of customized protein-resistant surfaces was developed. This method is an inexpensive, fast, reproducible and scalable approach to synthesize and screen protein-resistant surfaces appropriate for a specific feed. The method is illustrated here by combining a high throughput platform (HTP) approach together with our patented photo-induced graft polymerization (PGP) method developed for facile modification of commercial poly(aryl sulfone) membranes. We demonstrate that the HTP–PGP approach to synthesize and screen fouling-resistant surfaces is general, and thus provides the capability to develop surfaces optimized for specific feeds. Surfaces were prepared via graft polymerization onto poly(ether sulfone) (PES) membranes and were evaluated using a protein adsorption assay followed by pressure-driven filtration. We have employed the HTP–PGP approach to confirm previously reported successful monomers and to develop new antifouling surfaces from a library of 66 monomers for four different challenges of interest to the biotechnology community: hen egg-white lysozyme, supernatant from Chinese Hamster Ovary (CHO) cells in phosphate buffered saline (PBS) solution as a model cell suspension, and immunoglobulin G (IgG) precipitated in the absence and presence of bovine serum albumin (BSA) in high salt solution as a model precipitation process.

  4. High throughput electrophysiology: new perspectives for ion channel drug discovery.

    Science.gov (United States)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter; Jensen, Bo Skaaning; Korsgaard, Mads P G; Christophersen, Palle

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels. A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion channel targets accessible for drug screening. Specifically, genuine HTS parallel processing techniques based on arrays of planar silicon chips are being developed, but also lower throughput sequential techniques may be of value in compound screening, lead optimization, and safety screening. The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery.

  5. High-throughput nanoparticle catalysis: partial oxidation of propylene.

    Science.gov (United States)

    Duan, Shici; Kahn, Michael; Senkan, Selim

    2007-02-01

    Partial oxidation of propylene was investigated at 1 atm pressure over Rh/TiO(2) catalysts as a function of reaction temperature, metal loading and particle size using high-throughput methods. Catalysts were prepared by ablating thin sheets of pure rhodium metal using an excimer laser and by collecting the nanoparticles created on the external surfaces of TiO(2) pellets that were placed inside the ablation plume. Rh nanoparticles before the experiments were characterized by transmission electron microscopy (TEM) by collecting them on carbon film. Catalyst evaluations were performed using a high-throughput array channel microreactor system coupled to quadrupole mass spectrometry (MS) and gas chromatography (GC). The reaction conditions were 23% C(3)H(6), 20% O(2) and the balance helium in the feed, 20,000 h(-1) GHSV and a temperature range of 250-325 degrees C. The reaction products included primarily acetone (AT) and to a lesser degree propionaldehyde (PaL) as the C(3) products, together with deep oxidation products COx.

  6. Compression of structured high-throughput sequencing data.

    Directory of Open Access Journals (Sweden)

    Fabien Campagne

    Full Text Available Large biological datasets are being produced at a rapid pace and create substantial storage challenges, particularly in the domain of high-throughput sequencing (HTS. Most approaches currently used to store HTS data are either unable to quickly adapt to the requirements of new sequencing or analysis methods (because they do not support schema evolution, or fail to provide state of the art compression of the datasets. We have devised new approaches to store HTS data that support seamless data schema evolution and compress datasets substantially better than existing approaches. Building on these new approaches, we discuss and demonstrate how a multi-tier data organization can dramatically reduce the storage, computational and network burden of collecting, analyzing, and archiving large sequencing datasets. For instance, we show that spliced RNA-Seq alignments can be stored in less than 4% the size of a BAM file with perfect data fidelity. Compared to the previous compression state of the art, these methods reduce dataset size more than 40% when storing exome, gene expression or DNA methylation datasets. The approaches have been integrated in a comprehensive suite of software tools (http://goby.campagnelab.org that support common analyses for a range of high-throughput sequencing assays.

  7. High-throughput fragment screening by affinity LC-MS.

    Science.gov (United States)

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in 3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  8. Discovery of novel targets with high throughput RNA interference screening.

    Science.gov (United States)

    Kassner, Paul D

    2008-03-01

    High throughput technologies have the potential to affect all aspects of drug discovery. Considerable attention is paid to high throughput screening (HTS) for small molecule lead compounds. The identification of the targets that enter those HTS campaigns had been driven by basic research until the advent of genomics level data acquisition such as sequencing and gene expression microarrays. Large-scale profiling approaches (e.g., microarrays, protein analysis by mass spectrometry, and metabolite profiling) can yield vast quantities of data and important information. However, these approaches usually require painstaking in silico analysis and low-throughput basic wet-lab research to identify the function of a gene and validate the gene product as a potential therapeutic drug target. Functional genomic screening offers the promise of direct identification of genes involved in phenotypes of interest. In this review, RNA interference (RNAi) mediated loss-of-function screens will be discussed and as well as their utility in target identification. Some of the genes identified in these screens should produce similar phenotypes if their gene products are antagonized with drugs. With a carefully chosen phenotype, an understanding of the biology of RNAi and appreciation of the limitations of RNAi screening, there is great potential for the discovery of new drug targets.

  9. Fluorescent foci quantitation for high-throughput analysis

    Science.gov (United States)

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  10. Benchmarking procedures for high-throughput context specific reconstruction algorithms

    Directory of Open Access Journals (Sweden)

    Maria ePires Pacheco

    2016-01-01

    Full Text Available Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX (Duarte et al., 2007; Thiele et al., 2013 or HMR (Agren et al., 2013 has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last ten years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding.This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished, consistency testing and comparison based testing. The former includes methods like cross validation or testing with artificial networks. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms, that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms

  11. Development and Operation of High-throughput Accurate-wavelength Lens-based Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Bell, Ronald E

    2014-07-01

    A high-throughput spectrometer for the 400-820 nm wavelength range has been developed for charge exchange recombination spectroscopy or general spectroscopy. A large 2160 mm-1 grating is matched with fast f /1.8 200 mm lenses, which provide stigmatic imaging. A precision optical encoder measures the grating angle with an accuracy < 0.075 arc seconds. A high quantum efficiency low-etaloning CCD detector allows operation at longer wavelengths. A patch panel allows input fibers to interface with interchangeable fiber holders that attach to a kinematic mount behind the entrance slit. Computer-controlled hardware allows automated control of wavelength, timing, f-number, automated data collection, and wavelength calibration.

  12. Dynamical mean-field theory for quantum chemistry.

    Science.gov (United States)

    Lin, Nan; Marianetti, C A; Millis, Andrew J; Reichman, David R

    2011-03-04

    The dynamical mean-field concept of approximating an unsolvable many-body problem in terms of the solution of an auxiliary quantum impurity problem, introduced to study bulk materials with a continuous energy spectrum, is here extended to molecules, i.e., finite systems with a discrete energy spectrum. The application to small clusters of hydrogen atoms yields ground state energies which are competitive with leading quantum chemical approaches at intermediate and large interatomic distances as well as good approximations to the excitation spectrum.

  13. High-throughput microcavitation bubble induced cellular mechanotransduction

    Science.gov (United States)

    Compton, Jonathan Lee

    inhibitor to IP 3 induced Ca2+ release. This capability opens the development of a high-throughput screening platform for molecules that modulate cellular mechanotransduction. We have applied this approach to screen the effects of a small set of small molecules, in a 96-well plate in less than an hour. These detailed studies offer a basis for the design, development, and implementation of a novel high-throughput mechanotransduction assay to rapidly screen the effect of small molecules on cellular mechanotransduction at high throughput.

  14. Test of quantum chemistry in vibrationally-hot hydrogen molecules

    CERN Document Server

    Niu, M L; Ubachs, W

    2015-01-01

    Precision measurements are performed on highly excited vibrational quantum states of molecular hydrogen. The $v=12, J=0-3$ rovibrational levels of H$_2$ ($X^1\\Sigma_g^+$), lying only $2000$ cm$^{-1}$ below the first dissociation limit, were populated by photodissociation of H$_2$S and their level energies were accurately determined by two-photon Doppler-free spectroscopy. A comparison between the experimental results on $v=12$ level energies with the best \\textit{ab initio} calculations shows good agreement, where the present experimental accuracy of $3.5 \\times10^{-3}$ cm$^{-1}$ is more precise than theory, hence providing a gateway to further test theoretical advances in this benchmark quantum system.

  15. Photodissociation of a diatomic molecule in the quantum regime reveals ultracold chemistry

    CERN Document Server

    McDonald, M; Apfelbeck, F; Lee, C -H; Majewska, I; Moszynski, R; Zelevinsky, T

    2015-01-01

    Chemical reactions at temperatures near absolute zero require a full quantum description of the reaction pathways and enable enhanced control of the products via quantum state selection. Ultracold molecule experiments have provided initial insight into the quantum nature of basic chemical processes involving diatomic molecules, for example from studies of bimolecular reactions, but complete control over the reactants and products has remained elusive. The "half-collision" process of photodissociation is an indispensable tool in molecular physics and offers significantly more control than the reverse process of photoassociation. Here we reach a fully quantum regime with photodissociation of ultracold $^{88}$Sr$_2$ molecules where the initial bound state of the molecule and the target continuum state of the fragments are strictly controlled. Detection of the photodissociation products via optical absorption imaging reveals the hallmarks of ultracold chemistry: resonant and nonresonant barrier tunneling, importa...

  16. High-Throughput Tools for Characterization of Antibody Epitopes

    DEFF Research Database (Denmark)

    Christiansen, Anders

    , it is important to characterize antibodies thoroughly. In parallel to the characterization of antibodies, it is also important to characterize the binding area that is recognized by the antibody, known as an epitope. With the development of new technologies, such as high-throughput sequencing (HTS....... In this study, these improvements were utilized to characterize epitopes at high resolution, i.e. determine the importance of each residue for antibody binding, for all major peanut allergens. Epitope reactivity among patients often converged on known epitope hotspots, however the binding patterns were somewhat...... multiple years. Taken together, the presented studies demonstrated new applications for the investigated techniques focusing on their utilization in epitope mapping. In the process, new insights were obtained into how antibodies recognize their targets in a major disease, i.e. food allergy....

  17. Single-platelet nanomechanics measured by high-throughput cytometry

    Science.gov (United States)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2016-10-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  18. Microfluidic cell chips for high-throughput drug screening.

    Science.gov (United States)

    Chi, Chun-Wei; Ahmed, Ah Rezwanuddin; Dereli-Korkut, Zeynep; Wang, Sihong

    2016-05-01

    The current state of screening methods for drug discovery is still riddled with several inefficiencies. Although some widely used high-throughput screening platforms may enhance the drug screening process, their cost and oversimplification of cell-drug interactions pose a translational difficulty. Microfluidic cell-chips resolve many issues found in conventional HTS technology, providing benefits such as reduced sample quantity and integration of 3D cell culture physically more representative of the physiological/pathological microenvironment. In this review, we introduce the advantages of microfluidic devices in drug screening, and outline the critical factors which influence device design, highlighting recent innovations and advances in the field including a summary of commercialization efforts on microfluidic cell chips. Future perspectives of microfluidic cell devices are also provided based on considerations of present technological limitations and translational barriers.

  19. High-throughput Identification of Phage-derived Imaging Agents

    Directory of Open Access Journals (Sweden)

    Kimberly A. Kelly

    2006-01-01

    Full Text Available The use of phage-displayed peptide libraries is a powerful method for selecting peptides with desired binding properties. However, the validation and prioritization of “hits” obtained from this screening approach remains challenging. Here, we describe the development and testing of a new analysis method to identify and display hits from phage-display experiments and high-throughput enzyme-linked immunosorbent assay screens. We test the method using a phage screen against activated macrophages to develop imaging agents with higher specificity for active disease processes. The new methodology should be useful in identifying phage hits and is extendable to other library screening methods such as small-molecule and nanoparticle libraries.

  20. High-throughput ab-initio dilute solute diffusion database

    Science.gov (United States)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  1. Automated Transition State Theory Calculations for High-Throughput Kinetics.

    Science.gov (United States)

    Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard Henry

    2017-08-18

    A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.

  2. UAV-based high-throughput phenotyping in legume crops

    Science.gov (United States)

    Sankaran, Sindhuja; Khot, Lav R.; Quirós, Juan; Vandemark, George J.; McGee, Rebecca J.

    2016-05-01

    In plant breeding, one of the biggest obstacles in genetic improvement is the lack of proven rapid methods for measuring plant responses in field conditions. Therefore, the major objective of this research was to evaluate the feasibility of utilizing high-throughput remote sensing technology for rapid measurement of phenotyping traits in legume crops. The plant responses of several chickpea and peas varieties to the environment were assessed with an unmanned aerial vehicle (UAV) integrated with multispectral imaging sensors. Our preliminary assessment showed that the vegetation indices are strongly correlated (p<0.05) with seed yield of legume crops. Results endorse the potential of UAS-based sensing technology to rapidly measure those phenotyping traits.

  3. High-throughput profiling in the hematopoietic system.

    Science.gov (United States)

    Fabbri, Muller; Spizzo, Riccardo; Calin, George A

    2010-01-01

    The expression profile of microRNAs significantly varies in physiological and pathological conditions. Increasing evidence from the literature shows that abnormalities of the miRNome (defined as the full spectrum of miRNAs expressed in a genome) occur in almost all human diseases and have important pathogenetic, prognostic, and therapeutic implications. The study of the aberrancies of the miRNome has become possible by developing high-throughput profiling techniques that allow the simultaneous detection of differences in miRNA expression between normal and pathologic tissues or simply tissues at different stages of differentiation. These techniques provide the basis for further investigations focused on the miRNAs, which are most frequently and widely differentially expressed under the different investigated conditions.

  4. Interactive Visual Analysis of High Throughput Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; Goodall, John R [ORNL; Maness, Christopher S [ORNL; Senter, James K [ORNL; Potok, Thomas E [ORNL

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  5. High-Throughput Screening Using Fourier-Transform Infrared Imaging

    Directory of Open Access Journals (Sweden)

    Erdem Sasmaz

    2015-06-01

    Full Text Available Efficient parallel screening of combinatorial libraries is one of the most challenging aspects of the high-throughput (HT heterogeneous catalysis workflow. Today, a number of methods have been used in HT catalyst studies, including various optical, mass-spectrometry, and gas-chromatography techniques. Of these, rapid-scanning Fourier-transform infrared (FTIR imaging is one of the fastest and most versatile screening techniques. Here, the new design of the 16-channel HT reactor is presented and test results for its accuracy and reproducibility are shown. The performance of the system was evaluated through the oxidation of CO over commercial Pd/Al2O3 and cobalt oxide nanoparticles synthesized with different reducer-reductant molar ratios, surfactant types, metal and surfactant concentrations, synthesis temperatures, and ramp rates.

  6. Field high-throughput phenotyping: the new crop breeding frontier.

    Science.gov (United States)

    Araus, José Luis; Cairns, Jill E

    2014-01-01

    Constraints in field phenotyping capability limit our ability to dissect the genetics of quantitative traits, particularly those related to yield and stress tolerance (e.g., yield potential as well as increased drought, heat tolerance, and nutrient efficiency, etc.). The development of effective field-based high-throughput phenotyping platforms (HTPPs) remains a bottleneck for future breeding advances. However, progress in sensors, aeronautics, and high-performance computing are paving the way. Here, we review recent advances in field HTPPs, which should combine at an affordable cost, high capacity for data recording, scoring and processing, and non-invasive remote sensing methods, together with automated environmental data collection. Laboratory analyses of key plant parts may complement direct phenotyping under field conditions. Improvements in user-friendly data management together with a more powerful interpretation of results should increase the use of field HTPPs, therefore increasing the efficiency of crop genetic improvement to meet the needs of future generations.

  7. High-throughput antibody development and retrospective epitope mapping

    DEFF Research Database (Denmark)

    Rydahl, Maja Gro

    Plant cell walls are composed of an interlinked network of polysaccharides, glycoproteins and phenolic polymers. When addressing the diverse polysaccharides in green plants, including land plants and the ancestral green algae, there are significant overlaps in the cell wall structures. Yet...... of green algae, during the development into land plants. Hence, there is a pressing need for rethinking the glycomic toolbox, by developing new and high-throughput (HTP) technology, in order to acquire information of the location and relative abundance of diverse cell wall polymers. In this dissertation......, there are noteworthy differences in the less evolved species of algae as compared to land plants. The dynamic process orchestrating the deposition of these biopolymers both in algae and higher plants, is complex and highly heterogeneous, yet immensely important for the development and differentiation of the cell...

  8. High-throughput sequencing in veterinary infection biology and diagnostics.

    Science.gov (United States)

    Belák, S; Karlsson, O E; Leijon, M; Granberg, F

    2013-12-01

    Sequencing methods have improved rapidly since the first versions of the Sanger techniques, facilitating the development of very powerful tools for detecting and identifying various pathogens, such as viruses, bacteria and other microbes. The ongoing development of high-throughput sequencing (HTS; also known as next-generation sequencing) technologies has resulted in a dramatic reduction in DNA sequencing costs, making the technology more accessible to the average laboratory. In this White Paper of the World Organisation for Animal Health (OIE) Collaborating Centre for the Biotechnology-based Diagnosis of Infectious Diseases in Veterinary Medicine (Uppsala, Sweden), several approaches and examples of HTS are summarised, and their diagnostic applicability is briefly discussed. Selected future aspects of HTS are outlined, including the need for bioinformatic resources, with a focus on improving the diagnosis and control of infectious diseases in veterinary medicine.

  9. Applications of High-Throughput Nucleotide Sequencing (PhD)

    DEFF Research Database (Denmark)

    Waage, Johannes

    The recent advent of high throughput sequencing of nucleic acids (RNA and DNA) has vastly expanded research into the functional and structural biology of the genome of all living organisms (and even a few dead ones). With this enormous and exponential growth in biological data generation come......-sequencing, a study of the effects on alternative RNA splicing of KO of the nonsense mediated RNA decay system in Mus, using digital gene expression and a custom-built exon-exon junction mapping pipeline is presented (article I). Evolved from this work, a Bioconductor package, spliceR, for classifying alternative...... splicing events and coding potential of isoforms from full isoform deconvolution software, such as Cufflinks (article II), is presented. Finally, a study using 5’-end RNA-seq for alternative promoter detection between healthy patients and patients with acute promyelocytic leukemia is presented (article III...

  10. High throughput sequencing reveals a novel fabavirus infecting sweet cherry.

    Science.gov (United States)

    Villamor, D E V; Pillai, S S; Eastwell, K C

    2017-03-01

    The genus Fabavirus currently consists of five species represented by viruses that infect a wide range of hosts but none reported from temperate climate fruit trees. A virus with genomic features resembling fabaviruses (tentatively named Prunus virus F, PrVF) was revealed by high throughput sequencing of extracts from a sweet cherry tree (Prunus avium). PrVF was subsequently shown to be graft transmissible and further identified in three other non-symptomatic Prunus spp. from different geographical locations. Two genetic variants of RNA1 and RNA2 coexisted in the same samples. RNA1 consisted of 6,165 and 6,163 nucleotides, and RNA2 consisted of 3,622 and 3,468 nucleotides.

  11. Numerical techniques for high-throughput reflectance interference biosensing

    Science.gov (United States)

    Sevenler, Derin; Ünlü, M. Selim

    2016-06-01

    We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.

  12. EDITORIAL: Combinatorial and High-Throughput Materials Research

    Science.gov (United States)

    Potyrailo, Radislav A.; Takeuchi, Ichiro

    2005-01-01

    The success of combinatorial and high-throughput methodologies relies greatly on the availability of various characterization tools with new and improved capabilities [1]. Indeed, how useful can a combinatorial library of 250, 400, 25 000 or 2 000 000 compounds be [2-5] if one is unable to characterize its properties of interest fairly quickly? How useful can a set of thousands of spectra or chromatograms be if one is unable to analyse them in a timely manner? For these reasons, the development of new approaches for materials characterization is one of the most active areas in combinatorial materials science. The importance of this aspect of research in the field has been discussed in numerous conferences including the Pittsburgh Conferences, the American Chemical Society Meetings, the American Physical Society Meetings, the Materials Research Society Symposia and various Gordon Research Conferences. Naturally, the development of new measurement instrumentation attracts the attention not only of practitioners of combinatorial materials science but also of those who design new software for data manipulation and mining. Experimental designs of combinatorial libraries are pursued with available and realistic synthetic and characterization capabilities in mind. It is becoming increasingly critical to link the design of new equipment for high-throughput parallel materials synthesis with integrated measurement tools in order to enhance the efficacy of the overall experimental strategy. We have received an overwhelming response to our proposal and call for papers for this Special Issue on Combinatorial Materials Science. The papers in this issue of Measurement Science and Technology are a very timely collection that captures the state of modern combinatorial materials science. They demonstrate the significant advances that are taking place in the field. In some cases, characterization tools are now being operated in the factory mode. At the same time, major challenges

  13. The Principals and Practice of Distributed High Throughput Computing

    CERN Document Server

    CERN. Geneva

    2016-01-01

    The potential of Distributed Processing Systems to deliver computing capabilities with qualities ranging from high availability and reliability to easy expansion in functionality and capacity were recognized and formalized in the 1970’s. For more three decade these principals Distributed Computing guided the development of the HTCondor resource and job management system. The widely adopted suite of software tools offered by HTCondor are based on novel distributed computing technologies and are driven by the evolving needs of High Throughput scientific applications. We will review the principals that underpin our work, the distributed computing frameworks and technologies we developed and the lessons we learned from delivering effective and dependable software tools in an ever changing landscape computing technologies and needs that range today from a desktop computer to tens of thousands of cores offered by commercial clouds. About the speaker Miron Livny received a B.Sc. degree in Physics and Mat...

  14. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion....... The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery.......Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels...

  15. A robust robotic high-throughput antibody purification platform.

    Science.gov (United States)

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant.

  16. High-Throughput Genomics Enhances Tomato Breeding Efficiency

    Science.gov (United States)

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-01-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits. PMID:19721805

  17. A high-throughput chemically induced inflammation assay in zebrafish

    Directory of Open Access Journals (Sweden)

    Liebel Urban

    2010-12-01

    Full Text Available Abstract Background Studies on innate immunity have benefited from the introduction of zebrafish as a model system. Transgenic fish expressing fluorescent proteins in leukocyte populations allow direct, quantitative visualization of an inflammatory response in vivo. It has been proposed that this animal model can be used for high-throughput screens aimed at the identification of novel immunomodulatory lead compounds. However, current assays require invasive manipulation of fish individually, thus preventing high-content screening. Results Here we show that specific, noninvasive damage to lateral line neuromast cells can induce a robust acute inflammatory response. Exposure of fish larvae to sublethal concentrations of copper sulfate selectively damages the sensory hair cell population inducing infiltration of leukocytes to neuromasts within 20 minutes. Inflammation can be assayed in real time using transgenic fish expressing fluorescent proteins in leukocytes or by histochemical assays in fixed larvae. We demonstrate the usefulness of this method for chemical and genetic screens to detect the effect of immunomodulatory compounds and mutations affecting the leukocyte response. Moreover, we transformed the assay into a high-throughput screening method by using a customized automated imaging and processing system that quantifies the magnitude of the inflammatory reaction. Conclusions This approach allows rapid screening of thousands of compounds or mutagenized zebrafish for effects on inflammation and enables the identification of novel players in the regulation of innate immunity and potential lead compounds toward new immunomodulatory therapies. We have called this method the chemically induced inflammation assay, or ChIn assay. See Commentary article: http://www.biomedcentral.com/1741-7007/8/148.

  18. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  19. Interpretation of mass spectrometry data for high-throughput proteomics.

    Science.gov (United States)

    Chamrad, Daniel C; Koerting, Gerhard; Gobom, Johan; Thiele, Herbert; Klose, Joachim; Meyer, Helmut E; Blueggel, Martin

    2003-08-01

    Recent developments in proteomics have revealed a bottleneck in bioinformatics: high-quality interpretation of acquired MS data. The ability to generate thousands of MS spectra per day, and the demand for this, makes manual methods inadequate for analysis and underlines the need to transfer the advanced capabilities of an expert human user into sophisticated MS interpretation algorithms. The identification rate in current high-throughput proteomics studies is not only a matter of instrumentation. We present software for high-throughput PMF identification, which enables robust and confident protein identification at higher rates. This has been achieved by automated calibration, peak rejection, and use of a meta search approach which employs various PMF search engines. The automatic calibration consists of a dynamic, spectral information-dependent algorithm, which combines various known calibration methods and iteratively establishes an optimised calibration. The peak rejection algorithm filters signals that are unrelated to the analysed protein by use of automatically generated and dataset-dependent exclusion lists. In the "meta search" several known PMF search engines are triggered and their results are merged by use of a meta score. The significance of the meta score was assessed by simulation of PMF identification with 10,000 artificial spectra resembling a data situation close to the measured dataset. By means of this simulation the meta score is linked to expectation values as a statistical measure. The presented software is part of the proteome database ProteinScape which links the information derived from MS data to other relevant proteomics data. We demonstrate the performance of the presented system with MS data from 1891 PMF spectra. As a result of automatic calibration and peak rejection the identification rate increased from 6% to 44%.

  20. High throughput phenotyping for aphid resistance in large plant collections

    Directory of Open Access Journals (Sweden)

    Chen Xi

    2012-08-01

    Full Text Available Abstract Background Phloem-feeding insects are among the most devastating pests worldwide. They not only cause damage by feeding from the phloem, thereby depleting the plant from photo-assimilates, but also by vectoring viruses. Until now, the main way to prevent such problems is the frequent use of insecticides. Applying resistant varieties would be a more environmental friendly and sustainable solution. For this, resistant sources need to be identified first. Up to now there were no methods suitable for high throughput phenotyping of plant germplasm to identify sources of resistance towards phloem-feeding insects. Results In this paper we present a high throughput screening system to identify plants with an increased resistance against aphids. Its versatility is demonstrated using an Arabidopsis thaliana activation tag mutant line collection. This system consists of the green peach aphid Myzus persicae (Sulzer and the circulative virus Turnip yellows virus (TuYV. In an initial screening, with one plant representing one mutant line, 13 virus-free mutant lines were identified by ELISA. Using seeds produced from these lines, the putative candidates were re-evaluated and characterized, resulting in nine lines with increased resistance towards the aphid. Conclusions This M. persicae-TuYV screening system is an efficient, reliable and quick procedure to identify among thousands of mutated lines those resistant to aphids. In our study, nine mutant lines with increased resistance against the aphid were selected among 5160 mutant lines in just 5 months by one person. The system can be extended to other phloem-feeding insects and circulative viruses to identify insect resistant sources from several collections, including for example genebanks and artificially prepared mutant collections.

  1. Surrogate-assisted feature extraction for high-throughput phenotyping.

    Science.gov (United States)

    Yu, Sheng; Chakrabortty, Abhishek; Liao, Katherine P; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2017-04-01

    Phenotyping algorithms are capable of accurately identifying patients with specific phenotypes from within electronic medical records systems. However, developing phenotyping algorithms in a scalable way remains a challenge due to the extensive human resources required. This paper introduces a high-throughput unsupervised feature selection method, which improves the robustness and scalability of electronic medical record phenotyping without compromising its accuracy. The proposed Surrogate-Assisted Feature Extraction (SAFE) method selects candidate features from a pool of comprehensive medical concepts found in publicly available knowledge sources. The target phenotype's International Classification of Diseases, Ninth Revision and natural language processing counts, acting as noisy surrogates to the gold-standard labels, are used to create silver-standard labels. Candidate features highly predictive of the silver-standard labels are selected as the final features. Algorithms were trained to identify patients with coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis using various numbers of labels to compare the performance of features selected by SAFE, a previously published automated feature extraction for phenotyping procedure, and domain experts. The out-of-sample area under the receiver operating characteristic curve and F -score from SAFE algorithms were remarkably higher than those from the other two, especially at small label sizes. SAFE advances high-throughput phenotyping methods by automatically selecting a succinct set of informative features for algorithm training, which in turn reduces overfitting and the needed number of gold-standard labels. SAFE also potentially identifies important features missed by automated feature extraction for phenotyping or experts.

  2. A Química Quântica na compreensão de teorias de Química Orgânica The Quantum Chemistry in the understanding of theories of Organic Chemistry

    Directory of Open Access Journals (Sweden)

    Régis Casimiro Leal

    2010-01-01

    Full Text Available Quantum chemical calculations were performed in order to obtain molecular properties such as electronic density, dipole moment, atomic charges, and bond lengths, which were compared to qualitative results based on the theories of the organic chemistry. The quantum chemistry computational can be a useful tool to support the main theories of the organic chemistry.

  3. Structure and Quantum Chemistry Study on Hexaacetyl D-Mannose Hydrazine

    Institute of Scientific and Technical Information of China (English)

    CAO Rui; LIU Xiao-Hong; CHENG Chang-Mei; WANG Ru-Ji; ZHAO Yu-Fen; WANG Lai-Xi

    2007-01-01

    Hexaacetyl D-mannose hydrazine is one type of important intermediates in saccharide chemistry. In this paper, its single crystal was obtained and furthermore, X-ray diffraction and quantum chemistry calculation were performed. It belongs to orthorhombic system, space group P212121, with a=16.267(3), b=19.263(3), c=7.1948(12)(A), Mr=446.41, Dc=1.315 g/cm3, V=2254.5(6)(A)3 and Z=4. Meanwhile, the experimental results also provide information for designing a kind of molecular switch based on the mannose nitrogenous derivatives.

  4. Implementation of replica-exchange umbrella sampling in the DFTB + semiempirical quantum chemistry package

    Science.gov (United States)

    Ito, Shingo; Irle, Stephan; Okamoto, Yuko

    2016-07-01

    The replica-exchange umbrella sampling (REUS) method combines replica-exchange and umbrella sampling methods and allows larger conformational sampling than conventional simulation methods. This method has been used in many studies to understand docking mechanisms and the functions of molecules. However, REUS has not been combined with quantum chemical codes. Therefore, we implemented the REUS simulation technique in the DFTB + quantum chemistry code utilizing approximate density functional theory. We performed REUS simulations of an intra-molecular proton transfer reaction of malonaldehyde and a formation of a phthalocyanine from four phthalonitriles and one iron atom to validate the reliability of our implemented REUS-DFTB + combination.

  5. Mass spectrometric techniques for label-free high-throughput screening in drug discovery.

    Science.gov (United States)

    Roddy, Thomas P; Horvath, Christopher R; Stout, Steven J; Kenney, Kristin L; Ho, Pei-I; Zhang, Ji-Hu; Vickers, Chad; Kaushik, Virendar; Hubbard, Brian; Wang, Y Karen

    2007-11-01

    High-throughput screening (HTS) is an important tool for finding active compounds to initiate medicinal chemistry programs in pharmaceutical discovery research. Traditional HTS methods rely on fluorescent or radiolabeled reagents and/or coupling assays to permit quantitation of enzymatic target inhibition or activation. Mass spectrometry-based high-throughput screening (MS-HTS) is an alternative that is not susceptible to the limitations imposed by labeling and coupling enzymes. MS-HTS offers a selective and sensitive analytical method for unlabeled substrates and products. Furthermore, method development times are reduced without the need to incorporate labels or coupling assays. MS-HTS also permits screening of targets that are difficult or impossible to screen by other techniques. For example, enzymes that are challenging to purify can lead to the nonspecific detection of structurally similar components of the impure enzyme or matrix of membraneous enzymes. The high selectivity of tandem mass spectrometry (MS/MS) enables these screens to proceed with low levels of background noise to sensitively discover interesting hits even with relatively weak activity. In this article, we describe three techniques that we have adapted for large-scale (approximately 175,000 sample) compound library screening, including four-way parallel multiplexed electrospray liquid chromatography tandem mass spectrometry (MUX-LC/MS/MS), four-way parallel staggered gradient liquid chromatography tandem mass spectrometry (LC/MS/MS), and eight-way staggered flow injection MS/MS following 384-well plate solid-phase extraction (SPE). These methods are capable of analyzing a 384-well plate in 37 min, with typical analysis times of less than 2 h. The quality of the MS-HTS approach is demonstrated herein with screening data from two large-scale screens.

  6. Determination of Quantum Chemistry Based Force Fields for Molecular Dynamics Simulations of Aromatic Polymers

    Science.gov (United States)

    Jaffe, Richard; Langhoff, Stephen R. (Technical Monitor)

    1995-01-01

    Ab initio quantum chemistry calculations for model molecules can be used to parameterize force fields for molecular dynamics simulations of polymers. Emphasis in our research group is on using quantum chemistry-based force fields for molecular dynamics simulations of organic polymers in the melt and glassy states, but the methodology is applicable to simulations of small molecules, multicomponent systems and solutions. Special attention is paid to deriving reliable descriptions of the non-bonded and electrostatic interactions. Several procedures have been developed for deriving and calibrating these parameters. Our force fields for aromatic polyimide simulations will be described. In this application, the intermolecular interactions are the critical factor in determining many properties of the polymer (including its color).

  7. Quantum Chemistry of Solids The LCAO First Principles Treatment of Crystals

    CERN Document Server

    Evarestov, Robert A

    2007-01-01

    Quantum Chemistry of Solids delivers a comprehensive account of the main features and possibilities of LCAO methods for the first principles calculations of electronic structure of periodic systems. The first part describes the basic theory underlying the LCAO methods applied to periodic systems and the use of wave-function-based (Hartree-Fock), density-based (DFT) and hybrid hamiltonians. The translation and site symmetry consideration is included to establish connection between k-space solid-state physics and real-space quantum chemistry methods in the framework of cyclic model of an infinite crystal. The inclusion of electron correlation effects for periodic systems is considered on the basis of localized crystalline orbitals. The possibilities of LCAO methods for chemical bonding analysis in periodic systems are discussed. The second part deals with the applications of LCAO methods for calculations of bulk crystal properties, including magnetic ordering and crystal structure optimization. The discussion o...

  8. The use of quantum chemistry in pharmaceutical research as illustrated by case studies of indometacin and carbamazepine

    DEFF Research Database (Denmark)

    Gordon, Keith C; McGoverin, Cushla M; Strachan, Clare J

    2007-01-01

    A number of case studies that illustrate how quantum chemistry may be used in studying pharmaceutical systems are reviewed. A brief introduction to quantum methods is provided and the use of these methods in understanding the structure and properties of indometacin and carbamazepine is discussed....... The use of calculated structures and molecular electrostatic potentials in developing quantitative structure-activity relationships is discussed along with the use of computation chemistry to predict spectroscopic properties....

  9. STUDY OF GRAFT SITES IN EPOXY GRAFT COPOLYMERS BY QUANTUM CHEMISTRY CALCULATIONS

    Institute of Scientific and Technical Information of China (English)

    Song Chen; Xiao-yu Li

    2009-01-01

    Exploration and characterization of grafting productions by experimental methods are often cumbersome or sometimes impossible. Therefore, quantum chemistry calculations were performed to characterize the graft sites of epoxy resin. According to the Gibbs free energy criterion of the second law of thermodynamic, the reported graft sites were confirmed, and more important, some unreported graft sites were found. In addition, method of increasing the number of graft sites was studied in this article.

  10. A Bayesian Approach to Calibrating High-Throughput Virtual Screening Results and Application to Organic Photovoltaic Materials

    CERN Document Server

    Pyzer-Knapp, Edward O; Aspuru-Guzik, Alan

    2015-01-01

    A novel approach for calibrating quantum-chemical properties determined as part of a high-throughput virtual screen to experimental analogs is presented. Information on the molecular graph is extracted through the use of extended connectivity fingerprints, and exploited using a Gaussian process to calibrate both electronic properties such as frontier orbital energies, and optical gaps and device properties such as short circuit current density, open circuit voltage and power conversion efficiency. The Bayesian nature of this process affords a value for uncertainty in addition to each calibrated value. This allows the researcher to gain intuition about the model as well as the ability to respect its bounds.

  11. A Need to Reassess Physical-Organic Curricula: A Course Enhancement Using Readily Available Quantum Chemistry Programs.

    Science.gov (United States)

    Lipkowitz, Kenny B.

    1982-01-01

    Describes a graduate-level course in physical-organic chemistry in which students learn to solve problems using computer programs available through the Quantum Chemistry Program Exchange. Includes condensed syllabus and time line showing where various computational programs are introduced. (Author/JN)

  12. The molecular electron density distribution meeting place of X-ray diffraction and quantum chemistry intermediate - between theory and experiment

    NARCIS (Netherlands)

    Feil, Dirk

    1992-01-01

    Quantum chemistry and the concepts used daily in chemistry are increasingly growing apart. Among the concepts that are able to bridge the gap between theory and experimental practice, electron density distribution has an important place. The study of this distribution has led to new developments in

  13. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  14. On the applicability of one- and many-electron quantum chemistry models for hydrated electron clusters.

    Science.gov (United States)

    Turi, László

    2016-04-21

    We evaluate the applicability of a hierarchy of quantum models in characterizing the binding energy of excess electrons to water clusters. In particular, we calculate the vertical detachment energy of an excess electron from water cluster anions with methods that include one-electron pseudopotential calculations, density functional theory(DFT) based calculations, and ab initio quantum chemistry using MP2 and eom-EA-CCSD levels of theory. The examined clusters range from the smallest cluster size (n = 2) up to nearly nanosize clusters with n = 1000 molecules. The examined cluster configurations are extracted from mixed quantum-classical molecular dynamics trajectories of cluster anions with n = 1000 water molecules using two different one-electron pseudopotenial models. We find that while MP2 calculations with large diffuse basis set provide a reasonable description for the hydrated electron system, DFT methods should be used with precaution and only after careful benchmarking. Strictly tested one-electron psudopotentials can still be considered as reasonable alternatives to DFT methods, especially in large systems. The results of quantum chemistry calculations performed on configurations, that represent possible excess electron binding motifs in the clusters, appear to be consistent with the results using a cavitystructure preferring one-electron pseudopotential for the hydrated electron, while they are in sharp disagreement with the structural predictions of a non-cavity model.

  15. High-Throughput Preparation of New Photoactive Nanocomposites.

    Science.gov (United States)

    Conterosito, Eleonora; Benesperi, Iacopo; Toson, Valentina; Saccone, Davide; Barbero, Nadia; Palin, Luca; Barolo, Claudia; Gianotti, Valentina; Milanesio, Marco

    2016-06-08

    New low-cost photoactive hybrid materials based on organic luminescent molecules inserted into hydrotalcite (layered double hydroxides; LDH) were produced, which exploit the high-throughput liquid-assisted grinding (LAG) method. These materials are conceived for applications in dye-sensitized solar cells (DSSCs) as a co-absorbers and in silicon photovoltaic (PV) panels to improve their efficiency as they are able to emit where PV modules show the maximum efficiency. A molecule that shows a large Stokes' shift was designed, synthesized, and intercalated into LDH. Two dyes already used in DSSCs were also intercalated to produce two new nanocomposites. LDH intercalation allows the stability of organic dyes to be improved and their direct use in polymer melt blending. The prepared nanocomposites absorb sunlight from UV to visible and emit from blue to near-IR and thus can be exploited for light-energy management. Finally one nanocomposite was dispersed by melt blending into a poly(methyl methacrylate)-block-poly(n-butyl acrylate) copolymer to obtain a photoactive film.

  16. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  17. A fully automated high-throughput training system for rodents.

    Directory of Open Access Journals (Sweden)

    Rajesh Poddar

    Full Text Available Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal's home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors.

  18. PRISM: a data management system for high-throughput proteomics.

    Science.gov (United States)

    Kiebel, Gary R; Auberry, Ken J; Jaitly, Navdeep; Clark, David A; Monroe, Matthew E; Peterson, Elena S; Tolić, Nikola; Anderson, Gordon A; Smith, Richard D

    2006-03-01

    Advanced proteomic research efforts involving areas such as systems biology or biomarker discovery are enabled by the use of high level informatics tools that allow the effective analysis of large quantities of differing types of data originating from various studies. Performing such analyses on a large scale is not feasible without a computational platform that performs data processing and management tasks. Such a platform must be able to provide high-throughput operation while having sufficient flexibility to accommodate evolving data analysis tools and methodologies. The Proteomics Research Information Storage and Management system (PRISM) provides a platform that serves the needs of the accurate mass and time tag approach developed at Pacific Northwest National Laboratory. PRISM incorporates a diverse set of analysis tools and allows a wide range of operations to be incorporated by using a state machine that is accessible to independent, distributed computational nodes. The system has scaled well as data volume has increased over several years, while allowing adaptability for incorporating new and improved data analysis tools for more effective proteomics research.

  19. High-throughput comet assay using 96 minigels.

    Science.gov (United States)

    Gutzkow, Kristine B; Langleite, Torgrim M; Meier, Silja; Graupner, Anne; Collins, Andrew R; Brunborg, Gunnar

    2013-05-01

    The single-cell gel electrophoresis--the comet assay--has proved to be a sensitive and relatively simple method that is much used in research for the analysis of specific types of DNA damage, and its use in genotoxicity testing is increasing. The efficiency of the comet assay, in terms of number of samples processed per experiment, has been rather poor, and both research and toxicological testing should profit from an increased throughput. We have designed and validated a format involving 96 agarose minigels supported by a hydrophilic polyester film. Using simple technology, hundreds of samples may be processed in one experiment by one person, with less time needed for processing, less use of chemicals and requiring fewer cells per sample. Controlled electrophoresis, including circulation of the electrophoresis solution, improves the homogeneity between replicate samples in the 96-minigel format. The high-throughput method described in this paper should greatly increase the overall capacity, versatility and robustness of the comet assay.

  20. High Throughput T Epitope Mapping and Vaccine Development

    Directory of Open Access Journals (Sweden)

    Giuseppina Li Pira

    2010-01-01

    Full Text Available Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th and by cytolytic T lymphocytes (CTL is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost.

  1. An apparatus for high throughput nanomechanical muscle cell experimentation.

    Science.gov (United States)

    Garcia-Webb, M; Hunter, I; Taberner, A

    2004-01-01

    An array of independent muscle cell testing modules is being developed to explore the mechanics of cardiac myocytes. The instrument will be able to perform established physiological tests and utilize novel system identification techniques to measure the dynamic stiffness and stress frequency response of single cells with possible applications in the pharmaceutical industry for high throughput screening. Currently, each module consists of two independently controlled Lorentz force actuators in the form of stainless steel cantilevers with dimensions 0.025 mm x 0.8 mm x 3 mm, 0.1 m/N compliance and 1.5 kHz resonant frequency. Confocal position sensors focused on each cantilever provide position and force resolution 0.1 mm and forces > 0.1 mN. A custom Visual Basic.Net software interface to a National Instruments data acquisition card implements real time digital control over 4 input channels and 2 output channels at 20 kHz. In addition, algorithms for both swept sine and stochastic system identification have been written to probe mechanical systems. The device has been used to find the dynamic stiffness of a 5 microm diameter polymer fiber between 0 and 500 Hz.

  2. High throughput jet singlet oxygen generator for multi kilowatt SCOIL

    Science.gov (United States)

    Rajesh, R.; Singhal, Gaurav; Mainuddin; Tyagi, R. K.; Dawar, A. L.

    2010-06-01

    A jet flow singlet oxygen generator (JSOG) capable of handling chlorine flows of nearly 1.5 mol s -1 has been designed, developed, and tested. The generator is designed in a modular configuration taking into consideration the practical aspects of handling high throughput flows without catastrophic BHP carry over. While for such high flow rates a cross-flow configuration has been reported, the generator utilized in the present study is a counter flow configuration. A near vertical extraction of singlet oxygen is effected at the generator exit, followed by a 90° rotation of the flow forming a novel verti-horizontal COIL scheme. This allows the COIL to be operated with a vertical extraction SOG followed by the horizontal arrangement of subsequent COIL systems such as supersonic nozzle, cavity, supersonic diffuser, etc. This enables a more uniform weight distribution from point of view of mobile and other platform mounted systems, which is highly relevant for large scale systems. The present study discusses the design aspects of the jet singlet oxygen generator along with its test results for various operating ranges. Typically, for the intended design flow rates, the chlorine utilization and singlet oxygen yield have been observed to be ˜94% and ˜64%, respectively.

  3. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  4. Assessing the utility and limitations of high throughput virtual screening

    Directory of Open Access Journals (Sweden)

    Paul Daniel Phillips

    2016-05-01

    Full Text Available Due to low cost, speed, and unmatched ability to explore large numbers of compounds, high throughput virtual screening and molecular docking engines have become widely utilized by computational scientists. It is generally accepted that docking engines, such as AutoDock, produce reliable qualitative results for ligand-macromolecular receptor binding, and molecular docking results are commonly reported in literature in the absence of complementary wet lab experimental data. In this investigation, three variants of the sixteen amino acid peptide, α-conotoxin MII, were docked to a homology model of the a3β2-nicotinic acetylcholine receptor. DockoMatic version 2.0 was used to perform a virtual screen of each peptide ligand to the receptor for ten docking trials consisting of 100 AutoDock cycles per trial. The results were analyzed for both variation in the calculated binding energy obtained from AutoDock, and the orientation of bound peptide within the receptor. The results show that, while no clear correlation exists between consistent ligand binding pose and the calculated binding energy, AutoDock is able to determine a consistent positioning of bound peptide in the majority of trials when at least ten trials were evaluated.

  5. Comprehensive analysis of high-throughput screening data

    Science.gov (United States)

    Heyse, Stephan

    2002-06-01

    High-Throughput Screening (HTS) data in its entirety is a valuable raw material for the drug-discovery process. It provides the most compete information about the biological activity of a company's compounds. However, its quantity, complexity and heterogeneity require novel, sophisticated approaches in data analysis. At GeneData, we are developing methods for large-scale, synoptical mining of screening data in a five-step analysis: (1) Quality Assurance: Checking data for experimental artifacts and eliminating low quality data. (2) Biological Profiling: Clustering and ranking of compounds based on their biological activity, taking into account specific characteristics of HTS data. (3) Rule-based Classification: Applying user-defined rules to biological and chemical properties, and providing hypotheses on the biological mode-of-action of compounds. (4) Joint Biological-Chemical Analysis: Associating chemical compound data to HTS data, providing hypotheses for structure- activity relationships. (5) integration with Genomic and Gene Expression Data: Linking into other components of GeneData's bioinformatics platform, and assessing the compounds' modes-of-action, toxicity, and metabolic properties. These analyses address issues that are crucial for a correct interpretation and full exploitation of screening data. They lead to a sound rating of assays and compounds at an early state of the lead-finding process.

  6. Hydrodynamic Cell Trapping for High Throughput Single-Cell Applications

    Directory of Open Access Journals (Sweden)

    Amin Abbaszadeh Banaeiyan

    2013-12-01

    Full Text Available The possibility to conduct complete cell assays under a precisely controlled environment while consuming minor amounts of chemicals and precious drugs have made microfluidics an interesting candidate for quantitative single-cell studies. Here, we present an application-specific microfluidic device, cellcomb, capable of conducting high-throughput single-cell experiments. The system employs pure hydrodynamic forces for easy cell trapping and is readily fabricated in polydimethylsiloxane (PDMS using soft lithography techniques. The cell-trapping array consists of V-shaped pockets designed to accommodate up to six Saccharomyces cerevisiae (yeast cells with the average diameter of 4 μm. We used this platform to monitor the impact of flow rate modulation on the arsenite (As(III uptake in yeast. Redistribution of a green fluorescent protein (GFP-tagged version of the heat shock protein Hsp104 was followed over time as read out. Results showed a clear reverse correlation between the arsenite uptake and three different adjusted low = 25 nL min−1, moderate = 50 nL min−1, and high = 100 nL min−1 flow rates. We consider the presented device as the first building block of a future integrated application-specific cell-trapping array that can be used to conduct complete single cell experiments on different cell types.

  7. Adaptation to high throughput batch chromatography enhances multivariate screening.

    Science.gov (United States)

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Use of High Throughput Screening Data in IARC Monograph ...

    Science.gov (United States)

    Purpose: Evaluation of carcinogenic mechanisms serves a critical role in IARC monograph evaluations, and can lead to “upgrade” or “downgrade” of the carcinogenicity conclusions based on human and animal evidence alone. Three recent IARC monograph Working Groups (110, 112, and 113) pioneered analysis of high throughput in vitro screening data from the U.S. Environmental Protection Agency’s ToxCast program in evaluations of carcinogenic mechanisms. Methods: For monograph 110, ToxCast assay data across multiple nuclear receptors were used to test the hypothesis that PFOA acts exclusively through the PPAR family of receptors, with activity profiles compared to several prototypical nuclear receptor-activating compounds. For monographs 112 and 113, ToxCast assays were systematically evaluated and used as an additional data stream in the overall evaluation of the mechanistic evidence. Specifically, ToxCast assays were mapped to 10 “key characteristics of carcinogens” recently identified by an IARC expert group, and chemicals’ bioactivity profiles were evaluated both in absolute terms (number of relevant assays positive for bioactivity) and relative terms (ranking with respect to other compounds evaluated by IARC, using the ToxPi methodology). Results: PFOA activates multiple nuclear receptors in addition to the PPAR family in the ToxCast assays. ToxCast assays offered substantial coverage for 5 of the 10 “key characteristics,” with the greates

  9. High Throughput, Continuous, Mass Production of Photovoltaic Modules

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Barth

    2008-02-06

    AVA Solar has developed a very low cost solar photovoltaic (PV) manufacturing process and has demonstrated the significant economic and commercial potential of this technology. This I & I Category 3 project provided significant assistance toward accomplishing these milestones. The original goals of this project were to design, construct and test a production prototype system, fabricate PV modules and test the module performance. The original module manufacturing costs in the proposal were estimated at $2/Watt. The objectives of this project have been exceeded. An advanced processing line was designed, fabricated and installed. Using this automated, high throughput system, high efficiency devices and fully encapsulated modules were manufactured. AVA Solar has obtained 2 rounds of private equity funding, expand to 50 people and initiated the development of a large scale factory for 100+ megawatts of annual production. Modules will be manufactured at an industry leading cost which will enable AVA Solar's modules to produce power that is cost-competitive with traditional energy resources. With low manufacturing costs and the ability to scale manufacturing, AVA Solar has been contacted by some of the largest customers in the PV industry to negotiate long-term supply contracts. The current market for PV has continued to grow at 40%+ per year for nearly a decade and is projected to reach $40-$60 Billion by 2012. Currently, a crystalline silicon raw material supply shortage is limiting growth and raising costs. Our process does not use silicon, eliminating these limitations.

  10. High Throughput Multispectral Image Processing with Applications in Food Science.

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  11. High-throughput translational medicine: challenges and solutions.

    Science.gov (United States)

    Sulakhe, Dinanath; Balasubramanian, Sandhya; Xie, Bingqing; Berrocal, Eduardo; Feng, Bo; Taylor, Andrew; Chitturi, Bhadrachalam; Dave, Utpal; Agam, Gady; Xu, Jinbo; Börnigen, Daniela; Dubchak, Inna; Gilliam, T Conrad; Maltsev, Natalia

    2014-01-01

    Recent technological advances in genomics now allow producing biological data at unprecedented tera- and petabyte scales. Yet, the extraction of useful knowledge from this voluminous data presents a significant challenge to a scientific community. Efficient mining of vast and complex data sets for the needs of biomedical research critically depends on seamless integration of clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships accumulated in a plethora of publicly available databases. Furthermore, such experimental data should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining. Translational projects require sophisticated approaches that coordinate and perform various analytical steps involved in the extraction of useful knowledge from accumulated clinical and experimental data in an orderly semiautomated manner. It presents a number of challenges such as (1) high-throughput data management involving data transfer, data storage, and access control; (2) scalable computational infrastructure; and (3) analysis of large-scale multidimensional data for the extraction of actionable knowledge.We present a scalable computational platform based on crosscutting requirements from multiple scientific groups for data integration, management, and analysis. The goal of this integrated platform is to address the challenges and to support the end-to-end analytical needs of various translational projects.

  12. Towards high-throughput microfluidic Raman-activated cell sorting.

    Science.gov (United States)

    Zhang, Qiang; Zhang, Peiran; Gou, Honglei; Mou, Chunbo; Huang, Wei E; Yang, Menglong; Xu, Jian; Ma, Bo

    2015-09-21

    Raman-activated cell sorting (RACS) is a promising single-cell analysis technology that is able to identify and isolate individual cells of targeted type, state or environment from an isogenic population or complex consortium of cells, in a label-free and non-invasive manner. However, compared with those widely used yet labeling-required or staining-dependent cell sorting technologies such as FACS and MACS, the weak Raman signal greatly limits the further development of the existing RACS systems to achieve higher throughput. Strategies that can tackle this bottleneck include, first, improvement of Raman-acquisition efficiency and quality based on advanced Raman spectrometers and enhanced Raman techniques; second, development of novel microfluidic devices for cell sorting followed by integration into a complete RACS system. Exploiting these strategies, prototypes for a new generation of RACS have been demonstrated, such as flow-based OT-RACS, DEP-RACS, and SERS/CARS flow cytometry. Such high-throughput microfluidic RACS can provide biologists with a powerful single-cell analysis tool to explore the scientific questions or applications that have been beyond the reach of FACS and MACS.

  13. High Throughput Interrogation of Behavioral Transitions in C. elegans

    Science.gov (United States)

    Liu, Mochi; Shaevitz, Joshua; Leifer, Andrew

    We present a high-throughput method to probe transformations from neural activity to behavior in Caenorhabditis elegans to better understand how organisms change behavioral states. We optogenetically deliver white-noise stimuli to target sensory or inter neurons while simultaneously recording the movement of a population of worms. Using all the postural movement data collected, we computationally classify stereotyped behaviors in C. elegans by clustering based on the spectral properties of the instantaneous posture. (Berman et al., 2014) Transitions between these behavioral clusters indicate discrete behavioral changes. To study the neural correlates dictating these transitions, we perform model-driven experiments and employ Linear-Nonlinear-Poisson cascades that take the white-noise stimulus as the input. The parameters of these models are fitted by reverse-correlation from our measurements. The parameterized models of behavioral transitions predict the worm's response to novel stimuli and reveal the internal computations the animal makes before carrying out behavioral decisions. Preliminary results are shown that describe the neural-behavioral transformation between neural activity in mechanosensory neurons and reversal behavior.

  14. Probabilistic Assessment of High-Throughput Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Robin E. Kim

    2016-05-01

    Full Text Available Structural health monitoring (SHM using wireless smart sensors (WSS has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved.

  15. A microfluidic, high throughput protein crystal growth method for microgravity.

    Directory of Open Access Journals (Sweden)

    Carl W Carruthers

    Full Text Available The attenuation of sedimentation and convection in microgravity can sometimes decrease irregularities formed during macromolecular crystal growth. Current terrestrial protein crystal growth (PCG capabilities are very different than those used during the Shuttle era and that are currently on the International Space Station (ISS. The focus of this experiment was to demonstrate the use of a commercial off-the-shelf, high throughput, PCG method in microgravity. Using Protein BioSolutions' microfluidic Plug Maker™/CrystalCard™ system, we tested the ability to grow crystals of the regulator of glucose metabolism and adipogenesis: peroxisome proliferator-activated receptor gamma (apo-hPPAR-γ LBD, as well as several PCG standards. Overall, we sent 25 CrystalCards™ to the ISS, containing ~10,000 individual microgravity PCG experiments in a 3U NanoRacks NanoLab (1U = 10(3 cm.. After 70 days on the ISS, our samples were returned with 16 of 25 (64% microgravity cards having crystals, compared to 12 of 25 (48% of the ground controls. Encouragingly, there were more apo-hPPAR-γ LBD crystals in the microgravity PCG cards than the 1g controls. These positive results hope to introduce the use of the PCG standard of low sample volume and large experimental density to the microgravity environment and provide new opportunities for macromolecular samples that may crystallize poorly in standard laboratories.

  16. High Throughput Multispectral Image Processing with Applications in Food Science.

    Directory of Open Access Journals (Sweden)

    Panagiotis Tsakanikas

    Full Text Available Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  17. A high-throughput screen for antibiotic drug discovery.

    Science.gov (United States)

    Scanlon, Thomas C; Dostal, Sarah M; Griswold, Karl E

    2014-02-01

    We describe an ultra-high-throughput screening platform enabling discovery and/or engineering of natural product antibiotics. The methodology involves creation of hydrogel-in-oil emulsions in which recombinant microorganisms are co-emulsified with bacterial pathogens; antibiotic activity is assayed by use of a fluorescent viability dye. We have successfully utilized both bulk emulsification and microfluidic technology for the generation of hydrogel microdroplets that are size-compatible with conventional flow cytometry. Hydrogel droplets are ∼25 pL in volume, and can be synthesized and sorted at rates exceeding 3,000 drops/s. Using this technique, we have achieved screening throughputs exceeding 5 million clones/day. Proof-of-concept experiments demonstrate efficient selection of antibiotic-secreting yeast from a vast excess of negative controls. In addition, we have successfully used this technique to screen a metagenomic library for secreted antibiotics that kill the human pathogen Staphylococcus aureus. Our results establish the practical utility of the screening platform, and we anticipate that the accessible nature of our methods will enable others seeking to identify and engineer the next generation of antibacterial biomolecules. © 2013 Wiley Periodicals, Inc.

  18. PRISM: A Data Management System for High-Throughput Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Kiebel, Gary R.; Auberry, Kenneth J.; Jaitly, Navdeep; Clark, Dave; Monroe, Matthew E.; Peterson, Elena S.; Tolic, Nikola; Anderson, Gordon A.; Smith, Richard D.

    2006-03-01

    Advanced proteomic research efforts involving areas such as systems biology or biomarker discovery are enabled by the use of high level informatics tools that allow the effective analysis of large quantities of differing types of data originating from various studies. Performing such analyses on a large scale is not feasible without a computational platform that performs data processing and management tasks. Such a platform must be able to provide high-throughput operation while having sufficient flexibility to accommodate evolving data analysis tools and methodologies. The Proteomics Research Information Storage and Management System (PRISM) provides a platform that serves the needs of the accurate mass and time tag approach developed at PNNL. PRISM incorporates a diverse set of analysis tools and allows a wide range of operations to be incorporated by using a state machine that is accessible to independent, distributed computational nodes. The system has scaled well as data volume has increased over several years, while allowing adaptability for incorporating new and improved data analysis tools for more effective proteomics research.

  19. High Throughput Multispectral Image Processing with Applications in Food Science

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing’s outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples. PMID:26466349

  20. High-throughput screening technologies for drug glucuronidation profiling.

    Science.gov (United States)

    Trubetskoy, Olga; Finel, Moshe; Trubetskoy, Vladimir

    2008-08-01

    A significant number of endogenous and exogenous compounds, including many therapeutic agents, are metabolized in humans via glucuronidation, catalysed by uridine diphosphoglucuronosyltransferases (UGTs). The study of the UGTs is a growing field of research, with constantly accumulated and updated information regarding UGT structure, purification, substrate specificity and inhibition, including clinically relevant drug interactions. Development of reliable UGT assays for the assessment of individual isoform substrate specificity and for the discovery of novel isoform-specific substrates and inhibitors is crucial for understanding the function and regulation of the UGT enzyme family and its clinical and pharmacological relevance. High-throughput screening (HTS) is a powerful technology used to search for novel substrates and inhibitors for a wide variety of targets. However, application of HTS in the context of UGTs is complicated because of the poor stability, low levels of expression, low affinity and broad substrate specificity of the enzymes, combined with difficulties in obtaining individual UGT isoforms in purified format, and insufficient information regarding isoform-specific substrates and inhibitors. This review examines the current status of HTS assays used in the search for novel UGT substrates and inhibitors, emphasizing advancements and challenges in HTS technologies for drug glucuronidation profiling, and discusses possible avenues for future advancement of the field.

  1. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    Science.gov (United States)

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-12-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  2. Hypothesis testing in high-throughput screening for drug discovery.

    Science.gov (United States)

    Prummer, Michael

    2012-04-01

    Following the success of small-molecule high-throughput screening (HTS) in drug discovery, other large-scale screening techniques are currently revolutionizing the biological sciences. Powerful new statistical tools have been developed to analyze the vast amounts of data in DNA chip studies, but have not yet found their way into compound screening. In HTS, characterization of single-point hit lists is often done only in retrospect after the results of confirmation experiments are available. However, for prioritization, for optimal use of resources, for quality control, and for comparison of screens it would be extremely valuable to predict the rates of false positives and false negatives directly from the primary screening results. Making full use of the available information about compounds and controls contained in HTS results and replicated pilot runs, the Z score and from it the p value can be estimated for each measurement. Based on this consideration, we have applied the concept of p-value distribution analysis (PVDA), which was originally developed for gene expression studies, to HTS data. PVDA allowed prediction of all relevant error rates as well as the rate of true inactives, and excellent agreement with confirmation experiments was found.

  3. High-throughput mass spectrometric cytochrome P450 inhibition screening.

    Science.gov (United States)

    Lim, Kheng B; Ozbal, Can C; Kassel, Daniel B

    2013-01-01

    We describe here a high-throughput assay to support rapid evaluation of drug discovery compounds for possible drug-drug interaction (DDI). Each compound is evaluated for its DDI potential by incubating over a range of eight concentrations and against a panel of six cytochrome P450 (CYP) enzymes: 1A2, 2C8, 2C9, 2C19, 2D6, and 3A4. The method utilizes automated liquid handling for sample preparation, and online solid-phase extraction/tandem mass spectrometry (SPE/MS/MS) for sample analyses. The system is capable of generating two 96-well assay plates in 30 min, and completes the data acquisition and analysis of both plates in about 30 min. Many laboratories that perform the CYP inhibition screening automate only part of the processes leaving a throughput bottleneck within the workflow. The protocols described in this chapter are aimed to streamline the entire process from assay to data acquisition and processing by incorporating automation and utilizing high-precision instrument to maximize throughput and minimize bottleneck.

  4. High throughput screening for drug discovery of autophagy modulators.

    Science.gov (United States)

    Shu, Chih-Wen; Liu, Pei-Feng; Huang, Chun-Ming

    2012-11-01

    Autophagy is an evolutionally conserved process in cells for cleaning abnormal proteins and organelles in a lysosome dependent manner. Growing studies have shown that defects or induced autophagy contributes to many diseases including aging, neurodegeneration, pathogen infection, and cancer. However, the precise involvement of autophagy in health and disease remains controversial because the theories are built on limited assays and chemical modulators, indicating that the role of autophagy in diseases may require further verification. Many food and drug administration (FDA) approved drugs modulate autophagy signaling, suggesting that modulation of autophagy with pharmacological agonists or antagonists provides a potential therapy for autophagy-related diseases. This suggestion raises an attractive issue on drug discovery for exploring chemical modulators of autophagy. High throughput screening (HTS) is becoming a powerful tool for drug discovery that may accelerate screening specific autophagy modulators to clarify the role of autophagy in diseases. Herein, this review lays out current autophagy assays to specifically measure autophagy components such as LC3 (mammalian homologue of yeast Atg8) and Atg4. These assays are feasible or successful for HTS with certain chemical libraries, which might be informative for this intensively growing field as research tools and hopefully developing new drugs for autophagy-related diseases.

  5. High-Throughput Screening Using Mass Spectrometry within Drug Discovery.

    Science.gov (United States)

    Rohman, Mattias; Wingfield, Jonathan

    2016-01-01

    In order to detect a biochemical analyte with a mass spectrometer (MS) it is necessary to ionize the analyte of interest. The analyte can be ionized by a number of different mechanisms, however, one common method is electrospray ionization (ESI). Droplets of analyte are sprayed through a highly charged field, the droplets pick up charge, and this is transferred to the analyte. High levels of salt in the assay buffer will potentially steal charge from the analyte and suppress the MS signal. In order to avoid this suppression of signal, salt is often removed from the sample prior to injection into the MS. Traditional ESI MS relies on liquid chromatography (LC) to remove the salt and reduce matrix effects, however, this is a lengthy process. Here we describe the use of RapidFire™ coupled to a triple-quadrupole MS for high-throughput screening. This system uses solid-phase extraction to de-salt samples prior to injection, reducing processing time such that a sample is injected into the MS ~every 10 s.

  6. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    Directory of Open Access Journals (Sweden)

    Julio Alonso-Padilla

    2014-12-01

    Full Text Available The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  7. The JCSG high-throughput structural biology pipeline.

    Science.gov (United States)

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  8. Probabilistic Assessment of High-Throughput Wireless Sensor Networks.

    Science.gov (United States)

    Kim, Robin E; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F; Song, Junho

    2016-05-31

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved.

  9. Surface Chemistry of Semiconducting Quantum Dots: Theoretical Perspectives.

    Science.gov (United States)

    Kilina, Svetlana V; Tamukong, Patrick K; Kilin, Dmitri S

    2016-10-18

    Colloidal quantum dots (QDs) are near-ideal nanomaterials for energy conversion and lighting technologies. However, their photophysics exhibits supreme sensitivity to surface passivation and defects, of which control is problematic. The role of passivating ligands in photodynamics remains questionable and is a focus of ongoing research. The optically forbidden nature of surface-associated states makes direct measurements on them challenging. Therefore, computational modeling is imperative for insights into surface passivation and its impact on light-driven processes in QDs. This Account discusses challenges and recent progress in understanding surface effects on the photophysics of QDs addressed via quantum-chemical calculations. We overview different methods, including the effective mass approximation (EMA), time-dependent density functional theory (TDDFT), and multiconfiguration approaches, considering their strengths and weaknesses relevant to modeling of QDs with a complicated surface. We focus on CdSe, PbSe, and Si QDs, where calculations successfully explain experimental trends sensitive to surface defects, doping, and ligands. We show that the EMA accurately describes both linear and nonlinear optical properties of large-sized CdSe QDs (>2.5 nm), while TDDFT is required for smaller QDs where surface effects dominate. Both approaches confirm efficient two-photon absorption enabling applications of QDs as nonlinear optical materials. TDDFT also describes the effects of morphology on the optical response of QDs: the photophysics of stoichiometric, magic-sized XnYn (X = Cd, Pb; Y = S, Se) QDs is less sensitive to their passivation compared with nonstoichiometric Xn≠mYm QDs. In the latter, surface-driven optically inactive midgap states can be eliminated by anionic ligands, explaining the better emission of metal-enriched QDs compared with nonmetal-enriched QDs. Ideal passivation of magic-sized QDs by amines and phosphine oxides leaves lower-energy transitions

  10. High-throughput Screening:Synthesis of a Novel Fluorescent Microspheres

    Institute of Scientific and Technical Information of China (English)

    LI Song-Jun; LIU Bai-ling

    2004-01-01

    As one of efficient analytes, fluorescent microspheres have shown much usability on many biochemical and biomedical processes. Recent applications with fluorescent microspheres have included cytokine quantitation, single nucleotide polymorphism genotyping, phosphorylated protein detection, and characterization of the molecular interaction of nuclear receptors. These,coupled with the rapid advances in molecular biology and synthesis techniques of drugs, have presented a basis for drug screening in a high-throughput format. Based on fluorescent microspheres,earlier assay formats of HTS relied mainly on proximity-dependent energy transfer including scintillation proximity assay (SPA) (Amersham Pharmacia Biotech) and FlashPlatesTM (NEN Life Science Products, Boston, MA). Indeed, drug screening-based such fluorescent emission is still accounting for about 20~50% of current content of high-throughput screening (HTS). Now, SPA is almost a standard technique in common HTS-lab. In literature, SPA microspheres is generally prepared from inorganic scintillators such as yttrium silicate and hydrophobic polymers such as polyvinyl toluene. However, in HTS research, such microspheres often show the disadvantages of strong hydrophobicity and low quantum efficiency. The strong hydrophobicity is mainly attributed to the hydrophobic monomer, vinyl toluene. The low quantum efficiency can be as a result of low transparence of the polymer, polyvinyl toluene. Thus, the subsequent treatments for such microspheres, so as coat a polyhydroxy film to decrease the hydrophobicity, are actually considerably complicated.It has been well known that poly(methyl methacrylate) (PMMA), a good biocompatible polymer with not only adequate mechanical strength but also excellent transparence, can be regarded as an ideal candidate material for fluorescent matrix. In present study, methyl methacrylate as monomer and 2,5-diphenyloxazole (DPO) as fluorescent dye were used to the fluorescent microspheres. In

  11. High-Throughput Neuroimaging-Genetics Computational Infrastructure

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2014-04-01

    Full Text Available Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate and disseminate novel scientific methods, computational resources and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval and aggregation. Computational processing involves the necessary software, hardware and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical and phenotypic data and meta-data. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI and the Laboratory of Neuro Imaging (LONI at University of Southern California (USC. INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer’s and Parkinson’s data, we provide several examples of translational applications using this infrastructure.

  12. Emerging metrology for high-throughput nanomaterial genotoxicology.

    Science.gov (United States)

    Nelson, Bryant C; Wright, Christa W; Ibuki, Yuko; Moreno-Villanueva, Maria; Karlsson, Hanna L; Hendriks, Giel; Sims, Christopher M; Singh, Neenu; Doak, Shareen H

    2017-01-01

    The rapid development of the engineered nanomaterial (ENM) manufacturing industry has accelerated the incorporation of ENMs into a wide variety of consumer products across the globe. Unintentionally or not, some of these ENMs may be introduced into the environment or come into contact with humans or other organisms resulting in unexpected biological effects. It is thus prudent to have rapid and robust analytical metrology in place that can be used to critically assess and/or predict the cytotoxicity, as well as the potential genotoxicity of these ENMs. Many of the traditional genotoxicity test methods [e.g. unscheduled DNA synthesis assay, bacterial reverse mutation (Ames) test, etc.,] for determining the DNA damaging potential of chemical and biological compounds are not suitable for the evaluation of ENMs, due to a variety of methodological issues ranging from potential assay interferences to problems centered on low sample throughput. Recently, a number of sensitive, high-throughput genotoxicity assays/platforms (CometChip assay, flow cytometry/micronucleus assay, flow cytometry/γ-H2AX assay, automated 'Fluorimetric Detection of Alkaline DNA Unwinding' (FADU) assay, ToxTracker reporter assay) have been developed, based on substantial modifications and enhancements of traditional genotoxicity assays. These new assays have been used for the rapid measurement of DNA damage (strand breaks), chromosomal damage (micronuclei) and for detecting upregulated DNA damage signalling pathways resulting from ENM exposures. In this critical review, we describe and discuss the fundamental measurement principles and measurement endpoints of these new assays, as well as the modes of operation, analytical metrics and potential interferences, as applicable to ENM exposures. An unbiased discussion of the major technical advantages and limitations of each assay for evaluating and predicting the genotoxic potential of ENMs is also provided. Published by Oxford University Press on

  13. Mining Chemical Activity Status from High-Throughput Screening Assays

    KAUST Repository

    Soufan, Othman

    2015-12-14

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  14. Mining Chemical Activity Status from High-Throughput Screening Assays.

    Directory of Open Access Journals (Sweden)

    Othman Soufan

    Full Text Available High-throughput screening (HTS experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  15. Scanning fluorescence detector for high-throughput DNA genotyping

    Science.gov (United States)

    Rusch, Terry L.; Petsinger, Jeremy; Christensen, Carl; Vaske, David A.; Brumley, Robert L., Jr.; Luckey, John A.; Weber, James L.

    1996-04-01

    A new scanning fluorescence detector (SCAFUD) was developed for high-throughput genotyping of short tandem repeat polymorphisms (STRPs). Fluorescent dyes are incorporated into relatively short DNA fragments via polymerase chain reaction (PCR) and are separated by electrophoresis in short, wide polyacrylamide gels (144 lanes with well to read distances of 14 cm). Excitation light from an argon laser with primary lines at 488 and 514 nm is introduced into the gel through a fiber optic cable, dichroic mirror, and 40X microscope objective. Emitted fluorescent light is collected confocally through a second fiber. The confocal head is translated across the bottom of the gel at 0.5 Hz. The detection unit utilizes dichroic mirrors and band pass filters to direct light with 10 - 20 nm bandwidths to four photomultiplier tubes (PMTs). PMT signals are independently amplified with variable gain and then sampled at a rate of 2500 points per scan using a computer based A/D board. LabView software (National Instruments) is used for instrument operation. Currently, three fluorescent dyes (Fam, Hex and Rox) are simultaneously detected with peak detection wavelengths of 543, 567, and 613 nm, respectively. The detection limit for fluorescein-labeled primers is about 100 attomoles. Planned SCAFUD upgrades include rearrangement of laser head geometry, use of additional excitation lasers for simultaneous detection of more dyes, and the use of detector arrays instead of individual PMTs. Extensive software has been written for automatic analysis of SCAFUD images. The software enables background subtraction, band identification, multiple- dye signal resolution, lane finding, band sizing and allele calling. Whole genome screens are currently underway to search for loci influencing such complex diseases as diabetes, asthma, and hypertension. Seven production SCAFUDs are currently in operation. Genotyping output for the coming year is projected to be about one million total genotypes (DNA

  16. Mining Chemical Activity Status from High-Throughput Screening Assays.

    Science.gov (United States)

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  17. High-throughput microfluidic line scan imaging for cytological characterization

    Science.gov (United States)

    Hutcheson, Joshua A.; Powless, Amy J.; Majid, Aneeka A.; Claycomb, Adair; Fritsch, Ingrid; Balachandran, Kartik; Muldoon, Timothy J.

    2015-03-01

    Imaging cells in a microfluidic chamber with an area scan camera is difficult due to motion blur and data loss during frame readout causing discontinuity of data acquisition as cells move at relatively high speeds through the chamber. We have developed a method to continuously acquire high-resolution images of cells in motion through a microfluidics chamber using a high-speed line scan camera. The sensor acquires images in a line-by-line fashion in order to continuously image moving objects without motion blur. The optical setup comprises an epi-illuminated microscope with a 40X oil immersion, 1.4 NA objective and a 150 mm tube lens focused on a microfluidic channel. Samples containing suspended cells fluorescently stained with 0.01% (w/v) proflavine in saline are introduced into the microfluidics chamber via a syringe pump; illumination is provided by a blue LED (455 nm). Images were taken of samples at the focal plane using an ELiiXA+ 8k/4k monochrome line-scan camera at a line rate of up to 40 kHz. The system's line rate and fluid velocity are tightly controlled to reduce image distortion and are validated using fluorescent microspheres. Image acquisition was controlled via MATLAB's Image Acquisition toolbox. Data sets comprise discrete images of every detectable cell which may be subsequently mined for morphological statistics and definable features by a custom texture analysis algorithm. This high-throughput screening method, comparable to cell counting by flow cytometry, provided efficient examination including counting, classification, and differentiation of saliva, blood, and cultured human cancer cells.

  18. Towards Chip Scale Liquid Chromatography and High Throughput Immunosensing

    Energy Technology Data Exchange (ETDEWEB)

    Ni, Jing [Iowa State Univ., Ames, IA (United States)

    2000-09-21

    This work describes several research projects aimed towards developing new instruments and novel methods for high throughput chemical and biological analysis. Approaches are taken in two directions. The first direction takes advantage of well-established semiconductor fabrication techniques and applies them to miniaturize instruments that are workhorses in analytical laboratories. Specifically, the first part of this work focused on the development of micropumps and microvalves for controlled fluid delivery. The mechanism of these micropumps and microvalves relies on the electrochemically-induced surface tension change at a mercury/electrolyte interface. A miniaturized flow injection analysis device was integrated and flow injection analyses were demonstrated. In the second part of this work, microfluidic chips were also designed, fabricated, and tested. Separations of two fluorescent dyes were demonstrated in microfabricated channels, based on an open-tubular liquid chromatography (OT LC) or an electrochemically-modulated liquid chromatography (EMLC) format. A reduction in instrument size can potentially increase analysis speed, and allow exceedingly small amounts of sample to be analyzed under diverse separation conditions. The second direction explores the surface enhanced Raman spectroscopy (SERS) as a signal transduction method for immunoassay analysis. It takes advantage of the improved detection sensitivity as a result of surface enhancement on colloidal gold, the narrow width of Raman band, and the stability of Raman scattering signals to distinguish several different species simultaneously without exploiting spatially-separated addresses on a biochip. By labeling gold nanoparticles with different Raman reporters in conjunction with different detection antibodies, a simultaneous detection of a dual-analyte immunoassay was demonstrated. Using this scheme for quantitative analysis was also studied and preliminary dose-response curves from an immunoassay of a

  19. High-throughput DNA extraction of forensic adhesive tapes.

    Science.gov (United States)

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. High Throughput Screening for Drugs that Modulate Intermediate Filament Proteins

    Science.gov (United States)

    Sun, Jingyuan; Groppi, Vincent E.; Gui, Honglian; Chen, Lu; Xie, Qing; Liu, Li

    2016-01-01

    Intermediate filament (IF) proteins have unique and complex cell and tissue distribution. Importantly, IF gene mutations cause or predispose to more than 80 human tissue-specific diseases (IF-pathies), with the most severe disease phenotypes being due to mutations at conserved residues that result in a disrupted IF network. A critical need for the entire IF-pathy field is the identification of drugs that can ameliorate or cure these diseases, particularly since all current therapies target the IF-pathy complication, such as diabetes or cardiovascular disease, rather than the mutant IF protein or gene. We describe a high throughput approach to identify drugs that can normalize disrupted IF proteins. This approach utilizes transduction of lentivirus that expresses green-fluorescent-protein-tagged keratin 18 (K18) R90C in A549 cells. The readout is drug ‘hits’ that convert the dot-like keratin filament distribution, due to the R90C mutation, to a wildtype-like filamentous array. A similar strategy can be used to screen thousands of compounds and can be utilized for practically any IF protein with a filament-disrupting mutation, and could therefore potentially target many IF-pathies. ‘Hits’ of interest require validation in cell culture then using in vivo experimental models. Approaches to study the mechanism of mutant-IF normalization by potential drugs of interest are also described. The ultimate goal of this drug screening approach is to identify effective and safe compounds that can potentially be tested for clinical efficacy in patients. PMID:26795471

  1. Hypoxia-sensitive reporter system for high-throughput screening.

    Science.gov (United States)

    Tsujita, Tadayuki; Kawaguchi, Shin-ichi; Dan, Takashi; Baird, Liam; Miyata, Toshio; Yamamoto, Masayuki

    2015-01-01

    The induction of anti-hypoxic stress enzymes and proteins has the potential to be a potent therapeutic strategy to prevent the progression of ischemic heart, kidney or brain diseases. To realize this idea, small chemical compounds, which mimic hypoxic conditions by activating the PHD-HIF-α system, have been developed. However, to date, none of these compounds were identified by monitoring the transcriptional activation of hypoxia-inducible factors (HIFs). Thus, to facilitate the discovery of potent inducers of HIF-α, we have developed an effective high-throughput screening (HTS) system to directly monitor the output of HIF-α transcription. We generated a HIF-α-dependent reporter system that responds to hypoxic stimuli in a concentration- and time-dependent manner. This system was developed through multiple optimization steps, resulting in the generation of a construct that consists of the secretion-type luciferase gene (Metridia luciferase, MLuc) under the transcriptional regulation of an enhancer containing 7 copies of 40-bp hypoxia responsive element (HRE) upstream of a mini-TATA promoter. This construct was stably integrated into the human neuroblastoma cell line, SK-N-BE(2)c, to generate a reporter system, named SKN:HRE-MLuc. To improve this system and to increase its suitability for the HTS platform, we incorporated the next generation luciferase, Nano luciferase (NLuc), whose longer half-life provides us with flexibility for the use of this reporter. We thus generated a stably transformed clone with NLuc, named SKN:HRE-NLuc, and found that it showed significantly improved reporter activity compared to SKN:HRE-MLuc. In this study, we have successfully developed the SKN:HRE-NLuc screening system as an efficient platform for future HTS.

  2. High-throughput neuroimaging-genetics computational infrastructure.

    Science.gov (United States)

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D; Franco, Joseph; Toga, Arthur W

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize

  3. High-throughput screening method for lipases/esterases.

    Science.gov (United States)

    Mateos-Díaz, Eduardo; Rodríguez, Jorge Alberto; de Los Ángeles Camacho-Ruiz, María; Mateos-Díaz, Juan Carlos

    2012-01-01

    High-throughput screening (HTS) methods for lipases and esterases are generally performed by using synthetic chromogenic substrates (e.g., p-nitrophenyl, resorufin, and umbelliferyl esters) which may be misleading since they are not their natural substrates (e.g., partially or insoluble triglycerides). In previous works, we have shown that soluble nonchromogenic substrates and p-nitrophenol (as a pH indicator) can be used to quantify the hydrolysis and estimate the substrate selectivity of lipases and esterases from several sources. However, in order to implement a spectrophotometric HTS method using partially or insoluble triglycerides, it is necessary to find particular conditions which allow a quantitative detection of the enzymatic activity. In this work, we used Triton X-100, CHAPS, and N-lauroyl sarcosine as emulsifiers, β-cyclodextrin as a fatty acid captor, and two substrate concentrations, 1 mM of tributyrin (TC4) and 5 mM of trioctanoin (TC8), to improve the test conditions. To demonstrate the utility of this method, we screened 12 enzymes (commercial preparations and culture broth extracts) for the hydrolysis of TC4 and TC8, which are both classical substrates for lipases and esterases (for esterases, only TC4 may be hydrolyzed). Subsequent pH-stat experiments were performed to confirm the preference of substrate hydrolysis with the hydrolases tested. We have shown that this method is very useful for screening a high number of lipases (hydrolysis of TC4 and TC8) or esterases (only hydrolysis of TC4) from wild isolates or variants generated by directed evolution using nonchromogenic triglycerides directly in the test.

  4. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2016-10-14

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  5. Application of an active attachment model as a high-throughput demineralization biofilm model

    NARCIS (Netherlands)

    Silva, T.C.; Pereira, A.F.F.; Exterkate, R.A.M.; Bagnato, V.S.; Buzalaf, M.A.R.; de A.M. Machado, M.A.; ten Cate, J.M.; Crielaard, W.; Deng, D.M.

    2012-01-01

    Objectives To investigate the potential of an active attachment biofilm model as a high-throughput demineralization biofilm model for the evaluation of caries-preventive agents. Methods Streptococcus mutans UA159 biofilms were grown on bovine dentine discs in a high-throughput active attachment mode

  6. Recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers

    DEFF Research Database (Denmark)

    Andersen, Lars Dyrskjøt; Zieger, Karsten; Ørntoft, Torben Falck

    2007-01-01

    individually contributed to the management of the disease. However, the development of high-throughput techniques for simultaneous assessment of a large number of markers has allowed classification of tumors into clinically relevant molecular subgroups beyond those possible by pathological classification. Here......, we review the recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers....

  7. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  8. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  9. High-throughput genetic analysis using time-resolved fluorometry and closed-tube detection.

    Science.gov (United States)

    Nurmi, J; Kiviniemi, M; Kujanpää, M; Sjöroos, M; Ilonen, J; Lövgren, T

    2001-12-15

    Robust methods for genetic analysis are required for efficient exploitation of the constantly accumulating genetic information. We describe a closed-tube genotyping method suitable for high-throughput screening of genetic markers. The method is based on allele-specific probes labeled with an environment-sensitive lanthanide chelate, the fluorescence intensity of which is significantly increased upon PCR amplification of a complementary target. Genomic DNA samples were analyzed in an insulin gene single nucleotide polymorphism (SNP) assay using universal amplification primers and probes that recognized the two different alleles. The feasibility of dry reagent based all-in-one PCR assays was tested using another diabetes-related genetic marker, human leukocyte antigen DQB1 allele *0302 as a model analyte in a dual-color, closed-tube end-point assay. There was a 100% correlation between the novel SNP assay and a conventional PCR restriction fragment length polymorphism assay. It was also demonstrated that using real-time monitoring, accurate genotyping results can be obtained despite strongly cross-reacting probes, minimizing the time and effort needed for optimization of probe sequence. Throughput can be maximized by using predried PCR mixtures that are stable for at least 6 months. This homogenous, all-in-one dry reagent assay chemistry permits cost-effective genetic screening on a large scale.

  10. Freely accessible databases of commercial compounds for high- throughput virtual screenings.

    Science.gov (United States)

    Moura Barbosa, Arménio Jorge; Del Rio, Alberto

    2012-01-01

    In the last decades computer-aided drug design techniques have been successfully used to guide the selection of new hit compounds with biological activity. These methods, that include a broad range of chemoinformatic and computational chemistry algorithms, are still disciplines in full bloom. In particular, virtual screening procedures have celebrated a great popularity for the rapid and cost-effective assessment of large chemical libraries of commercial compounds. While the usage of in silico techniques promises an effective speed-up at the early-stage of the development of new active compounds, computational projects starting from scratch with raw chemical data are often associated with resource- and time-consuming preparation protocols, almost blunting the advantages of using these techniques. In order to help facing these difficulties, in the last years several chemoinformatic projects and tools have emerged in literature and have been useful in preparing curated databases of chemical compounds for high-throughput virtual screening purposes. The review will focus on the detailed analysis of free databases of commercial chemical compounds that are currently employed in virtual screening campaigns for drug design. The scope of this review is to compare such databases and suggest the reader on how and in which conditions the usage of these databases could be recommended.

  11. High throughput atmospheric pressure plasma-induced graft polymerization for identifying protein-resistant surfaces.

    Science.gov (United States)

    Gu, Minghao; Kilduff, James E; Belfort, Georges

    2012-02-01

    Three critical aspects of searching for and understanding how to find highly resistant surfaces to protein adhesion are addressed here with specific application to synthetic membrane filtration. They include the (i) discovery of a series of previously unreported monomers from a large library of monomers with high protein resistance and subsequent low fouling characteristics for membrane ultrafiltration of protein-containing fluids, (ii) development of a new approach to investigate protein-resistant mechanisms from structure-property relationships, and (iii) adaptation of a new surface modification method, called atmospheric pressure plasma-induced graft polymerization (APP), together with a high throughput platform (HTP), for low cost vacuum-free synthesis of anti-fouling membranes. Several new high-performing chemistries comprising two polyethylene glycol (PEG), two amines and one zwitterionic monomers were identified from a library (44 commercial monomers) of five different classes of monomers as strong protein-resistant monomers. Combining our analysis here, using the Hansen solubility parameters (HSP) approach, and data from the literature, we conclude that strong interactions with water (hydrogen bonding) and surface flexibility are necessary for producing the highest protein resistance. Superior protein-resistant surfaces and subsequent anti-fouling performance was obtained with the HTP-APP as compared with our earlier HTP-photo graft-induced polymerization (PGP).

  12. Quantum Chemistry, and Eclectic Mix: From Silicon Carbide to Size Consistency

    Energy Technology Data Exchange (ETDEWEB)

    Rintelman, Jamie Marie [Iowa State Univ., Ames, IA (United States)

    2004-12-19

    Chemistry is a field of great breadth and variety. It is this diversity that makes for both an interesting and challenging field. My interests have spanned three major areas of theoretical chemistry: applications, method development, and method evaluation. The topics presented in this thesis are as follows: (1) a multi-reference study of the geometries and relative energies of four atom silicon carbide clusters in the gas phase; (2) the reaction of acetylene on the Si(100)-(2x1) surface; (3) an improvement to the Effective Fragment Potential (EFP) solvent model to enable the study of reactions in both aqueous and nonaqueous solution; and (4) an evaluation of the size consistency of Multireference Perturbation Theory (MRPT). In the following section, the author briefly discusses two topics central to, and present throughout, this thesis: Multi-reference methods and Quantum Mechanics/Molecular Mechanics (QM/MM) methods.

  13. The AFLOW Standard for High-throughput Materials Science Calculations

    Science.gov (United States)

    2015-01-01

    Wang, Z. Wang, W. Setyawan, N. Mingo, S. Curtarolo, Phys. Rev. X 1 (2011) 021012, http://dx.doi.org/10.1103/PhysRevX.1.021012. [37] J.K. Lang , Y. Baer...Anisimov, J. Phys. Chem. Solids 56 (1995) 1521. [57] B. Liu, in: C. Moler, I. Shavitt (Eds.), Numerical Algorithms in Chemistry: Algebraic Methods

  14. High-throughput screening: speeding up porous materials discovery.

    Science.gov (United States)

    Wollmann, Philipp; Leistner, Matthias; Stoeck, Ulrich; Grünker, Ronny; Gedrich, Kristina; Klein, Nicole; Throl, Oliver; Grählert, Wulf; Senkovska, Irena; Dreisbach, Frieder; Kaskel, Stefan

    2011-05-14

    A new tool (Infrasorb-12) for the screening of porosity is described, identifying high surface area materials in a very short time with high accuracy. Further, an example for the application of the tool in the discovery of new cobalt-based metal-organic frameworks is given. © The Royal Society of Chemistry 2011

  15. Uquantchem: A versatile and easy to use Quantum Chemistry Computational Software

    CERN Document Server

    Souvatzis, Petros

    2013-01-01

    In this paper we present the Uppsala Quantum Chemistry package (UQUANTCHEM), a new and versatile computational platform with capabilities ranging from simple Hartree-Fock calculations to state of the art First principles Extended Lagrangian Born Oppenheimer Molecular Dynamics (XL- BOMD) and diffusion quantum Monte Carlo (DMC). The UQUANTCHEM package is distributed under the general public license and can be directly downloaded from the code web-site. Together with a presentation of the different capabilities of the uquantchem code and a more technical discus- sion on how these capabilities have been implemented, a presentation of the user-friendly aspect of the package on the basis of the large number of default settings will also be presented. Furthermore, since the code has been parallelized within the framework of the message passing interface (MPI), the timing of some benchmark calculations are reported to illustrate how the code scales with the number of computational nodes for different levels of chemic...

  16. Green chemistry for large-scale synthesis of semiconductor quantum dots.

    Science.gov (United States)

    Liu, Jin-Hua; Fan, Jun-Bing; Gu, Zheng; Cui, Jing; Xu, Xiao-Bo; Liang, Zhi-Wu; Luo, Sheng-Lian; Zhu, Ming-Qiang

    2008-05-20

    Large-scale synthesis of semiconductor nanocrystals or quantum dots (QDs) with high concentration and high yield through simultaneously increasing the precursor concentration was introduced. This synthetic route conducted in diesel has produced gram-scale CdSe semiconductor quantum dots (In optimal scale-up synthetic condition, the one-pot yield of QDs is up to 9.6g). The reaction has been conducted in open air and at relatively low temperature at 190-230 degrees C in the absence of expensive organic phosphine ligands, aliphatic amine and octadecene, which is really green chemistry without high energy cost for high temperature reaction and unessential toxic chemicals except for Cd, which is the essential building block for QDs.

  17. High-throughput Saccharification assay for lignocellulosic materials.

    Science.gov (United States)

    Gomez, Leonardo D; Whitehead, Caragh; Roberts, Philip; McQueen-Mason, Simon J

    2011-07-03

    Polysaccharides that make up plant lignocellulosic biomass can be broken down to produce a range of sugars that subsequently can be used in establishing a biorefinery. These raw materials would constitute a new industrial platform, which is both sustainable and carbon neutral, to replace the current dependency on fossil fuel. The recalcitrance to deconstruction observed in lignocellulosic materials is produced by several intrinsic properties of plant cell walls. Crystalline cellulose is embedded in matrix polysaccharides such as xylans and arabinoxylans, and the whole structure is encased by the phenolic polymer lignin, that is also difficult to digest (1). In order to improve the digestibility of plant materials we need to discover the main bottlenecks for the saccharification of cell walls and also screen mutant and breeding populations to evaluate the variability in saccharification (2). These tasks require a high throughput approach and here we present an analytical platform that can perform saccharification analysis in a 96-well plate format. This platform has been developed to allow the screening of lignocellulose digestibility of large populations from varied plant species. We have scaled down the reaction volumes for gentle pretreatment, partial enzymatic hydrolysis and sugar determination, to allow large numbers to be assessed rapidly in an automated system. This automated platform works with milligram amounts of biomass, performing ball milling under controlled conditions to reduce the plant materials to a standardised particle size in a reproducible manner. Once the samples are ground, the automated formatting robot dispenses specified and recorded amounts of material into the corresponding wells of 96 deep well plate (Figure 1). Normally, we dispense the same material into 4 wells to have 4 replicates for analysis. Once the plates are filled with the plant material in the desired layout, they are manually moved to a liquid handling station (Figure 2

  18. High throughput modular chambers for rapid evaluation of anesthetic sensitivity

    Directory of Open Access Journals (Sweden)

    Eckmann David M

    2006-11-01

    Full Text Available Abstract Background Anesthetic sensitivity is determined by the interaction of multiple genes. Hence, a dissection of genetic contributors would be aided by precise and high throughput behavioral screens. Traditionally, anesthetic phenotyping has addressed only induction of anesthesia, evaluated with dose-response curves, while ignoring potentially important data on emergence from anesthesia. Methods We designed and built a controlled environment apparatus to permit rapid phenotyping of twenty-four mice simultaneously. We used the loss of righting reflex to indicate anesthetic-induced unconsciousness. After fitting the data to a sigmoidal dose-response curve with variable slope, we calculated the MACLORR (EC50, the Hill coefficient, and the 95% confidence intervals bracketing these values. Upon termination of the anesthetic, Emergence timeRR was determined and expressed as the mean ± standard error for each inhaled anesthetic. Results In agreement with several previously published reports we find that the MACLORR of halothane, isoflurane, and sevoflurane in 8–12 week old C57BL/6J mice is 0.79% (95% confidence interval = 0.78 – 0.79%, 0.91% (95% confidence interval = 0.90 – 0.93%, and 1.96% (95% confidence interval = 1.94 – 1.97%, respectively. Hill coefficients for halothane, isoflurane, and sevoflurane are 24.7 (95% confidence interval = 19.8 – 29.7%, 19.2 (95% confidence interval = 14.0 – 24.3%, and 33.1 (95% confidence interval = 27.3 – 38.8%, respectively. After roughly 2.5 MACLORR • hr exposures, mice take 16.00 ± 1.07, 6.19 ± 0.32, and 2.15 ± 0.12 minutes to emerge from halothane, isoflurane, and sevoflurane, respectively. Conclusion This system enabled assessment of inhaled anesthetic responsiveness with a higher precision than that previously reported. It is broadly adaptable for delivering an inhaled therapeutic (or toxin to a population while monitoring its vital signs, motor reflexes, and providing precise control

  19. High throughput RNAi assay optimization using adherent cell cytometry

    Directory of Open Access Journals (Sweden)

    Pradhan Leena

    2011-04-01

    Full Text Available Abstract Background siRNA technology is a promising tool for gene therapy of vascular disease. Due to the multitude of reagents and cell types, RNAi experiment optimization can be time-consuming. In this study adherent cell cytometry was used to rapidly optimize siRNA transfection in human aortic vascular smooth muscle cells (AoSMC. Methods AoSMC were seeded at a density of 3000-8000 cells/well of a 96well plate. 24 hours later AoSMC were transfected with either non-targeting unlabeled siRNA (50 nM, or non-targeting labeled siRNA, siGLO Red (5 or 50 nM using no transfection reagent, HiPerfect or Lipofectamine RNAiMax. For counting cells, Hoechst nuclei stain or Cell Tracker green were used. For data analysis an adherent cell cytometer, Celigo® was used. Data was normalized to the transfection reagent alone group and expressed as red pixel count/cell. Results After 24 hours, none of the transfection conditions led to cell loss. Red fluorescence counts were normalized to the AoSMC count. RNAiMax was more potent compared to HiPerfect or no transfection reagent at 5 nM siGLO Red (4.12 +/-1.04 vs. 0.70 +/-0.26 vs. 0.15 +/-0.13 red pixel/cell and 50 nM siGLO Red (6.49 +/-1.81 vs. 2.52 +/-0.67 vs. 0.34 +/-0.19. Fluorescence expression results supported gene knockdown achieved by using MARCKS targeting siRNA in AoSMCs. Conclusion This study underscores that RNAi delivery depends heavily on the choice of delivery method. Adherent cell cytometry can be used as a high throughput-screening tool for the optimization of RNAi assays. This technology can accelerate in vitro cell assays and thus save costs.

  20. Emerging high throughput analyses of cyanobacterial toxins and toxic cyanobacteria.

    Science.gov (United States)

    Sivonen, Kaarina

    2008-01-01

    The common occurrence of toxic cyanobacteria causes problems for health of animals and human beings. More research and good monitoring systems are needed to protect water users. It is important to have rapid, reliable and accurate analysis i.e. high throughput methods to identify the toxins as well as toxin producers in the environment. Excellent methods, such as ELISA already exist to analyse cyanobacterial hepatotoxins and saxitoxins, and PPIA for microcystins and nodularins. The LC/MS method can be fast in identifying the toxicants in the samples. Further development of this area should resolve the problems with sampling and sample preparation, which still are the bottlenecks of rapid analyses. In addition, the availability of reliable reference materials and standards should be resolved. Molecular detection methods are now routine in clinical and criminal laboratories and may also become important in environmental diagnostics. One prerequisite for the development of molecular analysis is that pure cultures of the producer organisms are available for identification of the biosynthetic genes responsible for toxin production and for proper testing of the diagnostic methods. Good methods are already available for the microcystin and nodularin-producing cyanobacteria such as conventional PCR, quantitative real-time PCR and microarrays/DNA chips. The DNA-chip technology offers an attractive monitoring system for toxic and non-toxic cyanobacteria. Only with these new technologies (PCR + DNA-chips) will we be able to study toxic cyanobacteria populations in situ and the effects of environmental factors on the occurrence and proliferation of especially toxic cyanobacteria. This is likely to yield important information for mitigation purposes. Further development of these methods should include all cyanobacterial biodiversity, including all toxin producers and primers/probes to detect producers of neurotoxins, cylindrospermopsins etc. (genes are unknown). The on

  1. High-throughput metal susceptibility testing of microbial biofilms

    Directory of Open Access Journals (Sweden)

    Turner Raymond J

    2005-10-01

    Full Text Available Abstract Background Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. Results This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO32- than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic

  2. High throughput comet assay to study genotoxicity of nanomaterials

    Directory of Open Access Journals (Sweden)

    Naouale El Yamani

    2015-06-01

    Full Text Available The unique physicochemical properties of engineered nanomaterials (NMs have accelerated their use in diverse industrial and domestic products. Although their presence in consumer products represents a major concern for public health safety, their potential impact on human health is poorly understood. There is therefore an urgent need to clarify the toxic effects of NMs and to elucidate the mechanisms involved. In view of the large number of NMs currently being used, high throughput (HTP screening technologies are clearly needed for efficient assessment of toxicity. The comet assay is the most used method in nanogenotoxicity studies and has great potential for increasing throughput as it is fast, versatile and robust; simple technical modifications of the assay make it possible to test many compounds (NMs in a single experiment. The standard gel of 70-100 μL contains thousands of cells, of which only a tiny fraction are actually scored. Reducing the gel to a volume of 5 μL, with just a few hundred cells, allows twelve gels to be set on a standard slide, or 96 as a standard 8x12 array. For the 12 gel format, standard slides precoated with agarose are placed on a metal template and gels are set on the positions marked on the template. The HTP comet assay, incorporating digestion of DNA with formamidopyrimidine DNA glycosylase (FPG to detect oxidised purines, has recently been applied to study the potential induction of genotoxicity by NMs via reactive oxygen. In the NanoTEST project we investigated the genotoxic potential of several well-characterized metal and polymeric nanoparticles with the comet assay. All in vitro studies were harmonized; i.e. NMs were from the same batch, and identical dispersion protocols, exposure time, concentration range, culture conditions, and time-courses were used. As a kidney model, Cos-1 fibroblast-like kidney cells were treated with different concentrations of iron oxide NMs, and cells embedded in minigels (12

  3. Introducing Discrete Frequency Infrared Technology for High-Throughput Biofluid Screening

    Science.gov (United States)

    Hughes, Caryn; Clemens, Graeme; Bird, Benjamin; Dawson, Timothy; Ashton, Katherine M.; Jenkinson, Michael D.; Brodbelt, Andrew; Weida, Miles; Fotheringham, Edeline; Barre, Matthew; Rowlette, Jeremy; Baker, Matthew J.

    2016-02-01

    Accurate early diagnosis is critical to patient survival, management and quality of life. Biofluids are key to early diagnosis due to their ease of collection and intimate involvement in human function. Large-scale mid-IR imaging of dried fluid deposits offers a high-throughput molecular analysis paradigm for the biomedical laboratory. The exciting advent of tuneable quantum cascade lasers allows for the collection of discrete frequency infrared data enabling clinically relevant timescales. By scanning targeted frequencies spectral quality, reproducibility and diagnostic potential can be maintained while significantly reducing acquisition time and processing requirements, sampling 16 serum spots with 0.6, 5.1 and 15% relative standard deviation (RSD) for 199, 14 and 9 discrete frequencies respectively. We use this reproducible methodology to show proof of concept rapid diagnostics; 40 unique dried liquid biopsies from brain, breast, lung and skin cancer patients were classified in 2.4 cumulative seconds against 10 non-cancer controls with accuracies of up to 90%.

  4. Implementation of a high throughput spectrograph for Thomson scattering measurements on the Compact Toriodal Hybrid

    Science.gov (United States)

    Goforth, Matthew; Traverso, Peter; Maurer, David

    2013-10-01

    To better understand the equilibrium and stability of Compact Toroidal Hybrid (CTH) plasmas, a multipoint Thomson scattering system is under development at Auburn University. Thomson scattering will be performed at 532 nm using a frequency doubled Continuum PL DLS Nd:YAG laser. The Thomson scattered light will be measured using a high throughput HoloSpec f/1.8i imaging spectrograph with in-line interference filter for spectral discrimination of stray laser light. An image intensified charge coupled device (ICCD) camera employing a Gen III photocathode with quantum efficiency of approximately 50% near the frequency doubled laser line is planned as the detection element for the scattered light. Bench and CTH impurity line emission measurements will be presented quantifying spectrometer and ICCD performance and suitability for scattering measurements over the visible spectral region near 532 nm. This work has been supported by US Department of Energy Grant No. DE-FG02-00ER54610 and the Auburn University Undergraduate Research Fellowship Program.

  5. Code interoperability and standard data formats in quantum chemistry and quantum dynamics: The Q5/D5Cost data model.

    Science.gov (United States)

    Rossi, Elda; Evangelisti, Stefano; Laganà, Antonio; Monari, Antonio; Rampino, Sergio; Verdicchio, Marco; Baldridge, Kim K; Bendazzoli, Gian Luigi; Borini, Stefano; Cimiraglia, Renzo; Angeli, Celestino; Kallay, Peter; Lüthi, Hans P; Ruud, Kenneth; Sanchez-Marin, José; Scemama, Anthony; Szalay, Peter G; Tajti, Attila

    2014-03-30

    Code interoperability and the search for domain-specific standard data formats represent critical issues in many areas of computational science. The advent of novel computing infrastructures such as computational grids and clouds make these issues even more urgent. The design and implementation of a common data format for quantum chemistry (QC) and quantum dynamics (QD) computer programs is discussed with reference to the research performed in the course of two Collaboration in Science and Technology Actions. The specific data models adopted, Q5Cost and D5Cost, are shown to work for a number of interoperating codes, regardless of the type and amount of information (small or large datasets) to be exchanged. The codes are either interfaced directly, or transfer data by means of wrappers; both types of data exchange are supported by the Q5/D5Cost library. Further, the exchange of data between QC and QD codes is addressed. As a proof of concept, the H + H2 reaction is discussed. The proposed scheme is shown to provide an excellent basis for cooperative code development, even across domain boundaries. Moreover, the scheme presented is found to be useful also as a production tool in the grid distributed computing environment.

  6. A high-throughput reactor system for optimization of Mo–V–Nb mixed oxide catalyst composition in ethane ODH

    KAUST Repository

    Zhu, Haibo

    2015-01-01

    75 Mo-V-Nb mixed oxide catalysts with a broad range of compositions were prepared by a simple evaporation method, and were screened for the ethane oxidative dehydrogenation (ODH) reaction. The compositions of these 75 catalysts were systematically changed by varying the Nb loading, and the Mo/V molar ratio. Characterization by XRD, XPS, H2-TPR and SEM revealed that an intimate structure is formed among the 3 components. The strong interaction among different components leads to the formation of a new phase or an "intimate structure". The dependency of conversion and selectivity on the catalyst composition was clearly demonstrated from the results of high-throughput testing. The optimized Mo-V-Nb molar composition was confirmed to be composed of a Nb content of 4-8%, a Mo content of 70-83%, and a V content of 12-25%. The enhanced catalytic performance of the mixed oxides is obviously due to the synergistic effects of the different components. The optimized compositions for ethane ODH revealed in our high-throughput tests and the structural information provided by our characterization studies can serve as the starting point for future efforts to improve the catalytic performance of Mo-V-Nb oxides. This journal is © The Royal Society of Chemistry.

  7. Interactive quantum chemistry: a divide-and-conquer ASED-MO method.

    Science.gov (United States)

    Bosson, Mäel; Richard, Caroline; Plet, Antoine; Grudinin, Sergei; Redon, Stephane

    2012-03-15

    We present interactive quantum chemistry simulation at the atom superposition and electron delocalization molecular orbital (ASED-MO) level of theory. Our method is based on the divide-and-conquer (D&C) approach, which we show is accurate and efficient for this non-self-consistent semiempirical theory. The method has a linear complexity in the number of atoms, scales well with the number of cores, and has a small prefactor. The time cost is completely controllable, as all steps are performed with direct algorithms, i.e., no iterative schemes are used. We discuss the errors induced by the D&C approach, first empirically on a few examples, and then via a theoretical study of two toy models that can be analytically solved for any number of atoms. Thanks to the precision and speed of the D&C approach, we are able to demonstrate interactive quantum chemistry simulations for systems up to a few hundred atoms on a current multicore desktop computer. When drawing and editing molecular systems, interactive simulations provide immediate, intuitive feedback on chemical structures. As the number of cores on personal computers increases, and larger and larger systems can be dealt with, we believe such interactive simulations-even at lower levels of theory-should thus prove most useful to effectively understand, design and prototype molecules, devices and materials.

  8. A High-Throughput Pipeline for the Design of Real-Time PCR Signatures

    Science.gov (United States)

    2010-06-23

    available soon. A high-throughput pipeline for the design of real - time PCR signatures BMC Bioinformatics 2010, 11:340 doi:10.1186/1471-2105-11-340 Ravi...AND SUBTITLE A high-throughput pipeline for the design of real - time PCR signatures 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 1 A high-throughput pipeline for the design of real - time PCR signatures Ravi Vijaya Satya

  9. A Synthesis of Fluid Dynamics and Quantum Chemistry for the Design of Nanoelectronics

    Science.gov (United States)

    MacDougall, Preston J.

    1998-01-01

    In 1959, during a famous lecture entitled "There's Plenty of Room at the Bottom", Richard Feynman focused on the startling technical possibilities that would exist at the limit of miniaturization, that being atomically precise devices with dimensions in the nanometer range. A nanometer is both a convenient unit of length for medium to large sized molecules, and the root of the name of the new interdisciplinary field of "nanotechnology". Essentially, "nanoelectronics" denotes the goal of shrinking electronic devices, such as diodes and transistors, as well as integrated circuits of such devices that can perform logical operations, down to dimensions in the range of 100 nanometers. The thirty-year hiatus in the development of nanotechnology can figuratively be seen as a period of waiting for the bottom-up and atomically precise construction skills of synthetic chemistry to meet the top-down reductionist aspirations of device physics. The sub-nanometer domain of nineteenth-century classical chemistry has steadily grown, and state-of-the-art supramolecular chemistry can achieve atomic precision in non-repeating molecular assemblies of the size desired for nanotechnology. For nanoelectronics in particular, a basic understanding of the electron transport properties of molecules must also be developed. Quantum chemistry provides powerful computational methods that can accurately predict the properties of small to medium sized molecules on a desktop workstation, and those of large molecules if one has access to a supercomputer. Of the many properties of a molecule that quantum chemistry routinely predicts, the ability to carry a current is one that had not even been considered until recently. "Currently", there is a controversy over just how to define this key property. Reminiscent of the situation in high-Tc superconductivity, much of the difficulty arises from the different models that are used to simplify the complex electronic structure of real materials. A model

  10. A high throughput gas exchange screen for determining rates of photorespiration or regulation of C4 activity.

    Science.gov (United States)

    Bellasio, Chandra; Burgess, Steven J; Griffiths, Howard; Hibberd, Julian M

    2014-07-01

    Large-scale research programmes seeking to characterize the C4 pathway have a requirement for a simple, high throughput screen that quantifies photorespiratory activity in C3 and C4 model systems. At present, approaches rely on model-fitting to assimilatory responses (A/C i curves, PSII quantum yield) or real-time carbon isotope discrimination, which are complicated and time-consuming. Here we present a method, and the associated theory, to determine the effectiveness of the C4 carboxylation, carbon concentration mechanism (CCM) by assessing the responsiveness of V O/V C, the ratio of RuBisCO oxygenase to carboxylase activity, upon transfer to low O2. This determination compares concurrent gas exchange and pulse-modulated chlorophyll fluorescence under ambient and low O2, using widely available equipment. Run time for the procedure can take as little as 6 minutes if plants are pre-adapted. The responsiveness of V O/V C is derived for typical C3 (tobacco, rice, wheat) and C4 (maize, Miscanthus, cleome) plants, and compared with full C3 and C4 model systems. We also undertake sensitivity analyses to determine the impact of R LIGHT (respiration in the light) and the effectiveness of the light saturating pulse used by fluorescence systems. The results show that the method can readily resolve variations in photorespiratory activity between C3 and C4 plants and could be used to rapidly screen large numbers of mutants or transformants in high throughput studies.

  11. Quantum Diffusion-Controlled Chemistry: Reactions of Atomic Hydrogen with Nitric Oxide in Solid Parahydrogen.

    Science.gov (United States)

    Ruzi, Mahmut; Anderson, David T

    2015-12-17

    Our group has been working to develop parahydrogen (pH2) matrix isolation spectroscopy as a method to study low-temperature condensed-phase reactions of atomic hydrogen with various reaction partners. Guided by the well-defined studies of cold atom chemistry in rare-gas solids, the special properties of quantum hosts such as solid pH2 afford new opportunities to study the analogous chemical reactions under quantum diffusion conditions in hopes of discovering new types of chemical reaction mechanisms. In this study, we present Fourier transform infrared spectroscopic studies of the 193 nm photoinduced chemistry of nitric oxide (NO) isolated in solid pH2 over the 1.8 to 4.3 K temperature range. Upon short-term in situ irradiation the NO readily undergoes photolysis to yield HNO, NOH, NH, NH3, H2O, and H atoms. We map the postphotolysis reactions of mobile H atoms with NO and document first-order growth in HNO and NOH reaction products for up to 5 h after photolysis. We perform three experiments at 4.3 K and one at 1.8 K to permit the temperature dependence of the reaction kinetics to be quantified. We observe Arrhenius-type behavior with a pre-exponential factor of A = 0.036(2) min(-1) and Ea = 2.39(1) cm(-1). This is in sharp contrast to previous H atom reactions we have studied in solid pH2 that display definitively non-Arrhenius behavior. The contrasting temperature dependence measured for the H + NO reaction is likely related to the details of H atom quantum diffusion in solid pH2 and deserves further study.

  12. Exponentially more precise quantum simulation of fermions II: Quantum chemistry in the CI matrix representation

    CERN Document Server

    Babbush, Ryan; Kivlichan, Ian D; Wei, Annie Y; Love, Peter J; Aspuru-Guzik, Alán

    2015-01-01

    We present a quantum algorithm for the simulation of molecular systems that is asymptotically more efficient than all previous algorithms in the literature. As in the first paper of this series \\cite{BabbushSparse1}, we employ a recently developed technique for simulating Hamiltonian evolution using a truncated Taylor series to obtain logarithmic scaling with the inverse of the desired precision, which is an exponential improvement over methods based on the Trotter-Suzuki decomposition. The algorithm of this paper involves simulation under an oracle for the sparse, first quantized representation of the Hamiltonian known as the configuration interaction (CI) matrix. We construct and query the CI matrix oracle to allow for on-the-fly computation of molecular integrals in a way that is exponentially more efficient than classical numerical methods. Whereas second quantized representations of the wavefunction require $\\widetilde{\\cal O}(N)$ qubits, where $N$ is the number of single-particle spin-orbitals, the CI m...

  13. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    Science.gov (United States)

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  14. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3.04 "Propulsion Systems," Busek Co. Inc. will develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  15. Development of an optimized medium, strain and high-throughput culturing methods for Methylobacterium extorquens

    National Research Council Canada - National Science Library

    Delaney, Nigel F; Kaczmarek, Maria E; Ward, Lewis M; Swanson, Paige K; Lee, Ming-Chun; Marx, Christopher J

    2013-01-01

    .... Here we develop a new system for high-throughput batch culture of M. extorquens in microtiter plates by jointly optimizing the properties of the organism, the growth media and the culturing system...

  16. Wide Throttling, High Throughput Hall Thruster for Science and Exploration Missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to Topic S3-04 "Propulsion Systems," Busek proposes to develop a high throughput Hall effect thruster with a nominal peak power of 1-kW and wide...

  17. EMBRYONIC VASCULAR DISRUPTION ADVERSE OUTCOMES: LINKING HIGH THROUGHPUT SIGNALING SIGNATURES WITH FUNCTIONAL CONSEQUENCES

    Science.gov (United States)

    Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...

  18. High Throughput Atomic Layer Deposition Processes: High Pressure Operations, New Reactor Designs, and Novel Metal Processing

    Science.gov (United States)

    Mousa, MoatazBellah Mahmoud

    Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor

  19. Solid-phase cloning for high-throughput assembly of single and multiple DNA parts

    DEFF Research Database (Denmark)

    Lundqvist, Magnus; Edfors, Fredrik; Sivertsson, Åsa

    2015-01-01

    We describe solid-phase cloning (SPC) for high-throughput assembly of expression plasmids. Our method allows PCR products to be put directly into a liquid handler for capture and purification using paramagnetic streptavidin beads and conversion into constructs by subsequent cloning reactions. We...... at an average success rate above 80%. We report on several applications for SPC and we suggest it to be particularly suitable for high-throughput efforts using laboratory workstations....

  20. A multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans

    OpenAIRE

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-01-01

    The booming nanotech industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At t...

  1. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    Science.gov (United States)

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  2. Shielding of quantum dots using diblock copolymers: implementing copper catalyzed click chemistry to fluorescent quantum dots

    Science.gov (United States)

    Merkl, Jan-Philip; Ostermann, Johannes; Schmidtke, Christian; Kloust, Hauke; Eggers, Robin; Feld, Artur; Wolter, Christopher; Kreuziger, Anna-Marlena; Flessau, Sandra; Mattoussi, Hedi; Weller, Horst

    2014-03-01

    We describe the design and optimization of an amphiphilic diblock copolymer and its use to provide surface functionalization of colloidal semiconductor nanoparticles (quantum dots, QDs). This polymer coating promotes hydrophilicity of the nanocrystals while providing numerous functional groups ideally suited for biofunctionalization of the QDs using copper-catalyzed azide alkyne Husigen 1,3-cyloaddition (i.e., cupper catalyzed "click" reaction). Copper ions are known to quench the fluorescence of QDs in solution. Thus effective shielding of the nanocrystal surface is essential to apply copper-catalyzed reactions to luminescent QDs without drastically quenching their emission. We have applied a strategy based on micellar encapsulation within poly(isoprene-block- ethylene oxide) diblock-copolymers (PI-b-PEO), where three critical factors promote and control the effectiveness of the shielding of copper ion penetration: 1) The excess of PI-b-PEO, 2) the size of PI-b-PEO and 3) insertion of an additional PS-shell grown via seeded emulsion polymerization (EP) reaction. Due to the amphiphilic character of the block-copolymer, this approach provides a shielding layer surrounding the particles, preventing metal ions from reaching the QD surfaces and maintaining high photoluminescence. The effective shielding allowed the use of copper-catalyzed azide-alkyne 1,3-cycloaddition (CuAAC) to hydrophilic and highly fluorescent QDs, opening up great possibilities for the bio functionalization of QDs.

  3. The successful merger of theoretical thermochemistry with fragment-based methods in quantum chemistry.

    Science.gov (United States)

    Ramabhadran, Raghunath O; Raghavachari, Krishnan

    2014-12-16

    CONSPECTUS: Quantum chemistry and electronic structure theory have proven to be essential tools to the experimental chemist, in terms of both a priori predictions that pave the way for designing new experiments and rationalizing experimental observations a posteriori. Translating the well-established success of electronic structure theory in obtaining the structures and energies of small chemical systems to increasingly larger molecules is an exciting and ongoing central theme of research in quantum chemistry. However, the prohibitive computational scaling of highly accurate ab initio electronic structure methods poses a fundamental challenge to this research endeavor. This scenario necessitates an indirect fragment-based approach wherein a large molecule is divided into small fragments and is subsequently reassembled to compute its energy accurately. In our quest to further reduce the computational expense associated with the fragment-based methods and overall enhance the applicability of electronic structure methods to large molecules, we realized that the broad ideas involved in a different area, theoretical thermochemistry, are transferable to the area of fragment-based methods. This Account focuses on the effective merger of these two disparate frontiers in quantum chemistry and how new concepts inspired by theoretical thermochemistry significantly reduce the total number of electronic structure calculations needed to be performed as part of a fragment-based method without any appreciable loss of accuracy. Throughout, the generalized connectivity based hierarchy (CBH), which we developed to solve a long-standing problem in theoretical thermochemistry, serves as the linchpin in this merger. The accuracy of our method is based on two strong foundations: (a) the apt utilization of systematic and sophisticated error-canceling schemes via CBH that result in an optimal cutting scheme at any given level of fragmentation and (b) the use of a less expensive second

  4. Polymer surface functionalities that control human embryoid body cell adhesion revealed by high throughput surface characterization of combinatorial material microarrays.

    Science.gov (United States)

    Yang, Jing; Mei, Ying; Hook, Andrew L; Taylor, Michael; Urquhart, Andrew J; Bogatyrev, Said R; Langer, Robert; Anderson, Daniel G; Davies, Martyn C; Alexander, Morgan R

    2010-12-01

    High throughput materials discovery using combinatorial polymer microarrays to screen for new biomaterials with new and improved function is established as a powerful strategy. Here we combine this screening approach with high throughput surface characterization (HT-SC) to identify surface structure-function relationships. We explore how this combination can help to identify surface chemical moieties that control protein adsorption and subsequent cellular response. The adhesion of human embryoid body (hEB) cells to a large number (496) of different acrylate polymers synthesized in a microarray format is screened using a high throughput procedure. To determine the role of the polymer surface properties on hEB cell adhesion, detailed HT-SC of these acrylate polymers is carried out using time of flight secondary ion mass spectrometry (ToF SIMS), X-ray photoelectron spectroscopy (XPS), pico litre drop sessile water contact angle (WCA) measurement and atomic force microscopy (AFM). A structure-function relationship is identified between the ToF SIMS analysis of the surface chemistry after a fibronectin (Fn) pre-conditioning step and the cell adhesion to each spot using the multivariate analysis technique partial least squares (PLS) regression. Secondary ions indicative of the adsorbed Fn correlate with increased cell adhesion whereas glycol and other functionalities from the polymers are identified that reduce cell adhesion. Furthermore, a strong relationship between the ToF SIMS spectra of bare polymers and the cell adhesion to each spot is identified using PLS regression. This identifies a role for both the surface chemistry of the bare polymer and the pre-adsorbed Fn, as-represented in the ToF SIMS spectra, in controlling cellular adhesion. In contrast, no relationship is found between cell adhesion and wettability, surface roughness, elemental or functional surface composition. The correlation between ToF SIMS data of the surfaces and the cell adhesion demonstrates

  5. High throughput screens yield small molecule inhibitors of Leishmania CRK3:CYC6 cyclin-dependent kinase.

    Directory of Open Access Journals (Sweden)

    Roderick G Walker

    Full Text Available BACKGROUND: Leishmania species are parasitic protozoa that have a tightly controlled cell cycle, regulated by cyclin-dependent kinases (CDKs. Cdc2-related kinase 3 (CRK3, an essential CDK in Leishmania and functional orthologue of human CDK1, can form an active protein kinase complex with Leishmania cyclins CYCA and CYC6. Here we describe the identification and synthesis of specific small molecule inhibitors of bacterially expressed Leishmania CRK3:CYC6 using a high throughput screening assay and iterative chemistry. We also describe the biological activity of the molecules against Leishmania parasites. METHODOLOGY/PRINCIPAL FINDINGS: In order to obtain an active Leishmania CRK3:CYC6 protein kinase complex, we developed a co-expression and co-purification system for Leishmania CRK3 and CYC6 proteins. This active enzyme was used in a high throughput screening (HTS platform, utilising an IMAP fluorescence polarisation assay. We carried out two chemical library screens and identified specific inhibitors of CRK3:CYC6 that were inactive against the human cyclin-dependent kinase CDK2:CycA. Subsequently, the best inhibitors were tested against 11 other mammalian protein kinases. Twelve of the most potent hits had an azapurine core with structure activity relationship (SAR analysis identifying the functional groups on the 2 and 9 positions as essential for CRK3:CYC6 inhibition and specificity against CDK2:CycA. Iterative chemistry allowed synthesis of a number of azapurine derivatives with one, compound 17, demonstrating anti-parasitic activity against both promastigote and amastigote forms of L. major. Following the second HTS, 11 compounds with a thiazole core (active towards CRK3:CYC6 and inactive against CDK2:CycA were tested. Ten of these hits demonstrated anti-parasitic activity against promastigote L. major. CONCLUSIONS/SIGNIFICANCE: The pharmacophores identified from the high throughput screens, and the derivatives synthesised, selectively

  6. Optimization of random PEGylation reactions by means of high throughput screening.

    Science.gov (United States)

    Maiser, Benjamin; Dismer, Florian; Hubbuch, Jürgen

    2014-01-01

    Since the first FDA approval of a PEGylated product in 1990, so called random PEGylation reactions are still used to increase the efficacy of biopharmaceuticals and represent the major technology of all approved PEG-modified drugs. However, the great influence of process parameters on PEGylation degree and the PEG-binding site results in a lack of reaction specificity which can have severe impact on the product profile. Consequently, reproducible and well characterized processes are essential to meet increasing regulative requirements resulting from the quality-by-design (QbD) initiative, especially for this kind of modification type. In this study we present a general approach which combines the simple chemistry of random PEGylation reactions with high throughput experimentation (HTE) to achieve a well-defined process. Robotic based batch experiments have been established in a 96-well plate format and were analyzed to investigate the influence of different PEGylation conditions for lysozyme as model protein. With common SEC analytics highly reproducible reaction kinetics were measured and a significant influence of PEG-excess, buffer pH, and reaction time could be investigated. Additional mono-PEG-lysozyme analytics showed the impact of varying buffer pH on the isoform distribution, which allowed us to identify optimal process parameters to get a maximum concentration of each isoform. Employing Micrococcus lysodeikticus based activity assays, PEG-lysozyme33 was identified to be the isoform with the highest residual activity, followed by PEG-lysozyme1 . Based on these results, a control space for a PEGylation reaction was defined with respect to an optimal overall volumetric activity of mono-PEG-lysozyme isoform mixtures.

  7. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    Science.gov (United States)

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and sample analysis throughput suitable for broad applications in life sciences, agricultural chemistry, consumer safety, and beyond.

  8. A high throughput screen for biomining cellulase activity from metagenomic libraries.

    Science.gov (United States)

    Mewis, Keith; Taupp, Marcus; Hallam, Steven J

    2011-02-01

    Cellulose, the most abundant source of organic carbon on the planet, has wide-ranging industrial applications with increasing emphasis on biofuel production (1). Chemical methods to modify or degrade cellulose typically require strong acids and high temperatures. As such, enzymatic methods have become prominent in the bioconversion process. While the identification of active cellulases from bacterial and fungal isolates has been somewhat effective, the vast majority of microbes in nature resist laboratory cultivation. Environmental genomic, also known as metagenomic, screening approaches have great promise in bridging the cultivation gap in the search for novel bioconversion enzymes. Metagenomic screening approaches have successfully recovered novel cellulases from environments as varied as soils (2), buffalo rumen (3) and the termite hind-gut (4) using carboxymethylcellulose (CMC) agar plates stained with congo red dye (based on the method of Teather and Wood (5)). However, the CMC method is limited in throughput, is not quantitative and manifests a low signal to noise ratio (6). Other methods have been reported (7,8) but each use an agar plate-based assay, which is undesirable for high-throughput screening of large insert genomic libraries. Here we present a solution-based screen for cellulase activity using a chromogenic dinitrophenol (DNP)-cellobioside substrate (9). Our library was cloned into the pCC1 copy control fosmid to increase assay sensitivity through copy number induction (10). The method uses one-pot chemistry in 384-well microplates with the final readout provided as an absorbance measurement. This readout is quantitative, sensitive and automated with a throughput of up to 100X 384-well plates per day using a liquid handler and plate reader with attached stacking system.

  9. A sandwiched microarray platform for benchtop cell-based high throughput screening

    Science.gov (United States)

    Wu, Jinhui; Wheeldon, Ian; Guo, Yuqi; Lu, Tingli; Du, Yanan; Wang, Ben; He, Jiankang; Hu, Yiqiao; Khademhosseini, Ali

    2010-01-01

    The emergence of combinatorial chemistries and the increased discovery of natural compounds have led to the production of expansive libraries of drug candidates and vast numbers of compounds with potentially interesting biological activities. Despite broad interest in high throughput screening (HTS) across varied fields of biological research, there has not been an increase in accessible HTS technologies. Here, we present a simple microarray sandwich system suitable for screening chemical libraries in cell-based assays at the benchtop. The microarray platform delivers chemical compounds to isolated cell cultures by ‘sandwiching’ chemical-laden arrayed posts with cell-seeded microwells. In this way, an array of sealed cell-based assays was generated without cross-contamination between neighboring assays. After chemical exposure, cell viability was analyzed by fluorescence detection of cell viability indicator assays on a per microwell basis in a standard microarray scanner. We demonstrate the efficacy of the system by generating four hits from toxicology screens towards MCF-7 human breast cancer cells. Three of the hits were identified in a combinatorial screen of a library of natural compounds in combination with verapamil, a P-glycoprotein inhibitor. A fourth hit, 9-methoxy-camptothecin, was identified by screening the natural compound library in the absence of verapamil. The method developed here miniaturizes existing HTS systems and enables the screening of a wide array of individual or combinatorial libraries in a reproducible and scalable manner. We anticipate broad application of such a system as it is amenable to combinatorial drug screening in a simple, robust and portable platform. PMID:20965560

  10. Post-CMOS compatible high-throughput fabrication of AlN-based piezoelectric microcantilevers

    Science.gov (United States)

    Pérez-Campos, A.; Iriarte, G. F.; Hernando-Garcia, J.; Calle, F.

    2015-02-01

    A post-complementary metal oxide semiconductor (CMOS) compatible microfabrication process of piezoelectric cantilevers has been developed. The fabrication process is suitable for standard silicon technology and provides low-cost and high-throughput manufacturing. This work reports design, fabrication and characterization of piezoelectric cantilevers based on aluminum nitride (AlN) thin films synthesized at room temperature. The proposed microcantilever system is a sandwich structure composed of chromium (Cr) electrodes and a sputtered AlN film. The key issue for cantilever fabrication is the growth at room temperature of the AlN layer by reactive sputtering, making possible the innovative compatibility of piezoelectric MEMS devices with CMOS circuits already processed. AlN and Cr have been etched by inductively coupled plasma (ICP) dry etching using a BCl3-Cl2-Ar plasma chemistry. As part of the novelty of the post-CMOS micromachining process presented here, a silicon Si (1 0 0) wafer has been used as substrate as well as the sacrificial layer used to release the microcantilevers. In order to achieve this, the Si surface underneath the structure has been wet etched using an HNA (hydrofluoric acid + nitric acid + acetic acid) based solution. X-ray diffraction (XRD) characterization indicated the high crystalline quality of the AlN film. An atomic force microscope (AFM) has been used to determine the Cr electrode surface roughness. The morphology of the fabricated devices has been studied by scanning electron microscope (SEM). The cantilevers have been piezoelectrically actuated and their out-of-plane vibration modes were detected by vibrometry.

  11. Coupled effects of solution chemistry and hydrodynamics on the mobility and transport of quantum dot nanomaterials in the Vadose Zone

    Science.gov (United States)

    To investigate the coupled effects of solution chemistry and vadose zone processes on the mobility of quantum dot (QD) nanoparticles, laboratory scale transport experiments were performed. The complex coupled effects of ionic strength, size of QD aggregates, surface tension, contact angle, infiltrat...

  12. Methodology of Parameterization of Molecular Mechanics Force Field From Quantum Chemistry Calculations using Genetic Algorithm: A case study of methanol

    CERN Document Server

    Li, Ying; Chan, Maria K Y; Sankaranarayanan, Subramanian; Rouxb, Benoît

    2016-01-01

    In molecular dynamics (MD) simulation, force field determines the capability of an individual model in capturing physical and chemistry properties. The method for generating proper parameters of the force field form is the key component for computational research in chemistry, biochemistry, and condensed-phase physics. Our study showed that the feasibility to predict experimental condensed phase properties (i.e., density and heat of vaporization) of methanol through problem specific force field from only quantum chemistry information. To acquire the satisfying parameter sets of the force field, the genetic algorithm (GA) is the main optimization method. For electrostatic potential energy, we optimized both the electrostatic parameters of methanol using the GA method, which leads to low deviations of between the quantum mechanics (QM) calculations and the GA optimized parameters. We optimized the van der Waals (vdW) parameters both using GA and guided GA methods by calibrating interaction energy of various met...

  13. Advancing a distributed multi-scale computing framework for large-scale high-throughput discovery in materials science.

    Science.gov (United States)

    Knap, J; Spear, C E; Borodin, O; Leiter, K W

    2015-10-30

    We describe the development of a large-scale high-throughput application for discovery in materials science. Our point of departure is a computational framework for distributed multi-scale computation. We augment the original framework with a specialized module whose role is to route evaluation requests needed by the high-throughput application to a collection of available computational resources. We evaluate the feasibility and performance of the resulting high-throughput computational framework by carrying out a high-throughput study of battery solvents. Our results indicate that distributed multi-scale computing, by virtue of its adaptive nature, is particularly well-suited for building high-throughput applications.

  14. Two-electron reduction of ethylene carbonate: a quantum chemistry re-examination of mechanisms

    CERN Document Server

    Leung, Kevin

    2013-01-01

    Passivating solid-electrolyte interphase (SEI) films arising from electrolyte decomposition on low-voltage lithium ion battery anode surfaces are critical for battery operations. We review the recent theoretical literature on electrolyte decomposition and emphasize the modeling work on two-electron reduction of ethylene carbonate (EC, a key battery organic solvent). One of the two-electron pathways, which releases CO gas, is re-examined using simple quantum chemistry calculations. Excess electrons are shown to preferentially attack EC in the order (broken EC^-) > (intact EC^-) > EC. This confirms the viability of two electron processes and emphasizes that they need to be considered when interpreting SEI experiments. An estimate of the crossover between one- and two-electron regimes under a homogeneous reaction zone approximation is proposed.

  15. Neural Network Based on Quantum Chemistry for Predicting Melting Point of Organic Compounds

    Institute of Scientific and Technical Information of China (English)

    Juan A. Lazzús

    2009-01-01

    The melting points of organic compounds were estimated using a combined method that includes a backpropagation neural network and quantitative structure property relationship (QSPR) parameters in quantum chemistry. Eleven descriptors that reflect the intermolec-ular forces and molecular symmetry were used as input variables. QSPR parameters were calculated using molecular modeling and PM3 semi-empirical molecular orbital theories. A total of 260 compounds were used to train the network, which was developed using MatLab. Then, the melting points of 73 other compounds were predicted and results were compared to experimental data from the literature. The study shows that the chosen artificial neural network and the quantitative structure property relationships method present an excellent alternative for the estimation of the melting point of an organic compound, with average absolute deviation of 5%.

  16. Sulfobetaine-Vinylimidazole Block Copolymers: A Robust Quantum Dot Surface Chemistry Expanding Bioimaging's Horizons.

    Science.gov (United States)

    Tasso, Mariana; Giovanelli, Emerson; Zala, Diana; Bouccara, Sophie; Fragola, Alexandra; Hanafi, Mohamed; Lenkei, Zsolt; Pons, Thomas; Lequeux, Nicolas

    2015-11-24

    Long-term inspection of biological phenomena requires probes of elevated intra- and extracellular stability and target biospecificity. The high fluorescence and photostability of quantum dot (QD) nanoparticles contributed to foster their promise as bioimaging tools that could overcome limitations associated with traditional fluorophores. However, QDs' potential as a bioimaging platform relies upon a precise control over the surface chemistry modifications of these nano-objects. Here, a zwitterion-vinylimidazole block copolymer ligand was synthesized, which regroups all anchoring groups in one compact terminal block, while the rest of the chain is endowed with antifouling and bioconjugation moieties. By further application of an oriented bioconjugation approach with whole IgG antibodies, QD nanobioconjugates were obtained that display outstanding intra- and extracellular stability as well as biorecognition capacity. Imaging the internalization and intracellular dynamics of a transmembrane cell receptor, the CB1 brain cannabinoid receptor, both in HEK293 cells and in neurons, illustrates the breadth of potential applications of these nanoprobes.

  17. Multireference quantum chemistry through a joint density matrix renormalization group and canonical transformation theory

    Science.gov (United States)

    Yanai, Takeshi; Kurashige, Yuki; Neuscamman, Eric; Chan, Garnet Kin-Lic

    2010-01-01

    We describe the joint application of the density matrix renormalization group and canonical transformation theory to multireference quantum chemistry. The density matrix renormalization group provides the ability to describe static correlation in large active spaces, while the canonical transformation theory provides a high-order description of the dynamic correlation effects. We demonstrate the joint theory in two benchmark systems designed to test the dynamic and static correlation capabilities of the methods, namely, (i) total correlation energies in long polyenes and (ii) the isomerization curve of the [Cu2O2]2+ core. The largest complete active spaces and atomic orbital basis sets treated by the joint DMRG-CT theory in these systems correspond to a (24e,24o) active space and 268 atomic orbitals in the polyenes and a (28e,32o) active space and 278 atomic orbitals in [Cu2O2]2+.

  18. Bridging the Gap Between Quantum Chemistry and Classical Simulations for CO2 Capture

    Energy Technology Data Exchange (ETDEWEB)

    Gagliardi, Laura [Univ. of Minnesota, Minneapolis, MN (United States)

    2015-09-17

    We have developed a systematic procedure to generate transferable force fields to simulate the behavior of CO2 and other gases in open-metal-site metal organic frameworks using high-level quantum chemical calculations. Monte Carlo simulations based on an ab initio force field for CO2 in the Mg2(dobpdc) material have been employed to describe the interactions of CO2 with open metals. Our study has shed some light on the interpretation of thermodynamic data of flue gas in Mg2(dobpdc). This force field accurately describes the chemistry of the open metal sites, and is transferable to other structures.

  19. Many-body quantum chemistry for the electron gas: convergent perturbative theories

    CERN Document Server

    Shepherd, James J

    2013-01-01

    We investigate the accuracy of a number of wavefunction based methods at the heart of quantum chemistry for metallic systems. Using Hartree-Fock as a reference, perturbative (M{\\o}ller-Plesset, MP) and coupled cluster (CC) theories are used to study the uniform electron gas model. Our findings suggest that non-perturbative coupled cluster theories are acceptable for modelling electronic interactions in metals whilst perturbative coupled cluster theories are not. Using screened interactions, we propose a simple modification to the widely-used coupled-cluster singles and doubles plus perturbative triples method (CCSD(T)) that lifts the divergent behaviour and is shown to give very accurate correlation energies for the homogeneous electron gas.

  20. The Quixote project: Collaborative and Open Quantum Chemistry data management in the Internet age

    Directory of Open Access Journals (Sweden)

    Adams Sam

    2011-10-01

    Full Text Available Abstract Computational Quantum Chemistry has developed into a powerful, efficient, reliable and increasingly routine tool for exploring the structure and properties of small to medium sized molecules. Many thousands of calculations are performed every day, some offering results which approach experimental accuracy. However, in contrast to other disciplines, such as crystallography, or bioinformatics, where standard formats and well-known, unified databases exist, this QC data is generally destined to remain locally held in files which are not designed to be machine-readable. Only a very small subset of these results will become accessible to the wider community through publication. In this paper we describe how the Quixote Project is developing the infrastructure required to convert output from a number of different molecular quantum chemistry packages to a common semantically rich, machine-readable format and to build respositories of QC results. Such an infrastructure offers benefits at many levels. The standardised representation of the results will facilitate software interoperability, for example making it easier for analysis tools to take data from different QC packages, and will also help with archival and deposition of results. The repository infrastructure, which is lightweight and built using Open software components, can be implemented at individual researcher, project, organisation or community level, offering the exciting possibility that in future many of these QC results can be made publically available, to be searched and interpreted just as crystallography and bioinformatics results are today. Although we believe that quantum chemists will appreciate the contribution the Quixote infrastructure can make to the organisation and and exchange of their results, we anticipate that greater rewards will come from enabling their results to be consumed by a wider community. As the respositories grow they will become a valuable source of

  1. Quantitative determination of cerebrospinal fluid bilirubin on a high throughput chemistry analyzer

    OpenAIRE

    Said Ahmed, Degmo

    2009-01-01

    Background Subarachnoid hemorrhage is a condition with high rates of mortality and morbidity. The diagnosis requires an urgent cerebral computed tomography scan and also a lumbar puncture if the scan fails to demonstrate intracranial blood. In Sweden the cerebrospinal fluid (CSF) is analyzed by spectrophotometric scanning for the presence of hemoglobin and bilirubin. The aim of the study was to develop a quantitative diazo reagent based analysis of cerebrospinal fluid bilirubin as a replaceme...

  2. GPU Linear Algebra Libraries and GPGPU Programming for Accelerating MOPAC Semiempirical Quantum Chemistry Calculations.

    Science.gov (United States)

    Maia, Julio Daniel Carvalho; Urquiza Carvalho, Gabriel Aires; Mangueira, Carlos Peixoto; Santana, Sidney Ramos; Cabral, Lucidio Anjos Formiga; Rocha, Gerd B

    2012-09-11

    In this study, we present some modifications in the semiempirical quantum chemistry MOPAC2009 code that accelerate single-point energy calculations (1SCF) of medium-size (up to 2500 atoms) molecular systems using GPU coprocessors and multithreaded shared-memory CPUs. Our modifications consisted of using a combination of highly optimized linear algebra libraries for both CPU (LAPACK and BLAS from Intel MKL) and GPU (MAGMA and CUBLAS) to hasten time-consuming parts of MOPAC such as the pseudodiagonalization, full diagonalization, and density matrix assembling. We have shown that it is possible to obtain large speedups just by using CPU serial linear algebra libraries in the MOPAC code. As a special case, we show a speedup of up to 14 times for a methanol simulation box containing 2400 atoms and 4800 basis functions, with even greater gains in performance when using multithreaded CPUs (2.1 times in relation to the single-threaded CPU code using linear algebra libraries) and GPUs (3.8 times). This degree of acceleration opens new perspectives for modeling larger structures which appear in inorganic chemistry (such as zeolites and MOFs), biochemistry (such as polysaccharides, small proteins, and DNA fragments), and materials science (such as nanotubes and fullerenes). In addition, we believe that this parallel (GPU-GPU) MOPAC code will make it feasible to use semiempirical methods in lengthy molecular simulations using both hybrid QM/MM and QM/QM potentials.

  3. The High Throughput X-ray Spectroscopy (HTXS) Mission

    Science.gov (United States)

    White, N. E.; Tananbaum, H.; Kahn, S. M.

    1997-01-01

    The HTXS mission concept combines large effective area (approximately 15,000 sq cm at 1 keV), high spectral resolution (E/Delta(E) approximately 300-3000), and broad energy bandpass (0.25-40 keV and possibly up to 100 keV) by using replicated optics together with a complement of spectroscopic instrumentation including reflection gratings readout by charge-coupled device detectors (CCDs), quantum micro-calorimeters, and cadmium zinc telluride (CZT) or comparable high energy detectors. An essential feature of this concept involves minimization of cost (approximately $350M for development and approximately $500-600M including launches) and risk by building six identical modest satellites to achieve the large area. Current mission and technology studies are targeted towards a new start in the 2002 timeframe, with first launch around 2005-2006. The HTXS mission represents a major advance, providing as much as a factor of 100 increase in sensitivity over currently planned high resolution X ray spectroscopy missions. HTXS will mark the start of a new era when high quality X ray spectra will be obtained for all classes of X ray sources, over a wide range of luminosity and distance. With its increased capabilities, HTXS will address many fundamental astrophysics questions such as the origin and distribution of the elements from carbon to zinc, the formation and evolution of clusters of galaxies, the validity of general relativity in the strong gravity limit, the evolution of supermassive black holes in active galactic nuclei, the details of supernova explosions and their aftermath, and the mechanisms involved in the heating of stellar coronae and driving of stellar winds.

  4. Fast Quantum Molecular Dynamics Simulations of Shock-induced Chemistry in Organic Liquids

    Science.gov (United States)

    Cawkwell, Marc

    2014-03-01

    The responses of liquid formic acid and phenylacetylene to shock compression have been investigated via quantum-based molecular dynamics simulations with the self-consistent tight-binding code LATTE. Microcanonical Born-Oppenheimer trajectories with precise conservation of the total energy were computed without relying on an iterative self-consistent field optimization of the electronic degrees of freedom at each time step via the Fast Quantum Mechanical Molecular Dynamics formalism [A. M. N. Niklasson and M. J. Cawkwell, Phys. Rev. B, 86, 174308 (2012)]. The conservation of the total energy in our trajectories was pivotal for the capture of adiabatic shock heating as well as temperature changes arising from endo- or exothermic chemistry. Our self-consistent tight-binding parameterizations yielded very good predictions for the gas-phase geometries of formic acid and phenylacetylene molecules and the principal Hugoniots of the liquids. In accord with recent flyer-plate impact experiments, our simulations revealed i) that formic acid reacts at relatively low impact pressures but with no change in volume between products and reactants, and ii) a two-step polymerization process for phenylacetylene. Furthermore, the evolution of the HOMO-LUMO gap tracked on-the-fly during our simulations could be correlated with changes transient absorption measured during laser-driven shock compression experiments on these liquids.

  5. International journal of quantum chemistry. Quantum Chemistry Symposium Number 27: Proceedings of the International Symposium on Atomic, Molecular, and Condensed Matter Theory and Computational Methods

    Science.gov (United States)

    Lowdin, Per-Olov; Ohrn, N. Y.; Sabin, John R.; Zerner, Michael C.

    1993-03-01

    The topics covered at the 33rd annual Sanibel Symposium, organized by the faculty and staff of the Quantum Theory Project of the University of Florida, and held March 13 - 20, 1993, include advanced scientific computing, interaction of photons and matter, quantum molecular dynamics, electronic structure methods, polymeric systems, and quantum chemical methods for extended systems.

  6. The structural chemistry of metallocorroles: combined X-ray crystallography and quantum chemistry studies afford unique insights.

    Science.gov (United States)

    Thomas, Kolle E; Alemayehu, Abraham B; Conradie, Jeanet; Beavers, Christine M; Ghosh, Abhik

    2012-08-21

    Although they share some superficial structural similarities with porphyrins, corroles, trianionic ligands with contracted cores, give rise to fundamentally different transition metal complexes in comparison with the dianionic porphyrins. Many metallocorroles are formally high-valent, although a good fraction of them are also noninnocent, with significant corrole radical character. These electronic-structural characteristics result in a variety of fascinating spectroscopic behavior, including highly characteristic, paramagnetically shifted NMR spectra and textbook cases of charge-transfer spectra. Although our early research on corroles focused on spectroscopy, we soon learned that the geometric structures of metallocorroles provide a fascinating window into their electronic-structural characteristics. Thus, we used X-ray structure determinations and quantum chemical studies, chiefly using DFT, to obtain a comprehensive understanding of metallocorrole geometric and electronic structures. This Account describes our studies of the structural chemistry of metallocorroles. At first blush, the planar or mildly domed structure of metallocorroles might appear somewhat uninteresting particularly when compared to metalloporphyrins. Metalloporphyrins, especially sterically hindered ones, are routinely ruffled or saddled, but the missing meso carbon apparently makes the corrole skeleton much more resistant to nonplanar distortions. Ruffling, where the pyrrole rings are alternately twisted about the M-N bonds, is energetically impossible for metallocorroles. Saddling is also uncommon; thus, a number of sterically hindered, fully substituted metallocorroles exhibit almost perfectly planar macrocycle cores. Against this backdrop, copper corroles stand out as an important exception. As a result of an energetically favorable Cu(d(x2-y2))-corrole(π) orbital interaction, copper corroles, even sterically unhindered ones, are inherently saddled. Sterically hindered substituents

  7. High throughput chromatography strategies for potential use in the formal process characterization of a monoclonal antibody.

    Science.gov (United States)

    Petroff, Matthew G; Bao, Haiying; Welsh, John P; van Beuningen-de Vaan, Miranda; Pollard, Jennifer M; Roush, David J; Kandula, Sunitha; Machielsen, Peter; Tugcu, Nihal; Linden, Thomas O

    2016-06-01

    High throughput experimental strategies are central to the rapid optimization of biologics purification processes. In this work, we extend common high throughput technologies towards the characterization of a multi-column chromatography process for a monoclonal antibody (mAb). Scale-down strategies were first evaluated by comparing breakthrough, retention, and performance (yields and clearance of aggregates and host cell protein) across miniature and lab scale columns. The process operating space was then evaluated using several integrated formats, with batch experimentation to define process testing ranges, miniature columns to evaluate the operating space, and comparison to traditional scale columns to establish scale-up correlations and verify the determined operating space. When compared to an independent characterization study at traditional lab column scale, the high throughput approach identified the same control parameters and similar process sensitivity. Importantly, the high throughput approach significantly decreased time and material needs while improving prediction robustness. Miniature columns and manufacturing scale centerpoint data comparisons support the validity of this approach, making the high throughput strategy an attractive and appropriate scale-down tool for the formal characterization of biotherapeutic processes in the future if regulatory acceptance of the miniature column data can be achieved. Biotechnol. Bioeng. 2016;113: 1273-1283. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  8. High-throughput Phenotyping and Genomic Selection: The Frontiers of Crop Breeding Converge

    Institute of Scientific and Technical Information of China (English)

    Llorenc Cabrera-Bosquet; José Crossa; Jarislav von Zitzewitz; Maria Dolors Serret; José Luis Araus

    2012-01-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide.Both approaches promise to revolutionize the prediction of complex traits,including growth,yield and adaptation to stress.Whereas high-throughput phenotyping may help to improve understanding of crop physiology,most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection.Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome),they both consider the targeted traits (e.g.grain yield,growth,phenology,plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e.physiological) putatively related to the target trait.Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology.This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield.

  9. Human Development VIII: A Theory of “Deep” Quantum Chemistry and Cell Consciousness: Quantum Chemistry Controls Genes and Biochemistry to Give Cells and Higher Organisms Consciousness and Complex Behavior

    Directory of Open Access Journals (Sweden)

    Søren Ventegodt

    2006-01-01

    Full Text Available Deep quantum chemistry is a theory of deeply structured quantum fields carrying the biological information of the cell, making it able to remember, intend, represent the inner and outer world for comparison, understand what it “sees”, and make choices on its structure, form, behavior and division. We suggest that deep quantum chemistry gives the cell consciousness and all the qualities and abilities related to consciousness. We use geometric symbolism, which is a pre-mathematical and philosophical approach to problems that cannot yet be handled mathematically. Using Occam’s razor we have started with the simplest model that works; we presume this to be a many-dimensional, spiral fractal. We suggest that all the electrons of the large biological molecules’ orbitals make one huge “cell-orbital”, which is structured according to the spiral fractal nature of quantum fields. Consciousness of single cells, multi cellular structures as e.g. organs, multi-cellular organisms and multi-individual colonies (like ants and human societies can thus be explained by deep quantum chemistry. When biochemical activity is strictly controlled by the quantum-mechanical super-orbital of the cell, this orbital can deliver energetic quanta as biological information, distributed through many fractal levels of the cell to guide form and behavior of an individual single or a multi-cellular organism. The top level of information is the consciousness of the cell or organism, which controls all the biochemical processes. By this speculative work inspired by Penrose and Hameroff we hope to inspire other researchers to formulate more strict and mathematically correct hypothesis on the complex and coherence nature of matter, life and consciousness.

  10. The Study on Application of Quantum Chemistry Software%量子化学软件的应用研究

    Institute of Scientific and Technical Information of China (English)

    郭瑞萍; 刘旭光

    2009-01-01

    量化软件的引入对学生学习、理解、应用起到积极作用,激发了学生的学习动机,便于教师调整教学内容和进度, 优化教学过程.通过变分法在Gaussian软件包中的体现、Gaussian软件计算O2+,O2,和O2-的键长,对比键级的结果以及σ-π反馈键三个具体实例说明量子化学软件在教学中的应用,以及应用中的注意原则.%The introduction of quantum chemistry program into the teaching of Structural Chemistry not only plays a positive role for students to learn, understand and apply the Structural Chemistry course but also stimulates students' motivation.This is helpful for teachers to adjust teaching contents and progresses and optimize teaching procedure.The application and its rules of quantum chemistry program in Structural Chemistry are also illustrated by practical examples.

  11. Recent Progress Using High-throughput Sequencing Technologies in Plant Molecular Breeding

    Institute of Scientific and Technical Information of China (English)

    Qiang Gao; Guidong Yue; Wenqi Li; Junyi Wang; Jiaohui Xu; Ye Yin

    2012-01-01

    High-throughput sequencing is a revolutionary technological innovation in DNA sequencing.This technology has an ultra-low cost per base of sequencing and an overwhelmingly high data output.High-throughput sequencing has brought novel research methods and solutions to the research fields of genomics and post-genomics.Furthermore,this technology is leading to a new molecular breeding revolution that has landmark significance for scientific research and enables us to launch multi-level,multifaceted,and multi-extent studies in the fields of crop genetics,genomics,and crop breeding.In this paper,we review progress in the application of high-throughput sequencing technologies to plant molecular breeding studies.

  12. High-throughput parallel SPM for metrology, defect, and mask inspection

    Science.gov (United States)

    Sadeghian, H.; Herfst, R. W.; van den Dool, T. C.; Crowcombe, W. E.; Winters, J.; Kramer, G. F. I. J.

    2014-10-01

    Scanning probe microscopy (SPM) is a promising candidate for accurate assessment of metrology and defects on wafers and masks, however it has traditionally been too slow for high-throughput applications, although recent developments have significantly pushed the speed of SPM [1,2]. In this paper we present new results obtained with our previously presented high-throughput parallel SPM system [3,4] that showcase two key advances that are required for a successful deployment of SPM in high-throughput metrology, defect and mask inspection. The first is a very fast (up to 40 lines/s) image acquisition and a comparison of the image quality as function of speed. Secondly, a fast approach method: measurements of the scan-head approaching the sample from 0.2 and 1.0 mm distance in under 1.4 and 6 seconds respectively.

  13. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    Science.gov (United States)

    Green, M. L.; Choi, C. L.; Hattrick-Simpers, J. R.; Joshi, A. M.; Takeuchi, I.; Barron, S. C.; Campo, E.; Chiang, T.; Empedocles, S.; Gregoire, J. M.; Kusne, A. G.; Martin, J.; Mehta, A.; Persson, K.; Trautt, Z.; Van Duren, J.; Zakutayev, A.

    2017-03-01

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. A major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.

  14. Current impact and future directions of high throughput sequencing in plant virus diagnostics.

    Science.gov (United States)

    Massart, Sebastien; Olmos, Antonio; Jijakli, Haissam; Candresse, Thierry

    2014-08-08

    The ability to provide a fast, inexpensive and reliable diagnostic for any given viral infection is a key parameter in efforts to fight and control these ubiquitous pathogens. The recent developments of high-throughput sequencing (also called Next Generation Sequencing - NGS) technologies and bioinformatics have drastically changed the research on viral pathogens. It is now raising a growing interest for virus diagnostics. This review provides a snapshot vision on the current use and impact of high throughput sequencing approaches in plant virus characterization. More specifically, this review highlights the potential of these new technologies and their interplay with current protocols in the future of molecular diagnostic of plant viruses. The current limitations that will need to be addressed for a wider adoption of high-throughput sequencing in plant virus diagnostics are thoroughly discussed.

  15. Microfluidics for cell-based high throughput screening platforms - A review.

    Science.gov (United States)

    Du, Guansheng; Fang, Qun; den Toonder, Jaap M J

    2016-01-15

    In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery.

  16. High-throughput imaging: Focusing in on drug discovery in 3D.

    Science.gov (United States)

    Li, Linfeng; Zhou, Qiong; Voss, Ty C; Quick, Kevin L; LaBarbera, Daniel V

    2016-03-01

    3D organotypic culture models such as organoids and multicellular tumor spheroids (MCTS) are becoming more widely used for drug discovery and toxicology screening. As a result, 3D culture technologies adapted for high-throughput screening formats are prevalent. While a multitude of assays have been reported and validated for high-throughput imaging (HTI) and high-content screening (HCS) for novel drug discovery and toxicology, limited HTI/HCS with large compound libraries have been reported. Nonetheless, 3D HTI instrumentation technology is advancing and this technology is now on the verge of allowing for 3D HCS of thousands of samples. This review focuses on the state-of-the-art high-throughput imaging systems, including hardware and software, and recent literature examples of 3D organotypic culture models employing this technology for drug discovery and toxicology screening.

  17. Filtering high-throughput protein-protein interaction data using a combination of genomic features

    Directory of Open Access Journals (Sweden)

    Patil Ashwini

    2005-04-01

    Full Text Available Abstract Background Protein-protein interaction data used in the creation or prediction of molecular networks is usually obtained from large scale or high-throughput experiments. This experimental data is liable to contain a large number of spurious interactions. Hence, there is a need to validate the interactions and filter out the incorrect data before using them in prediction studies. Results In this study, we use a combination of 3 genomic features – structurally known interacting Pfam domains, Gene Ontology annotations and sequence homology – as a means to assign reliability to the protein-protein interactions in Saccharomyces cerevisiae determined by high-throughput experiments. Using Bayesian network approaches, we show that protein-protein interactions from high-throughput data supported by one or more genomic features have a higher likelihood ratio and hence are more likely to be real interactions. Our method has a high sensitivity (90% and good specificity (63%. We show that 56% of the interactions from high-throughput experiments in Saccharomyces cerevisiae have high reliability. We use the method to estimate the number of true interactions in the high-throughput protein-protein interaction data sets in Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens to be 27%, 18% and 68% respectively. Our results are available for searching and downloading at http://helix.protein.osaka-u.ac.jp/htp/. Conclusion A combination of genomic features that include sequence, structure and annotation information is a good predictor of true interactions in large and noisy high-throughput data sets. The method has a very high sensitivity and good specificity and can be used to assign a likelihood ratio, corresponding to the reliability, to each interaction.

  18. A novel imaging-based high-throughput screening approach to anti-angiogenic drug discovery.

    Science.gov (United States)

    Evensen, Lasse; Micklem, David R; Link, Wolfgang; Lorens, James B

    2010-01-01

    The successful progression to the clinic of angiogenesis inhibitors for cancer treatment has spurred interest in developing new classes of anti-angiogenic compounds. The resulting surge in available candidate therapeutics highlights the need for robust, high-throughput angiogenesis screening systems that adequately capture the complexity of new vessel formation while providing quantitative evaluation of the potency of these agents. Available in vitro angiogenesis assays are either cumbersome, impeding adaptation to high-throughput screening formats, or inadequately model the complex multistep process of new vessel formation. We therefore developed an organotypic endothelial-mural cell co-culture assay system that reflects several facets of angiogenesis while remaining compatible with high-throughput/high-content image screening. Co-culture of primary human endothelial cells (EC) and vascular smooth muscle cells (vSMC) results in assembly of a network of tubular endothelial structures enveloped with vascular basement membrane proteins, thus, comprising the three main components of blood vessels. Initially, EC are dependent on vSMC-derived VEGF and sensitive to clinical anti-angiogenic therapeutics. A subsequent phenotypic VEGF-switch renders EC networks resistant to anti-VEGF therapeutics, demarcating a mature vascular phenotype. Conversely, mature EC networks remain sensitive to vascular disrupting agents. Therefore, candidate anti-angiogenic compounds can be interrogated for their relative potency on immature and mature networks and classified as either vascular normalizing or vascular disrupting agents. Here, we demonstrate that the EC-vSMC co-culture assay represents a robust high-content imaging high-throughput screening system for identification of novel anti-angiogenic agents. A pilot high-throughput screening campaign was used to define informative imaging parameters and develop a follow-up dose-response scheme for hit characterization. High-throughput

  19. Machine learning in computational biology to accelerate high-throughput protein expression

    DEFF Research Database (Denmark)

    Sastry, Anand; Monk, Jonathan M.; Tegel, Hanna

    2017-01-01

    and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide...... the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. Availability and implementation: We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template...

  20. High-throughput exposure modeling to support prioritization of chemicals in personal care products

    DEFF Research Database (Denmark)

    Csiszar, Susan A.; Ernstoff, Alexi; Fantke, Peter;

    2016-01-01

    We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or population per mass...... intakes were associated with body lotion. Bioactive doses derived from high-throughput in vitro toxicity data were combined with the estimated PiFs to demonstrate an approach to estimate bioactive equivalent chemical content and to screen chemicals for risk....

  1. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    Science.gov (United States)

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  2. High-throughput screening of small-molecule adsorption in MOF

    OpenAIRE

    Canepa, Pieremanuele; Arter, Calvin A.; Conwill, Eliot M.; Johnson, Daniel H.; Shoemaker, Brian A.; Soliman, Karim Z.; Thonhauser, T.

    2013-01-01

    Using high-throughput screening coupled with state-of-the-art van der Waals density functional theory, we investigate the adsorption properties of four important molecules, H_2, CO_2, CH_4, and H_2O in MOF-74-M with M = Be, Mg, Al, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sr, Zr, Nb, Ru, Rh, Pd, La, W, Os, Ir, and Pt. We show that high-throughput techniques can aid in speeding up the development and refinement of effective materials for hydrogen storage, carbon capture, and gas separation. ...

  3. High-throughput analysis of total nitrogen content that replaces the classic Kjeldahl method.

    Science.gov (United States)

    Yasuhara, T; Nokihara, K

    2001-10-01

    A high-throughput method for determination of total nitrogen content has been developed. The method involves decomposition of samples, followed by trapping and quantitative colorimetric determination of the resulting ammonia. The present method is rapid, facile, and economical. Thus, it can replace the classic Kjeldahl method through its higher efficiency for determining multiple samples. Compared to the classic method, the present method is economical and environmentally friendly. Based on the present method, a novel reactor was constructed to realize routine high-throughput analyses of multiple samples such as those found for pharmaceutical materials, foods, and/or excrements.

  4. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    BACKGROUND: Unbiased flow cytometry-based methods have become the technique of choice in many laboratories for high-throughput, accurate assessments of malaria parasites in bioassays. A method to quantify live parasites based on mitotracker red CMXRos was recently described but consistent...... distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...

  5. Implementation of an Automated High-Throughput Plasmid DNA Production Pipeline.

    Science.gov (United States)

    Billeci, Karen; Suh, Christopher; Di Ioia, Tina; Singh, Lovejit; Abraham, Ryan; Baldwin, Anne; Monteclaro, Stephen

    2016-12-01

    Biologics sample management facilities are often responsible for a diversity of large-molecule reagent types, such as DNA, RNAi, and protein libraries. Historically, the management of large molecules was dispersed into multiple laboratories. As methodologies to support pathway discovery, antibody discovery, and protein production have become high throughput, the implementation of automation and centralized inventory management tools has become important. To this end, to improve sample tracking, throughput, and accuracy, we have implemented a module-based automation system integrated into inventory management software using multiple platforms (Hamilton, Hudson, Dynamic Devices, and Brooks). Here we describe the implementation of these systems with a focus on high-throughput plasmid DNA production management.

  6. Accelerating Virtual High-Throughput Ligand Docking: current technology and case study on a petascale supercomputer.

    Science.gov (United States)

    Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome

    2014-04-25

    In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.

  7. High-Throughput Non-destructive Phenotyping of Traits that Contribute to Salinity Tolerance in Arabidopsis thaliana

    KAUST Repository

    Awlia, Mariam

    2016-09-28

    Reproducible and efficient high-throughput phenotyping approaches, combined with advances in genome sequencing, are facilitating the discovery of genes affecting plant performance. Salinity tolerance is a desirable trait that can be achieved through breeding, where most have aimed at selecting for plants that perform effective ion exclusion from the shoots. To determine overall plant performance under salt stress, it is helpful to investigate several plant traits collectively in one experimental setup. Hence, we developed a quantitative phenotyping protocol using a high-throughput phenotyping system, with RGB and chlorophyll fluorescence (ChlF) imaging, which captures the growth, morphology, color and photosynthetic performance of Arabidopsis thaliana plants in response to salt stress. We optimized our salt treatment by controlling the soil-water content prior to introducing salt stress. We investigated these traits over time in two accessions in soil at 150, 100, or 50 mM NaCl to find that the plants subjected to 100 mM NaCl showed the most prominent responses in the absence of symptoms of severe stress. In these plants, salt stress induced significant changes in rosette area and morphology, but less prominent changes in rosette coloring and photosystem II efficiency. Clustering of ChlF traits with plant growth of nine accessions maintained at 100 mM NaCl revealed that in the early stage of salt stress, salinity tolerance correlated with non-photochemical quenching processes and during the later stage, plant performance correlated with quantum yield. This integrative approach allows the simultaneous analysis of several phenotypic traits. In combination with various genetic resources, the phenotyping protocol described here is expected to increase our understanding of plant performance and stress responses, ultimately identifying genes that improve plant performance in salt stress conditions.

  8. High-Throughput Screening by Nuclear Magnetic Resonance (HTS by NMR) for the Identification of PPIs Antagonists.

    Science.gov (United States)

    Wu, Bainan; Barile, Elisa; De, Surya K; Wei, Jun; Purves, Angela; Pellecchia, Maurizio

    2015-01-01

    In recent years the ever so complex field of drug discovery has embraced novel design strategies based on biophysical fragment screening (fragment-based drug design; FBDD) using nuclear magnetic resonance spectroscopy (NMR) and/or structure-guided approaches, most often using X-ray crystallography and computer modeling. Experience from recent years unveiled that these methods are more effective and less prone to artifacts compared to biochemical high-throughput screening (HTS) of large collection of compounds in designing protein inhibitors. Hence these strategies are increasingly becoming the most utilized in the modern pharmaceutical industry. Nonetheless, there is still an impending need to develop innovative and effective strategies to tackle other more challenging targets such as those involving protein-protein interactions (PPIs). While HTS strategies notoriously fail to identify viable hits against such targets, few successful examples of PPIs antagonists derived by FBDD strategies exist. Recently, we reported on a new strategy that combines some of the basic principles of fragment-based screening with combinatorial chemistry and NMR-based screening. The approach, termed HTS by NMR, combines the advantages of combinatorial chemistry and NMR-based screening to rapidly and unambiguously identify bona fide inhibitors of PPIs. This review will reiterate the critical aspects of the approach with examples of possible applications.

  9. Quantum Chemistry Meets Spectroscopy for Astrochemistry: Increasing Complexity toward Prebiotic Molecules.

    Science.gov (United States)

    Barone, Vincenzo; Biczysko, Malgorzata; Puzzarini, Cristina

    2015-05-19

    For many years, scientists suspected that the interstellar medium was too hostile for organic species and that only a few simple molecules could be formed under such extreme conditions. However, the detection of approximately 180 molecules in interstellar or circumstellar environments in recent decades has changed this view dramatically. A rich chemistry has emerged, and relatively complex molecules such as C60 and C70 are formed. Recently, researchers have also detected complex organic and potentially prebiotic molecules, such as amino acids, in meteorites and in other space environments. Those discoveries have further stimulated the debate on the origin of the building blocks of life in the universe. Many efforts continue to focus on the physical, chemical, and astrophysical processes by which prebiotic molecules can be formed in the interstellar dust and dispersed to Earth or to other planets.Spectroscopic techniques, which are widely used to infer information about molecular structure and dynamics, play a crucial role in the investigation of planetary atmosphere and the interstellar medium. Increasingly these astrochemical investigations are assisted by quantum-mechanical calculations of structures as well as spectroscopic and thermodynamic properties, such as transition frequencies and reaction enthalpies, to guide and support observations, line assignments, and data analysis in these new and chemically complicated situations. However, it has proved challenging to extend accurate quantum-chemical computational approaches to larger systems because of the unfavorable scaling with the number of degrees of freedom (both electronic and nuclear).In this Account, we show that it is now possible to compute physicochemical properties of building blocks of biomolecules with an accuracy rivaling that of the most sophisticated experimental techniques, and we summarize specific contributions from our groups. As a test case, we present the underlying computational machinery

  10. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  11. Proceedings of the meeting on tunneling reaction and low temperature chemistry, 97 October. Tunneling reaction and quantum medium

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, Tetsuo; Aratono, Yasuyuki; Ichikawa, Tsuneki; Shiotani, Masaru [eds.

    1998-02-01

    Present report is the proceedings of the 3rd Meeting on Tunneling Reaction and Low Temperature Chemistry held in Oct. 13 and 14, 1997. The main subject of the meeting is `Tunneling Reaction and Quantum Medium`. In the meeting, the physical and chemical phenomena in the liquid helium such as quantum nucleation, spectroscopy of atoms and molecules, and tunneling abstraction reaction of tritium atom were discussed as the main topics as well as the tunneling reactions in the solid hydrogen and organic compounds. Through the meetings held in 1995, 1996, and 1997, the tunneling phenomena proceeding at various temperatures (room temperature to mK) in the wide fields of chemistry, biology, and physics were discussed intensively and the importance of the tunneling phenomena in the science has been getting clear. The 12 of the presented papers are indexed individually. (J.P.N.)

  12. Proceedings of the meeting on tunneling reaction and low temperature chemistry, 97 October. Tunneling reaction and quantum medium

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, Tetsuo; Aratono, Yasuyuki; Ichikawa, Tsuneki; Shiotani, Masaru [eds.

    1998-02-01

    Present report is the proceedings of the 3rd Meeting on Tunneling Reaction and Low Temperature Chemistry held in Oct. 13 and 14, 1997. The main subject of the meeting is `Tunneling Reaction and Quantum Medium`. In the meeting, the physical and chemical phenomena in the liquid helium such as quantum nucleation, spectroscopy of atoms and molecules, and tunneling abstraction reaction of tritium atom were discussed as the main topics as well as the tunneling reactions in the solid hydrogen and organic compounds. Through the meetings held in 1995, 1996, and 1997, the tunneling phenomena proceeding at various temperatures (room temperature to mK) in the wide fields of chemistry, biology, and physics were discussed intensively and the importance of the tunneling phenomena in the science has been getting clear. The 12 of the presented papers are indexed individually. (J.P.N.)

  13. Quantum chemistry calculation-aided structural optimization of combretastatin A-4-like tubulin polymerization inhibitors: improved stability and biological activity.

    Science.gov (United States)

    Jiang, Junhang; Zheng, Canhui; Zhu, Kongkai; Liu, Jia; Sun, Nannan; Wang, Chongqing; Jiang, Hualiang; Zhu, Ju; Luo, Cheng; Zhou, Youjun

    2015-03-12

    A potent combretastatin A-4 (CA-4) like tubulin polymerization inhibitor 22b was found with strong antitumor activity previously. However, it easily undergoes cis-trans isomerization under natural light, and the resulting decrease in activity limits its further applications. In this study, we used quantum chemistry calculations to explore the molecular basis of its instability. Aided by the calculations, two rounds of structural optimization of 22b were conducted. Accelerated quantitative light stability testing confirmed that the stability of these designed compounds was significantly improved as predicted. Among them, compounds 1 and 3b displayed more potent inhibitory activity on tumor cell growth than 22b. In addition, the potent in vivo antitumor activity of compound 1 was confirmed. Quantum chemistry calculations were used in the optimization of stilbene-like molecules, providing new insight into stilbenoid optimization and important implications for the future development of novel CA-4-like tubulin polymerization inhibitors.

  14. Laboratory study of nitrate photolysis in Antarctic snow. I. Observed quantum yield, domain of photolysis, and secondary chemistry

    DEFF Research Database (Denmark)

    Meusinger, Carl; Berhanu, Tesfaye A.; Erbland, Joseph

    2014-01-01

    Post-depositional processes alter nitrate concentration and nitrate isotopic composition in the top layers of snow at sites with low snow accumulation rates, such as Dome C, Antarctica. Available nitrate ice core records can provide input for studying past atmospheres and climate if such processes...... undergoing secondary (recombination) chemistry. Modeled NOx emissions may increase significantly above measured values due to the observed quantum yield in this study. The apparent quantum yield in the 200 nm band was found to be ∼ 1%, much lower than reported for aqueous chemistry. A companion paper...... presents an analysis of the change in isotopic composition of snowpack nitrate based on the same samples as in this study....

  15. Semi-Empirical Quantum Chemistry Method for Pre-Polymerization Rational Design of Ciprofloxacin Imprinted Polymer and Adsorption Studies

    OpenAIRE

    Marestoni,Luiz D.; Wong,Ademar; Feliciano, Gustavo T.; Marchi,Mary R. R.; Tarley, César R. T.; Sotomayor,Maria D. P. T.

    2016-01-01

    It is well known that selectivity of molecularly imprinted polymers (MIPs) depends on adequate choice of functional monomer before the experimental synthesis. Computational simulation seems to be an ideal way to produce selective MIPs. In this work, we have proposed the use of semi-empirical simulation to obtain the best monomer able to strongly interact with ciprofloxacin. Twenty functional monomers were evaluated through semi-empirical quantum chemistry method and three MIPs were synthesize...

  16. Comparison of Oxygen Gauche Effects in Poly(Oxyethylene) and Poly(ethylene terephtylene) Based on Quantum Chemistry Calculations

    Science.gov (United States)

    Jaffe, Richard; Han, Jie; Yoon, Do; Langhoff, Stephen R. (Technical Monitor)

    1997-01-01

    The so-called oxygen gauche effect in poly(oxyethylene) (POE) and its model molecules such as 1,2-dimethoxyethane (DME) and diglyme (CH3OC2H4OC2H4OCH3) is manifested in the preference for gauche C-C bond conformations over trans. This has also been observed for poly(ethylene terephthalate) (PET). Our previous quantum chemistry calculations demonstrated that the large C-C gauche population in DME is due, in part, to a low-lying tg +/- g+ conformer that exhibits a substantial 1,5 CH ... O attraction. New calculations will be described that demonstrate the accuracy of the original quantum chemistry calculations. In addition, an extension of this work to model molecules for PET will be presented. It is seen that the C-C gauche preference is much stronger in 1,2 diacetoxyethane than in DME. In addition, there exist low-lying tg +/- g+/- and g+/-g+/-g+/- conformers that exhibit 1,5 CH ... O attractions involving the carbonyl oxygens. It is expected that the -O-C-C-O- torsional properties will be quite different in these two polymers. The quantum chemistry results are used to parameterize rotational isomeric states models (RIS) and force fields for molecular dynamics simulations of these polymers.

  17. High-throughput exploration of thermoelectric and mechanical properties of amorphous NbO{sub 2} with transition metal additions

    Energy Technology Data Exchange (ETDEWEB)

    Music, Denis, E-mail: music@mch.rwth-aachen.de; Geyer, Richard W.; Hans, Marcus [Materials Chemistry, RWTH Aachen University, Kopernikusstr. 10, 52074 Aachen (Germany)

    2016-07-28

    To increase the thermoelectric efficiency and reduce the thermal fatigue upon cyclic heat loading, alloying of amorphous NbO{sub 2} with all 3d and 5d transition metals has systematically been investigated using density functional theory. It was found that Ta fulfills the key design criteria, namely, enhancement of the Seebeck coefficient and positive Cauchy pressure (ductility gauge). These quantum mechanical predictions were validated by assessing the thermoelectric and elastic properties on combinatorial thin films, which is a high-throughput approach. The maximum power factor is 2813 μW m{sup −1} K{sup −2} for the Ta/Nb ratio of 0.25, which is a hundredfold increment compared to pure NbO{sub 2} and exceeds many oxide thermoelectrics. Based on the elasticity measurements, the consistency between theory and experiment for the Cauchy pressure was attained within 2%. On the basis of the electronic structure analysis, these configurations can be perceived as metallic, which is consistent with low electrical resistivity and ductile behavior. Furthermore, a pronounced quantum confinement effect occurs, which is identified as the physical origin for the Seebeck coefficient enhancement.

  18. High-throughput exploration of thermoelectric and mechanical properties of amorphous NbO2 with transition metal additions

    Science.gov (United States)

    Music, Denis; Geyer, Richard W.; Hans, Marcus

    2016-07-01

    To increase the thermoelectric efficiency and reduce the thermal fatigue upon cyclic heat loading, alloying of amorphous NbO2 with all 3d and 5d transition metals has systematically been investigated using density functional theory. It was found that Ta fulfills the key design criteria, namely, enhancement of the Seebeck coefficient and positive Cauchy pressure (ductility gauge). These quantum mechanical predictions were validated by assessing the thermoelectric and elastic properties on combinatorial thin films, which is a high-throughput approach. The maximum power factor is 2813 μW m-1 K-2 for the Ta/Nb ratio of 0.25, which is a hundredfold increment compared to pure NbO2 and exceeds many oxide thermoelectrics. Based on the elasticity measurements, the consistency between theory and experiment for the Cauchy pressure was attained within 2%. On the basis of the electronic structure analysis, these configurations can be perceived as metallic, which is consistent with low electrical resistivity and ductile behavior. Furthermore, a pronounced quantum confinement effect occurs, which is identified as the physical origin for the Seebeck coefficient enhancement.

  19. High-Throughput Design of Two-Dimensional Electron Gas Systems Based on Polar/Nonpolar Perovskite Oxide Heterostructures

    Science.gov (United States)

    Yang, Kesong; Nazir, Safdar; Behtash, Maziar; Cheng, Jianli

    2016-10-01

    The two-dimensional electron gas (2DEG) formed at the interface between two insulating oxides such as LaAlO3 and SrTiO3 (STO) is of fundamental and practical interest because of its novel interfacial conductivity and its promising applications in next-generation nanoelectronic devices. Here we show that a group of combinatorial descriptors that characterize the polar character, lattice mismatch, band gap, and the band alignment between the perovskite-oxide-based band insulators and the STO substrate, can be introduced to realize a high-throughput (HT) design of SrTiO3-based 2DEG systems from perovskite oxide quantum database. Equipped with these combinatorial descriptors, we have carried out a HT screening of all the polar perovskite compounds, uncovering 42 compounds of potential interests. Of these, Al-, Ga-, Sc-, and Ta-based compounds can form a 2DEG with STO, while In-based compounds exhibit a strain-induced strong polarization when deposited on STO substrate. In particular, the Ta-based compounds can form 2DEG with potentially high electron mobility at (TaO2)+/(SrO)0 interface. Our approach, by defining materials descriptors solely based on the bulk materials properties, and by relying on the perovskite-oriented quantum materials repository, opens new avenues for the discovery of perovskite-oxide-based functional interface materials in a HT fashion.

  20. Novel High-Throughput Drug Screening Platform for Chemotherapy-Induced Axonal Neuropathy

    Science.gov (United States)

    2013-05-01

    COVERED 1 May 201 - 30 Apr 201 4. TITLE AND SUBTITLE : Novel High-Throughput Drug Screening Platform for Chemotherapy-Induced axonal...Introduction-page 1 Results- page 1,2,3 Conclusion-page 3 Introduction: Taxol is an antineoplastic agent, which is used for the treatment of

  1. Roche genome sequencer FLX based high-throughput sequencing of ancient DNA

    DEFF Research Database (Denmark)

    Alquezar-Planas, David E; Fordyce, Sarah Louise

    2012-01-01

    Since the development of so-called "next generation" high-throughput sequencing in 2005, this technology has been applied to a variety of fields. Such applications include disease studies, evolutionary investigations, and ancient DNA. Each application requires a specialized protocol to ensure tha...

  2. Comprehensive analysis of high-throughput screens with HiTSeekR

    DEFF Research Database (Denmark)

    List, Markus; Schmidt, Steffen; Christiansen, Helle;

    2016-01-01

    High-throughput screening (HTS) is an indispensable tool for drug (target) discovery that currently lacks user-friendly software tools for the robust identification of putative hits from HTS experiments and for the interpretation of these findings in the context of systems biology. We developed H...

  3. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  4. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    Science.gov (United States)

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  5. Evaluation of simple and inexpensive high-throughput methods for phytic acid determination

    Science.gov (United States)

    High-throughput/low-cost/low-tech methods for phytic acid determination that are sufficiently accurate and reproducible would be of value in plant genetics, crop breeding and in the food and feed industries. Variants of two candidate methods, those described by Vaintraub and Lapteva (Anal. Biochem. ...

  6. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an

  7. A practical fluorogenic substrate for high-throughput screening of glutathione S-transferase inhibitors.

    Science.gov (United States)

    Fujikawa, Yuuta; Morisaki, Fumika; Ogura, Asami; Morohashi, Kana; Enya, Sora; Niwa, Ryusuke; Goto, Shinji; Kojima, Hirotatsu; Okabe, Takayoshi; Nagano, Tetsuo; Inoue, Hideshi

    2015-07-21

    We report a new fluorogenic substrate for glutathione S-transferase (GST), 3,4-DNADCF, enabling the assay with a low level of nonenzymatic background reaction. Inhibitors against Noppera-bo/GSTe14 from Drosophila melanogaster were identified by high throughput screening using 3,4-DNADCF, demonstrating the utility of this substrate.

  8. Application of high-throughput technologies to a structural proteomics-type analysis of Bacillus anthracis

    NARCIS (Netherlands)

    Au, K.; Folkers, G.E.; Kaptein, R.

    2006-01-01

    A collaborative project between two Structural Proteomics In Europe (SPINE) partner laboratories, York and Oxford, aimed at high-throughput (HTP) structure determination of proteins from Bacillus anthracis, the aetiological agent of anthrax and a biomedically important target, is described. Based up

  9. A novel high-throughput irradiator for in vitro radiation sensitivity bioassays

    Science.gov (United States)

    Fowler, Tyler L.

    Given the emphasis on more personalized radiation therapy there is an ongoing and compelling need to develop high-throughput screening tools to further examine the biological effects of ionizing radiation on cells, tissues and organ systems in either the research or clinical setting. Conventional x-ray irradiators are designed to provide maximum versatility to radiobiology researchers, typically accommodating small animals, tissue or blood samples, and cellular applications. This added versatility often impedes the overall sensitivity and specificity of an experiment resulting in a trade-off between the number of absorbed doses (or dose rates) and biological endpoints that can be investigated in vitro in a reasonable amount of time. Therefore, modern irradiator designs are incompatible with current high-throughput bioassay technologies. Furthermore, important dosimetry and calibration characteristics (i.e. dose build-up region, beam attenuation, and beam scatter) of these irradiators are typically unknown to the end user, which can lead to significant deviation between delivered dose and intended dose to cells that adversely impact experimental results. Therefore, the overarching goal of this research is to design and develop a robust and fully automated high-throughput irradiator for in vitro radiation sensitivity investigations. Additionally, in vitro biological validation of this system was performed by assessing intracellular reactive oxygen species production, physical DNA double strand breaks, and activation of cellular DNA repair mechanisms. Finally, the high-throughput irradiator was used to investigate autophagic flux, a cellular adaptive response, as a potential biomarker of radiation sensitivity.

  10. Cancer panomics: computational methods and infrastructure for integrative analysis of cancer high-throughput "omics" data

    DEFF Research Database (Denmark)

    Brunak, Søren; De La Vega, Francisco M.; Rätsch, Gunnar

    2014-01-01

    Targeted cancer treatment is becoming the goal of newly developed oncology medicines and has already shown promise in some spectacular cases such as the case of BRAF kinase inhibitors in BRAF-mutant (e.g. V600E) melanoma. These developments are driven by the advent of high-throughput sequencing...

  11. A high-throughput method for GMO multi-detection using a microfluidic dynamic array

    NARCIS (Netherlands)

    Brod, F.C.A.; Dijk, van J.P.; Voorhuijzen, M.M.; Dinon, A.Z.; Guimarães, L.H.S.; Scholtens, I.M.J.; Arisi, A.C.M.; Kok, E.J.

    2014-01-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNAbased methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the nu

  12. High-throughput exposure modeling to support prioritization of chemicals in personal care products

    DEFF Research Database (Denmark)

    Csiszar, Susan A.; Ernstoff, Alexi; Fantke, Peter

    2016-01-01

    We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or population per mass...

  13. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis (Intestinal-

  14. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  15. Development of in-house methods for high-throughput DNA extraction

    Science.gov (United States)

    Given the high-throughput nature of many current biological studies, in particular field-based or applied environmental studies, optimization of cost-effective, efficient methods for molecular analysis of large numbers of samples is a critical first step. Existing methods are either based on costly ...

  16. Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Y; Grossman, JC

    2014-12-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  17. HIGH-THROUGHPUT IDENTIFICATION OF CATALYTIC REDOX-ACTIVE CYSTEINE RESIDUES

    Science.gov (United States)

    Cysteine (Cys) residues often play critical roles in proteins; however, identification of their specific functions has been limited to case-by-case experimental approaches. We developed a procedure for high-throughput identification of catalytic redox-active Cys in proteins by se...

  18. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an infrare

  19. Investigation of non-halogenated solvent mixtures for high throughput fabrication of polymerfullerene solar cells

    NARCIS (Netherlands)

    Schmidt-Hansberg, B.; Sanyal, M.; Grossiord, N.; Galagan, Y.O.; Baunach, M.; Klein, M.F.G.; Colsmann, A.; Scharfer, P.; Lemmer, U.; Dosch, H.; Michels, J.J; Barrena, E.; Schabel, W.

    2012-01-01

    The rapidly increasing power conversion efficiencies of organic solar cells are an important prerequisite towards low cost photovoltaic fabricated in high throughput. In this work we suggest indane as a non-halogenated replacement for the commonly used halogenated solvent o-dichlorobenzene. Indane w

  20. An industrial engineering approach to laboratory automation for high throughput screening

    OpenAIRE

    Menke, Karl C.

    2000-01-01

    Across the pharmaceutical industry, there are a variety of approaches to laboratory automation for high throughput screening. At Sphinx Pharmaceuticals, the principles of industrial engineering have been applied to systematically identify and develop those automated solutions that provide the greatest value to the scientists engaged in lead generation.

  1. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  2. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis

  3. An integrated framework for discovery and genotyping of genomic variants from high-throughput sequencing experiments

    NARCIS (Netherlands)

    Duitama, Jorge; Quintero, Juan Camilo; Cruz, Daniel Felipe; Quintero, Constanza; Hubmann, Georg; Foulquié-Moreno, Maria R.; Verstrepen, Kevin J.; Thevelein, Johan M.; Tohme, Joe

    2014-01-01

    Recent advances in high-throughput sequencing (HTS) technologies and computing capacity have produced unprecedented amounts of genomic data that have unraveled the genetics of phenotypic variability in several species. However, operating and integrating current software tools for data analysis still

  4. High-throughput data pipelines for metabolic flux analysis in plants.

    Science.gov (United States)

    Poskar, C Hart; Huege, Jan; Krach, Christian; Shachar-Hill, Yair; Junker, Björn H

    2014-01-01

    In this chapter we illustrate the methodology for high-throughput metabolic flux analysis. Central to this is developing an end to end data pipeline, crucial for integrating the wet lab experiments and analytics, combining hardware and software automation, and standardizing data representation providing importers and exporters to support third party tools. The use of existing software at the start, data extraction from the chromatogram, and the end, MFA analysis, allows for the most flexibility in this workflow. Developing iMS2Flux provided a standard, extensible, platform independent tool to act as the "glue" between these end points. Most importantly this tool can be easily adapted to support different data formats, data verification and data correction steps allowing it to be central to managing the data necessary for high-throughput MFA. An additional tool was needed to automate the MFA software and in particular to take advantage of the course grained parallel nature of high-throughput analysis and available high performance computing facilities.In combination these methods show the development of high-throughput pipelines that allow metabolic flux analysis to join as a full member of the omics family.

  5. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly refle

  6. A perspective on high throughput analysis of pesticide residues in foods

    Institute of Scientific and Technical Information of China (English)

    Kai ZHANG; Jon W WONG; Perry G WANG

    2011-01-01

    The screening of pesticide residues plays a vital role in food safety. Applications of high throughput analytical procedures are desirable for screening a large number of pesticides and food samples in a time-effi- cient and cost-effective manner. This review discusses how sample throughput of pesticide analysis could be improved with an emphasis on sample preparation, instrumentation and data analysis.

  7. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  8. A high-throughput, precipitating colorimetric sandwich ELISA microarray for shiga toxins

    Science.gov (United States)

    Shiga toxins 1 and 2 (Stx1 and Stx2) from Shiga toxin-producing E. coli (STEC) bacteria were simultaneously detected with a newly developed, high-throughput antibody microarray platform. The proteinaceous toxins were immobilized and sandwiched between biorecognition elements (monoclonal antibodies)...

  9. High-throughput screening of carbohydrate-degrading enzymes using novel insoluble chromogenic substrate assay kits

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Willats, William George Tycho

    2016-01-01

    for this is that advances in genome and transcriptome sequencing, together with associated bioinformatics tools allow for rapid identification of candidate CAZymes, but technology for determining an enzyme's biochemical characteristics has advanced more slowly. To address this technology gap, a novel high-throughput assay...

  10. PLASMA PROTEIN PROFILING AS A HIGH THROUGHPUT TOOL FOR CHEMICAL SCREENING USING A SMALL FISH MODEL

    Science.gov (United States)

    Hudson, R. Tod, Michael J. Hemmer, Kimberly A. Salinas, Sherry S. Wilkinson, James Watts, James T. Winstead, Peggy S. Harris, Amy Kirkpatrick and Calvin C. Walker. In press. Plasma Protein Profiling as a High Throughput Tool for Chemical Screening Using a Small Fish Model (Abstra...

  11. High-throughput synthesis and characterization of nanocrystalline porphyrinic zirconium metal-organic frameworks.

    Science.gov (United States)

    Kelty, M L; Morris, W; Gallagher, A T; Anderson, J S; Brown, K A; Mirkin, C A; Harris, T D

    2016-06-14

    We describe and employ a high-throughput screening method to accelerate the synthesis and identification of pure-phase, nanocrystalline metal-organic frameworks (MOFs). We demonstrate the efficacy of this method through its application to a series of porphyrinic zirconium MOFs, resulting in the isolation of MOF-525, MOF-545, and PCN-223 on the nanoscale.

  12. High throughput screening of physicochemical properties and in vitro ADME profiling in drug discovery.

    Science.gov (United States)

    Wan, Hong; Holmén, Anders G

    2009-03-01

    Current advances of new technologies with robotic automated assays combined with highly selective and sensitive LC-MS enable high-speed screening of lead series libraries in many in vitro assays. In this review, we summarize state of the art high throughput assays for screening of key physicochemical properties such as solubility, lipophilicity, pKa, drug-plasma protein binding and brain tissue binding as well as in vitro ADME profiling. We discuss two primary approaches for high throughput screening of solubility, i.e. an automated 96-well plate assay integrated with LC-MS and a rapid multi-wavelength UV plate reader. We address the advantages of newly developed miniaturized techniques for high throughput pKa screening by capillary electrophoresis combined with mass spectrometry (CE-MS) with automated data analysis flow. Several new lipophilicity approaches other than octanol-water partitioning are critically reviewed, including rapid liquid chromatographic retention based approach, immobilized artificial membrane (IAM) partitioning and liposome, and potential microemulsion electrokinetic chromatography (MEEKC) for accurate screening of LogP. We highlight the sample pooling (namely cassette dosing, all-in-one, cocktail) as an efficient approach for high throughput screening of physicochemical properties and in vitro ADME profiling with emphasis on the benefit of on-line quality control. This cassette dosing approach has been widely adapted in drug discovery for rapid screening of in vivo pharmacokinetic parameters with significantly increased capacity and dramatically reduced animal usage.

  13. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free of t...

  14. GiNA, an efficient and high-throughput software for horticultural phenotyping

    Science.gov (United States)

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed,...

  15. High throughput phenotyping to accelerate crop breeding and monitoring of diseases in the field.

    Science.gov (United States)

    Shakoor, Nadia; Lee, Scott; Mockler, Todd C

    2017-08-01

    Effective implementation of technology that facilitates accurate and high-throughput screening of thousands of field-grown lines is critical for accelerating crop improvement and breeding strategies for higher yield and disease tolerance. Progress in the development of field-based high throughput phenotyping methods has advanced considerably in the last 10 years through technological progress in sensor development and high-performance computing. Here, we review recent advances in high throughput field phenotyping technologies designed to inform the genetics of quantitative traits, including crop yield and disease tolerance. Successful application of phenotyping platforms to advance crop breeding and identify and monitor disease requires: (1) high resolution of imaging and environmental sensors; (2) quality data products that facilitate computer vision, machine learning and GIS; (3) capacity infrastructure for data management and analysis; and (4) automated environmental data collection. Accelerated breeding for agriculturally relevant crop traits is key to the development of improved varieties and is critically dependent on high-resolution, high-throughput field-scale phenotyping technologies that can efficiently discriminate better performing lines within a larger population and across multiple environments. Copyright © 2017. Published by Elsevier Ltd.

  16. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  17. High-throughput genotoxicity assay identifies antioxidants as inducers of DNA damage response and cell death

    Science.gov (United States)

    Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...

  18. High-throughput semiquantitative analysis of insertional mutations in heterogeneous tumors

    NARCIS (Netherlands)

    Koudijs, M.J.; Klijn, C.; van der Weyden, L.; Kool, J.; ten Hoeve, J.; Sie, D.; Prasetyanti, P.R.; Schut, E.; Kas, S.; Whipp, T.; Cuppen, E.; Wessels, L.; Adams, D.J.; Jonkers, J.

    2011-01-01

    Retroviral and transposon-based insertional mutagenesis (IM) screens are widely used for cancer gene discovery in mice. Exploiting the full potential of IM screens requires methods for high-throughput sequencing and mapping of transposon and retroviral insertion sites. Current protocols are based on

  19. Investigation of non-halogenated solvent mixtures for high throughput fabrication of polymerfullerene solar cells

    NARCIS (Netherlands)

    Schmidt-Hansberg, B.; Sanyal, M.; Grossiord, N.; Galagan, Y.O.; Baunach, M.; Klein, M.F.G.; Colsmann, A.; Scharfer, P.; Lemmer, U.; Dosch, H.; Michels, J.J; Barrena, E.; Schabel, W.

    2012-01-01

    The rapidly increasing power conversion efficiencies of organic solar cells are an important prerequisite towards low cost photovoltaic fabricated in high throughput. In this work we suggest indane as a non-halogenated replacement for the commonly used halogenated solvent o-dichlorobenzene. Indane w

  20. The Impact of Data Fragmentation on High-Throughput Clinical Phenotyping

    Science.gov (United States)

    Wei, Weiqi

    2012-01-01

    Subject selection is essential and has become the rate-limiting step for harvesting knowledge to advance healthcare through clinical research. Present manual approaches inhibit researchers from conducting deep and broad studies and drawing confident conclusions. High-throughput clinical phenotyping (HTCP), a recently proposed approach, leverages…